Parallel evolutionary training algorithms for “hardware-friendly” neural networks

被引:53
作者
Vassilis P. Plagianakos
Michael N. Vrahatis
机构
[1] University of Patras,Department of Mathematics and Artificial Intelligence Research Center–UPAIRC
[2] University of Patras,Department of Mathematics
[3] University of Patras Artificial Intelligence Research Center-UPAIRC,undefined
关键词
``hardware-friendly'' implementations; integer weight neural networks; ``on-chip'' training; parallel differential evolution algorithms; threshold activation functions;
D O I
10.1023/A:1016545907026
中图分类号
学科分类号
摘要
In this paper, Parallel Evolutionary Algorithms for integer weightneural network training are presented. To this end, each processoris assigned a subpopulation of potential solutions. Thesubpopulations are independently evolved in parallel andoccasional migration is employed to allow cooperation betweenthem. The proposed algorithms are applied to train neural networksusing threshold activation functions and weight values confined toa narrow band of integers. We constrain the weights and biases inthe range [−3, 3], thus they can be represented by just 3 bits.Such neural networks are better suited for hardware implementationthan the real weight ones. These algorithms have been designedkeeping in mind that the resulting integer weights require lessbits to be stored and the digital arithmetic operations betweenthem are easier to be implemented in hardware. Another advantageof the proposed evolutionary strategies is that they are capableof continuing the training process ``on-chip'', if needed. Ourintention is to present results of parallel evolutionaryalgorithms on this difficult task. Based on the application of theproposed class of methods on classical neural network problems,our experience is that these methods are effective and reliable.
引用
收藏
页码:307 / 322
页数:15
相关论文
共 17 条
[1]  
Boutsinas B(2001)Artificial nonmonotonic neural networks Artificial Intelligence 132 1-38
[2]  
Vrahatis MN(1994)An IterativeMethod for Training Multilayer Networks with Threshold Functions. IEEE Transactions on Neural Networks 5 507-508
[3]  
Corwin EM(1978)Application of the adaptive random search to discrete and mixed integer optimization International Journal for Numerical Methods in Engineering 12 289-298
[4]  
Logar AM(1994)iInteger-weight neural nets Electronics Letters 30 1237-1238
[5]  
Oldham WJB(1997)Effective back-propagation with variable stepsize Neural Networks 10 69-82
[6]  
Kelahan RC(1999)System Design by Constraint Adaptation and Differential Evolution IEEE Transactions on Evolutionary Computation 3 22-34
[7]  
Gaddy JL(1997)Differential Evolution-A Simple and Efficient Heuristic for Global Optimization over Continuous spaces Journal of Global Optimization 11 341-359
[8]  
Khan AH(1999)Promises and Challenges of Evolvable Hardware Systems,Man, and Cybernetics Part C: Applications and Reviews 29 87-97
[9]  
Hines EL(undefined)undefined undefined undefined undefined-undefined
[10]  
Magoulas GD(undefined)undefined undefined undefined undefined-undefined