AN ANALYSIS ON THE PERFORMANCE OF SILICON IMPLEMENTATIONS OF BACKPROPAGATION ALGORITHMS FOR ARTIFICIAL NEURAL NETWORKS

被引:21
作者
REYNERI, LM
FILIPPI, E
机构
[1] Dipartimento di Elettronica, Politecnico di Torino, 10129, Torino
关键词
ARTIFICIAL NEURAL NETWORKS; BACKPROPAGATION RULES; LEARNING ALGORITHMS; MULTILAYER PERCEPTRON; NEURAL SIMULATIONS; OPTIMIZATION ALGORITHMS; VLSI IMPLEMENTATIONS;
D O I
10.1109/12.106223
中图分类号
TP3 [计算技术、计算机技术];
学科分类号
0812 ;
摘要
This work describes what are the effects on the behavior of backpropagation rule of the constraints due to silicon implementations of Artificial Neural Systems. The effects on learning performance of limited weight resolution, range limitations, and the steepness of the activation function are considered. A minimum resolution of about 20 divided-by 22 bits is generally required but this figure can be reduced to about 14 divided-by 15 bits by properly choosing the learning parameter-eta. An algorithm is described which finds a near-optimum value for eta which attains good performance in presence of limited resolution; these performances can be further improved using a modified batch backpropagation rule. Theoretical analysis has been compared with ad-hoc simulations and results are discussed in detail.
引用
收藏
页码:1380 / 1389
页数:10
相关论文
共 18 条
[1]  
ALMEIDA LB, 1990, LECTURE NOTES COMPUT, V412, P110
[2]  
CAVIGLIA DD, ANALOG VLSI SYSTEMS
[3]  
DELCORSO D, 1990, PARALLEL ARCHITECTUR, P233
[4]  
DENYER PB, 1981, IEE P 1, V128
[5]  
FILIPPI E, 1990, THESIS POLITECNICO T
[6]  
JUTTEN C, LECTURE NOTES COMPUT, V412, P244
[7]  
KRAMER AH, 1989, UCBERL M891 U BERK M
[8]  
KUNG SY, 1988, P IEEE INT C NEURAL, P363
[9]  
Lipmann RP, 1987, IEEE ASSP MAGAZINE, V4, P4
[10]  
Mead C., 1989, ANALOG VLSI NEURAL S