A METHOD FOR SELF-DETERMINATION OF ADAPTIVE LEARNING RATES IN BACK PROPAGATION

被引:67
作者
WEIR, MK
机构
[1] Computational Science Department, Scotland
关键词
NEURAL NETWORK; SELF-DETERMINATION; ADAPTIVE; LEARNING RATE; BACK PROPAGATION; MOMENTUM;
D O I
10.1016/0893-6080(91)90073-E
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
A method for self-determination of adaptive learning rates during behaviour is presented for back propagation in simulated neural networks. The inherent limitations of a learning rate fixed a priori for overshooting the goal are analysed and an optimum step length based on an adaptive learning rate is established. The use of a self-determined learning rate in this way allows first-time learning to become more feasible. A height and gradient based algorithm for determining the learning rate is described together with its computational expense. Experimental results are given comparing the new method with standard back propagation both with and without momentum-like augmentation. The results yield training times for the new method over a single trial of a similar order to those of the best fixed learning rates found empirically over multiple trials with potential for further improvement.
引用
收藏
页码:371 / 379
页数:9
相关论文
共 8 条
[1]  
CATER JP, 1987, P IEEE INT C NEUR NE, P645
[2]  
Chan L.-W., 1987, Computer Speech and Language, V2, P205, DOI 10.1016/0885-2308(87)90009-X
[3]  
FAHLMAN S, 1988, CMUCS88162 CARN U TE
[4]  
FRANZANI MA, 1987, 9 ANN C ENGMED BIOL, P1702
[5]  
HINTON GE, 1987, LECT NOTES COMPUT SC, V258, P1
[6]  
JACOBS RA, 1987, COINS TR87117 U MASS
[7]  
Rumelhart David E., 1987, LEARNING INTERNAL RE, P318
[8]  
SILVA FM, 1990, LECT NOTES COMPUT SC, V412, P110