RESCALING OF VARIABLES IN BACK PROPAGATION LEARNING

被引:54
作者
RIGLER, AK [1 ]
IRVINE, JM [1 ]
VOGL, TP [1 ]
机构
[1] ENVIRONM RES INST MICHIGAN,ANN ARBOR,MI 48107
关键词
BACKWARD ERROR PROPAGATION; LAYERED NETWORKS; RESCALING; PRECONDITIONING;
D O I
10.1016/0893-6080(91)90006-Q
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Use of the logistic derivative in backward error propagation suggests one source of ill-conditioning to be the decreasing multiplier in the computation of the elements of the gradient at each layer. A compensatory rescaling is suggested, based heuristically upon the expected value of the multiplier. Experimental results demonstrate an order of magnitude improvement in convergence.
引用
收藏
页码:225 / 229
页数:5
相关论文
共 8 条
[2]  
Curry H. B., 1944, Q APPL MATH, V2, P258, DOI [DOI 10.1090/QAM/10667, 10.1090/qam/10667]
[3]  
Fletcher R., 1981, PRACTICAL METHODS OP
[4]  
FORSYTHE AI, 1954, NBS APPL MATH SER, V39, P55
[5]   INCREASED RATES OF CONVERGENCE THROUGH LEARNING RATE ADAPTATION [J].
JACOBS, RA .
NEURAL NETWORKS, 1988, 1 (04) :295-307
[6]   USES AND ABUSES OF STATISTICAL SIMULATION [J].
RIPLEY, BD .
MATHEMATICAL PROGRAMMING, 1988, 42 (01) :53-68
[7]  
Rumelhart DE, 1986, ENCY DATABASE SYST, P45
[8]   ACCELERATING THE CONVERGENCE OF THE BACK-PROPAGATION METHOD [J].
VOGL, TP ;
MANGIS, JK ;
RIGLER, AK ;
ZINK, WT ;
ALKON, DL .
BIOLOGICAL CYBERNETICS, 1988, 59 (4-5) :257-263