Sufficient conditions for error back flow convergence in dynamical recurrent neural networks

被引:1
作者
Aussem, A [1 ]
机构
[1] Univ Clermont Ferrand 2, ISIMA, LIMOS, F-63173 Aubiere, France
来源
IJCNN 2000: PROCEEDINGS OF THE IEEE-INNS-ENNS INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS, VOL IV | 2000年
关键词
recurrent neural networks; gradient descent; forgetting behavior;
D O I
10.1109/IJCNN.2000.860833
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
This paper extends previous analysis of the gradient decay to a class of discrete-time fully recurrent networks, called Dynamical Recurrent Neural Networks (DRNN), obtained by modelling synapses as Finite Impulse Response (FIR) filters instead of multiplicative scalars. Using elementary matrix manipulations, we provide an upper bound on the norm of the weight matrix ensuring that the gradient vector, when propagated in a reverse manner in time through the error-propagation network, decays exponentially to zero. This bounds apply to all FIR architecture proposals as well as fixed point recurrent networks, regardless of delay and connectivity. In addition, we show that the computational overhead of the learning algorithm can be reduced drastically by taking advantage of the exponential decay of the gradient.
引用
收藏
页码:577 / 582
页数:6
相关论文
共 17 条
[11]   STEEPEST DESCENT ALGORITHMS FOR NEURAL-NETWORK CONTROLLERS AND FILTERS [J].
PICHE, SW .
IEEE TRANSACTIONS ON NEURAL NETWORKS, 1994, 5 (02) :198-212
[12]   GENERALIZATION OF BACK-PROPAGATION TO RECURRENT NEURAL NETWORKS [J].
PINEDA, FJ .
PHYSICAL REVIEW LETTERS, 1987, 59 (19) :2229-2232
[13]  
Rumelhart D.E., 1987, Parallel Distributed Processing: Explorations in the Microstructure of Cognition, P318
[14]   LOCALLY RECURRENT GLOBALLY FEEDFORWARD NETWORKS - A CRITICAL-REVIEW OF ARCHITECTURES [J].
TSOI, AC ;
BACK, AD .
IEEE TRANSACTIONS ON NEURAL NETWORKS, 1994, 5 (02) :229-239
[15]  
WAN EA, 1993, THESIS STANFORD U
[16]  
Werbos P. J., 1974, Ph.D. Thesis
[17]  
Williams R. J., 1989, Connection Science, V1, P87, DOI 10.1080/09540098908915631