A SIMPLIFIED GRADIENT ALGORITHM FOR IIR SYNAPSE MULTILAYER PERCEPTRONS

被引:29
作者
BACK, AD
TSOI, AC
机构
关键词
D O I
10.1162/neco.1993.5.3.456
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
A network architecture with a global feedforward local recurrent construction was presented recently as a new means of modeling nonlinear dynamic time series (Back and Tsoi 1991a). The training rule used was based on minimizing the least mean square (LMS) error and performed well, although the amount of memory required for large networks may become significant if a large number of feedback connections are used. In this note, a modified training algorithm based on a technique for linear filters is presented, simplifying the gradient calculations significantly. The memory requirements are reduced from O[n(a)(n(a) + n(b))N(s)] to O[(2n(a) + n(b))N(s)], where n(a) is the number of feedback delays, and N(s) is the total number of synapses. The new algorithm reduces the number of multiply-adds needed to train each synapse by n(a) at each time step. Simulations indicate that the algorithm has almost identical performance to the previous one.
引用
收藏
页码:456 / 462
页数:7
相关论文
共 4 条
[1]   FIR and IIR Synapses, a New Neural Network Architecture for Time Series Modeling [J].
Back, A. D. ;
Tsoi, A. C. .
NEURAL COMPUTATION, 1991, 3 (03) :375-385
[2]  
BACK AD, 1991, ARTIFICIAL NEURAL NETWORKS, VOLS 1 AND 2, P961
[3]   A SIMPLIFIED ADAPTIVE RECURSIVE FILTER DESIGN [J].
HSIA, TC .
PROCEEDINGS OF THE IEEE, 1981, 69 (09) :1153-1155
[4]  
WHITE SA, 1975, 9TH P AS C CIRC SYST, P21