FIR and IIR Synapses, a New Neural Network Architecture for Time Series Modeling

被引:124
作者
Back, A. D. [1 ]
Tsoi, A. C. [1 ]
机构
[1] Univ Queensland, Dept Elect Engn, Brisbane, Qld 4072, Australia
关键词
D O I
10.1162/neco.1991.3.3.375
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
A new neural network architecture involving either local feedforward global feedforward, and/or local recurrent global feedforward structure is proposed. A learning rule minimizing a mean square error criterion is derived. The performance of this algorithm (local recurrent global feedforward architecture) is compared with a local-feedforward global-feedforward architecture. It is shown that the local-recurrent global-feedforward model performs better than the local-feedforward global-feedforward model.
引用
收藏
页码:375 / 385
页数:11
相关论文
共 5 条
[1]  
[Anonymous], THESIS
[2]  
Jordan M. I., 1988, SUPERVISED LEARNING, P88
[3]  
Lapedes A., 1987, LA UR 87 2662 C 8706
[4]  
Lippmann R. P., 1988, Computer Architecture News, V16, P7, DOI [10.1109/MASSP.1987.1165576, 10.1145/44571.44572]
[5]   PHONEME RECOGNITION USING TIME-DELAY NEURAL NETWORKS [J].
WAIBEL, A ;
HANAZAWA, T ;
HINTON, G ;
SHIKANO, K ;
LANG, KJ .
IEEE TRANSACTIONS ON ACOUSTICS SPEECH AND SIGNAL PROCESSING, 1989, 37 (03) :328-339