GRADIENT CALCULATIONS FOR DYNAMIC RECURRENT NEURAL NETWORKS - A SURVEY

被引:358
作者
PEARLMUTTER, BA
机构
[1] Learning Systems Department at Siemens Corporate Research, Princeton
来源
IEEE TRANSACTIONS ON NEURAL NETWORKS | 1995年 / 6卷 / 05期
基金
美国国家科学基金会;
关键词
D O I
10.1109/72.410363
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
We survey learning algorithms for recurrent neural networks with hidden units and put the various techniques into a common framework, We discuss fixed point learning algorithms, namely recurrent backpropagation and deterministic Boltzmann machines, and nonfixed point algorithms, namely backpropagation through time, Elman's history cutoff, and Jordan's output feedback architecture, Forward propagation, an on-line technique that uses adjoint equations, and variations thereof, are also discussed, In many cases, the unified presentation leads to generalizations of various sorts, We discuss advantages and disadvantages of temporally continuous neural networks in contrast to clocked ones continue with some ''tricks of the trade'' for training, using, and simulating continuous time and recurrent neural networks, We present some simulations, and at the end, address issues of computational complexity and learning speed.
引用
收藏
页码:1212 / 1228
页数:17
相关论文
共 149 条
[1]  
ACKLEY DH, 1985, COGNITIVE SCI, V9, P147
[2]  
ALBESANO D, P IJCNN 92 BALTIMORE, P308
[3]  
ALEXANDER ST, 1986, ADAPTIVE SIGNAL PROC
[4]  
ALLEN RB, 1989, TMARHZ015240 BELL CO
[5]  
ALMEIDA LB, 1987, 1ST P IEEE INT C NEU, P609
[6]  
Anderson B. D. O., 1979, OPTIMAL FILTERING
[7]   LEARNING REGULAR SETS FROM QUERIES AND COUNTEREXAMPLES [J].
ANGLUIN, D .
INFORMATION AND COMPUTATION, 1987, 75 (02) :87-106
[8]  
[Anonymous], 1987, NONLINEAR SIGNAL PRO
[9]  
Atiya AF., 1988, NEURAL INFORMATION P, P22
[10]  
ATROUS RL, 1988, THESIS U PENNSYLVANI