Optimal convergence of on-line backpropagation

被引:39
作者
Gori, M
Maggini, M
机构
[1] Dipartimento di Sistemi e Informatica, Università di Firenze, 50139 Firenze, Via di S. Marta
来源
IEEE TRANSACTIONS ON NEURAL NETWORKS | 1996年 / 7卷 / 01期
关键词
D O I
10.1109/72.478415
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Many researchers are quite skeptical about the actual behavior of neural network learning algorithms like backpropagation. One of the major problems is with the lack of clear theoretical results on optimal convergence, particularly for pattern mode algorithms. In this paper, we prove the companion of Rosenblatt's PC (perceptron convergence) theorem for feedforward networks, stating that pattern mode backpropagation converges to an optimal solution for linearly separable patterns.
引用
收藏
页码:251 / 254
页数:4
相关论文
共 16 条
[11]   AN ACCELERATED LEARNING ALGORITHM FOR MULTILAYER PERCEPTRON NETWORKS [J].
PARLOS, AG ;
FERNANDEZ, B ;
ATIYA, AF ;
MUTHUSAMI, J ;
TSAI, WK .
IEEE TRANSACTIONS ON NEURAL NETWORKS, 1994, 5 (03) :493-497
[12]  
ROSENBLATT F, 1960, VG1196G4 CORN AER LA
[13]  
Rumelhart D. E., 1986, PARALLEL DISTRIBUTED, V1
[14]  
SONTAG ED, 1989, JUN P INT JOINT C NE, V1, P639
[15]  
WANG SD, 1991, NOV P INT JOINT C NE, P183
[16]   CAN BACKPROPAGATION ERROR SURFACE NOT HAVE LOCAL MINIMA [J].
YU, XH .
IEEE TRANSACTIONS ON NEURAL NETWORKS, 1992, 3 (06) :1019-1021