CAN BACKPROPAGATION ERROR SURFACE NOT HAVE LOCAL MINIMA

被引:59
作者
YU, XH
机构
[1] Communication Lab, Department of Radio Engineering, Southeast University, Nanjing 210018, Jiangsu
来源
IEEE TRANSACTIONS ON NEURAL NETWORKS | 1992年 / 3卷 / 06期
关键词
D O I
10.1109/72.165604
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
From a theoretic point of view, we show that for an arbitrary T-element training set with t(t less-than-or-equal-to T) different inputs, the backpropagation error surface does not have suboptimal local minima if the network is capable of exactly implementing an arbitrary training set consisting of t different patterns. As a special case, the error surface of a backpropagation network with one hidden layer and t - 1 hidden units has no local minima, if the network is trained by an arbitrary T-element set with t different inputs.
引用
收藏
页码:1019 / 1021
页数:3
相关论文
共 15 条
[1]  
ALLRED LG, 1990, JUN P IJCNN 90 SAN D, P702
[2]  
Baum E. B., 1988, Journal of Complexity, V4, P193, DOI 10.1016/0885-064X(88)90020-9
[3]   APPROXIMATION-THEORY AND FEEDFORWARD NETWORKS [J].
BLUM, EK ;
LI, LK .
NEURAL NETWORKS, 1991, 4 (04) :511-515
[4]   PARALLEL RECURSIVE PREDICTION ERROR ALGORITHM FOR TRAINING LAYERED NEURAL NETWORKS [J].
CHEN, S ;
COWAN, CFN ;
BILLINGS, SA ;
GRANT, PM .
INTERNATIONAL JOURNAL OF CONTROL, 1990, 51 (06) :1215-1228
[5]  
Cybenko G., 1989, Mathematics of Control, Signals, and Systems, V2, P303, DOI 10.1007/BF02551274
[6]  
Hecht-Nielsen R., 1989, IJCNN: International Joint Conference on Neural Networks (Cat. No.89CH2765-6), P593, DOI 10.1109/IJCNN.1989.118638
[7]   MULTILAYER FEEDFORWARD NETWORKS ARE UNIVERSAL APPROXIMATORS [J].
HORNIK, K ;
STINCHCOMBE, M ;
WHITE, H .
NEURAL NETWORKS, 1989, 2 (05) :359-366
[8]   BOUNDS ON THE NUMBER OF HIDDEN NEURONS IN MULTILAYER PERCEPTRONS [J].
HUANG, SC ;
HUANG, YF .
IEEE TRANSACTIONS ON NEURAL NETWORKS, 1991, 2 (01) :47-55
[9]  
JORDAN J, 1991, JUL P IJCNN 91 SEATT, V2, P391
[10]  
POSTON T, 1991, JUL P IEEE IJCNN91 S, V2, P173