A SIMPLE METHOD TO DERIVE BOUNDS ON THE SIZE AND TO TRAIN MULTILAYER NEURAL NETWORKS

被引:119
作者
SARTORI, MA
ANTSAKLIS, PJ
机构
[1] Dept of Electr Eng, Univ of Notre, Dame, IN
来源
IEEE TRANSACTIONS ON NEURAL NETWORKS | 1991年 / 2卷 / 04期
关键词
D O I
10.1109/72.88168
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
For an arbitrary training set with p training patterns, a multilayer neural network with one hidden layer and with p - 1 hidden layer neurons can exactly implement the training set. Previous derivations proved these bounds by separating the input patterns with particular hyperplanes and using the equations describing the hyperplanes to choose the weights for the hidden layer. Here, the bounds are derived by simply satisfying a rank condition on the output of the hidden layer. The weights for the hidden layer can be chosen almost arbitrarily, and the weights for the output layer are found by solving p linear equations.
引用
收藏
页码:467 / 471
页数:5
相关论文
共 5 条
[1]  
Baum E. B., 1988, Journal of Complexity, V4, P193, DOI 10.1016/0885-064X(88)90020-9
[2]   BOUNDS ON THE NUMBER OF HIDDEN NEURONS IN MULTILAYER PERCEPTRONS [J].
HUANG, SC ;
HUANG, YF .
IEEE TRANSACTIONS ON NEURAL NETWORKS, 1991, 2 (01) :47-55
[3]  
Nilsson N.J., 1965, LEARNING MACHINES
[4]  
Rosenblatt F, 1962, PRINCIPLES NEURODYNA
[5]  
WIDROW B, 1963, 1987 P IEEE INT C NE, V1, P145