CONSISTENCY OF MULTILAYER PERCEPTRON REGRESSION-ESTIMATORS

被引:25
作者
MIELNICZUK, J
TYRCHA, J
机构
关键词
MULTILAYER PERCEPTRON; LEAST SQUARES REGRESSION ESTIMATOR; ENTROPY; BACK PROPAGATION; VAPNIK-CHERVONENKIS CLASS;
D O I
10.1016/S0893-6080(09)80011-7
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
In the paper three layer perceptron with one hidden layer and the output layer consisting of one neuron is considered. This is commonly used architecture to solve regression problems where such a perceptron minimizing the mean squared error criterion for the data points x(k), y(k)), k = 1, .... N is sought. It is shown that in the model: y(k) = g0(X(k)) + epsilon(k), k = 1, .... N, where x(k) is independent from zero mean error term epsilon(k), this procedure is consistent when N --> infinity, provided that g0 is represented as three layer perceptron with Heaviside transfer fucntion. The same result is true when transfer function is an arbitrary continuous function with bounded limits at +/- infinity and the hidden-lo-output weights in the considered family of perceptrons are bounded.
引用
收藏
页码:1019 / 1022
页数:4
相关论文
共 13 条
[1]   What Size Net Gives Valid Generalization? [J].
Baum, Eric B. ;
Haussler, David .
NEURAL COMPUTATION, 1989, 1 (01) :151-160
[2]  
CHAUVIN Y, 1990, LECT NOTES COMPUT SC, V412, P46
[3]  
DUDLEY R, 1984, COURSE EMPIRICAL PRO
[4]   ON THE DECISION REGIONS OF MULTILAYER PERCEPTRONS [J].
GIBSON, GJ ;
COWAN, CFN .
PROCEEDINGS OF THE IEEE, 1990, 78 (10) :1590-1594
[5]  
Hecht-Nielsen R., 1990, NEUROCOMPUTING
[6]  
Lippman R. P., 1987, IEEE ASSP MAGAZI APR, P4
[7]  
McClelland J. L., 1986, PARALLEL DISTRIBUTED, V1
[8]  
Murtagh F., 1991, NEUROCOMPUTING, V2, P183
[9]  
VANDEGEER SA, 1988, CWI45 TRACT, V45
[10]  
Vapnik V, 1982, ESTIMATION DEPENDENC