Analysis of the rate of convergence of least squares neural network regression estimates in case of measurement errors

被引:12
作者
Kohler, Michael [1 ]
Mehnert, Jens [1 ]
机构
[1] Tech Univ Darmstadt, Fachbereich Math, D-64289 Darmstadt, Germany
关键词
Least squares estimates; Measurement error; Neural networks; Rate of convergence; Regression estimates; L-2; error; NONPARAMETRIC REGRESSION; BOUNDS;
D O I
10.1016/j.neunet.2010.11.003
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Estimation of a regression function from data which consists of an independent and identically distributed sample of the underlying distribution with additional measurement errors in the independent variables is considered. It is allowed that the measurement errors are not independent and have a nonzero mean. It is shown that the rate of convergence of suitably defined least squares neural network estimates applied to this data is similar to the rate of convergence of least squares neural network estimates applied to an independent and identically distributed sample of the underlying distribution as long as the measurement errors are small. (C) 2011 Published by Elsevier Ltd
引用
收藏
页码:273 / 279
页数:7
相关论文
共 23 条
[21]   CONVERGENCE-RATES FOR SINGLE HIDDEN LAYER FEEDFORWARD NETWORKS [J].
MCCAFFREY, DF ;
GALLANT, AR .
NEURAL NETWORKS, 1994, 7 (01) :147-158
[22]   CONSISTENCY OF MULTILAYER PERCEPTRON REGRESSION-ESTIMATORS [J].
MIELNICZUK, J ;
TYRCHA, J .
NEURAL NETWORKS, 1993, 6 (07) :1019-1022
[23]  
Ripley B. D., 2007, Pattern recognition and neural networks