Prediction intervals for neural networks via nonlinear regression

被引:92
作者
De Veaux, RD [1 ]
Schumi, J
Schweinsberg, J
Ungar, LH
机构
[1] Williams Coll, Dept Math, Williamstown, MA 01267 USA
[2] Iowa State Univ, Dept Stat, Ames, IA 50010 USA
[3] Univ Calif Berkeley, Dept Stat, Berkeley, CA 94720 USA
[4] Univ Penn, Dept Comp & Informat Sci, Philadelphia, PA 19104 USA
关键词
backpropagation; high-dimensional data; nonparametric regression; smoothing;
D O I
10.2307/1270528
中图分类号
O21 [概率论与数理统计]; C8 [统计学];
学科分类号
020208 ; 070103 ; 0714 ;
摘要
Standard methods for computing prediction intervals in nonlinear regression can be effectively applied to neural networks when the number of training points is large. Simulations show, however, that these methods can generate unreliable prediction intervals on smaller datasets when the network is trained to convergence. Stopping the training algorithm prior to convergence, to avoid overfitting, reduces the effective number of parameters but can lead to prediction intervals that are too wide. We present an alternative approach to estimating prediction intervals using weight decay to fit the network and show via a simulation study that this method may be effective in overcoming some of the shortcomings of the other approaches.
引用
收藏
页码:273 / 282
页数:10
相关论文
共 14 条