Training neural networks with noisy data as an ill-posed problem

被引:16
作者
Burger, M [1 ]
Engl, HW [1 ]
机构
[1] Johannes Kepler Univ Linz, Ind Math Inst, A-4040 Linz, Austria
关键词
ill-posed problems; least-squares collocation; neural networks; network training; regularization;
D O I
10.1023/A:1016641629556
中图分类号
O29 [应用数学];
学科分类号
070104 [应用数学];
摘要
This paper is devoted to the analysis of network approximation in the framework of approximation and regularization theory. It is shown that training neural networks and similar network approximation techniques are equivalent to least-squares collocation for a corresponding integral equation with mollified data. Results about convergence and convergence rates for exact data are derived based upon well-known convergence results about least-squares collocation, Finally, the stability properties with respect to errors in the data are examined and stability bounds are obtained, which yield rules for the choice of the number of network elements.
引用
收藏
页码:335 / 354
页数:20
相关论文
共 40 条
[1]
Besov regularization, thresholding and wavelets for smoothing data [J].
Amato, U ;
Vuza, DT .
NUMERICAL FUNCTIONAL ANALYSIS AND OPTIMIZATION, 1997, 18 (5-6) :461-493
[2]
[Anonymous], 1977, SOLUTIONS INVERSE PR
[3]
ATKINSON KE, 1991, MATH COMPUT, V56, P119, DOI 10.1090/S0025-5718-1991-1052084-0
[4]
UNIVERSAL APPROXIMATION BOUNDS FOR SUPERPOSITIONS OF A SIGMOIDAL FUNCTION [J].
BARRON, AR .
IEEE TRANSACTIONS ON INFORMATION THEORY, 1993, 39 (03) :930-945
[5]
Bishop C. M., 1995, NEURAL NETWORKS PATT
[6]
BISHOP CM, 1995, REGULARIZATION COMPL, P141
[7]
BURGER M, UNPUB ERROR BOUNDS A
[8]
APPROXIMATION BY RIDGE FUNCTIONS AND NEURAL NETWORKS WITH ONE HIDDEN LAYER [J].
CHUI, CK ;
LI, X .
JOURNAL OF APPROXIMATION THEORY, 1992, 70 (02) :131-141
[9]
Daubechies I., 1993, Ten Lectures of Wavelets, V28, P350
[10]
Engl H., 1996, Mathematics and Its Applications, V375, DOI DOI 10.1007/978-94-009-1740-8