Limitations of the approximation capabilities of neural networks with one hidden layer

被引:45
作者
Chui, CK
Li, X
Mhaskar, HN
机构
[1] TEXAS A&M UNIV,DEPT MATH,COLLEGE STN,TX 77843
[2] UNIV NEVADA,DEPT MATH SCI,LAS VEGAS,NV 89154
[3] CALIF STATE UNIV LOS ANGELES,DEPT MATH,LOS ANGELES,CA 90032
关键词
neural networks; Sobolev spaces; spline approximation; ridge functions;
D O I
10.1007/BF02124745
中图分类号
O29 [应用数学];
学科分类号
070104 ;
摘要
Let s greater than or equal to 1 be an integer and W be the class of all functions having integrable partial derivatives on [0, 1](s). We are interested in the minimum number of neurons in a neural network with a Single hidden layer required in order to provide a mean approximation order of a preassigned epsilon > 0 to each function in W. We prove that this number cannot be O(epsilon(-s) log(1/epsilon)) if a spline-like localization is required. This cannot be improved even if one allows different neurons to evaluate different activation functions, even depending upon the target function. Nevertheless, for any delta > 0, a network with O(epsilon(-s-delta)) neurons can be constructed to provide this order of approximation, with localization. Analogous results are also valid for other L(P) norms.
引用
收藏
页码:233 / 243
页数:11
相关论文
共 21 条
[1]   UNIVERSAL APPROXIMATION BOUNDS FOR SUPERPOSITIONS OF A SIGMOIDAL FUNCTION [J].
BARRON, AR .
IEEE TRANSACTIONS ON INFORMATION THEORY, 1993, 39 (03) :930-945
[2]  
BARRON AR, 1988, S INTERFACE STAT COM
[3]  
Broomhead D. S., 1988, Complex Systems, V2, P321
[4]  
CHUI CK, 1994, MATH COMPUT, V63, P607, DOI 10.1090/S0025-5718-1994-1240656-2
[5]  
Cybenko G., 1989, Mathematics of Control, Signals, and Systems, V2, P303, DOI 10.1007/BF02551274
[6]   OPTIMAL NONLINEAR APPROXIMATION [J].
DEVORE, RA ;
HOWARD, R ;
MICCHELLI, C .
MANUSCRIPTA MATHEMATICA, 1989, 63 (04) :469-478
[7]   REGULARIZATION THEORY AND NEURAL NETWORKS ARCHITECTURES [J].
GIROSI, F ;
JONES, M ;
POGGIO, T .
NEURAL COMPUTATION, 1995, 7 (02) :219-269
[8]   UNIVERSAL APPROXIMATION OF AN UNKNOWN MAPPING AND ITS DERIVATIVES USING MULTILAYER FEEDFORWARD NETWORKS [J].
HORNIK, K ;
STINCHCOMBE, M ;
WHITE, H .
NEURAL NETWORKS, 1990, 3 (05) :551-560
[9]   MULTILAYER FEEDFORWARD NETWORKS WITH A NONPOLYNOMIAL ACTIVATION FUNCTION CAN APPROXIMATE ANY FUNCTION [J].
LESHNO, M ;
LIN, VY ;
PINKUS, A ;
SCHOCKEN, S .
NEURAL NETWORKS, 1993, 6 (06) :861-867
[10]  
Mhaskar H.N., 1993, Adv. Comput. Math., V1, P61