APPROXIMATION CAPABILITIES OF MULTILAYER FEEDFORWARD NETWORKS

被引:3874
作者
HORNIK, K
机构
[1] Technische Universität Wien, Vienna
关键词
MULTILAYER FEEDFORWARD NETWORKS; ACTIVATION FUNCTION; UNIVERSAL APPROXIMATION CAPABILITIES; INPUT ENVIRONMENT MEASURE; LP(MU) APPROXIMATION; UNIFORM APPROXIMATION; SOBOLEV SPACES; SMOOTH APPROXIMATION;
D O I
10.1016/0893-6080(91)90009-T
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
We show that standard multilayer feedforward networks with as few as a single hidden layer and arbitrary bounded and nonconstant activation function are universal approximators with respect to L(p)(mu) performance criteria, for arbitrary finite input environment measures mu, provided only that sufficiently many hidden units are available. If the activation function is continuous, bounded and nonconstant, then continuous mappings can be learned uniformly over compact input sets. We also give very general conditions ensuring that networks with sufficiently smooth activation functions are capable of arbitrarily accurate approximation to a function and its derivatives.
引用
收藏
页码:251 / 257
页数:7
相关论文
共 16 条
[1]  
Adams RA, 1975, SOBOLEV SPACES
[2]  
[Anonymous], 1989, IEEE International Conference on Neural Networks
[3]  
Cybenko G., 1989, Mathematics of Control, Signals, and Systems, V2, P303, DOI 10.1007/BF02551274
[4]  
FRIEDMAN A, 1982, F MODERN ANAL
[5]   ON THE APPROXIMATE REALIZATION OF CONTINUOUS-MAPPINGS BY NEURAL NETWORKS [J].
FUNAHASHI, K .
NEURAL NETWORKS, 1989, 2 (03) :183-192
[6]  
GALLANT AR, 1989, LEARNING DERIVATIVES
[7]  
GALLANT AR, 1988, 2ND IEEE INT C NEUR, P657
[8]  
Hecht-Nielsen R., 1989, IJCNN: International Joint Conference on Neural Networks (Cat. No.89CH2765-6), P593, DOI 10.1109/IJCNN.1989.118638
[9]   MULTILAYER FEEDFORWARD NETWORKS ARE UNIVERSAL APPROXIMATORS [J].
HORNIK, K ;
STINCHCOMBE, M ;
WHITE, H .
NEURAL NETWORKS, 1989, 2 (05) :359-366
[10]  
Hornik K., 1990, NEURAL NETWORKS