Kolmogorov's Theorem Is Relevant

被引:98
作者
Kurkova, Vera [1 ]
机构
[1] Czechoslovak Acad Sci, Inst Comp Sci, POB 5, Prague 18207 8, Czech Republic
关键词
D O I
10.1162/neco.1991.3.4.617
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
We show that Kolmogorov's theorem on representations of continuous functions of n-variables by sums and superpositions of continuous functions of one variable is relevant in the context of neural networks. We give a version of this theorem with all of the one-variable functions approximated arbitrarily well by linear combinations of compositions of affine functions with some given sigmoidal function. We derive an upper estimate of the number of hidden units.
引用
收藏
页码:617 / 622
页数:6
相关论文
共 18 条
[1]  
Alexandrov P. S., 1983, HILBERTSCHEN PROBLEM
[2]  
ARNOLD VI, 1957, DOKL AKAD NAUK SSSR+, V114, P679
[3]  
Carroll S. M., 1989, P INT JOINT C NEUR N, p[I, 607]
[4]  
Cybenko G., 1989, Mathematics of Control, Signals, and Systems, V2, P303, DOI 10.1007/BF02551274
[5]   ON THE APPROXIMATE REALIZATION OF CONTINUOUS-MAPPINGS BY NEURAL NETWORKS [J].
FUNAHASHI, K .
NEURAL NETWORKS, 1989, 2 (03) :183-192
[6]   Representation Properties of Networks: Kolmogorov's Theorem Is Irrelevant [J].
Girosi, Federico ;
Poggio, Tomaso .
NEURAL COMPUTATION, 1989, 1 (04) :465-469
[7]  
Hecht-Nielsen R., 1989, P INT JOINT C NEUR N, p[I, 593]
[8]  
Hecht-Nielsen R., 1990, NEUROCOMPUTING
[9]  
HECHTNIELSEN R, 1987, P INT C NEUR NETW, pR3
[10]   APPROXIMATION CAPABILITIES OF MULTILAYER FEEDFORWARD NETWORKS [J].
HORNIK, K .
NEURAL NETWORKS, 1991, 4 (02) :251-257