REPRESENTATION OF FUNCTIONS BY SUPERPOSITIONS OF A STEP OR SIGMOID FUNCTION AND THEIR APPLICATIONS TO NEURAL NETWORK THEORY

被引:158
作者
ITO, Y
机构
[1] Nagoya University College of Medical Technology, Nagoya
关键词
3-LAYERED NEURAL NETWORK; HEAVISIDE FUNCTION; SIGMOID FUNCTION; RADON TRANSFORM; INVERSE RADON TRANSFORM; INTEGRAL REPRESENTATION; FINITE SUM APPROXIMATION;
D O I
10.1016/0893-6080(91)90075-G
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
The starting point of this article is the inversion formula of the Radon transform; the article aims to contribute to the theory of three-layered neural networks. Let H be the Heaviside function. Then, for any function f membership sign S(R(n)), there is a function g(f) such that f can be represented on R(n) by an integral integral of H(x . omega - t)g(f)(t, omega) dt d-mu(omega), where mu is the uniform measure on the unit sphere S(n-1), t membership sign R and omega membership sign S(n-1). Furthermore, f can be approximated uniformly arbitrarily well on the whole space R(n) by a finite sum of the form SIGMA-k(a)k(H)(x . omega(k) - t(k)). Let H-sigma be a sigmoid function on R defined by H-sigma(t) = integral of H(t - x . omega) d-sigma(x), where sigma is a spherically symmetric probability measure. Suppose that sigma satisfies a few further conditions. Then, for any f membership sign S(R(n)), there is a function g(f,sigma) such that f can be written integral H-sigma(x . omega - t)g(f,sigma)(t, omega) dt d-mu(omega) with the unscaled sigmoid function H-sigma fixed beforehand. This expression can also be approximated uniformly arbitrarily well on R(n) by a finite sum.
引用
收藏
页码:385 / 394
页数:10
相关论文
共 10 条
[1]  
[Anonymous], 1ST IEEE INT C NEUR
[2]  
CARROLL SM, 1989, 89 IJCNN P, V1, P607
[3]  
Cybenko G., 1989, Mathematics of Control, Signals, and Systems, V2, P303, DOI 10.1007/BF02551274
[4]   ON THE APPROXIMATE REALIZATION OF CONTINUOUS-MAPPINGS BY NEURAL NETWORKS [J].
FUNAHASHI, K .
NEURAL NETWORKS, 1989, 2 (03) :183-192
[5]  
GELFAND IM, 1966, GENERALIZED FUNCTION, V5
[6]  
Helgason S., 1980, The Radon transform
[7]   MULTILAYER FEEDFORWARD NETWORKS ARE UNIVERSAL APPROXIMATORS [J].
HORNIK, K ;
STINCHCOMBE, M ;
WHITE, H .
NEURAL NETWORKS, 1989, 2 (05) :359-366
[8]  
Irie B., 1988, IEEE INT C NEURAL NE, V1, P641
[9]  
TKOLMOGOROV AN, 1963, MATH SOC TRANSL, V28, P55
[10]  
WIELAND A, 1987, 1ST P IEEE ICNN, V1, P385