NETWORKS FOR APPROXIMATION AND LEARNING

被引:1890
作者
POGGIO, T [1 ]
GIROSI, F [1 ]
机构
[1] MIT,CTR BIOL INFORMAT PROC,CAMBRIDGE,MA 02139
关键词
D O I
10.1109/5.58326
中图分类号
TM [电工技术]; TN [电子技术、通信技术];
学科分类号
0808 ; 0809 ;
摘要
Learning an input-output mapping from a set of examples, of the type that many neural networks have been constructed to perform, can be regarded as synthesizing an approximation of a multidimensional function, that is solving the problem of hypersurface reconstruction. From this point of view, this form of learning is closely related to classical approximation techniques, such as generalized splines and regularization theory. This paper considers the problem of the approximation of nonlinear mappings—especially continuous mappings. We develop a theoretical framework for approximation based on regularization techniques that leads to a class of three-layer networks that we call regularization networks and include as a special case the well-known Radial Basis Functions method. Regularization networks are not only equivalent to generalized splines, but are also closely related to pattern recognition methods such as Parzen windows and potential functions and to several neural network algorithms, such as Kanerva's associative memory, backpropagation, and Kohonen's topology preserving map. They also have an interesting interpretation in terms of prototypes that are synthesized and optimally combined during the learning stage. This paper generalizes the theory of regularization networks to a formulation that turns out to include task-dependent clustering and dimensionality reduction. We also discuss briefly some intriguing analogies with neurobiological data. © 1990 IEEE
引用
收藏
页码:1481 / 1497
页数:17
相关论文
共 82 条
[1]  
ALBERT A, 1972, REGRESSION MOOREPENR
[2]  
ALBUS J S, 1971, Mathematical Biosciences, V10, P25, DOI 10.1016/0025-5564(71)90051-4
[3]  
ALOIMONOS JY, 1989, MAY P IMAG UND WORKS, P507
[4]  
[Anonymous], 1963, SOV MATH, DOI DOI 10.1111/J.1365-246X.2012.05699.X
[5]  
BARRON A, 1988, APR S INT STAT COMP
[6]   What Size Net Gives Valid Generalization? [J].
Baum, Eric B. ;
Haussler, David .
NEURAL COMPUTATION, 1989, 1 (01) :151-160
[7]  
Baum E. B., 1988, Journal of Complexity, V4, P193, DOI 10.1016/0885-064X(88)90020-9
[8]   ILL-POSED PROBLEMS IN EARLY VISION [J].
BERTERO, M ;
POGGIO, TA ;
TORRE, V .
PROCEEDINGS OF THE IEEE, 1988, 76 (08) :869-889
[9]  
BERTERO M, 1966, INVERSE PROBLEMS
[10]  
Bochner S., 1932, AKADEMISCHE VERLAGSG