INITIALIZING BACK PROPAGATION NETWORKS WITH PROTOTYPES

被引:88
作者
DENOEUX, T [1 ]
LENGELLE, R [1 ]
机构
[1] COMPIEGNE ARTIFICIAL INTELLIGENCE LAB, COMPIEGNE, FRANCE
关键词
INITIALIZATION; FEEDFORWARD NEURAL NETWORKS; BACK PROPAGATION; PROTOTYPES; SUPERVISED LEARNING; PATTERN RECOGNITION; FUNCTION APPROXIMATION; RADIAL BASIS FUNCTIONS;
D O I
10.1016/0893-6080(93)90003-F
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
This paper addresses the problem of initializing the weights in back propagation networks with one hidden layer. The proposed method relies on the use of reference patterns, or prototypes, and on a transformation which maps each vector in the original feature space onto a unit-length vector in a space with one additional dimension. This scheme applies to pattern recognition tasks, as well as to the approximation of continuous functions. Issues related to the preprocessing of input patterns and to the generation of prototypes are discussed, and an algorithm for building appropriate prototypes in the continuous case is described. Also examined is the relationship between this approach and the theory of radial basis functions. Finally, simulation results are presented, showing that initializing back propagation networks with prototypes generally results in (a) drastic reductions in training time, (b) improved robustness against local minima, and (c) better generalization.
引用
收藏
页码:351 / 363
页数:13
相关论文
共 42 条
[11]  
EPPLER W, 1990, P NEURONIMES 90 NIME, P227
[12]  
HANSON SJ, 1989, ADV NEURAL INFORMATI, V1, P177
[13]   BACK-PROPAGATION ALGORITHM WHICH VARIES THE NUMBER OF HIDDEN UNITS [J].
HIROSE, Y ;
YAMASHITA, K ;
HIJIYA, S .
NEURAL NETWORKS, 1991, 4 (01) :61-66
[14]   INCREASED RATES OF CONVERGENCE THROUGH LEARNING RATE ADAPTATION [J].
JACOBS, RA .
NEURAL NETWORKS, 1988, 1 (04) :295-307
[15]  
KOHONEN T, 1989, NEURAL NETWORKS FROM MODELS TO APPLICATIONS, P160
[16]  
KOHONEN T, 1987, SELFORGANIZATION ASS
[17]  
KONG YH, 1990, P INNC 90 PARIS, P769
[18]  
Kramer A. H., 1989, ADV NEURAL INFORMATI, P40
[19]   BENEFITS OF GAIN - SPEEDED LEARNING AND MINIMAL HIDDEN LAYERS IN BACK-PROPAGATION NETWORKS [J].
KRUSCHKE, JK ;
MOVELLAN, JR .
IEEE TRANSACTIONS ON SYSTEMS MAN AND CYBERNETICS, 1991, 21 (01) :273-280
[20]  
LECUN Y, 1985, P COGNITIVA, V85, P599