A CONVERGENCE THEOREM FOR SEQUENTIAL LEARNING IN 2-LAYER PERCEPTRONS

被引:68
作者
MARCHAND, M [1 ]
GOLEA, M [1 ]
RUJAN, P [1 ]
机构
[1] FORSCHUNGSZENTRUM JULICH, INST FESTKORPERFORSCH, W-5170 JULICH 1, GERMANY
来源
EUROPHYSICS LETTERS | 1990年 / 11卷 / 06期
关键词
D O I
10.1209/0295-5075/11/6/001
中图分类号
O4 [物理学];
学科分类号
0702 ;
摘要
We consider a perceptron with Ni input units, one output and a yet unspecified number of hidden units. This perceptron must be able to learn a given but arbitrary set of input- output examples. By sequential learning we mean that groups of patterns, pertaining to the same class, are sequentially separated from the rest by successively adding hidden units until the remaining patterns are all in the same class. We prove that the internal representations obtained by such procedures are linearly separable. Preliminary numerical tests of an algorithm implementing these ideas are presented and compare favourably with results of other growth algorithms. © 1990 The Japan Society of Applied Physics.
引用
收藏
页码:487 / 492
页数:6
相关论文
共 14 条
[1]   What Size Net Gives Valid Generalization? [J].
Baum, Eric B. ;
Haussler, David .
NEURAL COMPUTATION, 1989, 1 (01) :151-160
[2]  
BLUM A, 1988, 1ST P WORKSH COMP LE, P9
[3]   OCCAM RAZOR [J].
BLUMER, A ;
EHRENFEUCHT, A ;
HAUSSLER, D ;
WARMUTH, MK .
INFORMATION PROCESSING LETTERS, 1987, 24 (06) :377-380
[4]  
Cun YL, 1985, COGNITIVA, V85, P599
[5]  
Denker J., 1987, Complex Systems, V1, P877
[6]  
GALLANT SI, 1986, 8 IEEE P C PATT REC
[7]  
JUDD S, 1987, 1ST P IEEE C NEUR NE, V2, P685
[8]   LEARNING IN FEEDFORWARD LAYERED NETWORKS - THE TILING ALGORITHM [J].
MEZARD, M ;
NADAL, JP .
JOURNAL OF PHYSICS A-MATHEMATICAL AND GENERAL, 1989, 22 (12) :2191-2203
[9]  
Minsky ML, 1988, PERCEPTRONS
[10]  
RUJAN P, 1989 P IJCNN WASH, V2, P105