LEARNING FROM EXAMPLES IN LARGE NEURAL NETWORKS

被引:110
作者
SOMPOLINSKY, H
TISHBY, N
SEUNG, HS
机构
[1] AT&T BELL LABS,MURRAY HILL,NJ 07974
[2] HARVARD UNIV,DEPT PHYS,CAMBRIDGE,MA 02138
关键词
D O I
10.1103/PhysRevLett.65.1683
中图分类号
O4 [物理学];
学科分类号
0702 ;
摘要
A statistical-mechanical theory of learning from examples in layered networks at finite temperature is studied. When the training error is a smooth function of continuously varying weights, the generalization error falls off asymptotically as the inverse number of examples. By analytical and numerical studies of single-layer perceptrons, we show that when the weights are discrete, the generalization error can exhibit a discontinuous transition to perfect generalization. For intermediate sizes of the example set, the state of perfect generalization coexists with a metastable spin-glass state. © 1990 The American Physical Society.
引用
收藏
页码:1683 / 1686
页数:4
相关论文
共 18 条
[1]  
Ahmad S, 1989, ADV NEURAL INF PROCE, P160
[2]   What Size Net Gives Valid Generalization? [J].
Baum, Eric B. ;
Haussler, David .
NEURAL COMPUTATION, 1989, 1 (01) :151-160
[3]   EXHAUSTIVE THERMODYNAMICAL ANALYSIS OF BOOLEAN LEARNING NETWORKS [J].
CARNEVALI, P ;
PATARNELLO, S .
EUROPHYSICS LETTERS, 1987, 4 (10) :1199-1204
[4]  
CARNEVALI P, 1987, EUROPHYS LETT, V4, P503
[5]  
Denker J., 1987, Complex Systems, V1, P877
[6]  
FONTANARI JF, IN PRESS
[7]   3 UNFINISHED WORKS ON THE OPTIMAL STORAGE CAPACITY OF NETWORKS [J].
GARDNER, E ;
DERRIDA, B .
JOURNAL OF PHYSICS A-MATHEMATICAL AND GENERAL, 1989, 22 (12) :1983-1994
[8]   1ST-ORDER TRANSITION TO PERFECT GENERALIZATION IN A NEURAL NETWORK WITH BINARY SYNAPSES [J].
GYORGYI, G .
PHYSICAL REVIEW A, 1990, 41 (12) :7097-7100
[9]  
GYORGYI G, 1990, NEURAL NETWORKS SPIN
[10]   LEARNING FROM EXAMPLES IN A SINGLE-LAYER NEURAL NETWORK [J].
HANSEL, D ;
SOMPOLINSKY, H .
EUROPHYSICS LETTERS, 1990, 11 (07) :687-692