The Transition to Perfect Generalization in Perceptrons

被引:16
作者
Baum, Eric B. [1 ]
Lyuu, Yuh-Dauh [1 ]
机构
[1] NEC Res Inst, Princeton, NJ 08540 USA
关键词
D O I
10.1162/neco.1991.3.3.386
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Several recent papers (Gardner and Derrida 1989; Gyiirgyi 1990; Sompolinsky et al. 1990) have found, using methods of statistical physics, that a transition to perfect generalization occurs in training a simple perceptron whose weights can only take values +/- 1. We give a rigorous proof of such a phenomena. That is, we show, for alpha = 2.0821, that if at least alpha n examples are drawn from the uniform distribution on {+1, -1}(n) and classified according to a target perceptron w(t) is an element of {+1, -1}(n) as positive or negative according to whether w(t). is nonnegative or negative, then the probability is 2(-(root n)) that there is any other such perceptron consistent with the examples. Numerical results indicate further that perfect generalization holds for alpha as low as 1.5.
引用
收藏
页码:386 / 401
页数:16
相关论文
共 7 条
  • [1] BAUM EB, 1990, NEUR NETW EURASIP, V412, P2
  • [2] What Size Net Gives Valid Generalization?
    Baum, Eric B.
    Haussler, David
    [J]. NEURAL COMPUTATION, 1989, 1 (01) : 151 - 160
  • [3] Bollobas B., 1985, RANDOM GRAPHS
  • [4] 3 UNFINISHED WORKS ON THE OPTIMAL STORAGE CAPACITY OF NETWORKS
    GARDNER, E
    DERRIDA, B
    [J]. JOURNAL OF PHYSICS A-MATHEMATICAL AND GENERAL, 1989, 22 (12): : 1983 - 1994
  • [5] Graham Ronald L., 1989, CONCRETE MATH FDN CO
  • [6] 1ST-ORDER TRANSITION TO PERFECT GENERALIZATION IN A NEURAL NETWORK WITH BINARY SYNAPSES
    GYORGYI, G
    [J]. PHYSICAL REVIEW A, 1990, 41 (12): : 7097 - 7100
  • [7] LEARNING FROM EXAMPLES IN LARGE NEURAL NETWORKS
    SOMPOLINSKY, H
    TISHBY, N
    SEUNG, HS
    [J]. PHYSICAL REVIEW LETTERS, 1990, 65 (13) : 1683 - 1686