Neural networks with quadratic VC dimension

被引:42
作者
Koiran, P [1 ]
Sontag, ED [1 ]
机构
[1] RUTGERS STATE UNIV,DEPT MATH,NEW BRUNSWICK,NJ 08903
关键词
D O I
10.1006/jcss.1997.1479
中图分类号
TP3 [计算技术、计算机技术];
学科分类号
0812 ;
摘要
This paper shows that neural networks which use continuous activation functions have VC dimension at least as large as the square of the number of weights w. This results settles a long-standing open question, namely whether the well-known O(w log w) bound, known for hard-threshold nets, also held for more general sigmoidal nets. Implications for the number of samples needed for valid generalization are discussed. (C) 1997 Academic Press.
引用
收藏
页码:190 / 198
页数:9
相关论文
共 17 条
  • [1] ALON N, 1993, AN S FDN CO, P292
  • [2] [Anonymous], 1982, ESTIMATION DEPENDENC
  • [3] Anthony Martin., 1992, COMPUTATIONAL LEARNI
  • [4] What Size Net Gives Valid Generalization?
    Baum, Eric B.
    Haussler, David
    [J]. NEURAL COMPUTATION, 1989, 1 (01) : 151 - 160
  • [5] ON A THEORY OF COMPUTATION AND COMPLEXITY OVER THE REAL NUMBERS - NP-COMPLETENESS, RECURSIVE FUNCTIONS AND UNIVERSAL MACHINES
    BLUM, L
    SHUB, M
    SMALE, S
    [J]. BULLETIN OF THE AMERICAN MATHEMATICAL SOCIETY, 1989, 21 (01) : 1 - 46
  • [6] LEARNABILITY AND THE VAPNIK-CHERVONENKIS DIMENSION
    BLUMER, A
    EHRENFEUCHT, A
    HAUSSLER, D
    WARMUTH, MK
    [J]. JOURNAL OF THE ACM, 1989, 36 (04) : 929 - 965
  • [7] COVER TM, 1988, PATTERN RECOGN, P283
  • [8] BOUNDING THE VAPNIK-CHERVONENKIS DIMENSION OF CONCEPT CLASSES PARAMETERIZED BY REAL NUMBERS
    GOLDBERG, PW
    JERRUM, MR
    [J]. MACHINE LEARNING, 1995, 18 (2-3) : 131 - 148
  • [9] KARPINSKI M, 1995, P 27 ACM S THEOR COM, P200
  • [10] LANG KJ, 1988, CMUCS88152 CARN MELL