FINITE-SIZE EFFECTS IN LEARNING AND GENERALIZATION IN LINEAR PERCEPTRONS

被引:20
作者
SOLLICH, P
机构
[1] Dept. of Phys., Edinburgh Univ.
来源
JOURNAL OF PHYSICS A-MATHEMATICAL AND GENERAL | 1994年 / 27卷 / 23期
关键词
D O I
10.1088/0305-4470/27/23/020
中图分类号
O4 [物理学];
学科分类号
0702 ;
摘要
Most properties of learning and generalization in linear perceptrons can be derived from the average response function G. We present a method for calculating G using only simple matrix identities and partial differential equations. Using this method, we first rederive the known result for G in the thermodynamic limit of perceptrons of infinite size N, which has previously been calculated using replica and diagrammatic methods. We also show explicitly that the response function is self-averaging in the thermodynamic limit. Extensions of our method to more general learning scenarios with anisotropic teacher-space priors, input distributions, and weight-decay terms are discussed. Finally, finite-size effects are considered by calculating the O(1/N) correction to G. We verify the result by computer simulations and discuss the consequences for generalization and learning dynamics in linear perceptrons of finite size.
引用
收藏
页码:7771 / 7784
页数:14
相关论文
共 11 条
[1]  
BARBER D, UNPUB J PHYS A
[2]  
DERRIDA D, 1991, J PHYS A, V24, P4907
[3]  
John F., 1978, PARTIAL DIFFERENTIAL
[4]  
Kinzel W., 1991, MODELS NEURAL NETWOR, P149
[5]   GENERALIZATION IN A LINEAR PERCEPTRON IN THE PRESENCE OF NOISE [J].
KROGH, A ;
HERTZ, JA .
JOURNAL OF PHYSICS A-MATHEMATICAL AND GENERAL, 1992, 25 (05) :1135-1147
[6]   LEARNING WITH NOISE IN A LINEAR PERCEPTRON [J].
KROGH, A .
JOURNAL OF PHYSICS A-MATHEMATICAL AND GENERAL, 1992, 25 (05) :1119-1133
[7]  
MARION G, 1994, UNPUB DATA DEPENDENT
[8]   LEARNING IN NEURAL NETWORKS - SOLVABLE DYNAMICS [J].
OPPER, M .
EUROPHYSICS LETTERS, 1989, 8 (04) :389-392
[9]  
SAAD D, 1994, UNPUB NEURAL COMPUT
[10]  
SOLLICH P, 1994, UNPUB MINIMUM ENTROP