CONVERGENCE OF LEARNING ALGORITHMS WITH CONSTANT LEARNING RATES

被引:80
作者
KUAN, CM [1 ]
HORNIK, K [1 ]
机构
[1] VIENNA TECH UNIV,INST STAT & WAHRSCHEINLICHKEITSTHEORIE,A-1040 VIENNA,AUSTRIA
来源
IEEE TRANSACTIONS ON NEURAL NETWORKS | 1991年 / 2卷 / 05期
关键词
D O I
10.1109/72.134285
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
We investigate the behavior or neural network learning algorithms with a small, constant learning rate, is-an-element-of, in stationary, random input environments. It is rigorously established that the sequence of weight estimates can be approximated by a certain ordinary differential equation, in the sense of weak convergence of random processes as is-an-element-of tends to zero. As applications, back-propagation in feedforward architectures and some feature extraction algorithms are studied in more detail.
引用
收藏
页码:484 / 489
页数:6
相关论文
共 23 条
[1]  
[Anonymous], 1987, LEARNING INTERNAL RE
[2]  
[Anonymous], 1978, STOCHASTIC APPROXIMA
[3]   NEURAL NETWORKS AND PRINCIPAL COMPONENT ANALYSIS - LEARNING FROM EXAMPLES WITHOUT LOCAL MINIMA [J].
BALDI, P ;
HORNIK, K .
NEURAL NETWORKS, 1989, 2 (01) :53-58
[4]  
BALDI P, 1991, BACK PROPAGATION THE
[5]  
BALDI P, 1988, 1988 P NIPS C DENV
[6]  
Billingsley P, 1968, CONVERGENCE PROBABIL
[7]   AUTO-ASSOCIATION BY MULTILAYER PERCEPTRONS AND SINGULAR VALUE DECOMPOSITION [J].
BOURLARD, H ;
KAMP, Y .
BIOLOGICAL CYBERNETICS, 1988, 59 (4-5) :291-294
[8]  
Doob J. L., 1953, STOCHASTIC PROCESSES
[9]  
FOLDIAK P, 1989, P INT JOINT C NEURAL
[10]  
HORNIK K, 1990, BEBR901717 U ILL URB