A STOCHASTIC VERSION OF THE DELTA-RULE

被引:35
作者
HANSON, SJ [1 ]
机构
[1] PRINCETON UNIV,COGNIT SCI LAB,PRINCETON,NJ 08542
来源
PHYSICA D | 1990年 / 42卷 / 1-3期
关键词
D O I
10.1016/0167-2789(90)90081-Y
中图分类号
O29 [应用数学];
学科分类号
070104 ;
摘要
Self-organizing networks of neuron-like elements naturally lead to high-dimensional, nonlinear parameter spaces which prove difficult to search. Back-propagation is one of the simplest neural network/connectionist models that uses a gradient descent (delta rule) in a high multi-dimensional parameter space. Search in such a space is subject to many difficulties including minima that are locally stable but do not at the same time provide solutions. Search time can also be shown under relatively good conditions to scale poorly. Although gradient descent in error is attractive as a "weak" learning method it also seems to suffer from many search inefficiencies. A principle is proposed about the relation between constrained local noise injections and global search. The delta rule is modified to include synaptic noise in the transmission of information and modification of the connection strength. This stochastic version of the delta rule seems to promote escape from poor locally stable minima, and can improve convergence speed and likehood. © 1990.
引用
收藏
页码:265 / 272
页数:8
相关论文
共 13 条
[1]  
ACKELY DH, 1985, COGNITIVE SCI, V9, P147
[2]  
BAUM EB, 1988, NEURAL INFORMATION P
[3]  
Burns BD, 1968, UNCERTAIN NERVOUS SY
[4]  
HANSON SJ, IN PRESS BEHAVIORAL
[5]  
HANSON SJ, 1988, NEURAL INFORMATION P
[6]  
JORDAN M, 1986, ICS8604 TECHN REP
[7]   OPTIMIZATION BY SIMULATED ANNEALING [J].
KIRKPATRICK, S ;
GELATT, CD ;
VECCHI, MP .
SCIENCE, 1983, 220 (4598) :671-680
[8]  
PINEDA F, IN PRESS J COMPLEXIT
[9]   LEARNING REPRESENTATIONS BY BACK-PROPAGATING ERRORS [J].
RUMELHART, DE ;
HINTON, GE ;
WILLIAMS, RJ .
NATURE, 1986, 323 (6088) :533-536
[10]  
TESAURO G, 1988, CCSR881 CTR COMPL SY