Slow stochastic Hebbian learning of classes of stimuli in a recurrent neural network

被引:56
作者
Brunel, N
Carusi, F
Fusi, S
机构
[1] Univ Paris 06, CNRS, Ecole Normale Super, LPS, F-75231 Paris 05, France
[2] Univ Paris 07, CNRS, Ecole Normale Super, LPS, F-75231 Paris, France
[3] Univ Rome La Sapienza, Ist Nazl Fis Nucl, I-00185 Rome, Italy
[4] Hebrew Univ Jerusalem, Racah Inst Phys, IL-91904 Jerusalem, Israel
关键词
D O I
10.1088/0954-898X/9/1/007
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
We study unsupervised Hebbian learning in a recurrent network in which synapses have a finite number of stable states. Stimuli received by the network are drawn at random at each presentation from a set of classes. Each class is defined as a cluster in stimulus space,, centred on the class prototype. The presentation protocol is chosen to mimic the protocols of visual memory experiments in which a set of stimuli is presented repeatedly in a random way. The statistics of the input stream may be stationary, or changing. Each stimulus induces, in a stochastic way, transitions between stable synaptic states. Learning dynamics is studied analytically in the slow learning limit, in which a given stimulus has to be presented many times before it is memorized, i.e. before synaptic modifications enable a pattern of activity correlated with the stimulus to become an attractor of the recurrent network. We show that in this limit the synaptic matrix becomes more correlated with the class prototypes than with any-of the instances of the class. We also show that the number of classes that can be learned increases sharply when the coding level decreases, and determine the speeds of learning and forgetting pf classes in the case of changes in the statistics of the input stream.
引用
收藏
页码:123 / 152
页数:30
相关论文
共 34 条
[1]   Model of global spontaneous activity and local structured activity during delay periods in the cerebral cortex [J].
Amit, DJ ;
Brunel, N .
CEREBRAL CORTEX, 1997, 7 (03) :237-252
[2]   LEARNING INTERNAL REPRESENTATIONS IN AN ATTRACTOR NEURAL-NETWORK WITH ANALOG NEURONS [J].
AMIT, DJ ;
BRUNEL, N .
NETWORK-COMPUTATION IN NEURAL SYSTEMS, 1995, 6 (03) :359-388
[3]   CONSTRAINTS ON LEARNING IN DYNAMIC SYNAPSES [J].
AMIT, DJ ;
FUSI, S .
NETWORK-COMPUTATION IN NEURAL SYSTEMS, 1992, 3 (04) :443-464
[4]  
AMIT DJ, 1994, J NEUROSCI, V14, P6435
[5]   LEARNING IN NEURAL NETWORKS WITH MATERIAL SYNAPSES [J].
AMIT, DJ ;
FUSI, S .
NEURAL COMPUTATION, 1994, 6 (05) :957-982
[6]  
AMIT DJ, 1997, NEURAL COMPUT, V9, P1101
[7]  
Amit DJ, 1989, MODELING BRAIN FUNCT, DOI DOI 10.1017/CBO9780511623257
[8]  
ANNUNZIATO M, 1995, THESIS U ROMA SAPIEN
[9]   LONG-TERM DEPRESSION OF EXCITATORY SYNAPTIC TRANSMISSION AND ITS RELATIONSHIP TO LONG-TERM POTENTIATION [J].
ARTOLA, A ;
SINGER, W .
TRENDS IN NEUROSCIENCES, 1993, 16 (11) :480-487
[10]   ELECTRONIC IMPLEMENTATION OF AN ANALOG ATTRACTOR NEURAL-NETWORK WITH STOCHASTIC LEARNING [J].
BADONI, D ;
BERTAZZONI, S ;
BUGLIONI, S ;
SALINA, G ;
AMIT, DJ ;
FUSI, S .
NETWORK-COMPUTATION IN NEURAL SYSTEMS, 1995, 6 (02) :125-157