Hebbian learning of context in recurrent neural networks

被引:55
作者
Brunel, N [1 ]
机构
[1] UNIV ROMA LA SAPIENZA,INST FIS,IST NAZL FIS NUCL,I-00185 ROME,ITALY
关键词
D O I
10.1162/neco.1996.8.8.1677
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Single electrode recordings in the inferotemporal cortex of monkeys during delayed visual memory tasks provide evidence for attractor dynamics in the observed region. The persistent elevated delay activities could be internal representations of features of the learned visual stimuli shown to the monkey during training. When uncorrelated stimuli are presented during training in a fixed sequence, these experiments display significant correlations between the internal representations. Recently a simple model of attractor neural network has reproduced quantitatively the measured correlations. An underlying assumption of the model is that the synaptic matrix formed during the training phase contains in its efficacies information about the contiguity of persistent stimuli in the training sequence. We present here a simple unsupervised learning dynamics that produces such a synaptic matrix if sequences of stimuli are repeatedly presented to the network at fixed order. The resulting matrix is then shown to convert temporal correlations during training into spatial correlations between attractors. The scenario is that, in the presence of selective delay activity, at the presentation of each stimulus, the activity distribution in the neural assembly contains information of both the current stimulus and the previous one (carried by the attractor). Thus the recurrent synaptic matrix can code not only for each of the stimuli presented to the network but also for their context. We combine the idea that for learning to be effective, synaptic modification should be stochastic, with the fact that attractors provide learnable information about two consecutive stimuli. We calculate explicitly the probability distribution of synaptic efficacies as a function of training protocol, that is, the order in which stimuli are presented to the network. We then solve for the dynamics of a network composed of integrate-and-fire excitatory and inhibitory neurons with a matrix of synaptic collaterals resulting from the learning dynamics. The network has a stable spontaneous activity, and stable delay activity develops after a critical learning stage. The availability of a learning dynamics makes possible a number of experimental predictions for the dependence of the delay activity distributions and the correlations between them, on the learning stage and the learning protocol. In particular it makes specific predictions for pair-associates delay experiments.
引用
收藏
页码:1677 / 1710
页数:34
相关论文
共 33 条
[1]  
AMIT D, 1995, BBS, V18, P681
[2]   QUANTITATIVE STUDY OF ATTRACTOR NEURAL NETWORK RETRIEVING AT LOW SPIKE RATES .1. SUBSTRATE SPIKES, RATES AND NEURONAL GAIN [J].
AMIT, DJ ;
TSODYKS, MV .
NETWORK-COMPUTATION IN NEURAL SYSTEMS, 1991, 2 (03) :259-273
[3]   LEARNING INTERNAL REPRESENTATIONS IN AN ATTRACTOR NEURAL-NETWORK WITH ANALOG NEURONS [J].
AMIT, DJ ;
BRUNEL, N .
NETWORK-COMPUTATION IN NEURAL SYSTEMS, 1995, 6 (03) :359-388
[4]   LEARNING IN NEURAL NETWORKS WITH MATERIAL SYNAPSES [J].
AMIT, DJ ;
FUSI, S .
NEURAL COMPUTATION, 1994, 6 (05) :957-982
[5]  
AMIT DJ, 1994, J NEUROSCI, V14, P6635
[6]  
AMIT DJ, 1996, IN PRESS CEREBRAL CO
[7]   Could information theory provide an ecological theory of sensory processing? [J].
Aticky, Joseph J. .
NETWORK-COMPUTATION IN NEURAL SYSTEMS, 2011, 22 (1-4) :4-44
[8]   ELECTRONIC IMPLEMENTATION OF AN ANALOG ATTRACTOR NEURAL-NETWORK WITH STOCHASTIC LEARNING [J].
BADONI, D ;
BERTAZZONI, S ;
BUGLIONI, S ;
SALINA, G ;
AMIT, DJ ;
FUSI, S .
NETWORK-COMPUTATION IN NEURAL SYSTEMS, 1995, 6 (02) :125-157
[9]  
BARLOW HB, 1961, SENS COMMUN, P219
[10]   A SYNAPTIC MODEL OF MEMORY - LONG-TERM POTENTIATION IN THE HIPPOCAMPUS [J].
BLISS, TVP ;
COLLINGRIDGE, GL .
NATURE, 1993, 361 (6407) :31-39