LEARNING INTERNAL REPRESENTATIONS IN AN ATTRACTOR NEURAL-NETWORK WITH ANALOG NEURONS

被引:42
作者
AMIT, DJ
BRUNEL, N
机构
关键词
D O I
10.1088/0954-898X/6/3/004
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
A learning attractor neural network (LANN) With a double dynamics of neural activities and synaptic efficacies, operating on two different timescales is studied by simulations in preparation for an electronic implementation. The present network includes several quasirealistic features: neurons are represented by their afferent currents and output spike rates; excitatory and inhibitory neurons are separated; attractor spike rates as well as coding levels in arriving stimuli are low; learning takes place only between excitatory units. Synaptic dynamics is an unsupervised, analogue Hebbian process, but long term memory in the absence of neural activity is maintained by a refresh mechanism which on long timescales discretizes the synaptic values, converting learning into asynchronous stochastic process induced by the stimuli on the synaptic efficacies. This network is intended to learn a set of attractors from the statistics of freely arriving stimuli, which are represented by external synaptic inputs injected into the excitatory neurons. In the simulations different types of sequences of many thousands of stimuli are presented to the network, without distinguishing in the dynamics a learning phase from retrieval. Stimulus sequences differ in pre-assigned global statistics (including time-dependent statistics); in orders of presentation of individual stimuli within a given statistics; in lengths of time intervals for each presentation and in the intervals separating one stimulus from another. We find that the network effectively learns a set of attractors representing the statistics of the stimuli, and is able to modify its attractors when the input statistics change. Moreover, as the global input statistics changes the network can also forget attractors related to stimulus classes no longer presented. Forgetting takes place only due to the arrival of new stimuli. The performance of the network and the statistics of the attractors are studied as a function of the input statistics. Most of the large-scale characteristics of the learning dynamics can be captured theoretically. This model modifies a previous implementation of a LANN composed of discrete neurons, in a network of more realistic neurons. The different elements have been designed to facilitate their implementation in silicon.
引用
收藏
页码:359 / 388
页数:30
相关论文
共 28 条
[11]  
BADONI D, 1995, COMPUT NEURAL SYST, V6, P125
[12]  
BADONI D, 1994, 3RD P WORKSH NEUR NE, V3, P13
[13]   STORAGE CAPACITY OF NEURAL NETWORKS - EFFECT OF THE FLUCTUATIONS OF THE NUMBER OF ACTIVE NEURONS PER MEMORY [J].
BRUNEL, N .
JOURNAL OF PHYSICS A-MATHEMATICAL AND GENERAL, 1994, 27 (14) :4783-4789
[14]  
BRUNEL N, UNPUB NEURAL COMPUT
[15]  
BRUNEL N, UNPUB
[16]   DYNAMIC PROPERTIES OF NEURAL NETWORKS WITH ADAPTING SYNAPSES [J].
DONG, DW ;
HOPFIELD, JJ .
NETWORK-COMPUTATION IN NEURAL SYSTEMS, 1992, 3 (03) :267-283
[17]  
FREGNAC Y, 1990, Society for Neuroscience Abstracts, V16, P798
[18]   A THEORY FOR CEREBRAL NEOCORTEX [J].
MARR, D .
PROCEEDINGS OF THE ROYAL SOCIETY SERIES B-BIOLOGICAL SCIENCES, 1970, 176 (1043) :161-+
[19]   NEURONAL CORRELATE OF PICTORIAL SHORT-TERM-MEMORY IN THE PRIMATE TEMPORAL CORTEX [J].
MIYASHITA, Y ;
CHANG, HS .
NATURE, 1988, 331 (6151) :68-70
[20]   NEURONAL CORRELATE OF VISUAL ASSOCIATIVE LONG-TERM-MEMORY IN THE PRIMATE TEMPORAL CORTEX [J].
MIYASHITA, Y .
NATURE, 1988, 335 (6193) :817-820