A VLSI recurrent network of integrate-and-fire neurons connected by plastic synapses with long-term memory

被引:144
作者
Chicca, E [1 ]
Badoni, D
Dante, V
D'Andreagiovanni, M
Salina, G
Carota, L
Fusi, S
Del Giudice, P
机构
[1] UNI, ETH, Inst Neuroinformat, CH-8057 Zurich, Switzerland
[2] INFN Roma 2, I-00133 Rome, Italy
[3] Italian Natl Inst Hlth, Phys Lab, I-00161 Rome, Italy
[4] INFN Sanita, I-00161 Rome, Italy
[5] Univ Aquila, Dept Phys, I-67100 Laquila, Italy
[6] Univ Bern, Inst Physiol, CH-3012 Bern, Switzerland
[7] Ist Super Sanita, Phys Lab, I-00161 Rome, Italy
来源
IEEE TRANSACTIONS ON NEURAL NETWORKS | 2003年 / 14卷 / 05期
关键词
integrate-and-fire neurons; learning systems; neuromorphic a VLSI; synaptic plasticity;
D O I
10.1109/TNN.2003.816367
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Electronic neuromorphic devices with on-chip, on-line learning should be able to modify quickly the synaptic couplings' to acquire information about new patterns to be stored (synaptic plasticity) and, at the same time, preserve this information on very long time scales (synaptic stability). Here, we illustrate the electronic implementation of a simple solution to this stability-plasticity problem, recently proposed and studied in various contexts. It is based on the observation that reducing the analog depth of the synapses to the extreme (bistable synapses) does not necessarily disrupt the performance of the device as an associative memory, provided that 1) the number of neurons is large enough; 2) the transitions between stable synaptic states are stochastic; and 3) learning is slow. The drastic reduction of the analog depth of the synaptic variable also makes this solution appealing from the point of view of electronic implementation and offers a simple methodological alternative to the technological solution based on floating gates. We describe the full custom analog very large-scale integration (VLSI) realization of a small network of integrate-and-fire neurons connected by bistable deterministic plastic synapses which can implement the idea of stochastic learning. In the absence of stimuli, the memory is preserved indefinitely. During the stimulation the synapse undergoes quick temporary changes through the activities of the pre- and postsynaptic neurons; those changes stochastically result in a long-term modification of the synaptic efficacy. The intentionally disordered pattern of connectivity allows the system to generate a randomness suited to drive the stochastic selection mechanism. We check by a suitable stimulation protocol that the stochastic synaptic plasticity produces the expected pattern of potentiation and depression in, the electronic network. The proposed implementation requires only 69 x 83 mum(2) for the neuron and 68 x 47 mum(2) for the synapse (using a 0.6 mum, three metals, CMOS technology) and, hence, it is particularly suitable for the integration, of a large number of plastic synapses on a single chip.
引用
收藏
页码:1297 / 1307
页数:11
相关论文
共 18 条
[1]   A VLSI-EFFICIENT TECHNIQUE FOR GENERATING MULTIPLE UNCORRELATED NOISE SOURCES AND ITS APPLICATION TO STOCHASTIC NEURAL NETWORKS [J].
ALSPECTOR, J ;
GANNETT, JW ;
HABER, S ;
PARKER, MB ;
CHU, R .
IEEE TRANSACTIONS ON CIRCUITS AND SYSTEMS, 1991, 38 (01) :109-123
[2]   LEARNING IN NEURAL NETWORKS WITH MATERIAL SYNAPSES [J].
AMIT, DJ ;
FUSI, S .
NEURAL COMPUTATION, 1994, 6 (05) :957-982
[3]  
Amit DJ, 1989, MODELING BRAIN FUNCT, DOI DOI 10.1017/CBO9780511623257
[4]  
Boahen KA, 1998, ANALOG CIRCUITS SIG, P229
[5]  
Chicca E., 2001, WORLD C NEUR ARGESIM, P468
[6]  
CHICCA E, 1999, THESIS U ROME LA SAP
[7]  
D'Andreagiovanni M., 2001, P WORLD C NEUR, P478
[8]  
DANTE V, 2001, P WORKSH NEUR ENG, P99
[9]   Adaptive CMOS: From biological inspiration to systems-on-a-chip [J].
Diorio, C ;
Hsu, D ;
Figueroa, M .
PROCEEDINGS OF THE IEEE, 2002, 90 (03) :345-357
[10]  
DIORIO P, NEUROMORPHIC SYSTEMS, P315