Storage capacity diverges with synaptic efficiency in an associative memory model with synaptic delay and pruning

被引:13
作者
Miyoshi, S [1 ]
Okada, M
机构
[1] Univ Tokyo, Grad Sch Frontier Sci, Tokyo, Japan
[2] RIKEN, Brain Sci Inst, Lab Math Neurosci, Wako, Saitama 3510198, Japan
[3] Japan Sci & Technol Agcy, Kyoto 6190288, Japan
来源
IEEE TRANSACTIONS ON NEURAL NETWORKS | 2004年 / 15卷 / 05期
关键词
associative memory; delay; neural network; statistical neurodynamics; synaptic pruning;
D O I
10.1109/TNN.2004.832711
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
It is known that storage capacity per synapse increases by synaptic pruning in the case of a correlation-type associative memory model. However, the storage capacity of the entire network then decreases. To overcome this difficulty, we propose decreasing the connectivity while keeping the total number of synapses constant by introducing delayed synapses. In this paper, a discrete synchronous-type model with both delayed synapses and their prunings is discussed as a concrete example of the proposal. First, we explain the Yanai-Kim theory by employing statistical neurodynamics. This theory involves macrodynamical equations for the dynamics of a network with serial delay elements. Next, considering the translational symmetry of the explained equations, we rederive macroscopic steady-state equations of the model by using the discrete Fourier transformation. The storage capacities are analyzed quantitatively. Furthermore, two types of synaptic prunings are treated analytically: random pruning and systematic pruning. As a result, it becomes clear that in both prunings, the storage capacity increases as the length of delay increases and the connectivity of the synapses decreases when the total number of synapses is constant. Moreover, an interesting fact becomes clear: the storage capacity asymptotically approaches 2/pi due to random pruning. In contrast, the storage capacity diverges in proportion to the logarithm of the length of delay by systematic pruning and the proportion constant is 4/pi. These results theoretically support the significance of pruning following an overgrowth of synapses in the brain and may suggest that the brain prefers to store dynamic attractors such as sequences and limit cycles rather than equilibrium states.
引用
收藏
页码:1215 / 1227
页数:13
相关论文
共 34 条
[1]   STATISTICAL NEURODYNAMICS OF ASSOCIATIVE MEMORY [J].
AMARI, S ;
MAGINU, K .
NEURAL NETWORKS, 1988, 1 (01) :63-73
[2]  
AMARI S, 1988, P IEEE C NEUR NETW, V1, P633
[3]  
BOURGEOIS JP, 1993, J NEUROSCI, V13, P2801
[4]   Synaptic pruning in development: A computational account [J].
Chechik, G ;
Meilijson, I ;
Ruppin, E .
NEURAL COMPUTATION, 1998, 10 (07) :1759-1777
[5]   OPTIMIZING SYNAPTIC LEARNING RULES IN LINEAR ASSOCIATIVE MEMORIES [J].
DAYAN, P ;
WILLSHAW, DJ .
BIOLOGICAL CYBERNETICS, 1991, 65 (04) :253-265
[6]   Phase diagram and storage capacity of sequence processing neural networks [J].
During, A ;
Coolen, ACC ;
Sherrington, D .
JOURNAL OF PHYSICS A-MATHEMATICAL AND GENERAL, 1998, 31 (43) :8607-8621
[7]   A QUANTITATIVE-ANALYSIS OF SYNAPTOGENESIS IN THE MOLECULAR LAYER OF THE DENTATE GYRUS IN THE RHESUS-MONKEY [J].
ECKENHOFF, MF ;
RAKIC, P .
DEVELOPMENTAL BRAIN RESEARCH, 1991, 64 (1-2) :129-135
[8]   MODEL OF ASSOCIATIVE MEMORY IN BRAIN [J].
FUKUSHIMA, K .
KYBERNETIK, 1973, 12 (02) :58-63
[9]   STATISTICAL-ANALYSIS OF THE DYNAMICS OF A SPARSE ASSOCIATIVE MEMORY [J].
GIBSON, WG ;
ROBINSON, J .
NEURAL NETWORKS, 1992, 5 (04) :645-661
[10]   Memory maintenance via neuronal regulation [J].
Horn, D ;
Levy, N ;
Ruppin, E .
NEURAL COMPUTATION, 1998, 10 (01) :1-18