Neural potentials as stimuli for attractor neural networks

被引:17
作者
Amit, Daniel J. [1 ]
Parisi, G. [2 ]
Nicolis, S. [1 ]
机构
[1] Univ Rome La Sapienza, Dipartimento Fis, Ist Nazl Fis Nucl, I-00185 Rome, Italy
[2] Univ Roma Tor Vergata 2, Dipartimento Fis, I-00173 Rome, Italy
关键词
D O I
10.1088/0954-898X/1/1/006
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
The retrieval properties of attractor neural networks are studied, subject to a modification of the representation of the stimulus input. In the present study the stimulus imposes an initial state on the network and then remains as a persistent potential input (local field) of weakened amplitude, but containing the same number of errors as the initial, strong input. The dynamics of this network is analysed assuming three different types of error distributions in the persistent stimulus, i.e. Gaussian, discrete and hidden units. It is found that, below saturation of the free network, the persistent stimulus does not damage the retrieval properties of the free network up to rather high values of the field amplitude. Even above saturation, for all loading levels, the network corrects errors if the field amplitude is not too low. The system is studied by solving mean-field equations for its attractors. The solutions are then compared with simulations. The results lead to the conclusion that the memory loading level at which the network stops acting as an effective associative memory is determined, biologically, by the size of a plausible stimulus input and by the level of errors that can be biologically tolerated. In order to extend the study to a potential sequence of error-correcting oversaturated networks we introduce random synaptic dilution. It is intended to prevent much correlation between the error bits in the dynamics of the two coupled networks. This compound system is studied by simulations. Finally, we discuss a potential theoretical project in which the dynamics of the coupled networks can be studied analytically.
引用
收藏
页码:75 / 88
页数:14
相关论文
共 17 条
[1]   SPIN-GLASS MODELS OF NEURAL NETWORKS [J].
AMIT, DJ ;
GUTFREUND, H .
PHYSICAL REVIEW A, 1985, 32 (02) :1007-1018
[2]   STORING INFINITE NUMBERS OF PATTERNS IN A SPIN-GLASS MODEL OF NEURAL NETWORKS [J].
AMIT, DJ ;
GUTFREUND, H ;
SOMPOLINSKY, H .
PHYSICAL REVIEW LETTERS, 1985, 55 (14) :1530-1533
[3]  
AMIT DJ, 1987, HEID C GLASS DYN
[4]  
AMIT DJ, 1989, P NATL ACAD IN PRESS
[5]  
ANDERSEN RA, 1983, J NEUROSCI, V3, P532
[6]  
BUHMANN J, 1988, ASS MEMORY HIGH INFO
[7]   IMPROVED RETRIEVAL IN NEURAL NETWORKS WITH EXTERNAL FIELDS [J].
ENGEL, A ;
ENGLISCH, H ;
SCHUTTE, A .
EUROPHYSICS LETTERS, 1989, 8 (04) :393-397
[8]  
FORREST BM, 1988, PARALLEL ARCHITECTUR
[9]   NEURAL NETWORKS AND PHYSICAL SYSTEMS WITH EMERGENT COLLECTIVE COMPUTATIONAL ABILITIES [J].
HOPFIELD, JJ .
PROCEEDINGS OF THE NATIONAL ACADEMY OF SCIENCES OF THE UNITED STATES OF AMERICA-BIOLOGICAL SCIENCES, 1982, 79 (08) :2554-2558