Iterative retrieval of sparsely coded associative memory patterns

被引:68
作者
Schwenker, F
Sommer, FT
Palm, G
机构
[1] Dept. of Neural Info. Processing, University of Ulm
关键词
neural auto-associative memory; Hebbian learning; iterative retrieval; threshold control; sparse coding;
D O I
10.1016/0893-6080(95)00112-3
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
We investigate the pattern completion performance of neural auto-associative memories composed of binary threshold neurons for sparsely coded binary memory patterns. By focussing on iterative retrieval, we are able to introduce effective threshold control strategies. These are investigated by means of computer simulation experiments and analytical treatment. To evaluate the systems performance we consider the completion capacity C and the mean retrieval errors. The asymptotic completion capacity values for the recall of sparsely coded binary patterns in one-step retrieval is known to be In 2/4 approximate to 17.32% for binary Hebbian learning, and 1/(8 In 2) approximate to 18% for additive Hebbian learning. These values are accomplished with vanishing error probability and yet are higher than those obtained in other known neural memory models. Recent investigations on binary Hebbian learning have proved that iterative retrieval as a more refined retrieval method does not improve the asymptotic completion capacity of one step retrieval. In a finite size auto-associative memory we show that iterative retrieval achieves higher capacity and better error correction than one-step retrieval. One-step retrieval produces high retrieval errors at optimal memory load. Iterative retrieval reduces the retrieval errors within a few iteration steps (t less than or equal to 5). Experiments with additive Hebbian learning show that in the finite model, binary Hebbian learning exhibits much better performance. Thus the main concern of this paper Is binary Hebbian learning. We examine iterative retrieval in experiments with up to n = 20,000 threshold neurons. With this system size one-step retrieval yields a completion capacity of about 16%, the second retrieval step increases this value to 17.9% and with iterative retrieval we obtain 19%. The first two retrieval steps in the finite system have also been treated analytically. For one-step retrieval the asymptotic capacity value is approximated from below with growing system size. In the second retrieval step (and as the experiments suggest also for iterative retrieval) the finite size behaviour is different. The capacity exceeds the asymptotic value, reaches an optimum for finite system size, and decreases to the asymptotic limit. Copyright (C) 1996 Elsevier Science Ltd
引用
收藏
页码:445 / 455
页数:11
相关论文
共 19 条
[1]  
AMARI SI, 1989, NEURAL NETWORKS, P451
[2]  
Amit D., 1989, Modelling Brain Function: the World of Attractor Neural Networks
[3]   ON SETTING UNIT THRESHOLDS IN AN INCOMPLETELY CONNECTED ASSOCIATIVE NET [J].
BUCKINGHAM, J ;
WILLSHAW, D .
NETWORK-COMPUTATION IN NEURAL SYSTEMS, 1993, 4 (04) :441-459
[4]   RECALL OF EVENTS THROUGH LEARNING OF ASSOCIATIONS BETWEEN THEIR PARTS [J].
GARDNERMEDWIN, AR .
PROCEEDINGS OF THE ROYAL SOCIETY SERIES B-BIOLOGICAL SCIENCES, 1976, 194 (1116) :375-402
[5]   STATISTICAL-ANALYSIS OF THE DYNAMICS OF A SPARSE ASSOCIATIVE MEMORY [J].
GIBSON, WG ;
ROBINSON, J .
NEURAL NETWORKS, 1992, 5 (04) :645-661
[6]  
HEBB DO, 1949, ORG BEHAVIOR
[7]  
HOPFIELD JJ, 1982, P NATL ACAD SCI USA, P79
[8]  
Kohonen T., 1983, SELF ORG ASS MEMORY
[9]   BIDIRECTIONAL ASSOCIATIVE MEMORIES [J].
KOSKO, B .
IEEE TRANSACTIONS ON SYSTEMS MAN AND CYBERNETICS, 1988, 18 (01) :49-60
[10]  
LITTLE W A, 1974, Mathematical Biosciences, V19, P101, DOI 10.1016/0025-5564(74)90031-5