A DISTRIBUTED OUTSTAR NETWORK FOR SPATIAL PATTERN LEARNING

被引:17
作者
CARPENTER, GA [1 ]
机构
[1] BOSTON UNIV,DEPT COGNIT & NEURAL SYST,BOSTON,MA 02215
关键词
SPATIAL PATTERN LEARNING; DISTRIBUTED CODE; OUTSTAR; ADAPTIVE THRESHOLD; RECTIFIED BIAS; ATROPHY DUE TO DISUSE; TRANSMISSION FUNCTION; NEURAL NETWORK;
D O I
10.1016/0893-6080(94)90064-7
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
The distributed outstar, a generalization of the outstar neural network for spatial pattern learning, is introduced. In the outstar, signals from a source node cause weights to learn and recall arbitrary patterns across a target field of nodes. The distributed outstar replaces the outstar source node with a source field of arbitrarity many nodes, whose activity pattern may be arbitrarily distributed or compressed. Learning proceeds according to a principle of atrophy due to disuse, whereby a path weight decreases in joint proportion to the transmitted path signal and the degree of disuse of the target node. During learning, the total signal to a target node converges toward that node's activity level. Weight changes at a node are apportioned according to the distributed pattern of converging signals. Three synaptic transmission functions, a product rule, a capacity rule, and a threshold rule, are examined for this system. The three rules are computationally equivalent when source field activity is maximally compressed, or winner-take-all. When source field activity is distributed, catastrophic forgetting may occur Only the threshold rule solves this problem. Analysis of spatial pattern learning by distributed codes thereby leads to the conjecture that the unit of long-term memory in such a system is an adaptive threshold, rather than the multiplicative path weight widely used in neural models.
引用
收藏
页码:159 / 168
页数:10
相关论文
共 18 条
[1]   ART-3 - HIERARCHICAL SEARCH USING CHEMICAL TRANSMITTERS IN SELF-ORGANIZING PATTERN-RECOGNITION ARCHITECTURES [J].
CARPENTER, GA ;
GROSSBERG, S .
NEURAL NETWORKS, 1990, 3 (02) :129-152
[2]   ARTMAP - SUPERVISED REAL-TIME LEARNING AND CLASSIFICATION OF NONSTATIONARY DATA BY A SELF-ORGANIZING NEURAL NETWORK [J].
CARPENTER, GA ;
GROSSBERG, S ;
REYNOLDS, JH .
NEURAL NETWORKS, 1991, 4 (05) :565-588
[3]   A MASSIVELY PARALLEL ARCHITECTURE FOR A SELF-ORGANIZING NEURAL PATTERN-RECOGNITION MACHINE [J].
CARPENTER, GA ;
GROSSBERG, S .
COMPUTER VISION GRAPHICS AND IMAGE PROCESSING, 1987, 37 (01) :54-115
[4]   ART-2 - SELF-ORGANIZATION OF STABLE CATEGORY RECOGNITION CODES FOR ANALOG INPUT PATTERNS [J].
CARPENTER, GA ;
GROSSBERG, S .
APPLIED OPTICS, 1987, 26 (23) :4919-4930
[5]  
CARPENTER GA, 1993, FUZZY SETS NEURAL NE
[6]  
CARPENTER GA, 1991, CASCNSTR91021 BOST U
[7]   STATISTICAL THEORY OF SPONTANEOUS RECOVERY AND REGRESSION [J].
ESTES, WK .
PSYCHOLOGICAL REVIEW, 1955, 62 (03) :145-154
[8]   A PREDICTION THEORY FOR SOME NONLINEAR FUNCTIONAL-DIFFERENTIAL EQUATIONS .I. LEARNING OF LISTS [J].
GROSSBERG, S .
JOURNAL OF MATHEMATICAL ANALYSIS AND APPLICATIONS, 1968, 21 (03) :643-+
[9]  
GROSSBERG S, 1976, BIOL CYBERN, V23, P187
[10]  
GROSSBERG S, 1969, J MATH MECH, V19, P53