Fixed-point attractor analysis for a class of neurodynamics

被引:8
作者
Feng, JF [1 ]
Brown, D [1 ]
机构
[1] Babraham Inst, Biomath Lab, Cambridge CB2 4AT, England
关键词
D O I
10.1162/089976698300017944
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Nearly all models in neural networks start from the assumption that the input-output characteristic is a sigmoidal function. On parameter space, we present a systematic and feasible method for analyzing the whole spectrum of attractors-all-saturated, all-but-one-saturated, all-but-two-saturated, and so on-of a neurodynamical system with a saturated sigmoidal function as its input-output characteristic. We present an argument that claims, under a mild condition, that only all-saturated or all-but-one-saturated attractors are observable for the neurodynamics. For any given all-saturated configuration <(xi)under bar> (all-but-one-saturated configuration <(eta)under bar>) the article shows how to construct an exact parameter region RO) ((R) over bar(<(eta)under bar>)) such that if and only if the parameters fall within R(<(xi)under bar>) ((R) over bar<(eta)under bar>), then <(xi)under bar> (<(eta)under bar>) is an attractor (a fixed point) of the dynamics. The parameter region for an all-saturated fixed-point attractor is independent of the specific choice of a saturated sigmoidal function, whereas for an all-but-one-saturated fixed point, it is sensitive to the input-output characteristic. Based on a similar idea, the role of weight normalization realized by a saturated sigmoidal function in competitive learning is discussed. A necessary and sufficient condition is provided to distinguish two kinds of competitive learning: stable competitive learning with the weight vectors representing extremes of input space and being fixed-point attractors, and unstable competitive learning. We apply our results to Linsker's model and (using extreme value theory in statistics) the Hopfield model and obtain some novel results on these two models.
引用
收藏
页码:189 / 213
页数:25
相关论文
共 40 条
[31]   Analysis of Linsker's application of Hebbian rules to linear networks [J].
MacKay, David J. C. ;
Miller, Kenneth D. .
NETWORK-COMPUTATION IN NEURAL SYSTEMS, 1990, 1 (03) :257-297
[32]   DYNAMICS OF ITERATED-MAP NEURAL NETWORKS [J].
MARCUS, CM ;
WESTERVELT, RM .
PHYSICAL REVIEW A, 1989, 40 (01) :501-504
[33]   THE ROLE OF CONSTRAINTS IN HEBBIAN LEARNING [J].
MILLER, KD ;
MACKAY, DJC .
NEURAL COMPUTATION, 1994, 6 (01) :100-126
[34]   Non-mean-field behavior of realistic spin glasses [J].
Newman, CM ;
Stein, DL .
PHYSICAL REVIEW LETTERS, 1996, 76 (03) :515-518
[35]   MEMORY CAPACITY IN NEURAL NETWORK MODELS - RIGOROUS LOWER BOUNDS [J].
NEWMAN, CM .
NEURAL NETWORKS, 1988, 1 (03) :223-238
[36]  
PASTUR L, 1997, P WIAS WORKSH MATH S
[37]   PATTERN-RECOGNITION - TIME FOR A NEW NEURAL CODE [J].
SEJNOWSKI, TJ .
NATURE, 1995, 376 (6535) :21-22
[38]   A COGNITIVE AND ASSOCIATIVE MEMORY [J].
SHINOMOTO, S .
BIOLOGICAL CYBERNETICS, 1987, 57 (03) :197-206
[39]   The development of topography in the visual cortex: A review of models [J].
Swindale, NV .
NETWORK-COMPUTATION IN NEURAL SYSTEMS, 1996, 7 (02) :161-247
[40]  
[No title captured]