Fixed-point attractor analysis for a class of neurodynamics

被引:8
作者
Feng, JF [1 ]
Brown, D [1 ]
机构
[1] Babraham Inst, Biomath Lab, Cambridge CB2 4AT, England
关键词
D O I
10.1162/089976698300017944
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Nearly all models in neural networks start from the assumption that the input-output characteristic is a sigmoidal function. On parameter space, we present a systematic and feasible method for analyzing the whole spectrum of attractors-all-saturated, all-but-one-saturated, all-but-two-saturated, and so on-of a neurodynamical system with a saturated sigmoidal function as its input-output characteristic. We present an argument that claims, under a mild condition, that only all-saturated or all-but-one-saturated attractors are observable for the neurodynamics. For any given all-saturated configuration <(xi)under bar> (all-but-one-saturated configuration <(eta)under bar>) the article shows how to construct an exact parameter region RO) ((R) over bar(<(eta)under bar>)) such that if and only if the parameters fall within R(<(xi)under bar>) ((R) over bar<(eta)under bar>), then <(xi)under bar> (<(eta)under bar>) is an attractor (a fixed point) of the dynamics. The parameter region for an all-saturated fixed-point attractor is independent of the specific choice of a saturated sigmoidal function, whereas for an all-but-one-saturated fixed point, it is sensitive to the input-output characteristic. Based on a similar idea, the role of weight normalization realized by a saturated sigmoidal function in competitive learning is discussed. A necessary and sufficient condition is provided to distinguish two kinds of competitive learning: stable competitive learning with the weight vectors representing extremes of input space and being fixed-point attractors, and unstable competitive learning. We apply our results to Linsker's model and (using extreme value theory in statistics) the Hopfield model and obtain some novel results on these two models.
引用
收藏
页码:189 / 213
页数:25
相关论文
共 40 条
[1]  
Amit DJ, 1989, MODELING BRAIN FUNCT, DOI DOI 10.1017/CBO9780511623257
[2]   CONVERGENCE IN DISTRIBUTION OF THE ONE-DIMENSIONAL KOHONEN ALGORITHMS WHEN THE STIMULI ARE NOT UNIFORM [J].
BOUTON, C ;
PAGES, G .
ADVANCES IN APPLIED PROBABILITY, 1994, 26 (01) :80-103
[3]   SELF-ORGANIZATION AND AS CONVERGENCE OF THE ONE-DIMENSIONAL KOHONEN ALGORITHM WITH NONUNIFORMLY DISTRIBUTED STIMULI [J].
BOUTON, C ;
PAGES, G .
STOCHASTIC PROCESSES AND THEIR APPLICATIONS, 1993, 47 (02) :249-274
[4]   SELF-ORGANIZING MAPS - STATIONARY STATES, METASTABILITY AND CONVERGENCE RATE [J].
ERWIN, E ;
OBERMAYER, K ;
SCHULTEN, K .
BIOLOGICAL CYBERNETICS, 1992, 67 (01) :35-45
[5]   SELF-ORGANIZING MAPS - ORDERING, CONVERGENCE PROPERTIES AND ENERGY FUNCTIONS [J].
ERWIN, E ;
OBERMAYER, K ;
SCHULTEN, K .
BIOLOGICAL CYBERNETICS, 1992, 67 (01) :47-55
[6]  
FENG, 1997, J PHYS A, P3383
[7]   Qualitative behaviour of some simple networks [J].
Feng, J ;
Hadeler, KP .
JOURNAL OF PHYSICS A-MATHEMATICAL AND GENERAL, 1996, 29 (16) :5019-5033
[8]  
Feng J, 1995, LECT NOTES COMPUT SC, V930, P353
[9]  
FENG J, 1997, NEUROCOMPUTING, V14, P91
[10]  
FENG J, 1995, NEURAL PROCESS LETT, V2, P1