LOCATION AND STABILITY OF THE HIGH-GAIN EQUILIBRIA OF NONLINEAR NEURAL NETWORKS

被引:53
作者
VIDYASAGAR, M
机构
[1] Centre for Artificial Intelligence, Robotics Raj Bhavan Circle, High Grounds, Bangladore, India
来源
IEEE TRANSACTIONS ON NEURAL NETWORKS | 1993年 / 4卷 / 04期
关键词
D O I
10.1109/72.238320
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
This paper analyzes the number, location, and stability behavior of the equilibria of arbitrary nonlinear neural networks without resorting to energy arguments based on assumptions of symmetric interactions or no self-interactions. The class of networks studied consists of very general continuous-time continuous-state (CTCS) networks that contain the standard Hopfield network as a special case. The emphasis is on the case where the slopes of the sigmoidal nonlinearities become larger and larger, i.e., the high-gain limit. The following results are proved: Let H = (0, 1)n and HBAR = [0, 1]n denote the open and closed n-dimensional hypercubes, respectively, on which the neural network evolves, and let I denote the (constant) vector of external inputs. Then, as the neural sigmoid characteristics become steeper and steeper, it is shown that the following statements are true for all I except for those belonging to a set of measure zero. 1) There are only finitely many equilibria in any compact subset of H. If there are no self-interactions, then these equilibria cannot be exponentially stable, and under mild conditions they are in fact unstable. If the network has symmetric (nonlinear) interactions, whether or not it has self-interactions, then the stable manifolds of all these equilibria have the same dimension, which can be computed explicitly. If the network also has no self-interactions, then all of these equilibria are unstable. 2) There are only finitely many equilibria in any face of H. If there are no self-interactions, then there are no equilibria in an edge of H. If the network has symmetric interactions, then the stable manifolds of equilibria in parallel faces of H have the same dimension, which can be computed explicitly. If the network also has no self-interactions, then all equilibria in the faces of H are unstable. 3) A systematic procedure is given for determining which corners of H contain equilibria, and it is shown that all equilibria in the corners of H are asymptotically stable. One corollary of the above results is that the standard Hopfield network can have asymptotically stable equilibria only in the corners of H, and trajectories starting at almost all initial conditions approach the corners of H. It is important to note that the proofs here are not based on energy arguments. As a result, these results are ''hardy'' in the sense that they continue to hold even if the network-dynamics are slightly perturbed.
引用
收藏
页码:660 / 672
页数:13
相关论文
共 15 条
[2]  
Gantmacher F. R., 1959, MATRIX THEORY, V1
[3]  
HIRSCH M, 1987, 1ST P IEEE INT C NEU, V2, P115
[4]  
Hirsch MW., 1974, DIFFERENTIAL EQUATIO
[5]  
HOPFIELD JJ, 1985, BIOL CYBERN, V52, P141
[6]   NEURONS WITH GRADED RESPONSE HAVE COLLECTIVE COMPUTATIONAL PROPERTIES LIKE THOSE OF 2-STATE NEURONS [J].
HOPFIELD, JJ .
PROCEEDINGS OF THE NATIONAL ACADEMY OF SCIENCES OF THE UNITED STATES OF AMERICA-BIOLOGICAL SCIENCES, 1984, 81 (10) :3088-3092
[7]   NEURAL NETWORKS AND PHYSICAL SYSTEMS WITH EMERGENT COLLECTIVE COMPUTATIONAL ABILITIES [J].
HOPFIELD, JJ .
PROCEEDINGS OF THE NATIONAL ACADEMY OF SCIENCES OF THE UNITED STATES OF AMERICA-BIOLOGICAL SCIENCES, 1982, 79 (08) :2554-2558
[8]  
KAMP Y, 1990, RECURSIVE NEURAL NET
[9]   COLLECTIVE COMPUTATIONAL PROPERTIES OF NEURAL NETWORKS - NEW LEARNING-MECHANISMS [J].
PERSONNAZ, L ;
GUYON, I ;
DREYFUS, G .
PHYSICAL REVIEW A, 1986, 34 (05) :4217-4228
[10]   ON THE ANALYSIS OF DYNAMIC FEEDBACK NEURAL NETS [J].
SALAM, FMA ;
WANG, YW ;
CHOI, MR .
IEEE TRANSACTIONS ON CIRCUITS AND SYSTEMS, 1991, 38 (02) :196-201