Stability analysis of discrete-time recurrent neural networks

被引:93
作者
Barabanov, NE [1 ]
Prokhorov, DV
机构
[1] St Petersburg Electrotech Univ, Dept Software Engn, St Petersburg, Russia
[2] Ford Res Lab, Grp Artif Neural Syst & Fuzzy Logi, Dearborn, MI 48121 USA
来源
IEEE TRANSACTIONS ON NEURAL NETWORKS | 2002年 / 13卷 / 02期
关键词
bias weight; discrete-time recurrent neural network; exponential stability; linear matrix inequality (LMI); Lyapunov stability; recurrent multilayer perceptrons (RMLPs); recurrent neural network (RNN); sector monotone nonlinearity; state-space transformation;
D O I
10.1109/72.991416
中图分类号
TP18 [人工智能理论];
学科分类号
081104 [模式识别与智能系统]; 0812 [计算机科学与技术]; 0835 [软件工程]; 1405 [智能科学与技术];
摘要
We address the problem of global Lyapunov stability of discrete-time recurrent neural networks (RNNs) in the unforced (unperturbed) setting. It is assumed that network weights are fixed to some values, for example, those attained after training. Based on classical results of the theory of absolute stability, we propose a new approach for stability analysis of RNN with sector-type monotone nonlinearities and nonzero biases. We devise a simple state-space transformation to convert the original RNN equations to a form suitable for our stability analysis (without compositions of non-linearities). We then write appropriate linear matrix inequalities (LMIs) to be solved to determine whether the system under study is globally exponentially stable. Unlike previous treatments, our approach readily permits us to account for nonzero biases usually present in RNN for improved approximation capabilities. We show how recent results of others on stability analysis of RNN can be interpreted as special cases within our approach. We illustrate how to use our approach with examples. Though illustrated on stability analysis of recurrent multilayer perceptrons (RMLPs), the approach proposed can also be applied to other forms of time-lagged RNN.
引用
收藏
页码:292 / 303
页数:12
相关论文
共 12 条
[1]
Barabanov N. E., 1987, SIBERIAN MATH J, VXXVIII, P21
[2]
Boyd S, 1994, STUDIES APPL MATH, V15
[3]
NONLINEAR-SYSTEM IDENTIFICATION USING NEURAL NETWORKS [J].
CHEN, S ;
BILLINGS, SA ;
GRANT, PM .
INTERNATIONAL JOURNAL OF CONTROL, 1990, 51 (06) :1191-1214
[4]
Bounds of the induced norm and model reduction errors for systems with repeated scalar nonlinearities [J].
Chu, YC ;
Glover, K .
IEEE TRANSACTIONS ON AUTOMATIC CONTROL, 1999, 44 (03) :471-483
[5]
A signal processing framework based on dynamic neural networks with application to problems in adaptation, filtering, and classification [J].
Feldkamp, LA ;
Puskorius, GV .
PROCEEDINGS OF THE IEEE, 1998, 86 (11) :2259-2277
[6]
*MATHW INC, 1995, LMI CONTR TOOLB US G
[7]
Nesterov Y., 1994, INTERIOR POINT POLYN, V13
[8]
Suykens J. A. K., 1996, ARTIFICIAL NEURAL NE
[9]
An approach to stability criteria of neural-network control systems [J].
Tanaka, K .
IEEE TRANSACTIONS ON NEURAL NETWORKS, 1996, 7 (03) :629-642
[10]
TSYPKIN YZ, 1964, AUTOMAT REM CONTR+, V25, P261