Efficient neural networks for solving variational inequalities

被引:10
作者
Jiang, Suoliang [2 ]
Han, Deren [1 ]
Yuan, Xiaoming [3 ,4 ]
机构
[1] Nanjing Normal Univ, Sch Math Sci, Key Lab NSLSCS Jiangsu Prov, Nanjing 210046, Jiangsu, Peoples R China
[2] Nanjing Univ, Sch Comp Sci, Nanjing 210097, Jiangsu, Peoples R China
[3] Hong Kong Baptist Univ, Dept Math, Kowloon, Hong Kong, Peoples R China
[4] Hong Kong Math Soc, Hong Kong, Hong Kong, Peoples R China
关键词
Projection neural networks; Variational inequalities; Exponential stability; Pseudo-monotone mapping; Co-coercive mappings; POLYAK PROJECTION METHOD; CONVERGENT NEWTON METHOD; OPTIMIZATION PROBLEMS; MINIMAX PROBLEMS; SYSTEMS; CONSTRAINTS; STABILITY;
D O I
10.1016/j.neucom.2012.01.020
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
In this paper, we propose efficient neural network models for solving a class of variational inequality problems. Our first model can be viewed as a generalization of the basic projection neural network proposed by Friesz et al. [3]. As the basic projection neural network, it only needs some function evaluations and projections onto the constraint set, which makes the model very easy to implement, especially when the constraint set has some special structure such as a box, or a ball. Under the condition that the underlying mapping F is pseudo-monotone with respect to a solution, a condition that is much weaker than those required by the basic projection neural network, we prove the global convergence of the proposed neural network. If F is strongly pseudo-monotone, we prove its globally exponential stability. Then to improve the efficient of the neural network, we modify it by choosing a new direction that is bounded away from zero. Under the condition that the underlying mapping F is co-coercive, a condition that is a little stronger than pseudo-monotone but is still weaker than those required by the basic projection neural network, we prove the exponential stability and global convergence of the improved model. We also reported some computational results, which illustrated that the new method is more efficient than that of Friesz et al. [3]. (C) 2012 Elsevier B.V. All rights reserved.
引用
收藏
页码:97 / 106
页数:10
相关论文
共 36 条
[1]  
[Anonymous], 2003, SPRINGER SERIES OPER, DOI DOI 10.1007/978-0-387-21815-16
[2]  
Bertsekas D.P., 1989, PARALLEL DISTRIBUTED
[3]  
Facchinei Francisco., 2003, FINITE DIMENSIONAL V, V2, DOI DOI 10.1007/B97543
[4]   DAY-TO-DAY DYNAMIC NETWORK DISEQUILIBRIA AND IDEALIZED TRAVELER INFORMATION-SYSTEMS [J].
FRIESZ, TL ;
BERNSTEIN, D ;
MEHTA, NJ ;
TOBIN, RL ;
GANJALIZADEH, S .
OPERATIONS RESEARCH, 1994, 42 (06) :1120-1136
[5]   A novel neural network for variational inequalities with linear and nonlinear constraints [J].
Gao, XB ;
Liao, LZ ;
Qi, LQ .
IEEE TRANSACTIONS ON NEURAL NETWORKS, 2005, 16 (06) :1305-1317
[6]   A novel neural network for nonlinear convex programming [J].
Gao, XB .
IEEE TRANSACTIONS ON NEURAL NETWORKS, 2004, 15 (03) :613-621
[7]   Exponential stability of globally projected dynamic systems [J].
Gao, XB .
IEEE TRANSACTIONS ON NEURAL NETWORKS, 2003, 14 (02) :426-431
[8]   A novel neural network for a class of convex quadratic minimax problems [J].
Gao, Xing-Bao ;
Liao, Li-Zhi .
NEURAL COMPUTATION, 2006, 18 (08) :1818-1846
[9]   CONVEX PROGRAMMING IN HILBERT SPACE [J].
GOLDSTEIN, AA .
BULLETIN OF THE AMERICAN MATHEMATICAL SOCIETY, 1964, 70 (05) :709-&
[10]  
Han DR, 2004, COMPUT MATH APPL, V47, P1817, DOI 10.1016/j.camwa. 2003.12.002