A globally and superlinearly convergent algorithm for nonsmooth convex minimization

被引:72
作者
Fukushima, M [1 ]
Qi, LQ [1 ]
机构
[1] UNIV NEW S WALES,SCH MATH,SYDNEY,NSW 2052,AUSTRALIA
关键词
nonsmooth convex optimization; Moreau-Yosida regularization; global convergence; superlinear convergence; semismoothness;
D O I
10.1137/S1052623494278839
中图分类号
O29 [应用数学];
学科分类号
070104 ;
摘要
It is well known that a possibly nondifferentiable convex minimization problem can be transformed into a differentiable convex minimization problem by way of the Moreau-Yosida regularization. This paper presents a globally convergent algorithm that is designed to solve the latter problem. Under additional semismoothness and regularity assumptions, the proposed algorithm is shown to have a Q-superlinear rate of convergence.
引用
收藏
页码:1106 / 1120
页数:15
相关论文
共 24 条
[22]   MONOTONE OPERATORS AND PROXIMAL POINT ALGORITHM [J].
ROCKAFELLAR, RT .
SIAM JOURNAL ON CONTROL, 1976, 14 (05) :877-898
[23]   MAXIMAL MONOTONE RELATIONS AND THE 2ND DERIVATIVES OF NONSMOOTH FUNCTIONS [J].
ROCKAFELLAR, RT .
ANNALES DE L INSTITUT HENRI POINCARE-ANALYSE NON LINEAIRE, 1985, 2 (03) :167-184
[24]  
SUN D, IN PRESS B SCI CHINA