conjugate gradient method;
global convergence;
unconstrained optimization;
large-scale optimization;
D O I:
10.1137/0802003
中图分类号:
O29 [应用数学];
学科分类号:
070104 ;
摘要:
This paper explores the convergence of nonlinear conjugate gradient methods without restarts, and with practical line searches. The analysis covers two classes of methods that are globally convergent on smooth, nonconvex functions. Some properties of the Fletcher-Reeves method play an important role in the first family, whereas the second family shares an important property with the Polak-Ribiere method. Numerical experiments are presented.