The full text of this article is unavailable through your IP address: 172.17.0.1
Contents Online
Communications in Mathematical Sciences
Volume 21 (2023)
Number 1
Globally convergent Dai–Liao conjugate gradient method using quasi-Newton update for unconstrained optimization
Pages: 281 – 297
DOI: https://dx.doi.org/10.4310/CMS.2023.v21.n1.a13
Author
Abstract
Using quasi-Newton update and acceleration scheme, a new Dai–Liao conjugate gradient method that does not need computing or storing any approximate Hessian matrix of the objective function is developed for unconstrained optimization. It is shown that the search direction derived from a modified Perry matrix not only possesses sufficient descent condition but also fulfills Dai–Liao conjugacy condition at each iteration. Under certain assumptions, we establish the global convergence of the proposed method for uniformly convex function and general function, respectively. The numerical results illustrate that the presented method can effectively improve the numerical performance and successfully solve the test problems with a maximum dimension of $100000$.
Keywords
conjugate gradient, unconstrained optimization, sufficient descent, conjugacy condition, global convergence
2010 Mathematics Subject Classification
65K05, 90C06, 90C30
Received 7 April 2021
Received revised 16 March 2022
Accepted 29 May 2022
Published 27 December 2022