Family weak conjugate gradient algorithms and their convergence analysis for nonconvex functions
- PDF / 1,100,467 Bytes
- 22 Pages / 439.642 x 666.49 pts Page_size
- 72 Downloads / 213 Views
Family weak conjugate gradient algorithms and their convergence analysis for nonconvex functions Gonglin Yuan1 · Xiaoliang Wang2 · Zhou Sheng3 Received: 18 October 2018 / Accepted: 24 July 2019 / © Springer Science+Business Media, LLC, part of Springer Nature 2019
Abstract It is well-known that conjugate gradient algorithms are widely applied in many practical fields, for instance, engineering problems and finance models, as they are straightforward and characterized by a simple structure and low storage. However, challenging problems remain, such as the convergence of the PRP algorithms for nonconvexity under an inexact line search, obtaining a sufficient descent for all conjugate gradient methods, and other theory properties regarding global convergence and the trust region feature for nonconvex functions. This paper studies family conjugate gradient formulas based on the six classic formulas, PRP, HS, CD, FR, LS, and DY, where the family conjugate gradient algorithms have better theory properties than those of the formulas by themselves. Furthermore, this technique of the presented conjugate gradient formulas can be extended to any two-term conjugate gradient formula. This paper designs family conjugate gradient algorithms for nonconvex functions, which have the following features without other conditions: (i) the sufficient descent property holds, (ii) the trust region feature is true, and (iii) the global convergence holds under normal assumptions. Numerical results show that the given conjugate gradient algorithms are competitive with those of normal methods. Zhou Sheng
[email protected] Gonglin Yuan [email protected] Xiaoliang Wang [email protected] 1
College of Mathematics and Information Science, Guangxi University, Nanning, Guangxi 530004, China
2
School of Mathematical Sciences, Dalian University of Technology, Dalian, Liaoning 116024, China
3
Department of Mathematics, Nanjing University of Aeronautics and Astronautics, Nanjing, Jiangsu 211106, China
Numerical Algorithms
Keywords Nonconvex functions · Sufficient descent · Trust region · Inexact line search · Global convergence Mathematics Subject Classification (2010) 90C26
1 Introduction Consider the following problem as follows: min{f (x) | x ∈ n },
(1.1)
where the function f : n → is twice continuously differentiable. The nonlinear conjugate gradient (CG) method for (1.1) is designed by the following iteration formula: xk+1 = xk + αk dk , k = 0, 1, 2, · · · , where xk is the current iterative point, αk is the so-called stepsize, and dk is the search direction of the CG method, which is designed by the following: −gk+1 + βk dk , if k ≥ 0, dk+1 = (1.2) −gk+1 , if k = 0, where gk = ∇f (xk ) is the gradient and βk ∈ is a scalar. There exist six classic CG formulas with the following: βkPRP = βkFR βkCD
T (g gk+1 k+1 − gk )
gk 2 T g gk+1 = k+1 2 , [10] gk T g gk+1 k+1 = , [9] −dkT gk
βkLS = βkHS = βkDY =
T (g gk+1 k+1 − gk )
−dkT gk T (g gk+1 k+1 − gk )
(gk+1 − gk )T dk T g gk+1 k+1 (gk+1 − gk )T dk
, [22, 23]
, [19] , [1
Data Loading...