Hybrid Riemannian conjugate gradient methods with global convergence properties

  • PDF / 1,397,187 Bytes
  • 20 Pages / 439.37 x 666.142 pts Page_size
  • 100 Downloads / 192 Views

DOWNLOAD

REPORT


Hybrid Riemannian conjugate gradient methods with global convergence properties Hiroyuki Sakai1   · Hideaki Iiduka1 Received: 4 February 2020 © Springer Science+Business Media, LLC, part of Springer Nature 2020

Abstract This paper presents Riemannian conjugate gradient methods and global convergence analyses under the strong Wolfe conditions. The main idea of the proposed methods is to combine the good global convergence properties of the Dai–Yuan method with the efficient numerical performance of the Hestenes–Stiefel method. One of the proposed algorithms is a generalization to Riemannian manifolds of the hybrid conjugate gradient method of the Dai and Yuan in Euclidean space. The proposed methods are compared well numerically with the existing methods for solving several Riemannian optimization problems. Python implementations of the methods used in the numerical experiments are available at https​://githu​b.com/iiduk​a-resea​rches​ /20200​8-hybri​d-rcg. Keywords  Conjugate gradient method · Riemannian optimization · Hybrid conjugate gradient method · Global convergence · Strong Wolfe conditions

1 Introduction This paper focuses on the conjugate gradient method. Nonlinear conjugate gradient methods in Euclidean space are a class of important methods for solving unconstrained optimization problems. In [10], Hestenes and Stiefel developed a conjugate gradient method for solving linear systems with a symmetric positive-definite matrix of coefficients. In [7], Fletcher and Reeves extended the conjugate gradient method to unconstrained nonlinear optimization problems. Theirs is the first nonlinear conjugate gradient method in Euclidean space. Al-Baali [3] indicated that the Fletcher–Reeves method converges globally and generates the descent direction * Hiroyuki Sakai [email protected] Hideaki Iiduka [email protected] 1



Department of Computer Science, Meiji University, 1‑1‑1 Higashimita, Tama‑ku, Kawasaki‑shi, Kanagawa 214‑8571, Japan

13

Vol.:(0123456789)



H. Sakai, H. Iiduka

with an inexact line search when the step size satisfies the strong Wolfe conditions [22, 23]. Polak and Ribière [13] introduced a conjugate gradient method with good numerical performance. Dai and Yuan [4] introduced a conjugate gradient method with a better global convergence property than that of the Fletcher–Reeves method. The Hestenes–Stiefel and Polak–Ribière–Polyak methods do not always converge under the strong Wolfe conditions, and for this reason, hybrid conjugate gradient methods have been presented in [5, 11, 19]. Touati-Ahmed and Storey [19], and Hu and Storey [11] proposed methods combining the Fletcher–Reeves and Polak–Ribière–Polyak methods. Moreover, Dai and Yuan [5] proposed the hybrid conjugate gradient method, which combines the Dai–Yuan method and the Hestenes–Stiefel method. These nonlinear conjugate gradient methods in Euclidean space are summarized by Hager and Zhang [8]. The conjugate gradient method in Euclidean space is applicable to a Riemannian manifold. In [18], Smith introduced the notion of Ri