A double parameter self-scaling memoryless BFGS method for unconstrained optimization
- PDF / 978,157 Bytes
- 14 Pages / 439.37 x 666.142 pts Page_size
- 4 Downloads / 236 Views
A double parameter self-scaling memoryless BFGS method for unconstrained optimization Neculai Andrei1 Received: 18 November 2018 / Revised: 2 December 2019 / Accepted: 19 February 2020 © SBMAC - Sociedade Brasileira de Matemática Aplicada e Computacional 2020
Abstract A double parameter self-scaling memoryless BFGS method for unconstrained optimization is presented. In this method, the first two terms of the self-scaling memoryless BFGS matrix are scaled with a positive parameter, while the third one is scaled with another positive parameter. The first parameter scaling the first two terms is determined to cluster the eigenvalues of the memoryless BFGS matrix. The second parameter scaling the third term is computed as a preconditioner to the Hessian of the minimizing function combined with the minimization of the conjugacy condition to shift the large eigenvalues of the self-scaling memoryless BFGS matrix to the left. The stepsize is determined by the Wolfe line search conditions. The global convergence of this method is proved, assuming that the minimizing function is uniformly convex. The preliminary computational experiments on a set of 80 unconstrained optimization test functions show that this algorithm is more efficient and more robust than the self-scaling BFGS updates by Oren and Luenberger and by Oren and Spedicato. Subject to the CPU time metric, CG-DESCENT is top performer. Comparisons with L-BFGS show that our algorithm is more efficient. Keywords Unconstrained optimization · Self-scaling memoryless BFGS method · Global convergence · Numerical comparisons Mathematics Subject Classification 49M7 · 49M10 · 65K05 · 90C30
1 Introduction For solving the unconstrained optimization problem min f (x),
(1)
Communicated by Orizon Pereira Ferreira.
B 1
Neculai Andrei [email protected] Center for Advanced Modeling and Optimization, Academy of Romanian Scientists, Splaiul Independen¸tei nr. 54, Sector 5, Bucharest, Romania 0123456789().: V,-vol
123
159
Page 2 of 14
N. Andrei
where f : R n → R is a continuously differentiable function, the self-scaling memoryless BFGS method is one of the most efficient and robust (see Perry 1977; Shanno 1978). This method generates a sequence {xk } as xk+1 xk + αk dk ,
(2)
k 0, 1, . . . , where dk ∈ R n is the search direction computed as solution of a linear algebraic system involving the self-scaling memoryless BFGS approximation of the Hessian of the minimizing function. Given the initial matrix B0 δ0 I , where δ0 is a positive scalar, the self-scaling memoryless BFGS method is defined by the updating formula Bk+1 δk I − δk
sk skT sk 2
+
yk ykT ykT sk
,
(3)
k 0, 1, . . . where δk is the scaling parameter. The search direction corresponding to this method is computed as −1 dk+1 −Bk+1 gk+1 ,
(4)
−1 is the inverse of the self-scaling memoryless BFGS approximation given by (3), where Bk+1 which is computed as 1 1 sk ykT + yk skT 1 yk 2 sk skT −1 + 1+ . (5) Bk+1 I − δk δk δk ykT sk ykT sk ykT sk
From (4), using (5) the search dire
Data Loading...