Scaled three-term derivative-free methods for solving large-scale nonlinear monotone equations
- PDF / 714,614 Bytes
- 25 Pages / 439.642 x 666.49 pts Page_size
- 4 Downloads / 181 Views
Scaled three-term derivative-free methods for solving large-scale nonlinear monotone equations Qun Li1 · Bing Zheng1 Received: 11 January 2020 / Accepted: 19 August 2020 / © Springer Science+Business Media, LLC, part of Springer Nature 2020
Abstract In this paper, two effective derivative-free methods are proposed for solving largescale nonlinear monotone equations, in which the search directions are sufficiently descent and independent of the line search. The methods are the extensions of the conjugate gradient methods proposed by Bojari and Eslahchi (Numer. Algorithms 83, pp. 901–933, 2020) combined with the hyperplane projection technique. Our approaches are low storage memory and derivative-free, which makes them suitable for large-scale nonsmooth monotone nonlinear equations. Under proper assumptions, we analyze the global convergence property of the proposed methods. Finally, numerical experiments show that the proposed methods outperform some existing ones. Keywords Nonlinear monotone equations · Derivative-free method · Projection method · Global convergence Mathematics Subject Classification (2010) 65K05 · 90C06 · 90C52 · 90C56
1 Introduction Consider the following nonlinear equations: F (x) = 0, where F : Rn → Rn is continuous and monotone, i.e.: F (x) − F (y), x − y ≥ 0, for all x, y ∈ Rn .
Bing Zheng
[email protected] 1
School of Mathematics and Statistics, Lanzhou University, Lanzhou, 730000, People’s Republic of China
(1)
Numerical Algorithms
The solution set of the problem (1) is convex under the monotonicity condition. Systems of monotone equations have many practical applications. For example, they appear as subproblems in the generalized proximal algorithms with Bregman distances [15]. Some monotone variational inequality problems and nonlinear complementarity problems can also be converted into nonlinear monotone equations [29]. Moreover, l1 -norm regularized optimization problems in compressive sensing can be reformulated as monotone nonlinear equations [25]. Many different iterative methods have been designed for addressing the system of nonlinear equations, such as the Newton method, the quasi-Newton method, the Gauss-Newton method, the Levenberg-Marquardt method, and their variants; see [7, 12, 16, 19, 30]. These methods have fast local convergence property at a good starting point. However, they are not suitable for large-scale nonlinear equations because one has to compute the Jacobian matrix or approximation matrix of equations in each iteration, which leads to a large amount of computations. In recent years, the gradient-type methods have been proposed for solving the large-scale system of nonlinear equations (see [9, 10, 20, 26]), which have attracted researchers’ attention due to their simplicity, global convergence, and low memory requirement. Zhang and Zhou [27] combined the spectral gradient method [5] with the projection technique [24] to solve nonlinear monotone equations. Hu and Wei [14] presented the Wei-Yao-Liu conjugate gradient projection algorithm for nonlinear mo
Data Loading...