Descent Perry conjugate gradient methods for systems of monotone nonlinear equations
- PDF / 865,918 Bytes
- 23 Pages / 439.642 x 666.49 pts Page_size
- 63 Downloads / 182 Views
Descent Perry conjugate gradient methods for systems of monotone nonlinear equations Mohammed Yusuf Waziri1 · Kabiru Ahmed Hungu1 · Jamilu Sabi’u2 Received: 9 April 2019 / Accepted: 22 October 2019 / © Springer Science+Business Media, LLC, part of Springer Nature 2020
Abstract In this paper, we present a family of Perry conjugate gradient methods for solving large-scale systems of monotone nonlinear equations. The methods are developed by combining modified versions of Perry (Oper. Res. Tech. Notes 26(6), 1073– 1078, 1978) conjugate gradient method with the hyperplane projection technique of Solodov and Svaiter (1998). Global convergence and numerical results of the methods are established and preliminary numerical results shows that the proposed methods are promising and more effective compared to some existing methods in the literature. Keywords Nonlinear equations · Eigenvalue analysis · Hyperplane projection · Monotonicity property · Global convergence Mathematics Subject Classification (2010) 90C30 · 65K05 · 90C53 · 49M37 · 15A18
1 Introduction Systems of nonlinear equations form a family of problems that are closely related to optimization problems, and they often arise in the fields of science, technology, and industry. In recent years, researchers have considered various examples in this areas. A typical system of nonlinear equations is represented by the following: F (x) = 0,
subject to x ∈ Rn ,
Mohammed Yusuf Waziri
[email protected] 1
Department of Mathematical Sciences Faculty of Science, Bayero University, Kano, Nigeria
2
Department of Mathematics Faculty of Sciences, Northwest University, Kano, Nigeria
(1)
Numerical Algorithms
where F : Rn → Rn is a continuously differentiable mapping in the neighborhood of Rn . Nonlinear monotone equations form a class of nonlinear equations, where F (x) satisfies the monotonicity condition as follows: (F (x) − F (y))T (x − y) ≥ 0,
∀x, y ∈ Rn .
(2)
Systems of nonlinear monotone equations have appeared in various applications, for example, they are used as subproblems in the generalized proximal algorithms with Bregman distances [48]. By means of fixed point mappings or normal mappings [70], some monotone variational inequality problems can also be converted into nonlinear monotone equations. Monotone equations can also be reformulated as some l1 − norm regularized optimization problems in compressive sensing [58]. For more examples of the method’s applications, we refer the reader to [43, 70]. Some iterative methods for solving these problems include Newton and quasi-Newton schemes [10, 15, 29, 55], the Gauss-Newton methods [17, 29], the Levenberg–Marquardt methods [24, 28, 40], the derivative-free methods [56], the subspace methods [64], the tensor methods [9], and the trust-region methods [52, 65, 69]. Conjugate gradient (CG) methods form an important class of algorithms used in solving large-scale unconstrained optimization problems. They represent an ideal choice for mathematicians and engineers engaged in large-scale problems because of the
Data Loading...