Versions of the Subgradient Extragradient Method for Pseudomonotone Variational Inequalities

  • PDF / 1,808,252 Bytes
  • 27 Pages / 439.37 x 666.142 pts Page_size
  • 97 Downloads / 171 Views

DOWNLOAD

REPORT


Versions of the Subgradient Extragradient Method for Pseudomonotone Variational Inequalities Phan Quoc Khanh1 · Duong Viet Thong2 · Nguyen The Vinh3

Received: 7 June 2019 / Accepted: 7 May 2020 © Springer Nature B.V. 2020

Abstract We develop versions of the subgradient extragradient method for variational inequalities in Hilbert spaces and establish sufficient conditions for their convergence. First we prove a sufficient condition for a weak convergence of a recent existing algorithm under relaxed assumptions. Then, we propose two other algorithms. Both weak and strong convergence of the considered algorithms are studied. Under additional strong pseudomonotonicity and Lipschitz continuity assumptions, we obtain also a Q-linear convergence rate of these algorithms. Our results improve some recent contributions in the literature. Illustrative numerical experiments are also provided by the end of the paper. Keywords Extragradient method · Subgradient extragradient method · Variational inequality · Pseudomonotonicity · Weak and strong convergence · Q-linear convergence rate Mathematics Subject Classification 47H09 · 47H10 · 47J20 · 47J25

1 Introduction Variational inequalities (VIs) introduced by Stampacchia [27] (see details and developments in the books [9, 18, 19]) have been proved to be a simple, natural and unified framework en-

B D.V. Thong

[email protected] P.Q. Khanh [email protected] N.T. Vinh [email protected]

1

Department of Mathematics, International University, Vietnam National University-Hochiminh City, Linh Trung, Thu Duc, Hochiminh City, Vietnam

2

Applied Analysis Research Group, Faculty of Mathematics and Statistics, Ton Duc Thang University, Ho Chi Minh City, Vietnam

3

Department of Mathematics, University of Transport and Communications, Hanoi City, Vietnam

P.Q. Khanh et al.

compassing a broad spectrum of problems in applied mathematics, especially optimizationrelated problems such as constrained minimization, complementarity problems, fixed-point problems, traffic networks, etc. These mathematical models have in turn various practical applications in physics, engineering, social sciences, and digital processing in any fields (e.g., image restoration, data classification, neural networks, . . . ). In particular, nowadays numerical solutions are increasingly needed and so algorithms become more and more important in mathematical applications. For VIs, the extragradient algorithm introduced by Korpelevich [20] and Antipin [1] is one of the most important and popular methods. This algorithm requires two projections onto the feasible set per iteration. Finding this projection is minimizing the distance function to the feasible set, which is relatively easy only for very simple sets. Therefore, its computation is expensive if the feasible sets have complicated structures. Hence, many efforts have been made to reduce the overall number of projections or to use projections on simpler sets such as closed half-spaces. The subgradient extragradient method proposed in [3–5] is an im