Linearized symmetric multi-block ADMM with indefinite proximal regularization and optimal proximal parameter
- PDF / 666,640 Bytes
- 36 Pages / 439.37 x 666.142 pts Page_size
- 23 Downloads / 195 Views
Linearized symmetric multi-block ADMM with indefinite proximal regularization and optimal proximal parameter Xiaokai Chang1 · Jianchao Bai2 · Dunjiang Song3 · Sanyang Liu4 Received: 8 November 2019 / Revised: 2 October 2020 / Accepted: 10 October 2020 © Istituto di Informatica e Telematica (IIT) 2020
Abstract The proximal term plays a significant role in the literature of proximal Alternating Direction Method of Multipliers (ADMM), since (positive-definite or indefinite) proximal terms can promote convergence of ADMM and further simplify the involved subproblems. However, an overlarge proximal parameter decelerates the convergence numerically, though the convergence can be established with it. In this paper, we thus focus on a Linearized Symmetric ADMM (LSADMM) with proper proximal terms for solving a family of multi-block separable convex minimization models, and we determine an optimal (smallest) value of the proximal parameter while convergence of this LSADMM can be still ensured. Our LSADMM partitions the data into two group variables and updates the Lagrange multiplier twice in different forms with suitable step sizes. The region of the proximal parameter, involved in the second group subproblems, is partitioned into the union of three different sets. We show the global convergence and sublinear ergodic convergence rate of LSADMM for the two cases, while a counter-example is given to illustrate that convergence of LSADMM can not be guaranteed for the remaining case. Theoretically, we obtain the optimal lower bound of the proximal parameter. Numerical experiments on the so-called latent variable Gaussian graphical model selection problems are presented to demonstrate performance of the proposed algorithm and the significant advantage of the optimal lower bound of the proximal parameter. Keywords Alternating direction method of multipliers · Convex programming · Prediction–correction · Linearized technique · Indefinite proximal term · Global convergence · Complexity Mathematics Subject Classification 65K10 · 90C25 · 90C30
B B
Xiaokai Chang [email protected] Jianchao Bai [email protected]
Extended author information available on the last page of the article 0123456789().: V,-vol
123
38
Page 2 of 36
X. Chang et al.
1 Introduction In this paper, we consider the following multi-block separable convex minimization model: min
s.t.
p i=1 p
f i (xi ) + Ai x i +
i=1
q
g j (y j )
j=1 q
B j y j = c,
j=1
xi ∈ Xi ,
y j ∈ Y j , i = 1, . . . , p,
j = 1, . . . , q,
(1)
where f i (xi ) : Xi → R, g j (y j ) : Y j → R are closed (that is, lower semicontinuous) and proper convex functions (possibly nonsmooth); Ai : Xi → Rm and B j : Y j → Rm are linear operators; c ∈ Rm is a given vector; Xi and Y j are closed convex sets; and p, q ≥ 1 are positive integers. A wide range of problems in image and signal processing, statistics and machine learning can be formulated into the multi-block model (1). Here, we take some examples. Latent Variable Gaussian Graphical Model Selection (LVGGMS) problem in statistical
Data Loading...