The CoMirror algorithm with random constraint sampling for convex semi-infinite programming

  • PDF / 767,037 Bytes
  • 33 Pages / 439.37 x 666.142 pts Page_size
  • 105 Downloads / 186 Views

DOWNLOAD

REPORT


The CoMirror algorithm with random constraint sampling for convex semi-infinite programming Wei Bo1 · William B. Haskell2

· Zhao Sixiang3

© Springer Science+Business Media, LLC, part of Springer Nature 2020

Abstract The CoMirror algorithm, by Beck et al. (Oper Res Lett 38(6):493–498, 2010), is designed to solve convex optimization problems with one functional constraint. At each iteration, it performs a mirror-descent update using either the subgradient of the objective function or the subgradient of the constraint function, depending on whether or not the constraint violation is below some tolerance. In this paper, we combine the CoMirror algorithm with inexact cut generation to create the SIP-CoM algorithm for solving semi-infinite programming (SIP) problems. First, we provide general error bounds for SIP-CoM. Then, we propose two specific random constraint sampling schemes to approximately solve the cut generation problem for generic SIP. When the objective √ and constraint functions are generally convex, randomized SIP-CoM achieves an O(1/ N ) convergence rate in expectation (in terms of the optimality gap and SIP constraint violation). When the objective and constraint functions are all strongly convex, this rate can be improved to O(1/N ). Keywords Semi-infinite programming · Random constraint sampling · Corporative stochastic approximation

1 Introduction Mirror-descent can solve minimization problems with simple feasible sets. The CoMirror algorithm, first introduced in Beck et al. (2010), is designed to handle an additional functional constraint. Suppose we are given a general convex programming problem:   min f (x) : G(x) ≤ 0 ,

x∈X

B

William B. Haskell [email protected]

1

Institute of Operations Research and Analytics, National University of Singapore, Singapore 119007, Singapore

2

Krannert School of Management, Purdue University, West Lafayette 47906, USA

3

Sino-US Global Logistics Institute, Shanghai Jiao Tong University, Shanghai 200030, China

123

Annals of Operations Research

where X ⊂ Rn is compact and convex, and f and G are both convex and subdifferentiable functions on X . The CoMirror algorithm works as follows. Let {xk }k≥1 denote the sequence of iterates of the algorithm, {γk }k≥1 a sequence of step-sizes with all γk > 0, and η > 0 an error tolerance for constraint violation. At iteration k ≥ 1, we calculate G(x k ) and then check whether G(xk ) ≤ η. Depending on this check, we perform a mirror-descent step with step-size γk along either f  (xk ) ∈ ∂ f (xk ) (if the constraint violation is below tolerance) or G  (xk ) ∈ ∂G(xk ) (if the constraint violation is above tolerance). This procedure essentially includes the case of multiple functional constraints, i.e., take G(x) = maxi=1,2,...,m gi (x). Similar to the CoMirror algorithm, the cooperative stochastic approximation (CSA) method is developed in Lan and Zhou (2020) to deal with stochastic optimization problems with functional or expectation constraints. The adjective “cooperative” in this context means that the directi