On the Gradient Projection Method for Weakly Convex Functions on a Proximally Smooth Set

  • PDF / 679,516 Bytes
  • 9 Pages / 612 x 792 pts (letter) Page_size
  • 68 Downloads / 242 Views

DOWNLOAD

REPORT


he Gradient Projection Method for Weakly Convex Functions on a Proximally Smooth Set M. V. Balashov1* 1

Institute of Control Sciences of Russian Academy of Sciences, Moscow, 117997 Russia

Received March 23, 2020; in final form, May 5, 2020; accepted May 14, 2020

Abstract—Let a weakly convex function (in the general case, nonconvex and nonsmooth) satisfy the quadratic growth condition. It is proved that the gradient projection method for minimizing such a function on a set converges with linear rate on a proximally smooth (nonconvex) set of special form (for example, on a smooth manifold), provided that the constant of weak convexity of the function is less than the constant in the quadratic growth condition and the constant of proximal smoothness for the set is sufficiently large. The connection between the quadratic growth condition on the function and other conditions is discussed. DOI: 10.1134/S0001434620110024 Keywords: weak convexity, quadratic growth, gradient projection method, proximal smoothness, nonsmooth analysis.

1. INTRODUCTION We consider the problem minx∈Q f (x). The set Q is assumed to be closed. In the case of a differentiable function f and Q = Rn , to solve the problem, we can apply the gradient descent method: x0 ∈ R n ,

xk+1 = xk − tk f  (xk ),

where f  (xk ) is the gradient of the function f at the point xk , and tk > 0 is the step of the method. The gradient descent method for a nonconvex function f with Lipschitz continuous gradient and a fixed step tk = t > 0, as well as conditions for its convergence, were studied in [1]. In the case Q = Rn , the gradient descent method can be generalized to the gradient projection method: x0 ∈ Q,

xk+1 ∈ PQ (xk − tk f  (xk )).

Here PQ is the metric projection operator onto the set Q. For the classes of convex and strongly convex Lipschitz differentiable functions f and a convex set Q, the gradient projection method and its convergence were studied in [2]. In the case of a nonsmooth and nonconvex function, the application of the gradient descent method and the gradient projection method involves many difficulties. Polyak [3, Theorem 2, Chap. 5, Sec. 3] proved the linear convergence of the gradient descent method (i.e., convergence at the rate of a geometric progression) for a convex nonsmooth function under the so-called sharpness condition: there exists a γ > 0 such that f (x) − f∗ ≥ γΩ (x)

for all

x ∈ Rn .

Here Ω is the nonempty solution set of the problem f∗ = minx∈Rn f (x) (we require Ω to be nonempty) and Ω (x) = inf y∈Ω x − y is the distance function from the point x to the set Ω. The linear convergence of the gradient projection method on a convex closed set and of the gradient descent method were proved in [4], [5] for weakly convex functions under the sharpness condition. *

E-mail: [email protected]

643

644

BALASHOV

The sharpness condition may be redundant, because the function under consideration must be nonsmooth at the points of minimum. On the other hand, there exists another well-known condition, the quadratic growth condition