On the convergence of algorithms with Tikhonov regularization terms

  • PDF / 452,172 Bytes
  • 14 Pages / 439.37 x 666.142 pts Page_size
  • 57 Downloads / 187 Views

DOWNLOAD

REPORT


On the convergence of algorithms with Tikhonov regularization terms Bruno Dinis1

· Pedro Pinto2

Received: 14 May 2020 / Accepted: 19 August 2020 © Springer-Verlag GmbH Germany, part of Springer Nature 2020

Abstract We consider the strongly convergent modified versions of the Krasnosel’ski˘ı-Mann, the forward-backward and the Douglas-Rachford algorithms with Tikhonov regularization terms, introduced by Radu Bo¸t, Ernö Csetnek and Dennis Meier. We obtain quantitative information for these modified iterations, namely rates of asymptotic regularity and metastability. Furthermore, our arguments avoid the use of sequential weak compactness and use only a weak form of the projection argument. Keywords Fixed points of nonexpansive mappings · Tikhonov regularization · Splitting algorithms · Metastability · Asymptotic regularity

1 Introduction In nonlinear analysis one is often confronted with the need to find zeros of a sum of monotone operators. Two well-known iterative splitting methods to approximate such zeros are the forward-backward and the Douglas-Rachford algorithms (see [1]). The former weakly converges to a zero of the sum of a multi-valued maximal monotone operator with a single-valued cocoercive operator, while the latter weakly converges to a zero of the sum of two maximal monotone operators. These convergence results stem from the weak convergence of the Krasnosel’ski˘ı-Mann (KM) iteration to a fixed

B

Bruno Dinis [email protected] Pedro Pinto [email protected]

1

Departamento de Matemática, Faculdade de Ciências da, Universidade de Lisboa, Campo Grande, Edifício C6, 1749-016 Lisbon, Portugal

2

Department of Mathematics, Technische Universität Darmstadt, Schlossgartenstrasse 7, 64289 Darmstadt, Germany

123

B. Dinis, P. Pinto

point of a nonexpansive map T in a Hilbert space H : xn+1 = xn + λn (T (xn ) − xn ),

(KM)

with x0 ∈ H a starting point and (λn ) ⊂ [0, 1] a sequence of real numbers. One way to obtain strong convergence is to impose stronger conditions on the operators, such as strong monotonicity or strong convexity. However, the application to certain cases may be unfeasible, as the stronger conditions may prove to be too restrictive. The proximal-Tikhonov algorithm (see [11]) was introduced as an alternative way to obtain, under mild conditions, strong convergence to a zero of a maximal monotone operator A. This algorithm is based on the proximal point algorithm [13,16] and, in a first step, switches from the operator A to the operator A + μn Id, where (μn ) is a sequence of nonnegative real numbers. As the original proximal point algorithm, in general, is only weakly convergent, the added term μn Id is crucial in guaranteeing the strong convergence of the iteration. Motivated by this method, in [3] Radu Bo¸t, Ernö Csetnek and Dennis Meier considered the (KM) iteration with Tikhonov regularization terms as follows: xn+1 = βn xn + λn (T (βn xn ) − βn xn ),

(T-KM)

where x0 ∈ H is the starting point, (λn ), (βn ) are sequences of positive numbers and T : H → H is a nonexpansi