Secant variable projection method for solving nonnegative separable least squares problems

  • PDF / 3,075,995 Bytes
  • 25 Pages / 439.642 x 666.49 pts Page_size
  • 64 Downloads / 168 Views

DOWNLOAD

REPORT


Secant variable projection method for solving nonnegative separable least squares problems Xiongfeng Song1 · Wei Xu2 · Ken Hayami3 · Ning Zheng4 Received: 26 March 2019 / Accepted: 22 October 2019 / © Springer Science+Business Media, LLC, part of Springer Nature 2020

Abstract The variable projection method is a classical and efficient method for solving separable nonlinear least squares (SNLLS) problems. However, it is hard to handle the constrained SNLLS problems since the explicit form of the Jacobian matrix is required in each iteration. In this paper, we propose a secant variable projection (SVP) method, which employs a rank-one update to estimate the Jacobian matrices. The main advantages of our method are efficiency and ease of applicability to constrained SNLLS problems. The local convergence of our SVP method is also analyzed. Finally, some data fitting and image processing problems are solved to compare the performance of our proposed method with the classical variable projection method. Numerical results illustrate the efficiency and stability of our proposed SVP method in solving the SNLLS problems arising from the blind deconvolution problems. Keywords Separable nonlinear least squares problem · Secant method · Automatic differentiation · Nonnegativity constraints · Jacobian matrix · Blind deconvolution  Wei Xu

[email protected] Xiongfeng Song tjmath [email protected] Ken Hayami [email protected] Ning Zheng [email protected] 1

School of Mathematical Sciences, Tongji University, Shanghai, 200092, China

2

Department of Mathematics, Ryerson University, Toronto, ON, Canada

3

National Institute of Informatics / The Graduate University for Advanced Studies, SOKENDAI, 2-1-2, Hitotsubashi, Chiyoda-ku, Tokyo 101-8430, Japan

4

RIKEN Center for Advanced Intelligence Project, Nihobashi 1-chome, Mitsui Building, 15th floor, Nihonbashi, Chuo-ku, Tokyo, Japan

Numerical Algorithms

1 Introduction Separable nonlinear least squares (SNLLS) problems [12], in which the unknown variables in the objective function can be separated into two disjoint sets: linear variables and nonlinear variables, are very common in a variety of scientific applications [8], such as nonlinear data fitting [21], signal analysis[18], medical and biological imaging[2, 3], spectroscopy and microscopy [18], and neural networks [15]. Given a set of observations {bi , ti }, i = 1, 2, ..., m, and nonlinear functions ϕ j (y, t), j = 1, 2, ..., n, where y ∈ Rq is unknown, (m ≥ n + q), the vector x ∈ Rn and y are determined to fit the model as follows: η(x, y; t) =

n 

ϕ j (y, t)xj ,

(1.1)

j =1

which leads to the minimization problem min b − A(y)x22 , x,y

(1.2)

where A : Rq −→ Rm×n , with (A(y))ij = ϕ j (y, ti ), is a matrix function that maps the vector y to an m × n matrix and b is a vector of given data (b1 , b2 , ..., bm ) . An intuitive method for solving such problems is the block coordinate descent (BCD), which alternates between minimizations of the linear variables and the nonlinear variables, such as the nonlinear iterati