Computing and analyzing recoverable supports for sparse reconstruction

  • PDF / 1,393,648 Bytes
  • 26 Pages / 439.642 x 666.49 pts Page_size
  • 18 Downloads / 178 Views

DOWNLOAD

REPORT


Computing and analyzing recoverable supports for sparse reconstruction Christian Kruschel · Dirk A. Lorenz

Received: 12 September 2013 / Accepted: 5 January 2015 © Springer Science+Business Media New York 2015

Abstract Designing computational experiments involving 1 minimization with linear constraints in a finite-dimensional, real-valued space for receiving a sparse solution with a precise number k of nonzero entries is, in general, difficult. Several conditions were introduced which guarantee that, for example for small k or for certain matrices, simply placing entries with desired characteristics on a randomly chosen support will produce vectors which can be recovered by 1 minimization. In this work, we consider the case of large k and introduce a method which constructs vectors which support has the cardinality k and which can be recovered via 1 minimization. Especially, such vectors with largest possible support can be constructed. Further, we propose a methodology to quickly check whether a given vector is recoverable. This method can be cast as a linear program and we compare it with solving 1 minimization directly. Moreover, we gain new insights in the recoverability in a non-asymptotic regime. Our proposal for quickly checking vectors bases on optimality conditions for exact solutions of the 1 minimization. These conditions can be used to establish equivalence classes of recoverable vectors which have a support of the same cardinality. Further, by these conditions we deduce a geometrical interpretation which identifies an equivalence class with a face of an hypercube which is cut by a certain affine subspace. Due to the new geometrical interpretation we derive new results on the number of equivalence classes which are illustrated by computational experiments. Communicated by: Raymond H. Chan C. Kruschel Institute for Numerical and Applied Mathematics, University of Goettingen, 37083 Goettingen, Germany e-mail: [email protected] D. A. Lorenz () Institute for Analysis and Algebra, TU Braunschweig, 38092 Braunschweig, Germany e-mail: [email protected]

C. Kruschel, D.A. Lorenz

Keywords Sparse recovery · Hypercube sections · Phase transition · Compressed sensing Mathematics Subject Classification (2010) 52B05 · 94A12 · 15A29

1 Introduction The difficulty of finding suitable test instances is a serious problem in the field of Sparse Reconstruction. A common and promising method to reconstruct a vector x ∗ ∈ Rn with only a few nonzeros entries from a linear transformation, which is realized by a matrix A ∈ Rm×n with m < n, is performing the 1 minimization, i.e. x ∗ = arg min y1 s.t. Ay = Ax ∗ . y

(1)

This optimization problem was introduced in [26] and is called Basis Pursuit. Under certain conditions (e.g. see [11, 21, 31]) the vector x ∗ is also a solution with the smallest number of nonzero entries; a vector x ∗ with exactly k nonzero entries is called k-sparse. A popular method for finding a k-sparse vector x ∗ satisfying (1) for a given matrix A ∈ Rm×n