A Note on the Guarantees of Total Variation Minimization

In this paper, we provide a simplified understanding of the guarantees of 1-dimension total variation minimization. We consider a slightly modified total variation minimization rather than the original one. The modified model can be transformed into an \(

  • PDF / 143,940 Bytes
  • 10 Pages / 439.37 x 666.142 pts Page_size
  • 26 Downloads / 178 Views

DOWNLOAD

REPORT


College of Computer, National University of Defense Technology, Changsha 410072, China [email protected] 2 College of Science, National University of Defense Technology, Changsha 410072, China 3 The State Key Laboratory of High Performance Computation, National University of Defense Technology, Changsha 410072, China

Abstract. In this paper, we provide a simplified understanding of the guarantees of 1-dimension total variation minimization. We consider a slightly modified total variation minimization rather than the original one. The modified model can be transformed into an ‘1 minimization problem by several provable mathematical tools. With the techniques developed in random sampling theory, some estimates relative to Gaussian mean width are provided for both Gaussian and sub-Gaussian sampling. We also present a sufficient condition for the exact recovery under Gaussian sampling.

1 Introduction “Reconstructing structured signal from linear sampling is a core research in contemporary signal processing” [1]. Mathematically, the procedure of linear sampling can be presented as follows. Let x 2 RN be the unknown structured signal, U 2 RMN be the known random sampling matrix. The observation data b 2 RM are acquired as b ¼ Ux þ e; where e is the unknown but bounded noise. Under the assumption M1 k e k1  r, one can reconstruct the structured signal by employing the following convex program minx k x kK s:t:

1 k Ux  b k1  r; M

ð1Þ

This research is partly supported by the National High Technology Research and Development Program of China (No. 2012AA01A301), National Natural Science Foundation of China (No. 61402495, No. 61170046, No.11401580, No. 61402496, No. 61303189, No.61170049), Science Project of National University of Defense Technology (JC120201) and National Natural Science Foundation of Hunan Province in China (13JJ2001). © Springer International Publishing Switzerland 2016 D.-S. Huang and K.-H. Jo (Eds.): ICIC 2016, Part II, LNCS 9772, pp. 222–231, 2016. DOI: 10.1007/978-3-319-42294-7_19

A Note on the Guarantees of Total Variation Minimization

223

where kkK is some Minkowski function which can reflect the property of the structure of the signal (in sparse recovery, kkK is usually set as kk1 ). An important task in signal processing is estimating the error between the solution of Model (1) and the original signal. Besides this, another one is how many measurements is sufficient for the exact recovery when r ¼ 0. Literatures [1–6, 18, 19] show state-of-the-art answers for these questions in terms of Gaussian and sub-Gaussian measurements. In practical terms, a crowd of signals are not sparse unless being transformed by some operator (may be ill-posed). A typical example of them is signal with a sparse gradient (piecewise constant), which arises frequently from imaging. More precisely, given a signal x 2 RN , let ½Dxi ¼ xi þ 1  xi ; i ¼ 1; 2. . .; N  1: Then, Dx is sparse if x is piecewise constant. This observation motivates us to reconstruct x by solving the following minimization minx k Dx k