A unified linear convergence analysis of k -SVD
- PDF / 653,202 Bytes
- 11 Pages / 595.276 x 790.866 pts Page_size
- 9 Downloads / 184 Views
REGULAR RESEARCH PAPER
A unified linear convergence analysis of k-SVD Zhiqiang Xu1 · Yiping Ke2
· Xin Cao3 · Chunlai Zhou4 · Pengfei Wei5 · Xin Gao6
Received: 22 September 2020 / Accepted: 1 October 2020 © Springer-Verlag GmbH Germany, part of Springer Nature 2020
Abstract Eigenvector computation, e.g., k-SVD for finding top-k singular subspaces, is often of central importance to many scientific and engineering tasks. There has been resurgent interest recently in analyzing relevant methods in terms of singular value gap dependence. Particularly, when the gap vanishes, the convergence of k-SVD is considered to be capped by a gap-free sub-linear rate. We argue in this work both theoretically and empirically that this is not necessarily the case, refreshing our understanding on this significant problem. Specifically, we leverage the recently proposed structured gap in a careful analysis to establish a unified linear convergence of k-SVD to one of the ground-truth solutions, regardless of what target matrix and how large target rank k are given. Theoretical results are evaluated and verified by experiments on synthetic or real data. Keywords Eigenvector computation · k-SVD · Riemannian optimization · Convergence analysis
1 Introduction Eigenvector computation often plays central roles in many scientific and engineering problems, such as electronic structure calculation [19], structural analysis [23], dynamical control systems [13], combinatorial optimization [14], as well as data mining and machine learning [7,16]. There has been resurgent interest recently in proposing and analyzing
B
Yiping Ke [email protected] Zhiqiang Xu [email protected] Xin Cao [email protected] Chunlai Zhou [email protected] Pengfei Wei [email protected] Xin Gao [email protected]
1
Baidu Research, Beijing, China
2
Nanyang Technological University, Singapore, Singapore
3
University of New South Wales, Sydney, Australia
4
Renmin University of China, Beijing, China
5
National University of Singapore, Singapore, Singapore
6
King Abdullah University of Science and Technology, Thuwal, Saudi Arabia
new methods for this problem The state of the art is that there is a dichotomy of theoretical convergence behaviors of the algorithms in terms of the gap dependence, i.e., gapdependent [3,8,11,20,22,24,27] or gap-free convergence rate [15,21]. Gap-dependent rates are linear and depends on a positive gap at the target rank. Note that gap values are defined by the forward difference between two consecutive singular values in descending order, and thus always non-negative. On the contrary, gap-free rates concerning the worst-case performance are independent of any gap and sub-linear. When the gap at the target rank vanishes, it is expected that the convergence will assume a sub-linear rate as in the gap-free case. Given the great importance of eigenvector computation in many scientific fields, it would be always instructive to refresh our knowledge and deepen our understanding on this problem in one way or another. Specifically, an i
Data Loading...