Rank minimization on tensor ring: an efficient approach for tensor decomposition and completion
- PDF / 1,683,718 Bytes
- 20 Pages / 439.37 x 666.142 pts Page_size
- 108 Downloads / 223 Views
Rank minimization on tensor ring: an efficient approach for tensor decomposition and completion Longhao Yuan1,2,3
· Chao Li3 · Jianting Cao2,3,4 · Qibin Zhao1,3
Received: 2 May 2019 / Revised: 22 July 2019 / Accepted: 16 September 2019 © The Author(s), under exclusive licence to Springer Science+Business Media LLC, part of Springer Nature 2019
Abstract In recent studies, tensor ring decomposition (TRD) has become a promising model for tensor completion. However, TRD suffers from the rank selection problem due to the undetermined multilinear rank. For tensor decomposition with missing entries, the sub-optimal rank selection of traditional methods leads to the overfitting/underfitting problem. In this paper, we first explore the latent space of the TRD and theoretically prove the relationship between the TR-rank and the rank of the tensor unfoldings. Then, we propose two tensor completion models by imposing the different low-rank regularizations on the TR-factors, by which the TR-rank of the underlying tensor is minimized and the low-rank structures of the underlying tensor are exploited. By employing the alternating direction method of multipliers scheme, our algorithms obtain the TR factors and the underlying tensor simultaneously. In experiments of tensor completion tasks, our algorithms show robustness to rank selection and high computation efficiency, in comparison to traditional low-rank approximation algorithms. Keywords Tensor ring decomposition · Tensor completion · Structured nuclear norm · ADMM scheme
Editors: Kee-Eung Kim and Jun Zhu.
B B
Jianting Cao [email protected] Qibin Zhao [email protected] Longhao Yuan [email protected] Chao Li [email protected]
1
School of Automation, Guangdong University of Technology, Guangzhou, China
2
Graduate School of Engineering, Saitama Institute of Technology, Fukaya, Japan
3
Tensor Learning Unit, RIKEN Center for Advanced Intelligence Project (AIP), Tokyo, Japan
4
School of Computer Science and Technology, Hangzhou Dianzi University, Hangzhou, China
123
Machine Learning
1 Introduction Tensors are the natural representations of higher-order data (Kolda and Bader 2009; Sidiropoulos et al. 2017) and have been successfully applied to machine learning (Chen et al. 2018; Novikov et al. 2015; Zhao et al. 2012), computer version (Liu et al. 2013; Zhao et al. 2015), signal processing (Cichocki et al. 2015), remote sensing (Du et al. 2017), collaborative filtering (Hu et al. 2015) and so on. Most of the datasets in the applications are partially observed, which boosts the wide studies of the tensor completion problem (Long et al. 2018; Song et al. 2019). Tensor completion aims to recover the missing entries by sparse observations. The existing methods impose assumptions of various low-rank priors to discover the underlying tensor. According to the different types of low-rank assumption, tensor completion methods can be divided into two categories which are based on tensor decomposition and low-rank regularization, respectively. Tensor decomposition is to find t
Data Loading...