Relaxed group low rank regression model for multi-class classification

  • PDF / 1,667,269 Bytes
  • 19 Pages / 439.37 x 666.142 pts Page_size
  • 71 Downloads / 270 Views

DOWNLOAD

REPORT


Relaxed group low rank regression model for multi-class classification Shuangxi Wang 1,2 & Hongwei Ge 1,2 & Jinlong Yang 1,2 & Yubing Tong 3 Received: 22 May 2020 / Revised: 4 September 2020 / Accepted: 13 October 2020 # Springer Science+Business Media, LLC, part of Springer Nature 2020

Abstract

Least squares regression is an effective multi-classification method; however, in practical applications, many models based on the least squares regression method are significantly affected by noise (and outliers). Therefore, effectively reducing the adverse effects of noise is conducive to obtaining a better classification performance. Besides, preserving the intrinsic characteristics of samples to the greatest extent possible is beneficial for improving the discriminative ability of the model. Based on this analysis, we propose the relaxed group low-rank regression model for multi-class classification. The model effectively captures the hidden structural information of samples by exploiting the group lowrank constraint. Meanwhile, with the group low-rank constraint and the graph embedding constraint, the proposed method has more tolerance to noise (and outliers). The feature matrix with the L21-norm and the graph embedding constraint complement each other to capture the intrinsic characteristics of the samples. In addition, a sparsity error term with the L21 norm is utilized to relax the strict target label matrix. These factors guarantee that the original samples are converted into a more compact and discriminative characteristic space. Finally, we compare the proposed model with various popular algorithms on several benchmark datasets. The experimental results demonstrate that the performance of the proposed method outperforms those of state-of-the-art methods. Keywords Group low-rank representation . Label relaxation . Image classification . Graph embedding

* Hongwei Ge [email protected]

1

School of Artificial Intelligence and Computer Science, Jiangnan University, Jiangsu 214122 Wuxi, China

2

Key Laboratory of Advanced Process Control for Light Industry (Jiangnan University), Ministry of Education, 214122 Wuxi, China

3

Medical Image Processing Group, Department of Radiology, University of Pennsylvania, Philadelphia, PA 19104, USA

Multimedia Tools and Applications

1 Introduction Least squares regression (LSR) is an optimization technique commonly used in pattern recognition. It finds the optimal transformation matrix by minimizing the square of the error, that is, the error between the label matrix Y and the transformed matrixPX and the optimization problem can be formulated as: arg min ‖Y −PX ‖2F P

Because of its simplicity and efficiency, LSR is widely used in pattern recognition and computer vision. Examples include image classification [14, 28], feature extraction [20, 35], and speech recognition [36]. Moreover, support vector machines (SVM) and their variants [5, 6] are LSR-based methods, and principal component analysis (PCA) and its variants [7, 10] can be extended to the LSR framework. Meanwhile, som