Relaxed local preserving regression for image feature extraction
- PDF / 1,375,089 Bytes
- 20 Pages / 439.37 x 666.142 pts Page_size
- 68 Downloads / 234 Views
Relaxed local preserving regression for image feature extraction Jiaqi Bao 1 & Zhihui Lai 1,2 & Xuechen Li 1 Received: 4 November 2019 / Revised: 4 August 2020 / Accepted: 2 September 2020 # Springer Science+Business Media, LLC, part of Springer Nature 2020
Abstract
The latest linear least regression (LSR) methods improved the performance of image feature extraction effectively by relaxing strict zero-one labels as slack forms. However, these methods have the following three disadvantages: 1) LSR-based methods are sensitive to the noises and may lose effectiveness in feature extraction task; 2) they only focus on the global structures of data, but ignore locality which is important to improve the performance; 3) they suffer from small-class problem, which means the number of projections learned by methods is limited by the number of classes. To address these problems, we propose a novel method called Relaxed Local Preserving Regression (RLPR) for image feature extraction. By incorporating the relaxed label matrix and similarity graph-based regularization term, RLPR can not only explore the latent structure information of data, but also solve the small-class problem. In order to enhance the robustness to noises, we further proposed an extended version of RLPR based on l2, 1norm, termed as ERLPR. The experimental results on image databases consistently show that the recognition rates of RLPR and ERLPR are superior to the compared methods and can achieve 98% in normal cases. Especially, even on the corrupted databases, the proposed methods can also achieve the classification accuracy of more than 58%. Keywords Image feature extraction . Label relaxation . Linear least regression (LSR) . Manifold learning
1 Introduction Multivariate linear regression is one of the most widely used techniques in pattern recognition [36]. Least squares regression (LSR), which is one of the fundamental regression analysis
* Zhihui Lai [email protected]
1
The College of Computer Science and Software Engineering, Shenzhen University, Shenzhen, China
2
Institute of Textiles & Clothing, The Hong Kong Polytechnic University, Hong Kong, China
Multimedia Tools and Applications
methods, has been developed into many variants, including orthogonal LSR [48, 49], partial LSR [31], rotated LSR [10], discriminative LSR (DLSR) [42], margin scalable DLSR [37] and robust LSR (RoDLSR) [39]. The goal of linear regression (LR) [35] is to fit the target labels as closely as possible with the predicted results. LR can be defined as 2 ð1Þ min Y −X T W F W
X = [x1, x2, …, xn] ∈ Rm × n
consists of n training samples and Y ∈ Rn × c is normalwhere matrix ized class indicator. If xi belongs to j-th class, the Yij = 1; if not, Yij = 0. W ∈ Rm × c is a transformation matrix to be learnt by (1) and used to predict testing sample x ∈ Rm × 1 through WTx. However, such linear regression models may encounter the singularity problem or overfitting problem when the dimensions of data are higher than the number of samples. To perform accurate regression o
Data Loading...