Support vector machine ensembles for discriminant analysis for ranking principal components
- PDF / 2,193,633 Bytes
- 37 Pages / 439.642 x 666.49 pts Page_size
- 10 Downloads / 169 Views
Support vector machine ensembles for discriminant analysis for ranking principal components Tiene A. Filisbino1 · Gilson A. Giraldi1 · Carlos E. Thomaz2 Received: 18 July 2019 / Revised: 1 June 2020 / Accepted: 5 June 2020 / © Springer Science+Business Media, LLC, part of Springer Nature 2020
Abstract The problemof ranking linear subspaces in principal component analysis (PCA), for multiclass classification tasks, has been addressed by building support vector machine (SVM) ensembles and AdaBoost.M2 technique. This methodology, named multi-class discriminant principal components analysis (Multi-Class.M2 DPCA), is motivated by the fact that the first PCA components do not necessarily represent important discriminant directions to separate sample groups. The Multi-Class.M2 DPCA proposal presents fundamental issues related to the weakening methodology, parametrization, strategy for SVM bias, and classification versus reconstruction performance. Also, it is observed a lack of comparisons between Multi-Class.M2 DPCA and feature weighting techniques. Motivated by these facts, this paper firstly presents a unified formulation to generate weakened SVM approaches and to derive different strategies of the literature. These strategies are analyzed within MultiClass.M2 DPCA methodology and its parametrization to realize the best one for ranking PCA features in face image analysis. Moreover, this work proposes variants to improve that Multi-Class.M2 DPCA configuration using strategies that incorporate SVM bias and sensitivity analysis results. The obtained Multi-Class.M2 DPCA setups are applied in the computational experiments for both classification and reconstruction problems. The results show that Multi-Class.M2 DPCA achieves higher recognition rates using less PCA features, as well as robust reconstruction and interpretation of the data. Keywords PCA · Ranking PCA components · Separating hyperplanes · Ensemble methods · AdaBoost · Face image analysis
Tiene A. Filisbino
[email protected] Gilson A. Giraldi [email protected] Carlos E. Thomaz [email protected] 1
Coordination of Mathematical and Computational Methods, National Laboratory for Scientific Computing, Quitandinha, Petropolis, RJ, 25651-075, Brazil
2
Department of Electrical Engineering, FEI, S˜ao Bernardo do Campo, SP, 09850-901, Brazil
Multimedia Tools and Applications
1 Introduction Dimensionality reduction and discriminant analysis are essential operations due to the necessity to eliminate redundancy and diminish feature space dimension in image analysis tasks [6, 32, 49]. In the field of linear techniques for dimensionality reduction, a successful approach is the principal component analysis (PCA) [6]. However, PCA considers the covariance structure of the whole data to select the principal components as the ones with the largest eigenvalues (variances). Such strategy does not necessarily highlight discriminant directions for pattern recognition tasks, like classification [59]. The main motivation of this work is to approach discriminant analysis techniques for r
Data Loading...