Nonparallel Support Vector Machine Based on One Optimization Problem for Pattern Recognition
- PDF / 853,268 Bytes
- 21 Pages / 439.37 x 666.142 pts Page_size
- 78 Downloads / 221 Views
Nonparallel Support Vector Machine Based on One Optimization Problem for Pattern Recognition Ying-Jie Tian1,2 · Xu-Chan Ju1,3
Received: 22 April 2015 / Revised: 6 June 2015 / Accepted: 25 August 2015 © Operations Research Society of China, Periodicals Agency of Shanghai University, Science Press, and Springer-Verlag Berlin Heidelberg 2015
Abstract In this paper, we present a novel nonparallel support vector machine based on one optimization problem (NSVMOOP) for binary classification. Our NSVMOOP is formulated aiming to separate classes from the largest possible angle between the normal vectors and the decision hyperplanes in the feature space, at the same time implementing the structural risk minimization principle. Different from other nonparallel classifiers, such as the representative twin support vector machine, it constructs two nonparallel hyperplanes simultaneously by solving a single quadratic programming problem, on which a modified sequential minimization optimization algorithm is explored. The NSVMOOP is analyzed theoretically and implemented experimentally. Experimental results on both artificial and publicly available benchmark datasets show its feasibility and effectiveness.
This work was partially supported by the National Natural Science Foundation of China (Nos. 61472390, 11271361 and 71331005), Major International (Regional) Joint Research Project (No. 71110107026), and the Ministry of Water Resources Special Funds for Scientific Research on Public Causes (No. 201301094).
B
Xu-Chan Ju [email protected] Ying-Jie Tian [email protected]
1
Research Center on Fictitious Economy & Data Science, Chinese Academy of Sciences, Beijing 100190, China
2
Key Laboratory of Big Data Mining and Knowledge Management, Chinese Academy of Sciences, Beijing 100190, China
3
School of Mathematical Sciences, University of Chinese Academy of Sciences, Beijing 101408, China
123
Y.-J. Tian, X.-C. Ju
Keywords Pattern recognition · Support vector machines · Nonparallel hyperplanes · Sequential minimization optimization Mathematics Subject Classification
68T10 · 68W99 · 99R20 · 97N60
1 Introduction Support vector machines (SVMs) are computationally powerful tools for pattern classification and regression [1–5], and have been successfully applied in a wide variety of fields such as face recognition, text categorization, and bioinformatics [6–12]. SVMs are so successful depending on the implementation of margin maximization, dual theory, and kernel trick. For the standard support vector classification (SVC), it constructs two parallel supporting hyperplanes with maximal margin and make the middle one between them as the final separating hyperlane. Recently, the nonparallel hyperplanes classifiers are developed and have attracted many interests. The representative algorithms include the generalized eigenvalue proximal support vector machine (GEPSVM) [13] and the twin support vector machine (TWSVM) [14]. For TWSVM, it seeks two nonparallel proximal hyperplanes such that each hyperplane is closer to one of the two clas
Data Loading...