Least squares recursive projection twin support vector machine for multi-class classification

  • PDF / 1,685,125 Bytes
  • 16 Pages / 595.276 x 790.866 pts Page_size
  • 84 Downloads / 238 Views

DOWNLOAD

REPORT


ORIGINAL ARTICLE

Least squares recursive projection twin support vector machine for multi-class classification Zhi-Min Yang1 • He-Ji Wu2 • Chun-Na Li1 • Yuan-Hai Shao1

Received: 17 September 2014 / Accepted: 27 June 2015 Ó Springer-Verlag Berlin Heidelberg 2015

Abstract Multiple recursive projection twin support vector machine (MPTSVM) is a recently proposed classifier and has been proved to be outstanding in pattern recognition. However, MPTSVM is computationally expensive since it involves solving a series of quadratic programming problems. To relieve the training burden, in this paper, we propose a novel multiple least squares recursive projection twin support vector machine (MLSPTSVM) based on least squares recursive projection twin support vector machine (LSPTSVM) for multi-class classification problem. For a KðK [ 2Þ classes classification problem, MLSPTSVM aims at seeking K groups of projection axes, one for each class that separates it from all the other. Due to solving a series of linear equations, our algorithm tends to relatively simple and fast. Moreover, a recursive procure is introduced to generate multiple orthogonal projection axes for each class to enhance its performance. Experimental results on several synthetic and UCI datasets, as well as on relatively large datasets demonstrate that our MLSPTSVM has comparable classification accuracy while takes significantly less computing & Yuan-Hai Shao [email protected] Zhi-Min Yang [email protected] He-Ji Wu [email protected] Chun-Na Li [email protected] 1

Zhijiang College, Zhejiang University of Technology, Hangzhou 310024, People’s Republic of China

2

College of Science, Zhejiang University of Technology, Hangzhou 310023, People’s Republic of China

time compared with MPTSVM, and also obtains better performance than several other SVM related methods being used for multi-class classification problem. Keywords Pattern recognition  Multi-class classification  Multiple recursive projection  Projection twin support vector machine  Least squares recursive projection twin support vector machine

1 Introduction Support vector machine (SVM) [1, 2], being widely used for pattern classification and regression problems, was introduced by Vapnik and his co-workers in the early 1990s. Previous studies demonstrated the superiority of SVM [3–5]. By employing the structural risk minimization (SRM) principle [6], SVM tries to find a decision hyperplane that separates data points from two classes well by constructing two parallel support hyperplanes that the margin between them is maximized. However, SVM needs to solve a quadratic programming problem (QPP), which restricts its application to large scale problems. To address this issue, numerous approaches have been proposed [7– 12]. For binary classification, some nonparallel hyperplane classifiers have attracted much attention. Mangasarian and Wild [9] proposed a generalized eigenvalue proximal support vector machine (GEPSVM) which aims at finding two nonparallel hyperplanes such that each hyperplane is closer