Multi-view semi-supervised least squares twin support vector machines with manifold-preserving graph reduction

  • PDF / 1,407,203 Bytes
  • 11 Pages / 595.276 x 790.866 pts Page_size
  • 31 Downloads / 179 Views

DOWNLOAD

REPORT


ORIGINAL ARTICLE

Multi‑view semi‑supervised least squares twin support vector machines with manifold‑preserving graph reduction Xijiong Xie1 Received: 8 September 2019 / Accepted: 15 April 2020 © Springer-Verlag GmbH Germany, part of Springer Nature 2020

Abstract Multi-view semi-supervised support vector machines consider learning with multi-view unlabeled data to boost the learning performance. However, they have several defects. They need to solve the quadratic programming problem and the time complexity is quite high. Moreover, when a large number of multi-view unlabeled examples exist, it can generate more outliers and noisy examples and influence the performance. Therefore, in this paper, we propose two novel multi-view semisupervised support vector machines called multi-view Laplacian least squares twin support vector machine and its improved version with the manifold-preserving graph reduction which can enhance the robustness of the algorithm. They can reduce the time complexity by changing the constraints to a series of equality constraints and lead to a pair of linear equations. The linear multi-view Laplacian least squares twin support vector machine and its improved version with manifold-preserving graph reduction are further generalized to the nonlinear case via the kernel trick. Experimental results demonstrate that our proposed methods are effective. Keywords  Multi-view semi-supervised learning · Least squares twin support vector machines · Semi-supervised learning · Manifold-preserving graph reduction

1 Introduction Support vector machine (SVM) has been widely investigated [1–4], which implements the structural risk minimization of statistical learning theory. In contrast with other classification algorithms such as artificial neural networks [5], SVM can gain a better generalization capability. In recent years, non-parallel hyperplane classifiers have emerged and attracted much attention of many researchers. Twin support vector machine (TSVM) [6] is a typical non-parallel hyperplane classifier which creates two non-parallel hyperplanes such that one of the hyperplanes is closer to one class and has a certain distance to the other. Although the scale of TSVM is smaller than SVM, it still need to solve two quadratic programming problems (QPPs). Least squares twin support vector machine (LSTSVM) [7–10] can make its learning speed faster than the one of TSVM.

* Xijiong Xie [email protected] 1



The School of Information Science and Engineering, Ningbo University, Zhejiang 315211, China

In real application, collecting labeled examples spends most time and manual labor, while the collection of unlabeled examples may be relatively easy. Semi-supervised learning [11–14] was presented to handle this problem. When the unlabeled data are adopted rationally, it can outperform the performance of the counterpart supervised learning approach. There exist several semi-supervised learning methods of SVM and TSVM, such as transductive SVM [15], semi-supervised support vector machines [16], Laplacian support v