Generalized eigenvalue proximal support vector regressor for the simultaneous learning of a function and its derivatives
- PDF / 2,594,451 Bytes
- 12 Pages / 595.276 x 790.866 pts Page_size
- 81 Downloads / 251 Views
ORIGINAL ARTICLE
Generalized eigenvalue proximal support vector regressor for the simultaneous learning of a function and its derivatives Reshma Khemchandani1 · Keshav Goyal2 · Suresh Chandra2
Received: 25 May 2015 / Accepted: 29 April 2017 © Springer-Verlag Berlin Heidelberg 2017
Abstract Generalized eigenvalue proximal support vector regressor (GEPSVR) determines a pair of 𝜖-insensitive bounding regressors by solving a pair of generalized eigenvalue problem. On the lines of GEPSVR, in this paper we propose a novel regressor for the simultaneous learning of a function and its derivatives, termed as GEPSVR of a Function and its Derivatives. The proposed method is fast as it requires the solution of a pair of generalized eigenvalue problems as compared to the solution of a large Quadratic Programming Problem required in other existing approaches. The experiment results on several benchmark functions of more than one variable proves the efficacy of our proposed method. Keywords Support vector machines · Regression · 𝜖-insensitive bound · Generalized eigenvalues · Function approximation
1 Introduction The last decade has witnessed the evolution of support vector machines (SVMs) as a powerful paradigm for * Reshma Khemchandani [email protected] Keshav Goyal [email protected] Suresh Chandra [email protected] 1
Department of Computer Science, Faculty of Mathematics and Computer Science, South Asian University, Delhi, India
2
Department of Mathematics, Indian Institute of Technology, Delhi, India
pattern classification and regression [2, 3, 18, 24, 25]. SVMs emerged from research in statistical learning theory on how to regulate the trade-off between structural complexity and empirical risk. Its application includes a wide spectrum of research areas [4, 9, 11, 20]. Support vector regression (SVR) is used for fitting a regressor through a given set of samples by solving a quadratic minimization problem with linear inequality constraints. The standard SVR is an 𝜖-insensitive model which sets an 𝜖 tube around the data points within which errors are discarded using an 𝜖 -insensitive loss function. Recently, Managasarian and Wild [19] proposed a nonparallel plane classifier for binary data classification, which they termed as generalized eigenvalue proximal support vector machine (GEPSVM). In this approach, data points of each class are proximal to one of two distinct non-parallel planes. Each plane generated is closest to one of the two classes and at the same time as far as possible from the other class. The two non-parallel planes are obtained by solving two corresponding generalized eigenvalue problems. On the lines of GEPSVM, quadratic programming based non parallel plane based classifier twin support vector machines [7] and twin support vector machine based regeression [14] were proposed. In the spirit similar to GEPSVM, Khemchandani et al. [12] proposed a non-parallel plane based regressor, which they termed as Generalized eigenvalue proximal support vector regressor (GEPSVR). GEPSVR
Data Loading...