AEKOC+: Kernel Ridge Regression-Based Auto-Encoder for One-Class Classification Using Privileged Information

  • PDF / 489,207 Bytes
  • 14 Pages / 595.224 x 790.955 pts Page_size
  • 75 Downloads / 236 Views

DOWNLOAD

REPORT


AEKOC+: Kernel Ridge Regression-Based Auto-Encoder for One-Class Classification Using Privileged Information Chandan Gautam1

· Aruna Tiwari1 · M. Tanveer2

Received: 14 February 2019 / Accepted: 28 November 2019 © Springer Science+Business Media, LLC, part of Springer Nature 2020

Abstract In recent years, non-iterative learning approaches for kernel have received quite an attention by researchers and kernel ridge regression (KRR) approach is one of them. Recently, KRR-based Auto-Encoder is developed for the one-class classification (OCC) task and named as AEKOC. OCC is generally used for outlier or novelty detection. The brain can detect outlier just by learning from only normal samples. Similarly, OCC also uses only normal samples to train the model, and trained model can be used for outlier detection. In this paper, AEKOC is enabled to utilize privileged information, which is generally ignored by AEKOC or any traditional machine learning technique but usually present in human learning. For this purpose, we have combined learning using privileged information (LUPI) framework with AEKOC, and proposed a classifier, which is referred to as AEKOC+. Privileged information is only available during training but not during testing. Therefore, AEKOC is unable to utilize this information for building the model. However, AEKOC+ can efficiently handle the privileged information due to the inclusion of the LUPI framework with AEKOC. Experiments have been conducted on MNIST dataset and on various other datasets from UCI machine learning repository, which demonstrates the superiority of AEKOC+ over AEKOC. Our formulation shows that AEKOC does not utilize the privileged features in learning; however, formulation of AEKOC+ helps it in learning from the privileged features differently from other available features and improved generalization performance of AEKOC. Moreover, AEKOC+ also outperformed two LUPI framework–based one-class classifiers (i.e., OCSVM+ and SSVDD+). Keywords One-class classification · Kernel learning · Kernel ridge regression (KRR) · Learning using privileged information (LUPI)

Introduction Over the past decade, researchers have exploited additional information for improving the generalization ability of classifiers [1–10]. In the real world, additional information generally exists with training samples, but traditional  Chandan Gautam

[email protected]; [email protected] Aruna Tiwari [email protected] M. Tanveer [email protected] 1

Discipline of Computer Science and Engineering, Indian Institute of Technology Indore, Simrol, Indore, 453552, India

2

Discipline of Mathematics, Indian Institute of Technology Indore, Simrol, Indore, 453552, India

classifier does not utilize this information in building the classification model. Vapnik and Vashist [11] addressed this issue by proposing a novel framework, i.e., learning using privileged information (LUPI). This framework enables traditional classifier to utilize the additional information available with the training set. This additional in