LoRAS: an oversampling approach for imbalanced datasets

  • PDF / 1,756,061 Bytes
  • 23 Pages / 439.37 x 666.142 pts Page_size
  • 94 Downloads / 218 Views

DOWNLOAD

REPORT


LoRAS: an oversampling approach for imbalanced datasets Saptarshi Bej1 · Narek Davtyan1 · Markus Wolfien1 · Mariam Nassar1 · Olaf Wolkenhauer1  Received: 20 August 2019 / Revised: 21 July 2020 / Accepted: 8 September 2020 © The Author(s) 2020

Abstract The Synthetic Minority Oversampling TEchnique (SMOTE) is widely-used for the analysis of imbalanced datasets. It is known that SMOTE frequently over-generalizes the minority class, leading to misclassifications for the majority class, and effecting the overall balance of the model. In this article, we present an approach that overcomes this limitation of SMOTE, employing Localized Random Affine Shadowsampling (LoRAS) to oversample from an approximated data manifold of the minority class. We benchmarked our algorithm with 14 publicly available imbalanced datasets using three different Machine Learning (ML) algorithms and compared the performance of LoRAS, SMOTE and several SMOTE extensions that share the concept of using convex combinations of minority class data points for oversampling with LoRAS. We observed that LoRAS, on average generates better ML models in terms of F1-Score and Balanced accuracy. Another key observation is that while most of the extensions of SMOTE we have tested, improve the F1-Score with respect to SMOTE on an average, they compromise on the Balanced accuracy of a classification model. LoRAS on the contrary, improves both F1 Score and the Balanced accuracy thus produces better classification models. Moreover, to explain the success of the algorithm, we have constructed a mathematical framework to prove that LoRAS oversampling technique provides a better estimate for the mean of the underlying local data distribution of the minority class data space. Keywords  Imbalanced datasets · Oversampling · Synthetic sample generation · Data augmentation · Manifold learning

Editor: Nathalie Japkowicz. Electronic supplementary material  The online version of this article (https​://doi.org/10.1007/s1099​ 4-020-05913​-4) contains supplementary material, which is available to authorized users. * Olaf Wolkenhauer olaf.wolkenhauer@uni‑rostock.de https://www.sbi.uni-rostock.de/ Extended author information available on the last page of the article

13

Vol.:(0123456789)



Machine Learning

1 Introduction Imbalanced datasets are frequent occurrences in a large spectrum of fields, where Machine Learning (ML) has found its applications, including business, finance and banking as well as bio-medical science. Oversampling approaches are a popular choice to deal with imbalanced datasets (Chawla et al. 2002; Han et al. 2005; Haibo et al. 2008; Bunkhumpornpat et al. 2009; Barua et al. 2014). We here present Localized Randomized Affine Shadowsampling (LoRAS), which produces better ML models for imbalanced datasets, compared to state-of-the art oversampling techniques such as SMOTE and several of its extensions. We use computational analyses and a mathematical proof to demonstrate that drawing samples from a locally approximated data manifold of the minority class can p