Evaluation of Sampling Methods for Learning from Imbalanced Data

Garima Goel, Liam Maguire, Yuhua Li, Sean McLoone

Research output: Chapter in Book/Report/Conference proceedingChapter

8 Citations (Scopus)

Abstract

The problem of learning from imbalanced data is of critical importance in a large number of application domains and can be a bottleneck in the performance of various conventional learning methods that assume the data distribution to be balanced. The class imbalance problem corresponds to dealing with the situation where one class massively outnumbers the other. The imbalance between majority and minority would lead machine learning to be biased and produce unreliable outcomes if the imbalanced data is used directly. There has been increasing interest in this research area and a number of algorithms have been developed. However, independent evaluation of the algorithms is limited. This paper aims at evaluating the performance of five representative data sampling methods namely SMOTE, ADASYN, BorderlineSMOTE, SMOTETomek and RUSBoost that deal with class imbalance problems. A comparative study is conducted and the performance of each method is critically analysed in terms of assessment metrics.
Original languageEnglish
Title of host publicationLecture Notes in Computer Science: Intelligent Computing Theories
PublisherSpringer
Pages392-401
Volume7995
ISBN (Print)978-3-642-39478-2
DOIs
Publication statusPublished - 2013

Fingerprint Dive into the research topics of 'Evaluation of Sampling Methods for Learning from Imbalanced Data'. Together they form a unique fingerprint.

  • Cite this

    Goel, G., Maguire, L., Li, Y., & McLoone, S. (2013). Evaluation of Sampling Methods for Learning from Imbalanced Data. In Lecture Notes in Computer Science: Intelligent Computing Theories (Vol. 7995, pp. 392-401). Springer. https://doi.org/10.1007/978-3-642-39479-9_47