Article Preview
Top1. Introduction
The issue of class imbalance occurs in case of real-life datasets with uneven data scattering, formally known as imbalanced data. The phenomenon of data points having class labels, where one of the class instances are over-represented by the rest of the instances, is termed as the class imbalance problem (Das et al., 2013). In practice, the class having lower coverage appear to be the significant one associated with a higher misclassification cost (Lopez et al., 2013; Elkan 2001). Conventional learning models, when implemented in the domain of imbalance data space, tend to be biased towards the over-represented class, which in turn degrades the performance of the learning model, and increases the misclassification of minority class instances. The situation can charge higher cost when it is crucial to correctly classify the minority class instances (Oh 2011; Kang & Cho 2006).
The solutions proposed to solve the class imbalance problem can be categorized into two major groups: (a) Data level solution formally known as Data Sampling which provides to modify data distribution and to yield a revised set with balanced data distribution, and (b) Algorithmic level solution which modifies the classifier in order to improve the classifier accuracy (Das et al., 2013; Lopez et al., 2013; Phua et al., 2004). Data level solutions can be either undersampling (eliminating majority class instances) or oversampling (adding duplicate minority class instances). Each of these solutions has their own significant drawbacks. Examples of data-level solutions include condensed nearest neighbor (CNN) (Phua et al., 2004), one-sided-selection (OSS) (Turney 2000), Tomek-link (Hart 1968), cluster-based-undersampling (Tomek 1976), inverse random undersampling (Kubat & Matwin 1997), Synthetic Minority Oversampling (SMOTE) (García & Herrera 2009), Borderline-SMOTE (Agrawal et al., 2015), ADASYN (Guo et al., 2008), Safe-SMOTE (Ling & Li 1998), etc. Examples of algorithmic level solutions include Cost-Sensitive Learning technique (CSL) (Nguyen et al., 2010) Improved weighted-Extreme Learning Machine (IW-ELM) (Lu et al., 2019), RUSBoost (Seiffert et al., 2010), SMOTEBoost (Rahman & Davis 2013) etc.