Everything about class-imbalanced/long-tail learning: papers, codes, frameworks, and libraries | 有关类别不平衡/长尾学习的一切:论文、代码、框架与库
Class-imbalance (also known as the long-tail problem) is the fact that the classes are not represented equally in a classification problem, which is quite common in practice. For instance, fraud detection, prediction of rare adverse drug reactions and prediction gene families. Failure to account for the class imbalance often causes inaccurate and decreased predictive performance of many classification algorithms. Imbalanced learning aims to tackle the class imbalance problem to learn an unbiased model from imbalanced data.
Inspired by awesome-machine-learning. In this repository:
Note:
Check out Zhining's other open-source projects!
Imbalanced-Ensemble [PythonLib] |
Machine Learning [Awesome] |
Self-paced Ensemble [ICDE] |
Meta-Sampler [NeurIPS] |
imbalanced-ensemble [Github][Documentation][Gallery][Paper]
NOTE: written in python, easy to use.
imbalanced-ensemble
is a Python toolbox for quick implementing and deploying ensemble learning algorithms on class-imbalanced data. It is featured for:
imbalanced-learn [Github][Documentation][Paper]
NOTE: written in python, easy to use.
imbalanced-learn
is a python package offering a number of re-sampling techniques commonly used in datasets showing strong between-class imbalance. It is compatible with scikit-learn and is part of scikit-learn-contrib projects.smote_variants [Documentation][Github] - A collection of 85 minority over-sampling techniques for imbalanced learning with multi-class oversampling and model selection features (All writen in Python, also support R and Julia).
KEEL [Github][Paper] - KEEL provides a simple GUI based on data flow to design experiments with different datasets and computational intelligence algorithms (paying special attention to evolutionary algorithms) in order to assess the behavior of the algorithms. This tool includes many widely used imbalanced learning techniques such as (evolutionary) over/under-resampling, cost-sensitive learning, algorithm modification, and ensemble learning methods.
NOTE: wide variety of classical classification, regression, preprocessing algorithms included.
Learning from imbalanced data (IEEE TKDE, 2009, 6000+ citations) [Paper]
Learning from imbalanced data: open challenges and future directions (2016, 900+ citations) [Paper]
Learning from class-imbalanced data: Review of methods and applications (2017, 900+ citations) [Paper]
Self-paced Ensemble (ICDE 2020, 20+ citations) [Paper][Code][Slides][Zhihu/知乎][PyPI]
NOTE: versatile solution with outstanding performance and computational efficiency.
MESA: Boost Ensemble Imbalanced Learning with MEta-SAmpler (NeurIPS 2020) [Paper][Code][Video][Zhihu/知乎]
NOTE: learning an optimal sampling policy directly from data.
Exploratory Undersampling for Class-Imbalance Learning (IEEE Trans. on SMC, 2008, 1300+ citations) [Paper]
NOTE: simple but effective solution.
Bagging (1996, 20000+ citations) [Paper][Code] - Bagging predictor
Diversity Analysis on Imbalanced Data Sets by Using Ensemble Models (2009, 400+ citations) [Paper]
NOTE: See more over-sampling methods at smote-variants.
A Study of the Behavior of Several Methods for Balancing Training Data (2004, 2000+ citations) [Paper]
NOTE: extensive experimental evaluation involving 10 different over/under-sampling methods.
SMOTE-RSB (2012, 210+ citations) [Paper][Code] - Hybrid Preprocessing using SMOTE and Rough Sets Theory
SMOTE-IPF (2015, 180+ citations) [Paper][Code] - SMOTE with Iterative-Partitioning Filter
A systematic study of the class imbalance problem in convolutional neural networks (2018, 330+ citations) [Paper]
Survey on deep learning with class imbalance (2019, 50+ citations) [Paper]
NOTE: a recent comprehensive survey of the class imbalance problem in deep learning.
Focal loss for dense object detection (ICCV 2017, 2600+ citations) [Paper][Code (detectron2)][Code (unofficial)] - A uniform loss function that focuses training on a sparse set of hard examples to prevents the vast number of easy negatives from overwhelming the detector during training.
NOTE: elegant solution, high influence.
Training deep neural networks on imbalanced data sets (IJCNN 2016, 110+ citations) [Paper] - Mean (square) false error that can equally capture classification errors from both the majority class and the minority class.
Deep imbalanced attribute classification using visual attention aggregation (ECCV 2018, 30+ citation) [Paper][Code]
Imbalanced deep learning by minority class incremental rectification (TPAMI 2018, 60+ citations) [Paper] - Class Rectification Loss for minimizing the dominant effect of majority classes by discovering sparsely sampled boundaries of minority classes in an iterative batch-wise learning process.
Learning Imbalanced Datasets with Label-Distribution-Aware Margin Loss (NIPS 2019, 10+ citations) [Paper][Code] - A theoretically-principled label-distribution-aware margin (LDAM) loss motivated by minimizing a margin-based generalization bound.
Gradient harmonized single-stage detector (AAAI 2019, 40+ citations) [Paper][Code] - Compared to Focal Loss, which only down-weights "easy" negative examples, GHM also down-weights "very hard" examples as they are likely to be outliers.
Class-Balanced Loss Based on Effective Number of Samples (CVPR 2019, 70+ citations) [Paper][Code] - a simple and generic class-reweighting mechanism based on Effective Number of Samples.
Influence-Balanced Loss for Imbalanced Visual Classification (ICCV 2021) [Paper][Code]
AutoBalance: Optimized Loss Functions for Imbalanced Data (NeurIPS 2021) [Paper]
Label-Imbalanced and Group-Sensitive Classification under Overparameterization (NeurIPS 2021) [Paper][Code]
Learning to model the tail (NIPS 2017, 70+ citations) [Paper] - Transfer meta-knowledge from the data-rich classes in the head of the distribution to the data-poor classes in the tail.
Learning to reweight examples for robust deep learning (ICML 2018, 150+ citations) [Paper][Code] - Implicitly learn a weight function to reweight the samples in gradient updates of DNN.
NOTE: representative work to solve the class imbalance problem through meta-learning.
Meta-weight-net: Learning an explicit mapping for sample weighting (NIPS 2019) [Paper][Code] - Explicitly learn a weight function (with an MLP as the function approximator) to reweight the samples in gradient updates of DNN.
Learning Data Manipulation for Augmentation and Weighting (NIPS 2019) [Paper][Code]
Learning to Balance: Bayesian Meta-Learning for Imbalanced and Out-of-distribution Tasks (ICLR 2020) [Paper][Code]
MESA: Boost Ensemble Imbalanced Learning with MEta-SAmpler (NeurIPS 2020) [Paper][Code][Video]
NOTE: meta-learning-powered ensemble learning
Learning deep representation for imbalanced classification (CVPR 2016, 220+ citations) [Paper]
Supervised Class Distribution Learning for GANs-Based Imbalanced Classification (ICDM 2019) [Paper]
Decoupling Representation and Classifier for Long-tailed Recognition (ICLR 2020) [Paper][Code]
NOTE: interesting findings on representation learning and classifier learning
Supercharging Imbalanced Data Learning With Energy-based Contrastive Representation Transfer (NeurIPS 2021) [Paper]
Tailoring Self-Supervision for Supervised Learning (ECCV 2022) [Paper][Code]
Rethinking the Value of Labels for Improving Class-Imbalanced Learning (NeurIPS 2020) [Paper][Code][Video]
NOTE: semi-supervised training / self-supervised pre-training helps imbalance learning
Distribution Aligning Refinery of Pseudo-label for Imbalanced Semi-supervised Learning (NeurIPS 2020) [Paper][Code]
ABC: Auxiliary Balanced Classifier for Class-imbalanced Semi-supervised Learning (NeurIPS 2021) [Paper][Code]
Improving Contrastive Learning on Imbalanced Data via Open-World Sampling (NeurIPS 2021) [Paper]
DASO: Distribution-Aware Semantics-Oriented Pseudo-label for Imbalanced Semi-Supervised Learning (CVPR 2022) [Paper][Code]
Brain tumor segmentation with deep neural networks (2017, 1200+ citations) [Paper][Code (unofficial)]
Pre-training on balanced dataset, fine-tuning the last output layer before softmax on the original, imbalanced data.
imbalanced-learn
datasets
This collection of datasets is from
imblearn.datasets.fetch_datasets
.
ID | Name | Repository & Target | Ratio | #S | #F |
---|---|---|---|---|---|
1 | ecoli | UCI, target: imU | 8.6:1 | 336 | 7 |
2 | optical_digits | UCI, target: 8 | 9.1:1 | 5,620 | 64 |
3 | satimage | UCI, target: 4 | 9.3:1 | 6,435 | 36 |
4 | pen_digits | UCI, target: 5 | 9.4:1 | 10,992 | 16 |
5 | abalone | UCI, target: 7 | 9.7:1 | 4,177 | 10 |
6 | sick_euthyroid | UCI, target: sick euthyroid | 9.8:1 | 3,163 | 42 |
7 | spectrometer | UCI, target: > =44 | 11:1 | 531 | 93 |
8 | car_eval_34 | UCI, target: good, v good | 12:1 | 1,728 | 21 |
9 | isolet | UCI, target: A, B | 12:1 | 7,797 | 617 |
10 | us_crime | UCI, target: >0.65 | 12:1 | 1,994 | 100 |
11 | yeast_ml8 | LIBSVM, target: 8 | 13:1 | 2,417 | 103 |
12 | scene | LIBSVM, target: >one label | 13:1 | 2,407 | 294 |
13 | libras_move | UCI, target: 1 | 14:1 | 360 | 90 |
14 | thyroid_sick | UCI, target: sick | 15:1 | 3,772 | 52 |
15 | coil_2000 | KDD, CoIL, target: minority | 16:1 | 9,822 | 85 |
16 | arrhythmia | UCI, target: 06 | 17:1 | 452 | 278 |
17 | solar_flare_m0 | UCI, target: M->0 | 19:1 | 1,389 | 32 |
18 | oil | UCI, target: minority | 22:1 | 937 | 49 |
19 | car_eval_4 | UCI, target: vgood | 26:1 | 1,728 | 21 |
20 | wine_quality | UCI, wine, target: <=4 | 26:1 | 4,898 | 11 |
21 | letter_img | UCI, target: Z | 26:1 | 20,000 | 16 |
22 | yeast_me2 | UCI, target: ME2 | 28:1 | 1,484 | 8 |
23 | webpage | LIBSVM, w7a, target: minority | 33:1 | 34,780 | 300 |
24 | ozone_level | UCI, ozone, data | 34:1 | 2,536 | 72 |
25 | mammography | UCI, target: minority | 42:1 | 11,183 | 6 |
26 | protein_homo | KDD CUP 2004, minority | 111:1 | 145,751 | 74 |
27 | abalone_19 | UCI, target: 19 | 130:1 | 4,177 | 10 |
Imbalanced Databases
Thanks goes to these wonderful people (emoji key):
Zhining Liu 💻 🚧 🌍 |
曾阿信 🚧 |
WonJun Moon 💻 |
Gang Liu 💻 |
This project follows the all-contributors specification. Contributions of any kind welcome!