Imbalanced-Classification

Learning on long-tailed CIFAR10 or CIFAR100 (CVPR2019) [paper]

Imbalanced ratio=10

python3 train.py --dataset CIFAR10 (or CIFAR100) --gpu 0 --ratio 0.1

Imbalanced ratio=100

python3 train.py --dataset CIFAR10 (or CIFAR100) --gpu 0 --ratio 0.01

Imbalanced ratio=200

python3 train.py --dataset CIFAR10 (or CIFAR100) --gpu 0 --ratio 0.005

Imbalanced ratio=500

python3 train.py --dataset CIFAR10 (or CIFAR100) --gpu 0 --ratio 0.002

Learning with effective loss function

If you want to utilize various loss functions, you can directly run the following code to train the model.

Focal loss (ICCV2017) [paper]

python3 train.py --dataset CIFAR10 --gpu 0 --ratio 0.1 --loss Focal

Class balanced loss (CVPR2019) [paper]

python3 train.py --dataset CIFAR10 --gpu 0 --ratio 0.1 --loss CBW

Generalized reweight loss (CVPR2021) [paper]

ppython3 train.py --dataset CIFAR10 --gpu 0 --ratio 0.1 --loss GR

Balanced softmax loss (NeurIPS2020) [paper]

python3 train.py --dataset CIFAR10 --gpu 0 --ratio 0.1 --loss BS

LADE loss (CVPR2021) [paper]

python3 train.py --dataset CIFAR10 --gpu 0 --ratio 0.1 --loss LADE

LDAM loss (NeurIPS2019) [paper]

python3 train.py --dataset CIFAR10 --gpu 0 --ratio 0.1 --loss LDAM --norm

Logit adjusted loss (ICLR2021) [paper]

python3 train.py --dataset CIFAR10 --gpu 0 --ratio 0.1 --loss LA

Vector scaling loss (NeurIPS2021) [paper]

ppython3 train.py --dataset CIFAR10 --gpu 0 --ratio 0.1 --loss VS

Influence-Balanced loss (ICCV2021) [paper]

python3 train.py --dataset CIFAR10 --gpu 0 --ratio 0.1 --loss IB
python3 train.py --dataset CIFAR10 --gpu 0 --ratio 0.1 --loss IBFL

ELM loss (SMC2023) [paper]

python3 train.py --dataset CIFAR10 --gpu 0 --ratio 0.1 --loss ELM --norm

False cross-entropy loss [soon]

python3 train.py --dataset CIFAR10 --gpu 0 --ratio 0.1 --loss FCE
python3 train.py --dataset CIFAR10 --gpu 0 --ratio 0.1 --loss LAFCE

Learning with class balancing weight [paper]

If you want to apply class balancing weight, you can directly run the following code to train the model.

python3 train.py --dataset CIFAR10 --gpu 0 --ratio 0.1 --weight_rule CBReweight
python3 train.py --dataset CIFAR10 --gpu 0 --ratio 0.1 --weight_rule IBReweight

If you want to apply the weighting scheduler (which was proposed in LDAM loss), you can directly run the following code to train the model.

Learning with weighting scheduler [paper]

python3 train.py --dataset CIFAR10 --gpu 0 --ratio 0.1 --weight_rule CBReweight --weight_scheduler DRW

If you want to improve the performance of your model, you can apply the hard augmentations to the model.

Learning with hard augmentation

Mixup [paper]

python3 train.py --dataset CIFAR10 --gpu 0 --ratio 0.1 --augmentation Mixup

CutMix [paper]

python3 train.py --dataset CIFAR10 --gpu 0 --ratio 0.1 --augmentation CutMix