- Cross-Entropy Loss
- Focal Loss (ICCV'17)
- Class-Balanced Re-weightin (CVPR'19)
- LDAM Loss (NIPS'19)
- Balance Softmax Loss (NIPS'20)
- Decouple (ICLR'20): cRT
- Decouple (ICLR'20): tau-normalization
- Decouple (ICLR'20): LWS
- RIDE (ICLR'21)
- Logit Adjustment Loss (ICLR'21)
- DisAlign (CVPR'21)
- Bayias Loss (NIPS'21)
- Adaptive Logit Adjustment Loss (arXiv'21)
Here we simply show part of results to prove that our implementation is reasonable.
Method | Backbone | Reported Result | Our Implementation |
---|---|---|---|
CE | ResNet-10 | 34.8 | 35.3 |
Decouple-cRT | ResNet-10 | 41.8 | 41.8 |
Decouple-LWS | ResNet-10 | 41.4 | 41.6 |
BalanceSoftmax | ResNet-10 | 41.8 | 41.4 |
CE | ResNet-50 | 41.6 | 43.2 |
LDAM-DRW* | ResNet-50 | 48.8 | 51.2 |
Decouple-cRT | ResNet-50 | 47.3 | 48.7 |
Decouple-LWS | ResNet-50 | 47.7 | 49.3 |
Method | Datatset | Reported Result | Our Implementation |
---|---|---|---|
CE | CIFAR100-LT | 39.1 | 40.3 |
LDAM-DRW | CIFAR100-LT | 42.04 | 42.9 |
LogitAdjust | CIFAR100-LT | 43.89 | 45.3 |
BalanceSoftmax$^{\dagger}$ | CIFAR100-LT | 45.1 | 46.47 |
- Python >= 3.7, < 3.9
- PyTorch >= 1.6
- tqdm (Used in
test.py
) - tensorboard >= 1.14 (for visualization)
- pandas
- numpy
CIFAR code will download data automatically with the dataloader. We use data the same way as classifier-balancing. For ImageNet-LT and iNaturalist, please prepare data in the data
directory. ImageNet-LT can be found at this link. iNaturalist data should be the 2018 version from this repo (Note that it requires you to pay to download now). The annotation can be found at here. Please put them in the same location as below:
data
├── cifar-100-python
│ ├── file.txt~
│ ├── meta
│ ├── test
│ └── train
├── cifar-100-python.tar.gz
├── ImageNet_LT
│ ├── ImageNet_LT_open.txt
│ ├── ImageNet_LT_test.txt
│ ├── ImageNet_LT_train.txt
│ ├── ImageNet_LT_val.txt
│ ├── Tiny_ImageNet_LT_train.txt (Optional)
│ ├── Tiny_ImageNet_LT_val.txt (Optional)
│ ├── Tiny_ImageNet_LT_test.txt (Optional)
│ ├── test
│ ├── train
│ └── val
└── iNaturalist18
├── iNaturalist18_train.txt
├── iNaturalist18_val.txt
└── train_val2018
python train.py -c path_to_config_file
For example, to train a model with LDAM Loss on CIFAR-100-LT:
python train.py -c configs/CIFAR-100/LDAMLoss.json
python train.py -c path_to_config_file -crt path_to_stage_one_checkpoints
For example, to train a model with LWS classifier on ImageNet-LT:
python train.py -c configs/ImageNet-LT/R50_LWS.json -lws path_to_stage_one_checkpoints
To test a checkpoint, please put it with the corresponding config file.
python test.py -r path_to_checkpoint
python train.py -c path_to_config_file -r path_to_resume_checkpoint
Please see the pytorch template that we use for additional more general usages of this project
If you set fp16 in utils/util.py, it will enable fp16 training. However, this is susceptible to change (and may not work on all settings or models) and please double check if you are using it since we don't plan to focus on this part if you request help. Only some models work (see autograd in the code). We do not plan to provide support on this because it is not within our focus (just for faster training and less memory requirement). In our experiments, the use of FP16 training does not reduce the accuracy of the model, regardless of whether it is a small dataset (CIFAR-LT) or a large dataset(ImageNet_LT, iNaturalist).
We use tensorboard as a visualization tool, and provide the accuracy changes of each class and different groups during the training process:
tensorboard --logdir path_to_dir
We also provide the simple code to visualize feature distribution using t-SNE and calibration using the reliability diagrams, please check the parameters in plot_tsne.py and plot_ece.py, and then run:
python plot_tsne.py
or
python plot_ece.py
This is a project based on this pytorch template. The readme of the template explains its functionality, although we try to list most frequently used ones in this readme.
This project is licensed under the MIT License. See LICENSE for more details. The parts described below follow their original license.
This project is mainly based on RIDE's code base. In the process of reproducing and organizing the code, it also refers to some other excellent code repositories, such as decouple and LDAM.