MICS: Midpoint Interpolation to Learn Compact and Separated Representations for Few-Shot Class-Incremental Learning
MICS: Midpoint Interpolation to Learn Compact and Separated Representations for Few-Shot Class-Incremental Learning [Paper] [Supp]
Solang Kim, Yuho Jeong, Joon Sung Park, Sung Whan Yoon
In WACV 2024.
-
Clone this repository.
git clone http://github.com/solang/mics.git
-
Install the required dependency.
conda create env -y -n mics python=3.9 conda activate mics bash install.sh
-
Recommend: Check your CUDA version using the
nvcc -V
command and update the torch version in theinstall.sh
script accordingly. You can find the compatible PyTorch versions for your CUDA release at this link -
Our codes are tested on Ubuntu 18.04 with Python 3.9.5 and Pytorch 1.9.0. We utilized NVIDIA RTX A5000 for mini-ImageNet (CUDA 10.1) and GeForce RTX 3090 for CIFAR-100 and CUB-200-2011 (CUDA 11.1)
-
Download FSCIL benchmark datasets.
- CIFAR100: https://www.cs.toronto.edu/~kriz/cifar.html
- mini-ImageNet: There is no official website for mini-ImageNet. You can utilize the learn2learn python package or the unofficial Google Drive links for the download.
- CUB-200-2011: https://www.vision.caltech.edu/datasets/cub_200_2011/
cd scripts
bash [dataset]-mics.sh
- Before execution the scripts, set your dataset path and pre-trained model path options in scripts.
- Also, you can download our pretrained weight: Link (base session weight)
@inproceedings{kim2024mics,
title={MICS: Midpoint Interpolation to Learn Compact and Separated Representations for Few-Shot Class-Incremental Learning},
author={Kim, Solang and Jeong, Yuho and Park, Joon Sung and Yoon, Sung Whan},
booktitle={Proceedings of the IEEE/CVF Winter Conference on Applications of Computer Vision},
year={2024}
}