/online-hyperparameter-optimization

PyTorch implementation of "Online Hyperparameter Optimization for Class-Incremental Learning" (AAAI 2023 Oral)

Primary LanguagePythonMIT LicenseMIT

Online Hyperparameter Optimization for Class-Incremental Learning

LICENSE Python PyTorch

[Paper] [Project Page]

This repository contains the PyTorch implementation for the AAAI 2023 Paper "Online Hyperparameter Optimization for Class-Incremental Learning" by Yaoyao Liu, Yingying Li, Bernt Schiele, and Qianru Sun. If you have any questions on this repository or the related paper, feel free to create an issue or send me an email.

Getting Started

In order to run this repository, we advise you to install python 3.6 and PyTorch 1.2.0 with Anaconda.

You may download Anaconda and read the installation instruction on their official website: https://www.anaconda.com/download/

Create a new environment and install PyTorch and torchvision on it:

conda create --name AANets-PyTorch python=3.6
conda activate AANets-PyTorch
conda install pytorch=1.2.0 
conda install torchvision -c pytorch

Then, you need to install the following packages using pip:

pip install tqdm scipy sklearn tensorboardX Pillow==6.2.2

Next, clone this repository and enter the folder online-hyperparameter-optimization:

git clone https://github.com/yaoyao-liu/online-hyperparameter-optimization.git
cd online-hyperparameter-optimization

Download the Datasets

CIFAR-100

It will be downloaded automatically by torchvision when running the experiments.

ImageNet-Subset

We create the ImageNet-Subset following LUCIR. You may download the dataset using the following links:

File information:

File name: ImageNet-Subset.tar
Size: 15.37 GB
MD5: ab2190e9dac15042a141561b9ba5d6e9

You need to untar the downloaded file, and put the folder seed_1993_subset_100_imagenet in the folder data.

Please note that the ImageNet-Subset is created from ImageNet. ImageNet is only allowed to be downloaded by researchers for non-commercial research and educational purposes. See the terms of ImageNet here.

Running Experiments

Running Experiments w/ LUCIR on CIFAR-100

python run_tfh_exp.py # Training from half
python run_tfs_exp.py # Training from scratch

We will update the code for other baselines later.

Citations

Please cite our papers if they are helpful to your work:

@article{Liu2023Online, 
  title   = {Online Hyperparameter Optimization for Class-Incremental Learning}, 
  author  = {Liu, Yaoyao and Li, Yingying and Schiele, Bernt and Sun, Qianru}, 
  journal = {Proceedings of the AAAI Conference on Artificial Intelligence}, 
  volume  = {37}, 
  number  = {7}, 
  year    = {2023}, 
  month   = {Jun.}, 
  pages   = {8906-8913},
  URL     = {https://ojs.aaai.org/index.php/AAAI/article/view/26070}, 
  DOI     = {10.1609/aaai.v37i7.26070}, 
}