This repository contains the code for the paper:
Fast Few-Shot Classification by Few-Iteration Meta-Learning
Ardhendu Shekhar Tripathi, Martin Danelljan, Radu Timofte, Luc Van Gool
ICRA 2021
Autonomous agents interacting with the real world need to learn new concepts efficiently and reliably. This requires learning in a low-data regime, which is a highly challenging problem. We address this task by introducing a fast optimization-based meta-learning method for few-shot classification. It consists of an embedding network, providing a general representation of the image, and a base learner module. The latter learns a linear classifier during the inference through an unrolled optimization procedure. We design an inner learning objective composed of (i) a robust classification loss on the support set and (ii) an entropy loss, allowing transductive learning from unlabeled query samples. By employing an efficient initialization module and a Steepest Descent based optimization algorithm, our base learner predicts a powerful classifier within only a few iterations. Further, our strategy enables important aspects of the base learner objective to be learned during meta-training. To the best of our knowledge, this work is the first to integrate both induction and transduction into the base learner in an optimization-based meta-learning framework. We perform a comprehensive experimental analysis, demonstrating the speed and effectiveness of our approach on four few-shot classification datasets.
- Python 3.6+
- PyTorch 1.1.0+
- qpth 0.0.11+
- tqdm
-
Clone this repository:
git clone https://github.com/4rdhendu/FIML.git cd FIML
-
Download and decompress dataset files.
-
For each dataset loader, specify the path to the directory. For example, in FIML/data/mini_imagenet.py, specify:
_MINI_IMAGENET_DATASET_DIR = 'path/to/miniImageNet'
- To train FIML on 5-way miniImageNet benchmark with ResNet backbone:
python train.py --gpu 0,1,2,3 --save-path "./experiments/miniImageNet_FIML" --train-shot 15 \ --head FIML --network ResNet_DC --dataset miniImageNet --eps 0.1 --learn-rate 0.1 --val-shot 5
- For fine-tuning the shot specific hyperparameters of the trained model:
python train_finetune.py --gpu 0,1,2,3 --load "./experiments/miniImageNet_FIML/best_model.pth" --save-path "./experiments/miniImageNet_FIML" --train-shot 5 \
--head FIML --network ResNet_DC --dataset miniImageNet --eps 0.1 --val-shot 5
- To test FIML on 5-way miniImageNet 5-shot benchmark:
python test.py --gpu 0,1,2,3 --load ./experiments/miniImageNet_FIML/best_model.pth --episode 1000 \
--way 5 --shot 5 --query 15 --head FIML --network ResNet_DC --dataset miniImageNet
All the results in the paper can be reproduced by trying out different options in the run scripts.
This code is based on the implementation of MetaOptNet.
If you use this code for your research, please cite our paper:
@inproceedings{lee2019meta,
title={Fast Few-Shot Classification by Few-Iteration Meta-Learning},
author={Ardhendu Shekhar Tripathi, Martin Danelljan, Radu Timofte and Luc Van Gool},
booktitle={ICRA},
year={2021}
}