This repository is goining to contain the pytorch implementation of Matching Feature Sets for Few-shot Image Classification which has been accepted at Conference on Computer Vision and Pattern Recognition (CVPR) 2022, paper, poster. This paper introduces a set-based representation intrinsically builds a richer representation of images from the base classes, which can subsequently better transfer to the few-shot classes. To do so, we propose to adapt existing feature extractors to instead produce \emph{sets} of feature vectors from images. Our approach, dubbed SetFeat, embeds shallow self-attention mechanisms inside existing encoder architectures. The attention modules are lightweight, and as such our method results in encoders that have approximately the same number of parameters as their original versions. During training and inference, a set-to-set matching metric is used to perform image classification.
In the evaluations, we used Cuda 11.0 with the following list of dependencies:
- Python 3.8.10;
- Numpy 1.21.2;
- PyTorch 1.9.1+cu111;
- Torchvision 0.10.1;
- PIL 7.0.0;
- Einops 0.3.0.
- For dataset and backbone specifications please see table 1 and table 2 of our supp. mat.
- Download the CUB dataset from www.vision.caltech.edu
- Copy the dataset to "./benchmarks/cub/"
- In the "./benchmarks/cub/" directory, run cub
traintestval.py
: python cub traintestval.py - Feel free to copy your dataset in the benchmarks directory change and specify the directory from
args.py
- Go to the main directory and run
main.py
: python main.py
Please visit the project webpage for more information.
The code will be run with SetFeat12* by default. Feel free to change it to SetFeat4-64 in -backbone
to SetFeat4 in args.py
.
@inproceedings{afrasiyabi2022matching, title={Matching Feature Sets for Few-Shot Image Classification}, author={Afrasiyabi, Arman and Larochelle, Hugo and Lalonde, Jean-Fran{\c{c}}ois and Gagn{'e}, Christian}, booktitle={Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition}, pages={9014--9024}, year={2022} }