/HPA

Official PyTorch Implementation of Holistic Prototype Activation for Few-Shot Segmentation (TPAMI 2022)

Primary LanguagePythonMIT LicenseMIT

Holistic Prototype Activation for Few-Shot Segmentation

This repo contains the code for our IEEE TPAMI 2022 paper "Holistic Prototype Activation for Few-Shot Segmentation" by Gong Cheng, Chunbo Lang, and Junwei Han.

📋 Note

Please refer to our BAM repository for the latest training/testing scripts. HPA can also be naturally combined with BAM (state-of-the-art) as a stronger meta-learner, with potential for further improvement.

Dependencies

  • Python 3.6
  • PyTorch 1.3.1
  • cuda 9.0
  • torchvision 0.4.2
  • tensorboardX 2.1

Datasets

Usage

  1. Download the prior prototypes of base categories from our Google Drive and put them under HPA/initmodel/prototypes.
  2. Download the pre-trained backbones from here.
  3. Change configuration via the .yaml files in HPA/config, then run the .sh scripts for training and testing.

To-Do List

  • Support different backbones
  • Support various annotations for training/testing
  • Zero-Shot Segmentation (ZSS)
  • FSS-1000 dataset
  • Multi-GPU training

References

This repo is built based on PFENet and DANet. Thanks for their great work!

BibTeX

If you find our work and this repository useful. Please consider giving a star ⭐ and citation 📚.

@article{lang2022hpa,
  title={Holistic Prototype Activation for Few-Shot Segmentation},
  author={Cheng, Gong and Lang, Chunbo and Han, Junwei},
  journal={IEEE Transactions on Pattern Analysis and Machine Intelligence},
  year={2022},
}