/SUPMER

pytorch implementation of SUPMER: Self-supervised Meta-Prompt Learning with Meta-Gradient Regularization for Few-shot Generalization (EMNLP2023 Findings)

Primary LanguagePython

Self-supervised Meta-Prompt Learning with Meta-Gradient Regularization for Few-shot Generalization

Kaihang Pan1, Juncheng Li1†, Hongye Song2, Jun Lin2, Xiaozhong Liu3, Siliang Tang1

1Zhejiang University, 2DAMO Academy, Alibaba Group, 3Worcester Polytechnic Institute

Corresponding Author

This repo contains the PyTorch implementation of Self-supervised Meta-Prompt Learning with Meta-Gradient Regularization for Few-shot Generalization, which is accepted by EMNLP 2023 (findings).

Installation

This repos is built based on the repo of PERFECT. Please refer to facebookresearch/perfect for the installation of the Python environment.

Meta-trained checkpoints

Run the code

To run the code of meta-training and downstream prompt-tuning, you can refer to the scripts provided at scripts/meta-train.sh and scripts/prompt-tuning.sh.

Acknowledgment

Our project is developed based on the following repositories:

  • Perfect: Prompt-free and Efficient Few-shot Learning with Language Models

  • PPT: Pre-trained Prompt Tuning for Few-shot Learning

Citation

If you found this work useful, please consider citing our paper as follows:

@article{pan2023self,
  title={Self-supervised Meta-Prompt Learning with Meta-Gradient Regularization for Few-shot Generalization},
  author={Pan, Kaihang and Li, Juncheng and Song, Hongye and Lin, Jun and Liu, Xiaozhong and Tang, Siliang},
  journal={arXiv preprint arXiv:2303.12314},
  year={2023}
}