Dynamic Global Memory for Document-level Argument Extraction

Code and data for paper the ACL 22 paper (link).


  • pytorch=1.6
  • transformers=3.1.0
  • pytorch-lightning=1.0.6
  • spacy=3.0 # conflicts with transformers
  • pytorch-struct=0.4
  • sentence-transformers=2.1.0


  • WikiEvents (included in this repo)


  • Normal data setting

    Train:./scripts/train_kairos.sh Test: ./scripts/test_kairos.sh

  • Adversarial data setting

    Train:./scripts/train_kairos_adv.sh Test: ./scripts/test_kairos_adv.sh


If you use our code or data/outputs, please cite:

  author = {Du, Xinya and Li, Sha and Ji, Heng},
  title = {Dynamic Global Memory for Document-level Argument Extraction},
  booktitle = {Proceedings of the 60th Annual Meeting of the Association for Computational Linguistics},
  year = {2022},