Codes and datasets for our paper "Cross-stitching Text and Knowledge Graph Encoders for Distantly Supervised Relation Extraction" (EMNLP 2022)
Install dependencies (consider using a virtual environment):
pip install -r requirements.txt
Then, follow the instruction to install apex.
We provide preprocessed NYT10 and Medline21.
Please download them from here: NYT10 (i.e., nyt.zip
) and Medline21 (i.e., bio.zip
), and unzip them under data/
Please download our pre-trained KG encoders from: NYT10 KG enc. and Medline21 KG enc., and put them under code_xbe/xbe/ckpt_kg/
Run the following script for training XBE on NYT10.
cd code_xbe/xbe/
bash train_xbe_nyt.sh
Run the following script for training XBE on Medline21.
cd code_xbe/xbe/
bash train_xbe_bio.sh
If you want to directly utilize our pre-trained XBE models, you can download them from: NYT10 XBE and Medline21 XBE, and put them under save/nyt/
and save/bio/
respectively.
Please run the following script for testing the trained XBE model on NYT10 and Medline21 datasets respectively.
cd code_xbe/xbe/
bash test_xbe_nyt.sh
cd code_xbe/xbe/
bash test_xbe_bio.sh
- Please run the following script for training and testing the XBE model without the cross-stitch mechanism.
cd code_xbe/xbe/ bash train_test_xbe_nyt_wo_xstitch.sh bash train_test_xbe_bio_wo_xstitch.sh
- Run the following script for the XBE model without Text encoder.
cd code_xbe/xbe/ bash train_test_xbe_nyt_wo_tx.sh bash train_test_xbe_bio_wo_tx.sh
- Run the following script for the XBE model without KG encoder.
cd code_xbe/xbe/ bash train_test_xbe_nyt_wo_kg.sh bash train_test_xbe_bio_wo_kg.sh
- Run the following script for the XBE model with frozen KG encoder.
cd code_xbe/xbe/ bash train_test_xbe_nyt_freeze_kg.sh bash train_test_xbe_bio_freeze_kg.sh
- Run the following script for the XBE model with the KG encoder that is randomly initialized.
cd code_xbe/xbe/ bash train_test_xbe_nyt_random_kg.sh bash train_test_xbe_bio_random_kg.sh