This project has adopted the Microsoft Open Source Code of Conduct.
Resources:
- Microsoft Open Source Code of Conduct
- Microsoft Code of Conduct FAQ
- Contact opencode@microsoft.com with questions or concerns
The source codes of the paper "Transformer-XH: Multi-evidence Reasoning with Extra Hop Attention (ICLR 2020)".
First, Run python setup.py develop to install required dependencies for transformer-xh. Also install apex (for distributed training) following official documentation here.
You can run bash script download.sh
For Hotpot QA, we provide processed graph (Transformer-XH) input here, after downloading, unzip it and put into ./data folder We also provide trained model here, unzip the downloaded model and put into ./experiments folder
Similarly, we provide processed graph in fever here, and trained model here.
Use hotpot_train.sh for training on hotpot QA task, hotpot_eval.sh for evaluation (default fp16 training).
Similarly, fever_train.sh for training on FEVER task, fever_eval.sh for evaluation (default fp16 training).
If you have questions, suggestions and bug reports, please email chenz@cs.umd.edu and/or Chenyan.Xiong@microsoft.com.