/P2_WebNLG2020

Primary LanguagePythonBSD 3-Clause "New" or "Revised" LicenseBSD-3-Clause

P2_WebNLG2020

This is the GitHub repo for our paper "P2: A Plan-and-Pretrain Approach for Knowledge Graph-to-Text Generation" by Qipeng Guo, Zhijing Jin, Ning Dai, Xipeng Qiu, Xiangyang Xue, David Wipf, and Zheng Zhang.

Our model achieves the top #1 performance at the English track of the WebNLG 2020 Challenge at INLG 2020 Workshop.

Model Introduction

Our P2 model consists of two steps:

Codes

Run the run.sh for the training and the fix_nonenglish.py is a post-process script to map the character back to the original non-english one.

Our model output on WebNLG 2020 test set is available at output.txt.

If you have any question, please feel free to email the first author, Qipeng Guo, by qpguo16@fudan.edu.cn.

Citation

@article{guo2020p2,
  title={P2: A Plan-and-Pretrain Approach for Knowledge Graph-to-Text Generation},
  author={Qipeng Guo, Zhijing Jin, Ning Dai, Xipeng Qiu, Xiangyang Xue, David Wipf, and Zheng Zhang},
  year={2017}
}