/Distill-BERT-Textgen

Research code for ACL 2020 paper: "Distilling Knowledge Learned in BERT for Text Generation".

Primary LanguagePythonMIT LicenseMIT

Watchers

No one’s watching this repository yet.