laihuiyuan/pre-trained-formality-transfer
Thank you BART! Rewarding Pre-Trained Models Improves Formality Style Transfer (ACL 2021)
PythonMIT
Stargazers
- aenaliph
- aflah02Max Planck Institute for Software Systems: MPI SWS
- AoliusTokyo Tech
- bldu
- cshanboByteDance
- dhruv2601@exhuman-ai
- etomoscowMoscow, Russia
- eubinecto@WRTN-Technologies
- frankdarkluoUniversity of Alberta
- gqjiaNEFU
- him-mah10Hyderabad
- jiaqingliu-nlpRenmin University of China
- jinmang2Republic of Korea
- JordannnnnPengUESTC
- Justus-Jonas@UKPLab, TU Darmstadt
- MariekeWeultjes
- mrqwertyuiop
- Oran-Ac@soarsmu | @RUCAIBOX
- Penny-AdmixtureEast Lansing, MI
- raymondng76AI Singapore
- Sayan-VB
- spurscoderBeijing
- talent404New York
- uco-physics
- vnsmvMoscow
- xyiiinexg3
- xysunnHong Kong
- Yebin46Seoul National University
- ymnseolSeoul, Republic of Korea
- yotamnahum@Samplead