Training script of nll.
Closed this issue · 2 comments
Chunngai commented
Hello. Thanks for your work and kindly releasing the code. Can you also provide the fine-tuning script of NLL except CL- and CL?
MichaelCaohn commented
Hi,
Sorry, I did not save the fine-tuning script for NLL. As long as I can recall:
For GEC-BART, we used the fine-tuning script provided in https://github.com/Katsumata420/generic-pretrained-GEC/blob/master/BART-GEC/train.sh , but in order to fit in V100 GPU, we modified the MAX_TOKENS=4000 to 1400 and changed the UPDATE_FREQ from 1 to 4.
For GEC-PD, we reported the NLL result used in the CWEB dataset paper (https://aclanthology.org/2020.emnlp-main.680.pdf).
Chunngai commented
Got it. Thank you!