zhuchen03/FreeLB

Could you add some comments in the code?

Opened this issue · 1 comments

It's hard to understand the code, including bash shell.

Hi Pengbo,

Sorry for not writing enough comments. I just added comments to the hyperparemters used in fairseq-RoBERTa/launch/FreeLB/mnli-fp32-clip.sh and huggingface-transformers/launch/run_glue.sh, so that you can read the code starting from these scripts...

fairseq is more convolved, but I think it should be much easier to read the code of Huggingface's transformers. The algorithm is all included in huggingface-transformers/examples/run_glue_freelb.py, plus some modification for the dropout mask in the ALBERT model. fairseq includes our implementations for FreeAT and YOPO, but will take more time to read.

I will add more comments to the code soon!