SelennLamson/AITextGenerator

Clarification of GPT2_BLOCK_SIZE

Closed this issue · 0 comments

Hello authors, first of all, really interesting paper. It's quite a creative approach to make use of the various contexts to do meaningful text generation.

I was wondering in utils.py, you set GPT2_BLOCK_SIZE to be 1020, when the max block size allowed is 1024. May I ask why the reason for 1020?

P.S I'm intending to write a new finetuning script making use of the new transformers.Trainer class, and will submit a pull request for you guys if you are interested :)