Why only padding tokens are generated after a period of training, but no words?
greens007 opened this issue · 0 comments
greens007 commented
Hi, Lisa
I try to duplicate your work. But I got into some trouble. Why only padding tokens are generated after a period of training, but no words? Do you ever meet it before? Looking forward to your reply!