facebookresearch/CodeGen

Bug in epoch calculation

dineshkh opened this issue · 0 comments

At line no. 1483 in the file codegen_sources/model/src/trainer.py. the code is
self.n_sentences += params.batch_size I think it should be self.n_sentences += len1.size(0)

self.n_sentences += params.batch_size

With above bug notion of one epoch becomes wrong because of check at following line.

while trainer.n_sentences < trainer.epoch_size: