nlpyang/BertSum

default batch_size is 3000, I don't quite understand, why so huge?

yongzhuo opened this issue · 1 comments

default batch_size is 3000, I don't quite understand, why so huge?

The batch_size here is not the common sense batch size, here seems mean the token number in the batch. Check "batch" method in data_loader.py for detail.