distributed training dataloader setting
wjlyclll opened this issue · 2 comments
wjlyclll commented
Thanks for publishing this useful work! May I ask a question will you keep different gpus training data with same aspect ratio (BucketManagers in different process share same seed) in one iteration during distributed training?Hope for your reply ~~