anosorae/IRRA

Multi gpu training problem

Opened this issue · 2 comments

Thank you for your excellent work! I am very interested in your work and am currently using multiple GPUs for distributed training. As a beginner, I would like to ask if it is normal for the number of iterations of an epoch to not decrease when using multiple GOUs?

Have you solve this problem? I have the same question! I am looking forward your reply!

Splitting training tasks across multiple Gpus speeds up training, but does not affect the number of iterations in each epoch.