Multi-gpu training slower than single gpu
Opened this issue · 0 comments
PallottaEnrico commented
Hi, I just wanted to know whether anyone else faced the same issue.
I'm running a training on cityscapes, after a few test I decided to scale it up but I noticed something weird.
When training with 4 GPUs it's way slower compared to training on 1 GPU only.
Tried with different model settings (smaller and bigger), looks like it takes 4x the time for 100 steps training time.
EDIT: Also tried with 2 GPUs on my PC, the training time is 2x then single gpu training.
Code looks fine to me.