the batch didn't slip when use muti-gpu
Opened this issue · 1 comments
1453042287 commented
when i use two gpu and set the batch_size=16, i found that the batch is 16 on per gpu, not 8, why?
s-JoL commented
config["batch_size"] *= len(config["parallels"])
batch size means batch size per device