BobLiu20/YOLOv3_PyTorch

the batch didn't slip when use muti-gpu

Opened this issue · 1 comments

when i use two gpu and set the batch_size=16, i found that the batch is 16 on per gpu, not 8, why?

s-JoL commented

config["batch_size"] *= len(config["parallels"])
batch size means batch size per device