BobLiu20/YOLOv3_PyTorch

Batch_size changed to 64

Opened this issue · 0 comments

Thank you for your sharing. I want to change the batch size to. But the memory is out of size. In the Darknet, there's a parameter called "subdivisions". Do you have any suggessions to divide the one mini-batch in order to increase the maximum batch size?