LARC-CMU-SMU/FoodSeg103-Benchmark-v1

The question of batch_size

Opened this issue · 2 comments

tthe file FoodSeg103/mmseg/datasets/build.py

if dist:
sampler = DistributedSampler(
dataset, world_size, rank, shuffle=shuffle)
shuffle = False
batch_size = samples_per_gpu # 2
num_workers = workers_per_gpu
else:
sampler = None
batch_size = num_gpus * samples_per_gpu # 2
num_workers = num_gpus * workers_per_gpu

when dist is True, batch_size=2, but the paper shows the batch_size is 8 (4 GPUs) ,so I don't know why, whether batch_size (=2) is for single GPU ?

Yes, we use 4 GPU cards to run the experiments, and for each GPU, we set the batch size as 2. Totally the batch size is 2*4 =8.

Thanks very much