CoinCheung/triplet-reid-pytorch

The speed of batch sampler

hzh8311 opened this issue · 1 comments

I use the batch sampler in this repo, but the data fetch speed is quite slow, which is ~250s for 50 batches. The batch size is 256, P=32, K=8, use 4 workers and 4 GPUs. Have you encountered with this problem?

The 250s is only for data loading, and the time for model training is ~40s