Num workers for data loaders must be 1 otherwise it duplicates data
KennethEnevoldsen opened this issue · 0 comments
KennethEnevoldsen commented
See an explanation (and potential solution) here:
https://discuss.pytorch.org/t/iterable-pytorch-dataset-with-multiple-workers/135475
An alternative might be to use a non-iterable dataset (this is usually slower because you are not loading it in chunks), but it might be notably faster if we have multiple cores working on it.
https://discuss.pytorch.org/t/iterable-pytorch-dataset-with-multiple-workers/135475