PyTorch DistributedDataParallel Template A small and quick example to run distributed training with PyTorch. It is also recommended to use DistributedDataParallel even on a single multi-gpu node because it is faster.