Lance0218/Pytorch-DistributedDataParallel-Training-Tricks
A guide that integrates Pytorch DistributedDataParallel, Apex, warmup, learning rate scheduler, also mentions the set-up of early-stopping and random seed.
PythonMIT
No issues in this repository yet.