/Pytorch-DistributedDataParallel-Training-Tricks

A guide that integrates Pytorch DistributedDataParallel, Apex, warmup, learning rate scheduler, also mentions the set-up of early-stopping and random seed.

Primary LanguagePythonMIT LicenseMIT

No issues in this repository yet.