data-parallel-sgd
There are 3 repositories under data-parallel-sgd topic.
Tabrizian/learning-to-quantize
Code for "Adaptive Gradient Quantization for Data-Parallel SGD", published in NeurIPS 2020.
fartashf/nuqsgd
PyTorch Code for the paper "NUQSGD: Provably Communication-efficient Data-parallel SGD via Nonuniform Quantization"
thadikari/consensus_optimization
Code for numerical results in the ICASSP 2020 paper "Decentralized optimization with non-identical sampling in presence of stragglers".