data-parallel-sgd

There are 3 repositories under data-parallel-sgd topic.

  • Tabrizian/learning-to-quantize

    Code for "Adaptive Gradient Quantization for Data-Parallel SGD", published in NeurIPS 2020.

    Language:Jupyter Notebook29535
  • fartashf/nuqsgd

    PyTorch Code for the paper "NUQSGD: Provably Communication-efficient Data-parallel SGD via Nonuniform Quantization"

    Language:Jupyter Notebook7302
  • thadikari/consensus_optimization

    Code for numerical results in the ICASSP 2020 paper "Decentralized optimization with non-identical sampling in presence of stragglers".

    Language:Python2100