/dpsgd-optimizer

Amortized version of the differentially private SGD algorithm published in "Deep Learning with Differential Privacy" by Abadi et al. Enforces privacy by clipping and sanitising the gradients with Gaussian noise during training.

Primary LanguagePythonMIT LicenseMIT

Watchers