/differential-privacy-federated-learning-1

Curated notebooks on how to train neural networks using differential privacy and federated learning.

Primary LanguageJupyter NotebookMIT LicenseMIT

Differential Privacy & Federated Learning.

Curated notebooks on how to train neural networks using differential privacy and federated learning.

Intro Notebooks

Before you start learning about Differential Privacy and Federated Learning, it's important to understand tensors; the fundamental data structures for neural networks.

Learn about tensors:

Creating simple neural networks

Creating Dense Networks with MNIST data

Transfer Learning

Most of the time you won't want to train a whole convolutional network yourself. Modern ConvNets training on huge datasets like ImageNet take weeks on multiple GPUs. Transfer Learning helps you solve this problem.

What is Differential Privacy?

Differential Privacy is a set of techniques for preventing a model from accidentally memorizing secrets present in a training dataset during the learning process.

The key points under Differential Privacy are:

  • Make a promise to a data subject that: You won’t be affected, adversely or otherwise, by allowing your data to be used in any analysis, no matter what studies, datasets or information sources, are available.
  • Ensure that the model learning from sensitive data are only learning what they are supposed to learn without accidentally learning what they are not supposed to learn from their data

Here's some notebooks to explain the concept further:

Federated Learning

Instead of bringing data all to one place for training, federated learning is done by bringing the model to the data. this allows a data owner to maintain the only copy of their information.

This notebook on Federated Learning explains more in detail.