/mnist-adversarial-attack

adversarial-mnist

Primary LanguageJupyter NotebookMIT LicenseMIT

Adversarial Attacks on MNIST

This is an example of generating adversarial examples to exploit the deep MNIST Convolution network. It is inspired from It is inspired from Intriguing Properties of Neural Networks, Explaining and Harnessing Adversarial Examples and Breaking Convnets.

Running the code

To run the code, go to the link - https://colab.research.google.com/github/souravsingh/mnist-adversarial-attack/blob/master/Keras_MNIST_attack.ipynb

The link contains the contents present in the notebook.Make sure to change your runtime to GPU so as to ensure that training is faster.

The link as well as the notebook shown in the repository uses Keras library. There will be another notebook which will make use of PyTorch really soon.