This is a library dedicated to adversarial machine learning. Its purpose is to allow rapid crafting and analysis of attacks and defense methods for machine learning models. The Adversarial Robustness Toolbox provides an implementation for many state-of-the-art methods for attacking and defending classifiers.
The library is still under development. Feedback, bug reports and extensions are highly appreciated. Get in touch with us on Slack (invite here)!
The library contains implementations of the following attacks:
- DeepFool (Moosavi-Dezfooli et al., 2015)
- Fast Gradient Method (Goodfellow et al., 2014)
- Basic Iterative Method (Kurakin et al., 2016)
- Jacobian Saliency Map (Papernot et al., 2016)
- Universal Perturbation (Moosavi-Dezfooli et al., 2016)
- Virtual Adversarial Method (Miyato et al., 2015)
- C&W Attack (Carlini and Wagner, 2016)
- NewtonFool (Jang et al., 2017)
The following defense methods are also supported:
- Feature squeezing (Xu et al., 2017)
- Spatial smoothing (Xu et al., 2017)
- Label smoothing (Warde-Farley and Goodfellow, 2016)
- Adversarial training (Szegedy et al., 2013)
- Virtual adversarial training (Miyato et al., 2015)
- Gaussian data augmentation (Zantedeschi et al., 2017)
The toolbox is designed to run with Python 2 and 3.
The library can be installed from the PyPi repository using pip
:
pip install adversarial-robustness-toolbox
For the most recent version of the library, either download the source code or clone the repository in your directory of choice:
git clone https://github.com/IBM/adversarial-robustness-toolbox
To install ART, do the following in the project folder:
pip install .
The library comes with a basic set of unit tests. To check your install, you can run all the unit tests by calling the test script in the install folder:
bash run_tests.sh
Some examples of how to use ART when writing your own code can be found in the examples
folder. See examples/README.md
for more information about what each example does. To run an example, use the following command:
python examples/<example_name>.py
The notebooks
folder contains Jupyter notebooks with detailed walkthroughs of some usage scenarios.
If you use ART for research, please consider citing the following reference paper:
@article{art2018,
title = {Adversarial Robustness Toolbox v0.3.0},
author = {Nicolae, Maria-Irina and Sinn, Mathieu and Tran, Minh~Ngoc and Rawat, Ambrish and Wistuba, Martin and Zantedeschi, Valentina and Baracaldo, Nathalie and Chen, Bryant and Ludwig, Heiko and Molloy, Ian and Edwards, Ben},
journal = {CoRR},
volume = {1807.01069}
year = {2018},
url = {https://arxiv.org/pdf/1807.01069}
}