/deel-lip

Build and train Lipschitz constrained networks: TensorFlow implementation of k-Lipschitz layers

Primary LanguagePythonMIT LicenseMIT

Library Banner

PyLint Tox Pypi Pepy
Explore DEEL-LIP docs Β»

πŸ‘‹ Welcome to deel-lip documentation!

Controlling the Lipschitz constant of a layer or a whole neural network has many applications ranging from adversarial robustness to Wasserstein distance estimation.

This library provides an efficient implementation of k-Lispchitz layers for keras.

Caution

Incompatibility with TensorFlow >= 2.16 and Keras 3

Due to significant changes introduced in TensorFlow version 2.16 and Keras 3, this package is currently incompatible with TensorFlow versions 2.16 and above. Users are advised to use TensorFlow versions lower than 2.16 to ensure compatibility and proper functionality of this package.

 We are actively working on updating the package to support Keras 3. Please stay tuned for updates. For now, make sure to install an earlier version of TensorFlow by specifying it in your environment.

πŸ“š Table of contents

πŸš€ Quick Start

You can install deel-lip directly from pypi:

pip install deel-lip

In order to use deel-lip, you also need a valid tensorflow installation. deel-lip supports tensorflow versions 2.x.

πŸ”₯ Tutorials

Tutorial Name Notebook
Getting Started 1 - Creating a 1-Lipschitz neural network Open In Colab
Getting Started 2 - Training an adversarially robust 1-Lipschitz neural network Open In Colab
Wasserstein distance estimation on toy example Open In Colab
HKR Classifier on toy dataset Open In Colab
HKR classifier on MNIST dataset Open In Colab
HKR multiclass and fooling Open In Colab

πŸ“¦ What's Included

  • k-Lipschitz variant of keras layers such as Dense, Conv2D and Pooling,
  • activation functions compatible with keras,
  • kernel initializers and kernel constraints for keras,
  • loss functions that make use of Lipschitz constrained networks (see our paper for more information),
  • tools to monitor the singular values of kernels during training,
  • tools to convert k-Lipschitz network to regular network for faster inference.

πŸ‘ Contributing

To contribute, you can open an issue, or fork this repository and then submit changes through a pull-request. We use black to format the code and follow PEP-8 convention. To check that your code will pass the lint-checks, you can run:

tox -e py36-lint

You need tox in order to run this. You can install it via pip:

pip install tox

πŸ‘€ See Also

More from the DEEL project:

  • Xplique a Python library exclusively dedicated to explaining neural networks.
  • Influenciae Python toolkit dedicated to computing influence values for the discovery of potentially problematic samples in a dataset.
  • deel-torchlip a Python library for training k-Lipschitz neural networks on PyTorch.
  • DEEL White paper a summary of the DEEL team on the challenges of certifiable AI and the role of data quality, representativity and explainability for this purpose.

πŸ™ Acknowledgments

DEEL Logo
This project received funding from the French ”Investing for the Future – PIA3” program within the Artificial and Natural Intelligence Toulouse Institute (ANITI). The authors gratefully acknowledge the support of the DEEL project.

πŸ—žοΈ Citation

This library has been built to support the work presented in the paper Achieving robustness in classification using optimaltransport with Hinge regularization which aim provable and efficient robustness by design.

This work can be cited as:

@misc{2006.06520,
    Author = {Mathieu Serrurier and Franck Mamalet and Alberto GonzΓ‘lez-Sanz and Thibaut Boissin and Jean-Michel Loubes and Eustasio del Barrio},
    Title = {Achieving robustness in classification using optimal transport with hinge regularization},
    Year = {2020},
    Eprint = {arXiv:2006.06520},
}

πŸ“ License

The package is released under MIT license.