/POT

Python Optimal Transport library

Primary LanguagePythonMIT LicenseMIT

POT: Python Optimal Transport

PyPI version Anaconda Cloud Build Status Documentation Status Downloads Anaconda downloads License

This open source Python library provide several solvers for optimization problems related to Optimal Transport for signal, image processing and machine learning.

It provides the following solvers:

  • OT Network Flow solver for the linear program/ Earth Movers Distance [1].
  • Entropic regularization OT solver with Sinkhorn Knopp Algorithm [2], stabilized version [9][10] and greedy Sinkhorn [22] with optional GPU implementation (requires cupy).
  • Sinkhorn divergence [23] and entropic regularization OT from empirical data.
  • Smooth optimal transport solvers (dual and semi-dual) for KL and squared L2 regularizations [17].
  • Non regularized Wasserstein barycenters [16] with LP solver (only small scale).
  • Bregman projections for Wasserstein barycenter [3], convolutional barycenter [21] and unmixing [4].
  • Optimal transport for domain adaptation with group lasso regularization [5]
  • Conditional gradient [6] and Generalized conditional gradient for regularized OT [7].
  • Linear OT [14] and Joint OT matrix and mapping estimation [8].
  • Wasserstein Discriminant Analysis [11] (requires autograd + pymanopt).
  • Gromov-Wasserstein distances and barycenters ([13] and regularized [12])
  • Stochastic Optimization for Large-scale Optimal Transport (semi-dual problem [18] and dual problem [19])
  • Non regularized free support Wasserstein barycenters [20].
  • Unbalanced OT with KL relaxation distance and barycenter [10, 25].
  • Screening Sinkhorn Algorithm for OT [26].

Some demonstrations (both in Python and Jupyter Notebook format) are available in the examples folder.

Using and citing the toolbox

If you use this toolbox in your research and find it useful, please cite POT using the following bibtex reference:

@misc{flamary2017pot,
title={POT Python Optimal Transport library},
author={Flamary, R{'e}mi and Courty, Nicolas},
url={https://github.com/rflamary/POT},
year={2017}
}

Installation

The library has been tested on Linux, MacOSX and Windows. It requires a C++ compiler for building/installing the EMD solver and relies on the following Python modules:

  • Numpy (>=1.11)
  • Scipy (>=1.0)
  • Cython (>=0.23)
  • Matplotlib (>=1.5)

Pip installation

Note that due to a limitation of pip, cython and numpy need to be installed prior to installing POT. This can be done easily with

pip install numpy cython

You can install the toolbox through PyPI with:

pip install POT

or get the very latest version by downloading it and then running:

python setup.py install --user # for user install (no root)

Anaconda installation with conda-forge

If you use the Anaconda python distribution, POT is available in conda-forge. To install it and the required dependencies:

conda install -c conda-forge pot

Post installation check

After a correct installation, you should be able to import the module without errors:

import ot

Note that for easier access the module is name ot instead of pot.

Dependencies

Some sub-modules require additional dependences which are discussed below

  • ot.dr (Wasserstein dimensionality reduction) depends on autograd and pymanopt that can be installed with:
pip install pymanopt autograd
  • ot.gpu (GPU accelerated OT) depends on cupy that have to be installed following instructions on this page.

obviously you need CUDA installed and a compatible GPU.

Examples

Short examples

  • Import the toolbox
import ot
  • Compute Wasserstein distances
# a,b are 1D histograms (sum to 1 and positive)
# M is the ground cost matrix
Wd=ot.emd2(a,b,M) # exact linear program
Wd_reg=ot.sinkhorn2(a,b,M,reg) # entropic regularized OT
# if b is a matrix compute all distances to a and return a vector
  • Compute OT matrix
# a,b are 1D histograms (sum to 1 and positive)
# M is the ground cost matrix
T=ot.emd(a,b,M) # exact linear program
T_reg=ot.sinkhorn(a,b,M,reg) # entropic regularized OT
  • Compute Wasserstein barycenter
# A is a n*d matrix containing d  1D histograms
# M is the ground cost matrix
ba=ot.barycenter(A,M,reg) # reg is regularization parameter

Examples and Notebooks

The examples folder contain several examples and use case for the library. The full documentation is available on Readthedocs.

Here is a list of the Python notebooks available here if you want a quick look:

You can also see the notebooks with Jupyter nbviewer.

Acknowledgements

This toolbox has been created and is maintained by

The contributors to this library are

This toolbox benefit a lot from open source research and we would like to thank the following persons for providing some code (in various languages):

Contributions and code of conduct

Every contribution is welcome and should respect the contribution guidelines. Each member of the project is expected to follow the code of conduct.

Support

You can ask questions and join the development discussion:

You can also post bug reports and feature requests in Github issues. Make sure to read our guidelines first.

References

[1] Bonneel, N., Van De Panne, M., Paris, S., & Heidrich, W. (2011, December). Displacement interpolation using Lagrangian mass transport. In ACM Transactions on Graphics (TOG) (Vol. 30, No. 6, p. 158). ACM.

[2] Cuturi, M. (2013). Sinkhorn distances: Lightspeed computation of optimal transport. In Advances in Neural Information Processing Systems (pp. 2292-2300).

[3] Benamou, J. D., Carlier, G., Cuturi, M., Nenna, L., & Peyré, G. (2015). Iterative Bregman projections for regularized transportation problems. SIAM Journal on Scientific Computing, 37(2), A1111-A1138.

[4] S. Nakhostin, N. Courty, R. Flamary, D. Tuia, T. Corpetti, Supervised planetary unmixing with optimal transport, Whorkshop on Hyperspectral Image and Signal Processing : Evolution in Remote Sensing (WHISPERS), 2016.

[5] N. Courty; R. Flamary; D. Tuia; A. Rakotomamonjy, Optimal Transport for Domain Adaptation, in IEEE Transactions on Pattern Analysis and Machine Intelligence , vol.PP, no.99, pp.1-1

[6] Ferradans, S., Papadakis, N., Peyré, G., & Aujol, J. F. (2014). Regularized discrete optimal transport. SIAM Journal on Imaging Sciences, 7(3), 1853-1882.

[7] Rakotomamonjy, A., Flamary, R., & Courty, N. (2015). Generalized conditional gradient: analysis of convergence and applications. arXiv preprint arXiv:1510.06567.

[8] M. Perrot, N. Courty, R. Flamary, A. Habrard (2016), Mapping estimation for discrete optimal transport, Neural Information Processing Systems (NIPS).

[9] Schmitzer, B. (2016). Stabilized Sparse Scaling Algorithms for Entropy Regularized Transport Problems. arXiv preprint arXiv:1610.06519.

[10] Chizat, L., Peyré, G., Schmitzer, B., & Vialard, F. X. (2016). Scaling algorithms for unbalanced transport problems. arXiv preprint arXiv:1607.05816.

[11] Flamary, R., Cuturi, M., Courty, N., & Rakotomamonjy, A. (2016). Wasserstein Discriminant Analysis. arXiv preprint arXiv:1608.08063.

[12] Gabriel Peyré, Marco Cuturi, and Justin Solomon (2016), Gromov-Wasserstein averaging of kernel and distance matrices International Conference on Machine Learning (ICML).

[13] Mémoli, Facundo (2011). Gromov–Wasserstein distances and the metric approach to object matching. Foundations of computational mathematics 11.4 : 417-487.

[14] Knott, M. and Smith, C. S. (1984).On the optimal mapping of distributions, Journal of Optimization Theory and Applications Vol 43.

[15] Peyré, G., & Cuturi, M. (2018). Computational Optimal Transport .

[16] Agueh, M., & Carlier, G. (2011). Barycenters in the Wasserstein space. SIAM Journal on Mathematical Analysis, 43(2), 904-924.

[17] Blondel, M., Seguy, V., & Rolet, A. (2018). Smooth and Sparse Optimal Transport. Proceedings of the Twenty-First International Conference on Artificial Intelligence and Statistics (AISTATS).

[18] Genevay, A., Cuturi, M., Peyré, G. & Bach, F. (2016) Stochastic Optimization for Large-scale Optimal Transport. Advances in Neural Information Processing Systems (2016).

[19] Seguy, V., Bhushan Damodaran, B., Flamary, R., Courty, N., Rolet, A.& Blondel, M. Large-scale Optimal Transport and Mapping Estimation. International Conference on Learning Representation (2018)

[20] Cuturi, M. and Doucet, A. (2014) Fast Computation of Wasserstein Barycenters. International Conference in Machine Learning

[21] Solomon, J., De Goes, F., Peyré, G., Cuturi, M., Butscher, A., Nguyen, A. & Guibas, L. (2015). Convolutional wasserstein distances: Efficient optimal transportation on geometric domains. ACM Transactions on Graphics (TOG), 34(4), 66.

[22] J. Altschuler, J.Weed, P. Rigollet, (2017) Near-linear time approximation algorithms for optimal transport via Sinkhorn iteration, Advances in Neural Information Processing Systems (NIPS) 31

[23] Aude, G., Peyré, G., Cuturi, M., Learning Generative Models with Sinkhorn Divergences, Proceedings of the Twenty-First International Conference on Artficial Intelligence and Statistics, (AISTATS) 21, 2018

[24] Vayer, T., Chapel, L., Flamary, R., Tavenard, R. and Courty, N. (2019). Optimal Transport for structured data with application on graphs Proceedings of the 36th International Conference on Machine Learning (ICML).

[25] Frogner C., Zhang C., Mobahi H., Araya-Polo M., Poggio T. (2015). Learning with a Wasserstein Loss Advances in Neural Information Processing Systems (NIPS).

[26] Alaya M. Z., Bérar M., Gasso G., Rakotomamonjy A. (2019). Screening Sinkhorn Algorithm for Regularized Optimal Transport, Advances in Neural Information Processing Systems 33 (NeurIPS).