/Learning-Embeddings-into-Entropic-Wasserstein-Spaces-ENSAE

A thorough review of the paper "Learning Embeddings into Entropic Wasserstein Spaces" by Frogner et al. Includes a reproduction of the results on word embeddings.

Primary LanguageJupyter Notebook

Spring 2019 Geometric Methods in ML project at ENSAE

Work done for the Spring 2019 class of Geometric Methods in ML at ENSAE. The instructor was Marco Cuturi.

The report starts with a review of some optimal transport topics (Wasserstein spaces and Sinkhorn divergences). It then moves on to the problem of learning Wasserstein embeddings, with a focus on word embeddings. It finishes with a thorough study of word2cloud. We provide our own implementation, and the results we obtained are very promising.

Final grade: 20/20

Paper

Frogner et al, Learning Embeddings into Entropic Wasserstein Spaces [link]