/Stein-Variational-Gradient-Descend

An application of Stein Variational Gradient descend including some visualizations.

Primary LanguageJupyter NotebookMIT LicenseMIT

Stein Variational Gradient Descend

A simple application of Stein Variational Gradient Descend including some visualizations. This has been put together as part of the lecture Monte Carlo Methods in Machine Learning and Artificial Intelligence at TU Berlin.

What is it?

Stein variational Gradient Descend (SVGD) is a Monte Carlo Method used to draw samples from a target distribution. Starting with a set of particles drawn from a reference distribution, it uses Gradient Descend in a RKHS to find transformations for the particles. By iteratively applying these transformations, the particles will then converge towards the target distribution.

References

Paper: Liu, Qiang and Wang, Dilin (2016). Stein Variational Gradient Descent: A General Purpose Bayesian Inference Algorithm (arXiv.org)

Code: github.com/DartML/Stein-Variational-Gradient-Descent
This is where the SVGD kernel and update functions are from.