/mixture_of_discrete_normalizing_flows

Mixture of discrete normalizing flows for variational inference

Primary LanguageJupyter Notebook

Mixture of Discrete Normalizing Flows for Variational Inference

Sources and demos in Jupyter notebooks:

The repository includes:

  1. notebooks - Jupyter notebooks illustrating use of MDNF with various models:
  1. mdnf - main files implementing flows, mixtures, inference etc.:

Specification of dependencies

The code was tested with Python 3.7.4 (on a Linux platform), using tensorflow 2.2.0 and tensorflow_probability 0.9.0 (can be installed with pip install tensorflow tensorflow_probability). It also requires numpy, pandas, sklearn and scipy, that can be installed with pip install numpy pandas sklearn scipy, but are also available by default in for example, python Anaconda distributions. Potential problems with scipy 1.4.1 can be solved by downgrading it to version 1.2.1 with pip install scipy==1.2.1.

Notebooks .ipynb can be previewed using Jupyter Notebook and run from a command line with runipy. Visualizing results requires matplotlib and seaborn to be available (pip install matplotlib seaborn).

Parts of the code for Bayesian networks require PGMPY (pip install pgmpy==0.1.10) and code for Gaussian mixture models builds on Python codes implementing algorithms described in Bishop's book (can by installed with git clone https://github.com/ctgk/PRML; cd PRML; python setup.py install).

Finally, the code comparing partial and location-scale flows uses Edward2 that can be installed with pip install "git+https://github.com/google/edward2.git#egg=edward2"