This repository is a collection of notebooks about Bayesian Machine Learning. The following links display some of the notebooks via nbviewer to ensure a proper rendering of formulas.
-
Reliable uncertainty estimates for neural network predictions. Applies noise contrastive priors to Bayesian neural networks to get more reliable uncertainty estimates for OOD data. Implemented with Tensorflow 2 and Tensorflow Probability.
-
Variational inference in Bayesian neural networks. Demonstrates how to implement a Bayesian neural network and variational inference of network parameters. Example implementation with Keras (see also PyMC4 implementation).
-
Latent variable models, part 2: Stochastic variational inference and variational autoencoders. Introduction to stochastic variational inference with variational autoencoder as application example. Implementation with Tensorflow 2.x.
-
Latent variable models, part 1: Gaussian mixture models and the EM algorithm. Introduction to the expectation maximization (EM) algorithm and its application to Gaussian mixture models. Example implementation with plain NumPy/SciPy and scikit-learn for comparison (see also PyMC3 implementation).
-
Bayesian optimization. Introduction to Bayesian optimization. Example implementations with plain NumPy/SciPy as well as with libraries scikit-optimize and GPyOpt. Hyper-parameter tuning as application example.
-
Gaussian processes. Introduction to Gaussian processes. Example implementations with plain NumPy/SciPy as well as with libraries scikit-learn and GPy.
-
Bayesian regression with linear basis function models. Introduction to Bayesian linear regression. Implementation from scratch with plain NumPy as well as usage of scikit-learn for comparison (see also PyMC4 implementation and PyMC3 implementation).
-
Deep feature consistent variational autoencoder. Describes how a perceptual loss can improve the quality of images generated by a variational autoencoder. Example implementation with Keras.
-
Conditional generation via Bayesian optimization in latent space. Describes an approach for conditionally generating outputs with desired properties by doing Bayesian optimization in latent space learned by a variational autoencoder. Example application implemented with Keras and GPyOpt.