/uq-course

Introduction to Uncertainty Quantification

Primary LanguageJupyter NotebookMIT LicenseMIT

Introduction to Uncertainty Quantification

This version of the course is being taught at Purdue University during Spring 2020. The code for the course is ME 59700 and MA 59800. The instructor is Prof. Ilias Bilionis. The class meets every Tuesday and Thursday 12:00pm-1:15pm at WALC 2127.

The goal of this course is to introduce the fundamentals of uncertainty quantification to advanced undergraduates or graduate engineering and science students with research interests in the field of predictive modeling. Upon completion of this course the students should be able to:

  • Represent mathematically the uncertainty in the parameters of physical models.
  • Propagate parametric uncertainty through physical models to quantify the induced uncertainty on quantities of interest.
  • Calibrate the uncertain parameters of physical models using experimental data.
  • Combine multiple sources of information to enhance the predictive capabilities of models.
  • Pose and solve design optimization problems under uncertainty involving expensive computer simulations.

Student Evaluation

  • 10% Participation
  • 60% Homework
  • 30% Final Project

Lectures

  • Lecture 1 - Introduction, 01/14/2020.

  • Lecture 2 - Introduction to Predictive Modeling, 01/16/2020.

    • Topics: Predictive modeling, structural causal models and their graphical representation, aleatory vs epistemic uncertainties, the uncertainty propagation problem, the model calibration problem.
    • Lecture notebook
  • Lecture 3 - Introduction to Probability Theory (Part I), 01/21/2020.

    • Topics: Interpretation of probability as a representation of our state of knowledge, basic rules of probability, practice examples.
    • Lecture notebook
  • Lecture 4 - Introduction to Probability Theory (Part II), 01/23/2018.

    • Topics: Discrete random variables, probability mass function, cumulative distribution function, expectation, variance, covariance, joint probability mass function, marginals, independence, conditional probability, the Bernoulli distribution, the Binomial distribution, the categorical distribution, the Poisson distribution.
    • Lecture notebook
  • Lecture 5 - Introduction to Probability Theory (Part III), 01/28/2020.

    • Topics: Continuous random variables, the uniform distribution, the Gaussian distribution, analytical Bayesian inference examples.
    • Lecture notebook
  • Lecture 6 - Introduction to Probability Theory (Part IV), 01/30/2020.

    • Topics: Bayesian parameter estimation, credible intervals, Bayesian decision making, analytical Bayesian inference examples.
    • Lecture notebook
  • Lecture 7 - Introduction to Probability Theory (Part V), 02/04/2020.

    • Topics: Pseudo-random number generators, sampling the uniform distribution, the empirical cumulative distribution function, the Kolmogorov-Smirnov test, sampling the Bernoulli distribution, sampling any discrete distribution, limiting behavior of the binomial distribution, the central limit theorem and the ubiquitousness of the Gaussian distribution, sampling continuous distributions using inverse sampling and rejection sampling.
    • Lecture notebook
  • Lecture 8 - Uncertainty Propagation: Introduction to Monte Carlo Sampling, 02/06/2020.

    • Topics: Curse of dimensionality, estimate multi-dimensional integrals using Monte Carlo, quantification of epistemic uncertainty in Monte Carlo estimates, example of uncertainty propagation through partial differential equations.
    • Lecture notebook
  • Lecture 9 - Uncertainty Propagation: Advanced Monte Carlo Sampling, 02/11/2020.

    • Topics: Importance sampling, latin-hyper cube designs, example of uncertainty propagation through partial differential equations.
    • Lecture notebook
  • Lecture 10 - Uncertainty Propagation: Perturbation Methods, 02/13/2020.

    • Topics: Taylor series expansions; The Laplace Approximation; Low-order perturbation methods for dynamical systems; Method of adjoints.
    • Lecture notebook
  • Lecture 11 - Model Checking and Evaluation, 02/18/2020.

    • Topics: External validity, posterior predictive checking, test statistics, Bayesian p-values, examples.
    • Lecture notebook
  • Lecture 12 - Basics of Curve Fitting: The Generalized Linear Model, 02/20/2020.

    • Topics: Supervised learning, regression, generalized linear model, least squares, maximum likelihood.
    • Lecture notebook
  • Lecture 13 - Basics of Curve Fitting: Bayesian Linear Regression, 02/25/2020.

    • Topics: Maximum a posteriori estimates, Bayesian linear regression, evidence approximation, automatic relevance determination.
    • Lecture notebook
  • Lecture 14 - Advanced Curve Fitting: Gaussian Processes to Encode Prior Knowledge about Functions, 02/27/2020.

    • Topics: Stochastic processes, random fields, Gaussian process, mean functions, covariance functions, sampling from a Gaussian process, encoding prior knowledge about functions.
    • Lecture notebook
  • Lecture 15 - Advanced Curve Fitting: Gaussian Process Regression I, 03/03/2020.

    • Topics: Conditioning Gaussian random fields on exact and noisy observations.
    • Lecture notebook
  • Lecture 16 - Advanced Curve Fitting: Gaussian Process Regression II, 03/05/2020.

    • Topics: Diagnostics for curve fitting, estimating the hyperparameters of covariance functions.
    • Lecture notebook (same as lecture 15)
  • Lecture 17 - Advanced Curve Fitting: Multivariate Gaussian Process Regression and Automatic Relevance Determination, 03/10/2020.

    • Topics: Multivariate Gaussian process regression, automatic relevance determination, the curse of dimensionality, active subspaces, high-dimensional model representation.
    • Lecture notebook
  • Lecture 18 - Application of Gaussian Process Regression: Optimizing expensive black-box functions, 03/12/2020.

    • Topics: Bayesian global optimization without noise, maximum upper interval, probability of improvement, expected improvement, quantifying epistemic uncertainty in the location of the maximum, Bayesian global optimization with noise.
    • Lecture notebook
  • No lecture on Tuesday 03/17/2020 (spring break).

  • No lecture on Thursday 03/19/2020 (spring break).

  • Lecture 19 - Inverse Problems/Model Calibration: Classical Approach, 03/24/2020.

    • Topics: Formulate inverse problems as optimization problems, reaction kinetics example, shortcomings of the classical approach.
    • Lecture notebook
  • Lecture 20 - Inverse Problems/Model Calibration: Bayesian Approach, 03/26/2020.

    • Topics: Ill-posed problems, Bayesian formulation, reminder of the Laplace approximation, reaction kinetics example, shortcomings of the Laplace approximation.
    • Lecture notebook
  • Lecture 21 - Sampling from Posteriors: The Metropolis Algorithm, 03/31/2020.

    • Topics: Basics of Markov chains, random walks, Metropolis algorithm, Bayesian calibration of the catalysis problem.
    • Lecture notebook
  • Practice Lab - Sampling from Posteriors: The Metropolis Algorithm, 04/02/2020.

  • Lecture 22 - Sampling from Posteriors: The Metropolis-Hastings Algorithm, 04/07/2020.

    • Topics: Metropolis-Hastings, Metropolis-Adjusted Langevin Dynamics, Gibbs sampling, Hierarchical Bayes.
    • Lecture notebook
  • Practice Lab - Sampling from Posteriors: The Metropolis-Hastings Algorithm, 04/09/2020.

  • Lecture 23 - PyMC3 Tutorial, 04/14/2020.

    • Topics: Defining distributions and models with PyMC3, performing MCMC simulations with pymc3.sample, real world dataset examples.
    • Lecture notebook
  • Lecture 24 - Bayesian Model Selection using Sequential Monte Carlo, 04/16/2020.

  • Practice Lab - Bayesian Model Selection using Sequential Monte Carlo, 04/21/2020.

  • Lecture 25 - Estimating Posteriors: Variational Inference, 04/23/2020.

    • Topics: Introduction to Variational Inference, VI with PyMC3 .
    • Lecture notebook
  • Practice Lab - Estimating Posteriors: Variational Inference, 04/28/2020.

  • Lecture 26 - Advanced Curve Fitting: Deep Neural Networks, 04/30/2020.

    • Topics: Introduction to basic concepts in deep neural networks, setting up DNNs in PyTorch, setting up Bayesian DNNs in pyro, physics-informed deep neural networks for solving PDEs.
    • Lecture notebook

Homework Notebooks

Project submission timeline

  • Title and abstract, due 02/15/2020.

  • Final report, due 05/04/2020.

Running the notebooks on Google Colab

Make sure you have a Google account before you start. Ok, there are many ways you can do this. This is the simplest one:

Google Colab using directly this GitHub site

  • Go to the Google Colab website and login with your Google account (if you are not already logged in).

  • Then hit File->Open Notebook.

  • In the pop up window that opens, click on GitHub.

  • Write: https://github.com/PredictiveScienceLab/uq-course.git and hit enter.

  • Now you can select the notebook you would like to open. For example, select "lecture_01.ipynb".

  • That's it.

Google Colab using notebooks on your computer

  • First, download this repository to your computer. Use this link. Unzip the file and make sure you know where it is.

  • Go to the Google Colab website and login with your Google account (if you are not already logged in).

  • Google Colab can see your Google drive. So you should be able to open any notebook you have on your Google drive. This is one way you can do it. Drop the course directory in your Google drive. You can find these by "File->Open Notebook" and hitting the Google Drive tab.

  • The other way is to individually upload notebooks. On the Google Colab page hit File->Upload Notebook and drop the notebook you would like to open.

Installing software on Google Colab

When running on google Colab, you will have to install some software manually every time you run the notebook. For example, to install the Python module GPy, you need to add a code block:

!pip install GPy

Running the notebooks on your personal computer

Find and download the right version of Anaconda for Python 3.7 from Continuum Analytics. This package contains most of the software we are going to need. Note: You do need Python 3 and note Python 2. The notebooks will not work with Python 2.

OS Specific Instructions

Microsoft Windows

  • We need C, C++, Fortran compilers, as well as the Python sources. Start the command line by opening "Anaconda Prompt" from the start menu. In the command line type:
conda config --append channels https://repo.continuum.io/pkgs/free
conda install mingw libpython
  • Finally, you need git. As you install it, make sure to indicate that you want to use "Git from the command line and also from 3rd party software".

Apple OS X

  • Download and install the latest version of Xcode.

Linux

If you are using Linux, I am sure that you can figure it out on your own.

Installation of Required Python Packages

Independently of the operating system, use the command line to install the following Python packages:

conda install seaborn
  • PyMC3 for MCMC sampling:
conda install pymc3
  • GPy for Gaussian process regression:
pip install GPy
  • pydoe for generating experimental designs:
pip install pydoe
  • fipy for solving partial differential equations using the finite volume method:
pip install fipy

*** Windows Users ***

You may receive the error

ModuleNotFoundError: No module named 'future'

If so, please install future and then install fipy:

pip install future
  • scikit-learn for some standard machine learning algorithms implemented in Python:
conda install scikit-learn
  • graphviz for visualizing probabilistic graphical models:
pip install graphviz

Running the notebooks

  • Open the command line.
  • cd to your favorite folder.
  • Then, type:
git clone https://github.com/PredictiveScienceLab/uq-course.git
  • This will download the contents of this repository in a folder called uq-course.
  • Enter the uq-course folder:
cd uq-course
  • Start the jupyter notebook by typing the command:
jupyter notebook
  • Use the browser to navigate the course, experiment with code etc.
  • If the course content has been updated, type the following command (while being inside uq-course) to get the latest version:
git pull origin master

Keep in mind, that if you have made local changes to the repository, you may have to commit them before moving on.