/uq-course

Introduction to Uncertainty Quantification

Primary LanguageJupyter NotebookMIT LicenseMIT

Introduction to Uncertainty Quantification

This version of the course is being taught at Purdue University during Spring 2018. The code for the course is ME 59700 and MA 598. The instructor is Prof. Ilias Bilionis. The class meets every Tuesday and Thursday 12:00pm-1:15pm at GRIS 102.

The goal of this course is to introduce the fundamentals of uncertainty quantification to advanced undergraduates or graduate engineering and science students with research interests in the field of predictive modeling. Upon completion of this course the students should be able to:

  • Represent mathematically the uncertainty in the parameters of physical models.
  • Propagate parametric uncertainty through physical models to quantify the induced uncertainty on quantities of interest.
  • Calibrate the uncertain parameters of physical models using experimental data.
  • Combine multiple sources of information to enhance the predictive capabilities of models.
  • Pose and solve design optimization problems under uncertainty involving expensive computer simulations.

Student Evaluation

  • 10% Participation
  • 60% Homework
  • 30% Final Project

Lectures

  • Lecture 1 - Introduction, 01/09/2018.

  • Lecture 2 - Quantifying Uncertainties in Physical Models, 01/11/2018.

  • Lecture 3 - Introcution to Probability Theory (Part I), 01/16/2018.

    • Topics: Dynamics of coin toss; Interpretation of probability; Basic rules of probability; Practice examples; Probability as an extension of Aristotelian logic.
    • Notebook
    • Slides
    • Handwritten Notes
    • Video
  • Lecture 4 - Introduction to Probability Theory (Part II), 01/18/2018.

    • Topics: Independence; Conditional independence; Graphical representation of probability models; Causality; Discrete random variables; Continuous random variables; Expectations.
    • Slides
    • Handwritten Notes
    • Video
  • Lecture 5 - Common Random Variables, 01/23/2018.

    • Topics: Uniform distribution; Generating uniform random numbers; Bernoulli distribution and how to sample it; Binomial distribution; Poisson distribution.
    • Notebook
    • Slides
    • Video
  • Lecture 6 - Turning Prior Information to Probability Statements, 01/25/2018.

    • Topics: Principle of insufficient reason, maximum entropy principle, statistical mechanics.
    • Notebook
    • Slides
    • Video
  • Lecture 7 - Generalized Linear Models (Part I), 01/30/2018.

    • Topics: Supervised learning; regression; generalized linear models; least squares; maximum likelihood.
    • Notebook
    • Slides
    • Video
  • Lecture 8 - Generalized Linear Models (Part II), 02/01/2018.

  • Lecture 9 - Generalized Linear Models (Part III), 02/01/2018.

    • Topics: The evidence approximation; automatic relevance determination.
    • Notebook (it is the same as lecture's 8 handout)
    • Slides
    • Video
  • Lecture 10 - Priors on Function Spaces, 02/08/2018.

  • Lecture 11 - Conditioning a Random Field on Observations, 02/13/2018.

  • Lecture 12 - Reducing the Dimensionality of Random Fields, 02/15/2018.

    • Topics: Karhunen-Lo`eve expansion (KLE); Nystr"om approximation to the KLE.
    • Notebook
    • Slides
    • Video
  • Lecture 13 - Uncertainty Propagation: Sampling Methods I, 02/20/2018.

    • Topics: Monte Carlo; high-dimensional integration; error estimates; convergence.
    • Notebook
    • Slides
    • Video
  • Lecture 14 - Uncertainty Propagation: Sampling Methods II, 02/22/2018.

    • Topics: Importance sampling; latin hyper-cube designs; multi-level Monte Carlo.
    • Notebook
    • Slides
    • Video
  • Lecture 15 - Uncertainty Propagation: Perturbation Methods, 02/27/2018.

    • Topics: Taylor series expansions; The Laplace Approximation; Low-order perturbation methods for dynamical systems; Method of adjoints.
    • Notebook
    • Slides
    • Video
  • Lecture 16 - Uncertainty Propagation: Polynomial Chaos I, 03/01/2018.

    • Topics: Hilbert space of square integrable functions; orthogonal polynomials; constructing orthonormal polynomials in 1D; Hermite, Laguerre, Legendre polynomials; constructing multi-dimensional orthonormal polynomials; solving stochastic dynamical system with polynomial chaos;
    • Notebook
    • Slides
    • Video
  • Lecture 17 - Uncertainty Propagation: Polynomial Chaos II, 03/06/2018.

    • Topics: Quadrature rules in 1D; sparse grid collocation; intrusive solution of stochastic dynamical systems; stochastic harmonic oscillator.
    • Notebook
    • Slides
    • Video
  • Lecture 18 - Uncertainty Propagation: Polynomial Chaos III, 03/08/2018.

    • Topics: Intrusive polynomial chaos for stochastic dynamical systems; stochastic exponential decay; stochastic harmonic oscillator; Non-intrusive polynomial chaos.
    • Notebook
    • Slides
    • Handwritten Notes
    • Video
  • No lecture on Tuesday 03/12/2018 (spring break).

  • No lecture on Thursday 03/15/2018 (spring break).

  • Lecture 19 Inverse Problems/Model Calibration: Classic Approaches, 03/20/2018.

    • Topics: Formulation of inverse problems as optimization problems; method of adjoints revisited; calibration of reaction kinetics problem.
    • Notebook
    • Slides
    • Video
  • No lecture on Thursday 03/22/2018 (The instructor will be at 2018 NSF Design Circle Workshop: Designing and Developing Global Engineering Systems).

  • Lecture 20 - Inverse Problems/Model Calibration: Bayesian Approaches, 03/27/2018.

    • Topics: stochastic formulation of inverse problems; the Laplace approximation; solving inverse problems with MCMC; hierarchical Bayes modeling.
    • Notebook
    • Slides
    • Video
  • Lecture 21 - Markov Chain Monte Carlo I, 03/29/2018.

    • Topics: Basics of Markov chains; random walks; Metropolis algorithm; Bayesian calibration of the catalysis problem.
    • Notebook
    • Slides
    • Video
  • Lecture 22 - Markov Chain Monte Carlo II, 04/03/2018.

    • Topics: Metropolis-Hastings; Metropolis-Adjusted Langevin Dynamics; Gibbs sampling; Hierarchical Bayes.
    • Notebook
    • Slides
    • Video
  • Lecture 23 - Markov Chain Monte Carlo III, 04/05/2018.

    • Topics: Hierarchical Bayes examples; Logistic regression; PyMC tutorial.
    • Notebook
    • Slides: No slides. This is a hands-on section.
  • Lecture 24 - Bayesian Model Selection I, 04/10/2018.

    • Topics: Model evidence; Sequential Monte Carlo; Adaptive Importance Sampling.
    • Notebook
    • Slides
    • Video
  • Lecture 25 - Bayesian Model Selection II, 04/12/2018.

    • Topics: PySMC tutorial.
    • Notebook
    • Slides: No slides. This is a hands-on section.
  • No lecture on Tuesday 04/17/2018 (The instructor will be at the SIAM Conference for Uncertainty Quantification 2018).

  • No lecture on Thursday 04/19/2018 (The instructor will be at the SIAM Conference for Uncertainty Quantification 2018).

  • Lecture 26 - Accelerating Bayesian Statistics, 04/24/2018.

    • Topics: Kullback-Leibler divergence; expectation propagation; variational inference.
    • Notebook
    • Slides
    • Video
  • Lecture 27 - Bayesian Algorithms for Solving Stochastic Optimization Problems with Expensive Information Sources, 04/26/2018.

    • Topics: Bayesian global optimization; expected improvement; probability of improvement; knowledge gradient; expected improvement in dominated hypervolume.
    • Notebook
    • Slides

Homework Notebooks

Installation of Required Software for Viewing the Notebookes

Find and download the right version of Anaconda for Python 2.7 from Continuum Analytics. This package contains most of the software we are going to need.

OS Specific Instructions

Microsoft Windows

  • We need C, C++, Fortran compilers, as well as the Python sources. Start a command line (look for cmd) and type:
conda install mingw libpython
  • Finally, you need git. As you install it, make sure you select that you want to use it from the Windows command prompt.

Apple OS X

  • Download and install Xcode
  • Agree to the license of Xcode by opening a terminal and typing:
sudo xcrun cc
  • Install your favorite version of the GNU compiler suite. You can do this with Homebrew (after you install it of course), by typing in the terminal:
brew install gcc

Alternatively, you may use the MacPorts.

Linux

Nothing special is required.

Installation of Required Python Packages

Independently of the operating system, use the command line to install the following Python packages:

conda install seaborn
  • PyMC for MCMC sampling:
conda install pymc
  • GPy for Gaussian process regression:
pip install GPy
  • py-design for generating designs for computer codes:
pip install py-design
  • py-orthpol for generating orthogonal polynomials with respect to arbitrary probability measures:
pip install py-orthpol

Running the notebooks

  • Open the command line.
  • cd to your favorite folder.
  • Then, type:
git clone https://github.com/PredictiveScienceLab/uq-course.git
  • This will download the contents of this repository in a folder called uq-course.
  • Enter the uq-course folder:
cd uq-course
  • Start the jupyter notebook by typing the command:
jupyter notebook
  • Use the browser to navigate the course, experiment with code etc.
  • If the course contented is updated, type the following command (while being inside uq-course) to get the latest version:
git pull origin master

Keep in mind, that if you have made local changes to the repository, you may have to commit them before moving on.