Course material for Bayesian Inference and Modern Statistical Methods, STA360/601, Duke University, Spring 2015.
The first half of this course was based on my own lecture notes (Chapters 1-6, Lecture Notes on Bayesian Statistics, Jeffrey W. Miller, 2015).
For the second half of the course, we used A First Course in Bayesian Statistical Methods, Peter D. Hoff, 2009, New York: Springer. http://www.stat.washington.edu/people/pdhoff/book.php
Bayes’ theorem, Definitions & notation, Decision theory, Beta-Bernoulli model, Gamma-Exponential model, Gamma-Poisson model
What is Bayesian inference? Why use Bayes? A brief history of statistics
One-parameter exponential families, Natural/canonical form, Conjugate priors, Multi-parameter exponential families, Motivations for using exponential families
Normal with conjugate Normal-Gamma prior, Sensitivity to outliers
Graphical models, De Finetti's theorem, exchangeability
Monte Carlo, rejection sampling, importance sampling
Markov chain Monte Carlo (MCMC) with Gibbs sampling, Markov chain basics, MCMC diagnostics
Normal distribution, Wishart distribution, Normal with Normal-Wishart prior
Linear regression, basis functions, regularized least-squares, Bayesian linear regression
Hierarchical models, comparing multiple groups
Testing hypotheses, Model selection/inference, Variable selection in linear regression
Informative vs. non-informative, proper vs. improper, Jeffreys priors
Metropolis algorithm, Metropolis–Hastings algorithm
GLMs and examples (logistic, probit, Poisson)
See LICENSE.