Bayesian method is the natural approach to inference, yet it is hidden from readers behind chapters of slow, mathematical analysis. The typical text on Bayesian inference involves two to three chapters on probability theory, then enters what Bayesian inference is. Unfortunately, due to mathematical intractability of most Bayesian models, the reader is only shown simple, artificial examples. This can leave the user with a so-what feeling about Bayesian inference. In fact, this was the author's own prior opinion.
After some recent success of Bayesian methods in machine-learning competitions, I decided to investigate the subject again. Even with my mathematical background, it took me three straight-days of reading examples and trying to put the pieces together to understand the methods. There was simplely not enough literature bridging theory to practice. The problem with my misunderstanding was the disconnect between Bayesian mathematics and probabilistic programming. That being said, I suffered then so the reader would not have to now. This book attempts to bridge the gap.
If Bayesian inference is the destination, then mathematical analysis is a particular path to it. On the other hand, computing power is cheap enough that we can afford to take an alternate route via probabilistic programming. The path is much more useful, as it denies the necessity of mathematical intervention at each step, that is, we remove often-intractable mathematical analysis as a prerequisite to Bayesian inference. Simply put, this computational path proceeds via small intermediate jumps from beginning to end, where as the first path proceeds by enormous leaps, often landing far away from our target. Furthermore, without a strong mathematical background, the analysis required by the first path cannot even take place.
Probabilistic Programming and Bayesian Methods for Hackers is designed as a introduction to Bayesian inference from a computational/understanding-first, and mathematics-second, point of view. Of course as an introductory book, we can only leave it at that: an introductory book. For the mathematically trained, they may cure their curiousity this text generates with other texts designed with mathematical analysis in mind. For the enthusiast with less mathematical-background, or one who is not interested in the mathematics but simply the practice of Bayesian methods, this text should be sufficient and entertaining.
The choice of PyMC as the probabilistic programming language is two-fold. As of this writing, there is currently no central resource for examples and explanations in the PyMC universe. The official documentation assumes prior knowledge of Bayesian inference and probabilistic programming. We hope this book encourages users at every level to look at PyMC. Secondly, with recent core developments and popularity of the scientific stack in Python, PyMC is likely to become a core component soon enough.
PyMC does have dependencies to run, namely NumPy and (optionally) SciPy. To not limit the user, the examples in this book will rely only on PyMC, NumPy and SciPy only.
(The below chapters are rendered via the nbviewer at nbviewer.ipython.org/, and is read-only and rendered in real-time. Interactive notebooks + examples can be downloaded by cloning! )
-
Prologue. Why we do it.
-
Chapter 1: Introduction to Bayesian Methods Introduction to the philosophy and practice of Bayesian methods and answering the question "What is probabilistic programming?" Examples include:
- Inferring human behaviour changes from text message rates.
-
Chapter 2: A little more on PyMC We explore modeling Bayesian problems using Python's PyMC library through examples. How do we create Bayesian models? Examples include:
- Detecting the frequency of cheating students, while avoiding liars.
- Calculating probabilities of space-shuttle disasters.
-
Chapter 3: Opening the Black Box of MCMC We discuss how MCMC operates and diagnostic tools. Examples include:
- Bayesian clustering with mixture models
-
Chapter 4: The Greatest Theorem Never Told We explore an incredibly useful, and dangerous, theorem: The Law of Large Numbers. Examples include:
- Exploring a Kaggle dataset and the pitfalls of naive analysis
- How to sort Reddit comments from best to worst (not as easy as you think)
-
Chapter 5: Would you rather loss an arm or a leg? The introduction of Loss functions and there (awesome) use in Bayesian methods. Examples include:
- Solving the Price is Right's Showdown
- Optimizing financial predictions
- Winning solution to the Kaggle Dark World's competition.
-
Chapter 6: Getting our prior-ities straight Probably the most important chapter. We draw on expert opinions to answer questions like:
- how do we pick priors?
- what is the relationship between data sample size and prior?
We explore useful tips to be objective in analysis, and common pitfalls of priors.
-
Chapter X1: Bayesian Markov Models
-
Chapter X2: Bayesian methods in Machine Learning We explore how to resolve the overfitting problem plus popular ML methods. Also included are probablistic explainations of Ridge Regression and LASSO Regression.
- Bayesian spam filtering plus how to defeat Bayesian spam filtering
- Tim Saliman's winning solution to Kaggle's Don't Overfit problem
-
Chapter X3: More PyMC Hackery We explore the gritty details of PyMC. Examples include:
- Analysis on real-time GitHub repo stars and forks.
-
Chapter X4: Troubleshooting and debugging
More questions about PyMC? Please post your modeling, convergence, or any other PyMC question on cross-validated, the statistcs stack-exchange.
The book can be read in three different ways, starting from most recommended to least recommended:
-
The most recommended option is to clone the repository and download the .ipynb files to your local machine. If you have IPython installed, you can view the chapters in your browser plus edit and run the code provided (and try some practice questions). This is the preferred option to read this book, though it comes with some dependencies.
- IPython 0.13 is a requirement to view the ipynb files. It can be downloaded here
- For Linux users, you should not have a problem installing Numpy, Scipy and PyMC. For Windows users, check out pre-compiled versions if you have difficulty.
- In the styles/ directory are a number of files that used to make things pretty. These are not only designed for the book, but they offer many improvements over the default settings of matplotlib and the IPython notebook.
-
The second, preferred, option is to use the nbviewer.ipython.org site, which display IPython notebooks in the browser (example). The contents are updated synchronously as commits are made to the book. You can use the Contents section above to link to the chapters.
-
The most traditional approach, but also not recommended, is to read the chapters as PDFs contained in the
previews
folder. The content in these PDFs is not guarunteed to be the most recent content as the PDFs are only compiled periodically. Similarly, the book will not be interactive.
This book has an unusual development design. The content is open-sourced, meaning anyone can be an author. Authors submit content or revisions using the GitHub interface. After a major revision or addition, we collect all the content, compile it to a PDF, and increment the version of Probabilistic Programming and Bayesian Methods for Hackers.
Thanks to all our contributing authors, including (in chronological order):
- Cameron Davidson-Pilon
- Stef Gibson
- Vincent Ohprecio
- Lars Buitinck
- Paul Magwene
- Matthias Bussonnier
- Jens Rantil
- y-p
We would like to thank the Python community for building an amazing architecture. We would like to thank the statistics community for building an amazing architecture.
Similarly, the book is only possible because of the PyMC library. A big thanks to the core devs of PyMC: Chris Fonnesbeck, Anand Patil, David Huard and John Salvatier.
One final thanks. This book was generated by IPython Notebook, a wonderful tool for developing in Python. We thank the IPython community for developing the Notebook interface. All IPython notebook files are available for download on the GitHub repository.
####What to contribute?
- The current chapter list is not finalized. If you see something that is missing (MCMC, MAP, Bayesian networks, good prior choices, Potential classes etc.), feel free to start there.
- Cleaning up Python code and making code more PyMC-esque.
- Giving better explainations
- Contributing to the IPython notebook styles.
####Installation and configuration
- IPython 0.13 is a requirement to view the ipynb files. It can be downloaded here
- For Linux users, you should not have a problem installing Numpy, Scipy and PyMC. For Windows users, check out pre-compiled versions if you have difficulty.
- In the styles/ directory are a number of files that are customized for the notebook. These are not only designed for the book, but they offer many improvements over the default settings of matplotlib and the IPython notebook. The in notebook style has not been finalized yet.
- Currently the formatting of the style is not set, so try to follow what has been used so far, but inconsistencies are fine.
####Commiting
- All commits are welcome, even if they are minor ;)
- If you are unfamiliar with Github, you can email me contributions to the email below.
####Contact Contact the main author, Cam Davidson-Pilon at cam.davidson.pilon@gmail.com or @cmrndp