Markov chain Monte Carlo
(1.2 hours to learn)
Markov Chain Monte Carlo (MCMC) is a set of techniques for approximately sampling from a probability distribution p by running a Markov chain which has p as its stationary distribution. Gibbs sampling and Metropolis-Hastings are the most common examples.
This concept has the prerequisites:
Core resources (read/watch one of the following)
→ Coursera: Probabilistic Graphical Models (2013)
→ Machine learning summer school: Markov chain Monte Carlo (2009)
A video tutorial on MCMC methods.
Location: 29:08 to 69:40
→ Information Theory, Inference, and Learning Algorithms
A graudate-level textbook on machine learning and information theory.
→ Pattern Recognition and Machine Learning
A textbook for a graduate machine learning course, with a focus on Bayesian methods.
Location: Section 11.2, pages 537-542
- multivariate Gaussian distribution
→ Probabilistic Graphical Models: Principles and Techniques
A very comprehensive textbook for a graduate-level course on probabilistic AI.
Location: Section 12.3-12.3.3, pages 505-515
- Some commonly used MCMC algorithms include:
- Gibbs sampling , where one variable is resampled given the others
- Metropolis-Hastings , a very general technique
- While MCMC is normally used as an approximate inference technique, it can also be used to get exact samples .
- We can analyze the mixing rate of MCMC samplers using spectral graph theory.
- create concept: shift + click on graph
- change concept title: shift + click on existing concept
- link together concepts: shift + click drag from one concept to another
- remove concept from graph: click on concept then press delete/backspace
- add associated content to concept: click the small circle that appears on the node when hovering over it
- other actions: use the icons in the upper right corner to optimize the graph placement, preview the graph, or download a json representation