# Markov chain Monte Carlo

(1.2 hours to learn)

## Summary

Markov Chain Monte Carlo (MCMC) is a set of techniques for approximately sampling from a probability distribution p by running a Markov chain which has p as its stationary distribution. Gibbs sampling and Metropolis-Hastings are the most common examples.

## Context

This concept has the prerequisites:

- Monte Carlo estimation (MCMC is a kind of sampling method.)
- conditional distributions (MCMC is often used to sample from conditional distributions.)
- Markov chains

## Core resources (read/watch one of the following)

## -Free-

→ Coursera: Probabilistic Graphical Models (2013)

An online course on probabilistic graphical models.

Other notes:

- Click on "Preview" to see the videos.

→ Machine learning summer school: Markov chain Monte Carlo (2009)

→ Information Theory, Inference, and Learning Algorithms

A graudate-level textbook on machine learning and information theory.

## -Paid-

→ Pattern Recognition and Machine Learning

A textbook for a graduate machine learning course, with a focus on Bayesian methods.

Location:
Section 11.2, pages 537-542

Additional dependencies:

- multivariate Gaussian distribution

→ Probabilistic Graphical Models: Principles and Techniques

A very comprehensive textbook for a graduate-level course on probabilistic AI.

Location:
Section 12.3-12.3.3, pages 505-515

## See also

- Some commonly used MCMC algorithms include:
- Gibbs sampling , where one variable is resampled given the others
- Metropolis-Hastings , a very general technique

- While MCMC is normally used as an approximate inference technique, it can also be used to get exact samples .
- We can analyze the mixing rate of MCMC samplers using spectral graph theory.