importance sampling

(45 minutes to learn)


Importance sampling is a way of estimating expectations under an intractable distribution p by sampling from a tractable distribution q and reweighting the samples according to the ratio of the probabilities. While importance sampling has unreasonably large variance when applied naively, it forms the basis for some very effective Monte Carlo estimators.


This concept has the prerequisites:

Core resources (read/watch one of the following)


Information Theory, Inference, and Learning Algorithms
A graudate-level textbook on machine learning and information theory.
Author: David MacKay
Additional dependencies:
  • multivariate Gaussian distribution


Supplemental resources (the following are optional, but you may find them useful)


Machine learning summer school: Markov chain Monte Carlo (2009)
A video tutorial on MCMC methods.
Location: 15:36 to 22:37
Author: Iain Murray


See also