annealed importance sampling
(2.4 hours to learn)
Annealed importance sampling (AIS) is a Monte Carlo algorithm based on sampling from a sequence of distributions which interpolate between a tractable initial distribution and the intractable target distribution. It returns a set of weighted samples, and in the limit of infinitely many intermediate distributions, the variance of the weights approahces zero. The most common use is in estimating partition functions.
This concept has the prerequisites:
- Know the steps the AIS algorithm.
- Know how to obtain weighted samples and estimates of the partition function from the algorithm's outputs.
- Show that the variance of the weights approaches zero in the limit of infinitely many intermediate distributions (assuming the transition operator returns perfect samples).
Core resources (read/watch one of the following)
→ Annealed importance sampling
- Section 2, "The annealed importance sampling procedure"
- Section 4, "Efficiency of annealed importance sampling"
Supplemental resources (the following are optional, but you may find them useful)
→ Pattern Recognition and Machine Learning
A textbook for a graduate machine learning course, with a focus on Bayesian methods.
Location: Section 11.6, "Estimating the partition function"
- create concept: shift + click on graph
- change concept title: shift + click on existing concept
- link together concepts: shift + click drag from one concept to another
- remove concept from graph: click on concept then press delete/backspace
- add associated content to concept: click the small circle that appears on the node when hovering over it
- other actions: use the icons in the upper right corner to optimize the graph placement, preview the graph, or download a json representation