# annealed importance sampling

(2.4 hours to learn)

## Summary

Annealed importance sampling (AIS) is a Monte Carlo algorithm based on sampling from a sequence of distributions which interpolate between a tractable initial distribution and the intractable target distribution. It returns a set of weighted samples, and in the limit of infinitely many intermediate distributions, the variance of the weights approahces zero. The most common use is in estimating partition functions.

## Context

This concept has the prerequisites:

- Markov chain Monte Carlo (AIS uses MCMC transition operators.)
- importance sampling (AIS gives a weighted sample from the distribution.)
- inference in MRFs (AIS is often used to estimate the partition function.)
- Central Limit Theorem (The Central Limit Theorem is used to show that the variance of the weights approaches zero.)

## Goals

- Know the steps the AIS algorithm.

- Know how to obtain weighted samples and estimates of the partition function from the algorithm's outputs.

- Show that the variance of the weights approaches zero in the limit of infinitely many intermediate distributions (assuming the transition operator returns perfect samples).

## Core resources (read/watch one of the following)

## -Free-

→ Annealed importance sampling

- Section 2, "The annealed importance sampling procedure"
- Section 4, "Efficiency of annealed importance sampling"

## Supplemental resources (the following are optional, but you may find them useful)

## -Paid-

→ Pattern Recognition and Machine Learning

A textbook for a graduate machine learning course, with a focus on Bayesian methods.

Location:
Section 11.6, "Estimating the partition function"

## See also

- AIS is commonly used to estimate the partition function of a probabilistic model.
- Tempered transitions is another MCMC algorithm based on modifying the target distribution.