# collapsed Gibbs sampling

## Summary

MCMC samplers can often be improved by marginalizing out a subset of the variables in closed form and performing MCMC over the remaining variables. This is more statistically efficient since each particle can cover a larger part of the distribution, and it can also improve mixing by allowing larger jumps.

## Context

This concept has the prerequisites:

- Gibbs sampling
- MCMC convergence (Collapsed sampling is a way of improving mixing.)
- multivariate distributions (Collapsed Gibbs sampling involves marginalizing out some of the variables.)

## Goals

- Be able to derive the update rules for collapsed Gibbs sampling

- Be aware of the motivations in terms of:
- greater statistical efficiency (from the Rao-Blackwell theorem)
- faster mixing

## Core resources (we're sorry, we haven't finished tracking down resources for this concept yet)

## Supplemental resources (the following are optional, but you may find them useful)

## -Free-

→ Machine learning summer school: Markov chain Monte Carlo (2009)

## -Paid-

→ Machine Learning: a Probabilistic Perspective

A very comprehensive graudate-level machine learning textbook.

Location:
Section 24.2.4, pages 841-844

→ Probabilistic Graphical Models: Principles and Techniques

A very comprehensive textbook for a graduate-level course on probabilistic AI.

Location:
Section 12.4, pages 526-532

Additional dependencies:

- importance sampling
- Bayesian networks

## See also

-No Additional Notes-