collapsed Gibbs sampling
MCMC samplers can often be improved by marginalizing out a subset of the variables in closed form and performing MCMC over the remaining variables. This is more statistically efficient since each particle can cover a larger part of the distribution, and it can also improve mixing by allowing larger jumps.
This concept has the prerequisites:
- Be able to derive the update rules for collapsed Gibbs sampling
- Be aware of the motivations in terms of:
- greater statistical efficiency (from the Rao-Blackwell theorem)
- faster mixing
Core resources (we're sorry, we haven't finished tracking down resources for this concept yet)
Supplemental resources (the following are optional, but you may find them useful)
→ Machine learning summer school: Markov chain Monte Carlo (2009)
A video tutorial on MCMC methods.
Location: Part 2, 16:47 to 21:37
→ Machine Learning: a Probabilistic Perspective
A very comprehensive graudate-level machine learning textbook.
Location: Section 24.2.4, pages 841-844
→ Probabilistic Graphical Models: Principles and Techniques
A very comprehensive textbook for a graduate-level course on probabilistic AI.
Location: Section 12.4, pages 526-532
- importance sampling
- Bayesian networks
-No Additional Notes-
- create concept: shift + click on graph
- change concept title: shift + click on existing concept
- link together concepts: shift + click drag from one concept to another
- remove concept from graph: click on concept then press delete/backspace
- add associated content to concept: click the small circle that appears on the node when hovering over it
- other actions: use the icons in the upper right corner to optimize the graph placement, preview the graph, or download a json representation