(1.2 hours to learn)
Slice sampling is a method for sampling from a one-dimensional probability distribution by doing Gibbs sampling in an auxiliary variable model. A major virtue is that it doesn't require specifying a step size. For this reason, it's a useful tool for constructing MCMC samplers which don't require tuning step size parameters.
This concept has the prerequisites:
Core resources (read/watch one of the following)
→ Information Theory, Inference, and Learning Algorithms
A graudate-level textbook on machine learning and information theory.
→ Machine learning summer school: Markov chain Monte Carlo (2009)
A video tutorial on MCMC methods.
Location: Part 2, 22:58 to 39:44
Supplemental resources (the following are optional, but you may find them useful)
→ Bayesian Reasoning and Machine Learning
A textbook for a graudate machine learning course.
→ Pattern Recognition and Machine Learning
A textbook for a graduate machine learning course, with a focus on Bayesian methods.
Location: Section 11.4, pages 546-548
→ Machine Learning: a Probabilistic Perspective
A very comprehensive graudate-level machine learning textbook.
Location: Section 24.5.2, pages 864-866
- Other auxiliary variable sampling methods include:
- The Swendsen-Wang algorithm is a powerful sampling method for Ising models.
- Hamiltonian Monte Carlo (HMC) uses gradient information to sample from a continuous model
- create concept: shift + click on graph
- change concept title: shift + click on existing concept
- link together concepts: shift + click drag from one concept to another
- remove concept from graph: click on concept then press delete/backspace
- add associated content to concept: click the small circle that appears on the node when hovering over it
- other actions: use the icons in the upper right corner to optimize the graph placement, preview the graph, or download a json representation