slice sampling
(1.2 hours to learn)
Summary
Slice sampling is a method for sampling from a one-dimensional probability distribution by doing Gibbs sampling in an auxiliary variable model. A major virtue is that it doesn't require specifying a step size. For this reason, it's a useful tool for constructing MCMC samplers which don't require tuning step size parameters.
Context
This concept has the prerequisites:
- Gibbs sampling (Slice sampling is a special case of Gibbs sampling)
- Metropolis-Hastings algorithm (When it's intractable to sample exactly from a slice, we need to use a more general M-H update.)
Core resources (read/watch one of the following)
-Free-
→ Information Theory, Inference, and Learning Algorithms
A graudate-level textbook on machine learning and information theory.
→ Machine learning summer school: Markov chain Monte Carlo (2009)
Supplemental resources (the following are optional, but you may find them useful)
-Free-
→ Bayesian Reasoning and Machine Learning
A textbook for a graudate machine learning course.
-Paid-
→ Pattern Recognition and Machine Learning
A textbook for a graduate machine learning course, with a focus on Bayesian methods.
Location:
Section 11.4, pages 546-548
→ Machine Learning: a Probabilistic Perspective
A very comprehensive graudate-level machine learning textbook.
Location:
Section 24.5.2, pages 864-866
See also
- Other auxiliary variable sampling methods include:
- The Swendsen-Wang algorithm is a powerful sampling method for Ising models.
- Hamiltonian Monte Carlo (HMC) uses gradient information to sample from a continuous model
- elliptical slice sampling , for models with a multivariate Gaussian prior
- the No-U-Turn Sampler (NUTS) , a parameter-free version of Hamiltonian Monte Carlo