Hamiltonian Monte Carlo
Hamiltonian Monte Carlo (HMC) is an MCMC algorithm which makes use of gradient information in order to avoid random walks and move more quickly toward regions of high probability. It is based on a discretization of Hamiltonian dynamics, with a Metropolis-Hastings accept/reject step to ensure that it has the right stationary distribution.
This concept has the prerequisites:
- Metropolis-Hastings algorithm (HMC is an example of an M-H sampler.)
- gradient descent (HMC uses the gradient of the log probability in its update rule.)
Core resources (we're sorry, we haven't finished tracking down resources for this concept yet)
Supplemental resources (the following are optional, but you may find them useful)
→ Bayesian Reasoning and Machine Learning
A textbook for a graudate machine learning course.
Location: Section 27.5.1, pages 553-555
→ Machine learning summer school: Markov chain Monte Carlo (2009)
A video tutorial on MCMC methods.
Location: Part 2, 22:58 to 24:10 and 39:44 to 43:53
→ Information Theory, Inference, and Learning Algorithms
A graudate-level textbook on machine learning and information theory.
→ Pattern Recognition and Machine Learning
A textbook for a graduate machine learning course, with a focus on Bayesian methods.
Location: Section 11.5, pages 548-554
- the No-U-Turn Sampler (NUTS) is a parameter-free version of HMC
- Langevin methods are another sampling algorithm that makes use of gradient information
- create concept: shift + click on graph
- change concept title: shift + click on existing concept
- link together concepts: shift + click drag from one concept to another
- remove concept from graph: click on concept then press delete/backspace
- add associated content to concept: click the small circle that appears on the node when hovering over it
- other actions: use the icons in the upper right corner to optimize the graph placement, preview the graph, or download a json representation