HMM inference as belief propagation
The forward-backward algorithm for computing posterior marginals in an HMM can be viewed as a special case of sum-product belief propagation. Similarly, the Viterbi algorithm for computing the most likely state sequence can be viewed as a special case of max-product belief propagation.
This concept has the prerequisites:
Core resources (we're sorry, we haven't finished tracking down resources for this concept yet)
Supplemental resources (the following are optional, but you may find them useful)
→ Pattern Recognition and Machine Learning
A textbook for a graduate machine learning course, with a focus on Bayesian methods.
Location: Section 13.2.3, pages 625-627
- Following on this result, the Baum-Welch algorithm for learning HMM parameters can be seen as a [speical case](baum_welch_as_em) of [Expectation-Maximization](expectation_maximization) .
- Kalman smoothing can be seen as a [speical case](kalman_as_bp) of the forward-backward algorithm, hence a special case of BP.
- create concept: shift + click on graph
- change concept title: shift + click on existing concept
- link together concepts: shift + click drag from one concept to another
- remove concept from graph: click on concept then press delete/backspace
- add associated content to concept: click the small circle that appears on the node when hovering over it
- other actions: use the icons in the upper right corner to optimize the graph placement, preview the graph, or download a json representation