Gaussian BP on trees
Marginalization in Gaussian MRFs can be performed in cubic time by inverting a matrix, but this is too slow for some applications. If the model is tree-structured, belief propagation can compute the means and single-node variances in linear time. Unlike in general MRFs, it turns out that the loopy version yields the correct means.
This concept has the prerequisites:
Core resources (we're sorry, we haven't finished tracking down resources for this concept yet)
Supplemental resources (the following are optional, but you may find them useful)
→ Walk-sums and belief propagation in Gaussian graphical models
Location: Section 2.2, up to "Loopy belief propagation"
→ Gaussian Belief Propagation: Theory and Application
Location: Section 2, pages 8-16
→ Machine Learning: a Probabilistic Perspective
A very comprehensive graudate-level machine learning textbook.
Location: Section 23.2.3, pages 710-712
→ Probabilistic Graphical Models: Principles and Techniques
A very comprehensive textbook for a graduate-level course on probabilistic AI.
Location: Section 14.2-14.2.2, pages 608-615
- If we apply the same update rules on a non-tree graph, it often works just fine. This is known as loopy belief propagation .
- create concept: shift + click on graph
- change concept title: shift + click on existing concept
- link together concepts: shift + click drag from one concept to another
- remove concept from graph: click on concept then press delete/backspace
- add associated content to concept: click the small circle that appears on the node when hovering over it
- other actions: use the icons in the upper right corner to optimize the graph placement, preview the graph, or download a json representation