loopy belief propagation
(1.8 hours to learn)
Summary
The sum-product and max-product algorithms give exact answers for tree graphical models, but if we apply the same update rules on a general graph, it often gives pretty reasonable results. This is known as loopy belief propagation, and it is a widely used approximate inference algorithm in coding theory and low level vision.
Context
This concept has the prerequisites:
- sum-product on trees (Loopy BP uses the same update rules as the tree-based version.)
- max-product on trees (Loopy BP uses the same update rules as the tree-based version.)
Core resources (read/watch one of the following)
-Paid-
→ Machine Learning: a Probabilistic Perspective
A very comprehensive graudate-level machine learning textbook.
Location:
Sections 22.1-22.2.5, pages 767-774
Supplemental resources (the following are optional, but you may find them useful)
-Free-
→ Coursera: Probabilistic Graphical Models (2013)
An online course on probabilistic graphical models.
Other notes:
- Click on "Preview" to see the videos.
-Paid-
→ Probabilistic Graphical Models: Principles and Techniques
A very comprehensive textbook for a graduate-level course on probabilistic AI.
Location:
Sections 11.3-11.3.3 (pages 391-401) and boxes 11.B and 11.C (pages 407-411)
Other notes:
- This uses the cluster graph formalism, which is more general than the factor graph version, but has more cumbersome notation.
See also
- The Brouwer fixed point theorem implies that loopy BP has at least one fixed point.
- Loopy BP can be viewed as a [variational inference](variational_inference) algorithm which tries to minimize the Bethe free energy.
- In the case of Gaussian graphical models, loopy BP is guaranteed to converge to the correct mean.
- Loopy BP is guaranteed to converge in graphs with a single loop.