# loopy BP as variational inference

(2.8 hours to learn)

## Summary

Loopy belief propagation sounds like a hack, but it can be interpreted as a variational inference algorithm. In particular, it is a fixed point update for an approximation to variational inference, where both the energy functional and the marginal polytope are approximated. While this analysis doesn't lead to any strong guarantees, it is the basis for generalizations of loopy BP which have stronger guarantees.

## Context

This concept has the prerequisites:

- loopy belief propagation
- variational inference
- Lagrange multipliers (Lagrange multipliers are needed to derive the update rules.)

## Core resources (read/watch one of the following)

## -Free-

→ Graphical models, exponential families, and variational inference (2008)

An in-depth review of exact and approximate inference methods for graphical models.

Location:
Sections 4.1-4.1.4, pages 76-91

## Supplemental resources (the following are optional, but you may find them useful)

## -Paid-

→ Machine Learning: a Probabilistic Perspective

A very comprehensive graudate-level machine learning textbook.

Location:
Sections 22.3-22.3.5, pages 776-782

Additional dependencies:

- exponential families
- mean field approximation

→ Probabilistic Graphical Models: Principles and Techniques

A very comprehensive textbook for a graduate-level course on probabilistic AI.

Location:
Sections 11.2 (pages 386-390), 11.3.5 (pages 404-407) and 11.3.6 (pages 411-414)

Additional dependencies:

- junction trees

## See also

- Loopy BP is guaranteed to converge to the correct mean for Gaussian graphical models.
- Loopy BP is guaranteed to converge for a graph with a single loop.
- Tree-reweighted belief propagation is an algorithm inspired by the same ideas, but where the approximation to KL divergence is convex and gives an upper bound on the partition function.
- Some other inference algorithms based on variational principles:
- expectation propagation , which approximates BP messages in terms of expectations
- mean field approximation , where different variables are approximated as independent in the posterior
- variational Bayes , a general framework for posterior inference in Bayesian models