# expectation propagation

(2.7 hours to learn)

## Summary

In some graphical models, it is intractable even to compute the messages in loopy belief propagation. Expectation propagation is a way of approximating these messages in terms of expectations of sufficient statistics. It can be viewed as a variational inference algorithm, and often gives much more accurate results than mean field based approximations.

## Context

This concept has the prerequisites:

- loopy belief propagation (EP is an approximation to loopy BP.)
- variational inference (EP is a variational inference algorithm.)
- exponential families (EP requires that the approximating distribution be an exponential family.)

## Core resources (read/watch one of the following)

## -Paid-

→ Pattern Recognition and Machine Learning

A textbook for a graduate machine learning course, with a focus on Bayesian methods.

Location:
Section 10.7, pages 505-517

Additional dependencies:

- Lagrange multipliers
- Gaussian distribution

## Supplemental resources (the following are optional, but you may find them useful)

## -Free-

→ Gaussian Processes for Machine Learning

A graduate-level machine learning textbook focusing on Gaussian processes.

Location:
Section 3.6, pages 52-60

Additional dependencies:

- Gaussian process classification

Other notes:

- This gives the special case of EP for Gaussian process classification.

## -Paid-

→ Machine Learning: a Probabilistic Perspective

A very comprehensive graudate-level machine learning textbook.

Location:
Sections 22.5-22.5.3, pages 787-793

Additional dependencies:

- Lagrange multipliers
- Gaussian distribution

→ Probabilistic Graphical Models: Principles and Techniques

A very comprehensive textbook for a graduate-level course on probabilistic AI.

Location:
Sections 11.4-11.4.4, pages 430-445

Other notes:

- This explains EP in the context of a more general framework for approximate BP messages.

## See also

- EP can be used to estimate the partition function of a graphical model .
- Variational Bayes is an alternative variational inference method which is often simpler and faster but less accurate.