expectation propagation

(2.7 hours to learn)


In some graphical models, it is intractable even to compute the messages in loopy belief propagation. Expectation propagation is a way of approximating these messages in terms of expectations of sufficient statistics. It can be viewed as a variational inference algorithm, and often gives much more accurate results than mean field based approximations.


This concept has the prerequisites:

Core resources (read/watch one of the following)


Supplemental resources (the following are optional, but you may find them useful)


Gaussian Processes for Machine Learning
A graduate-level machine learning textbook focusing on Gaussian processes.
Authors: Carl E. Rasmussen,Christopher K. I. Williams
Additional dependencies:
  • Gaussian process classification
Other notes:
  • This gives the special case of EP for Gaussian process classification.


See also