mean field approximation
(1 hours to learn)
Summary
In variational inference algorithms, we try to approximate an intractable distribution with a tractable one. Mean field is probably the most common example. The approximating distribution is factorized into independent terms corresponding to different variables or groups of variables. Variational Bayes and variational Bayes EM are important applications of mean field to Bayesian parameter estimation.
Context
This concept has the prerequisites:
- variational inference (Mean field is an instance of variational inference.)
- independent random variables (Mean field makes the approximation that certain random variables, or sets of random variables, are independent.)
- Lagrange multipliers (Lagrange multipliers are used to derive the mean field updates.)
Core resources (read/watch one of the following)
-Paid-
→ Machine Learning: a Probabilistic Perspective
A very comprehensive graudate-level machine learning textbook.
Location:
Section 21.3, pages 735-739
Additional dependencies:
- Markov random fields
Supplemental resources (the following are optional, but you may find them useful)
-Free-
→ Graphical models, exponential families, and variational inference (2008)
An in-depth review of exact and approximate inference methods for graphical models.
Location:
Sections 5.1-5.4, pages 127-142
Additional dependencies:
- Gaussian MRFs
- convex optimization
-Paid-
→ Pattern Recognition and Machine Learning
A textbook for a graduate machine learning course, with a focus on Bayesian methods.
Location:
Section 10.1-10.1.3
Additional dependencies:
- Bayesian parameter estimation: Gaussian distribution
→ Probabilistic Graphical Models: Principles and Techniques
A very comprehensive textbook for a graduate-level course on probabilistic AI.
Location:
Sections 11.5-11.5.1, pages 448-456
Additional dependencies:
- Markov random fields
See also
- Mean field updates in an Ising model have a similar form to updates in a Hopfield network