# Gaussian BP on trees

## Summary

Marginalization in Gaussian MRFs can be performed in cubic time by inverting a matrix, but this is too slow for some applications. If the model is tree-structured, belief propagation can compute the means and single-node variances in linear time. Unlike in general MRFs, it turns out that the loopy version yields the correct means.

## Context

This concept has the prerequisites:

- Gaussian MRFs (Gaussian BP is an inference algorithm for Gaussian graphical models)
- computations on multivariate Gaussians (Gaussian BP is defined in terms of common operations on multivariate Gaussians.)
- sum-product on trees
- max-product on trees

## Core resources (we're sorry, we haven't finished tracking down resources for this concept yet)

## Supplemental resources (the following are optional, but you may find them useful)

## -Free-

→ Walk-sums and belief propagation in Gaussian graphical models

→ Gaussian Belief Propagation: Theory and Application

## -Paid-

→ Machine Learning: a Probabilistic Perspective

A very comprehensive graudate-level machine learning textbook.

Location:
Section 23.2.3, pages 710-712

→ Probabilistic Graphical Models: Principles and Techniques

A very comprehensive textbook for a graduate-level course on probabilistic AI.

Location:
Section 14.2-14.2.2, pages 608-615

## See also

- If we apply the same update rules on a non-tree graph, it often works just fine. This is known as loopy belief propagation .