# computations on multivariate Gaussians

(1.1 hours to learn)

## Summary

Multivariate Gaussians are widely used in computational sciences because many useful operations can be performed efficiently. Marginalization is easy: we simply pull the relevant rows and columns of the mean and covariance. Conditioning can be done with a matrix inversion.

## Context

This concept has the prerequisites:

- multivariate Gaussian distribution
- conditional distributions (Conditioning is one operation we commonly need to perform.)
- multivariate distributions (Marginalization is one operation we commonly need to perform.)
- matrix inverse (Many operations require computing matrix inverses.)
- covariance matrices (Many operations involve manipulating covariance matrices.)

## Core resources (read/watch one of the following)

## -Paid-

→ Machine Learning: a Probabilistic Perspective

A very comprehensive graudate-level machine learning textbook.

Location:
Sections 4.3.1-4.3.2, pages 111-115

→ Pattern Recognition and Machine Learning

A textbook for a graduate machine learning course, with a focus on Bayesian methods.

Location:
Sections 2.3.1-2.3.1, pages 85-90

## See also

- Some probabilistic models allow for more efficient conditioning by exploiting problem structure:
- Gaussian Markov random fields , which exploit sparsity in the precision matrix
- Linear-Gaussian models , which exploit sparsity in the Cholesky decomposition