# the Laplace approximation

(40 minutes to learn)

## Summary

The Laplace approximation is a way of approximating Bayesian parameter estimation and Bayesian model comparison. It is based on a second-order Taylor approximation of the log posterior around the MAP estimate, which results in a Gaussian approximation to the posterior.

## Context

This concept has the prerequisites:

- Bayesian parameter estimation (The Laplace approximation is an approximation to Bayesian parameter estimation.)
- Bayesian model comparison (The Laplace approximation is a method for Bayesian model comparison.)
- MAP parameter estimation (The Laplace approximation is a second-order Taylor expansion around the MAP estimate.)
- Taylor approximations (The Laplace approximation is a second-order Taylor expansion around the MAP estimate.)
- multivariate Gaussian distribution (The posterior distribution is approximated with a multivariate Gaussian.)

## Core resources (read/watch one of the following)

## -Free-

→ Information Theory, Inference, and Learning Algorithms

A graudate-level textbook on machine learning and information theory.

## -Paid-

→ Pattern Recognition and Machine Learning

A textbook for a graduate machine learning course, with a focus on Bayesian methods.

Location:
Section 4.4, pages 213-217

## Supplemental resources (the following are optional, but you may find them useful)

## -Paid-

→ Machine Learning: a Probabilistic Perspective

A very comprehensive graudate-level machine learning textbook.

Location:
Section 8.4.1, page 255

## See also

- Some other approximations to Bayesian parameter estimation include: The Bayesian information criterion (BIC) can be [justified](justifying_aic_and_bic) in terms of the Laplace approximation.
- Some other ways to approximate the model evidence include: The Laplace approximation approximates the posterior with a Gaussian. Other Gaussian approximations are more accurate, though often slower and more complicated to implement: