# forward-backward algorithm

(1.8 hours to learn)

## Summary

The forward-backward algorithm is an algorithm for computing posterior marginals in a hidden Markov model (HMM). It is based on dynamic programming, and has linear complexity in the length of the sequence. It is used as a component of several other algorithms, such as the Baum_Welch algorithm and block Gibbs sampling in factorial HMMs.

## Context

This concept has the prerequisites:

- hidden Markov models (Forward-backward is an algorithm for inference in HMMs.)
- multivariate distributions (The forward-backward algorithm is an algorithm for marginalization.)
- conditional independence (The justification of the algorithm uses the conditional independence properties.)

## Core resources (read/watch one of the following)

## -Free-

→ Mathematical Monk: Machine Learning (2011)

## -Paid-

→ Artificial Intelligence: a Modern Approach

A textbook giving a broad overview of all of AI.

Location:
Sections 15.2, up to "Finding the most likely sequence" (pages 541-547) and 15.3 (pages 549-551)

→ Machine Learning: a Probabilistic Perspective

A very comprehensive graudate-level machine learning textbook.

Location:
Sections 17.4-17.4.3, pages 606-612

## Supplemental resources (the following are optional, but you may find them useful)

## -Free-

→ Bayesian Reasoning and Machine Learning

## -Paid-

→ Pattern Recognition and Machine Learning

A textbook for a graduate machine learning course, with a focus on Bayesian methods.

Location:
Sections 13.2.2 (pages 618-625) and 13.2.4 (pages 627-629)

Additional dependencies:

- d-separation

## See also

- The forward-backward algorithm can be interpreted as a special case of belief propagation .
- Kalman smoothing can be viewed as a special case of forward-backward, where the variables are all jointly Gaussian.
- This algorithm is part of the Baum-Welch algorithm for learning HMM parameters.
- The forward-backward algorithm is an example of dynamic programming .