Bayes' rule

(2 hours to learn)

Summary

Bayes' rule is a formula for combining prior beliefs with observed evidence to obtain a "posterior" distribution. It is central to Bayesian statistics, where one infers a posterior over the parameters of a statistical model given the observed data.

Context

This concept has the prerequisites:

Goals

  • Know the statement of Bayes' Rule
  • Be able to use it to combine prior information with evidence
  • Derive Bayes' Rule from the definition of conditional probability
  • Know terminology: prior, posterior
  • Be able to reason intuitively about Bayes' Rule in terms of odds ratios

Core resources (read/watch one of the following)

-Free-

Mathematical Monk: Probability Primer (2011)
Online videos on probability theory.
Other notes:
  • This uses the measure theoretic notion of probability, but should still be accessible without that background. Refer to Lecture 1.S for unfamiliar terms.

-Paid-

Supplemental resources (the following are optional, but you may find them useful)

-Free-

BerkeleyX: Introduction to Statistics: Probability
An online course on basic probability.
Location: Lecture 1.6, "Bayes' Rule"

See also

  • Bayes nets are a framework for sophisticated probabilistic reasoning about many variables of interest using things like Bayes' rule.
  • Bayesian statistics is a branch of statistics loosely inspired by Bayesian reasoning.