(2 hours to learn)
Bayes' rule is a formula for combining prior beliefs with observed evidence to obtain a "posterior" distribution. It is central to Bayesian statistics, where one infers a posterior over the parameters of a statistical model given the observed data.
This concept has the prerequisites:
- Know the statement of Bayes' Rule
- Be able to use it to combine prior information with evidence
- Derive Bayes' Rule from the definition of conditional probability
- Know terminology: prior, posterior
- Be able to reason intuitively about Bayes' Rule in terms of odds ratios
Core resources (read/watch one of the following)
→ Mathematical Monk: Probability Primer (2011)
Online videos on probability theory.
- This uses the measure theoretic notion of probability, but should still be accessible without that background. Refer to Lecture 1.S for unfamiliar terms.
→ A First Course in Probability
An introductory probability textbook.
Location: Section 3.3, "Bayes' Formula," pages 72-87
→ Probability and Statistics
An introductory textbook on probability theory and statistics.
Location: Section 2.3, "Bayes' Theorem," pages 66-77
Supplemental resources (the following are optional, but you may find them useful)
→ BerkeleyX: Introduction to Statistics: Probability
An online course on basic probability.
Location: Lecture 1.6, "Bayes' Rule"
- create concept: shift + click on graph
- change concept title: shift + click on existing concept
- link together concepts: shift + click drag from one concept to another
- remove concept from graph: click on concept then press delete/backspace
- add associated content to concept: click the small circle that appears on the node when hovering over it
- other actions: use the icons in the upper right corner to optimize the graph placement, preview the graph, or download a json representation