expectation and variance

(3.7 hours to learn)


The expectation of a random variable is the value that it takes "on average," and the variance is a measure of how much the random variable deviates from that value "on average." Expectation and variance have several convenient properties that often allow one to abstract away the underlying PDFs or PMFs.


This concept has the prerequisites:

Core resources (read/watch one of the following)


Sets, Counting, and Probability
Online lectures on basic probability theory.
Location: Lecture "Expectation I"
Mathematical Monk: Probability Primer (2011)
Online videos on probability theory.
Other notes:
  • This uses the measure theoretic notion of probability, but should still be accessible without that background. Refer to Lecture 1.S for unfamiliar terms.


See also

  • Some other statistics which are based on expected value:
    • variance, which reflects how much a random variable typically deviates from its expected value
    • moment generating functions, a mathematical representation convenient for analyzing sums of random variables (go to concept)
    Markov's inequality bounds the probability that a random variable takes on extreme values, based on its expected value.
  • Martingales are a kind of sequence of random variables whose expected value at the next time step is the current value.
  • Monte Carlo techniques are a way of estimating expectations by sampling from a distribution. (go to concept)