asymptotics of maximum likelihood

(3.4 hours to learn)

Summary

Under certain regularity conditions, the maximum likelihood estimator is consistent, i.e. it asymptotically approaches the true value. Its sampling distribution (when rescaled appropriately) approaches a normal distribution whose variance is determined by the Fisher information. Because of the Cramer-Rao bound, this is the best we can do. The asymptotic analysis is useful for constructing confidence intervals for parameter estimates.

Context

This concept has the prerequisites:

Goals

  • Understand basic properties of maximum likelihood estimators:
    • they are consistent (they approach the correct value in the limit)
    • asymptotically, their sampling distribution (rescaled appropriately) approaches a normal distribution whose variance is the inverse Fisher information
    • they are efficient (no unbiased estimator has smaller variance asymptotically)
  • Note: for the multivariate version of the asymptotic normality result, you'll want to know about the Fisher information matrix .

Core resources (read/watch one of the following)

-Paid-

Supplemental resources (the following are optional, but you may find them useful)

-Paid-

See also