(1.1 hours to learn)
The Cramer-Rao bound gives the minimum possible variance of an unbiased estimator of the parameters of a probability distribution. It is used to prove the asymptotic efficiency of the maximum likelihood estimator.
This concept has the prerequisites:
- Prove the Cramer-Rao theorem, which bounds the variance of any unbiased estimator of model parameters.
- Use the result to compute the asymptotic relative efficiency of an estimator.
Core resources (read/watch one of the following)
→ Probability and Statistics
An introductory textbook on probability theory and statistics.
Location: Section 8.8, "Fisher information," subsections "The information inequality" and "Efficient estimators," pages 518-522
→ Mathematical Statistics and Data Analysis
An undergraduate statistics textbook.
Location: Section 8.7, "Efficiency and the Cramer-Rao lower bound," pages 298-305
- The bound implies that maximum likelihood estimation is asymptotically efficient .
- create concept: shift + click on graph
- change concept title: shift + click on existing concept
- link together concepts: shift + click drag from one concept to another
- remove concept from graph: click on concept then press delete/backspace
- add associated content to concept: click the small circle that appears on the node when hovering over it
- other actions: use the icons in the upper right corner to optimize the graph placement, preview the graph, or download a json representation