Bayesian decision theory
(1 hours to learn)
When we use Bayesian parameter estimation techniques, often it's because we want to make a decision. In Bayesian decision theory, we make the choice which minimizes the expected loss under the posterior. When we compute a statistic like the mode or the mean of the predictive distribution, this can be interpreted as the decision theoretic solution under a particular loss function.
This concept has the prerequisites:
- Know how the optimal decision is defined (in terms of minimizing expected loss with respect to the posterior)
- Derive the form of the estimator for some particular loss functions:
- 0-1 loss
- quadratic loss
- absolute loss
Core resources (read/watch one of the following)
→ Machine Learning: a Probabilistic Perspective
A very comprehensive graudate-level machine learning textbook.
Location: Sections 5.7-5.7.1, pages 176-180
Supplemental resources (the following are optional, but you may find them useful)
→ Pattern Recognition and Machine Learning
A textbook for a graduate machine learning course, with a focus on Bayesian methods.
Location: Section 1.5, pages 38-48
- linear regression
- Influence diagrams are a graphical model formalism for decision theoretic problems.
- create concept: shift + click on graph
- change concept title: shift + click on existing concept
- link together concepts: shift + click drag from one concept to another
- remove concept from graph: click on concept then press delete/backspace
- add associated content to concept: click the small circle that appears on the node when hovering over it
- other actions: use the icons in the upper right corner to optimize the graph placement, preview the graph, or download a json representation