Bayesian parameter estimation: Gaussian distribution
(1.6 hours to learn)
Using the Bayesian framework, we can infer the mean parameter of a Gaussian distribution, the scale parameter, or both. Since Gaussians are widely used in probabilistic modeling, the computations that go into this are common motifs in Bayesian machine learning more generally.
This concept has the prerequisites:
- Derive the conjugate priors for three cases:
- unknown mean, but known variance
- known mean, but unknown variance
- unknown mean and unknown variance
- Derive the posterior distribution and the posterior predictive distribution for each of these cases.
Core resources (read/watch one of the following)
→ Information Theory, Inference, and Learning Algorithms
A graudate-level textbook on machine learning and information theory.
→ Pattern Recognition and Machine Learning
A textbook for a graduate machine learning course, with a focus on Bayesian methods.
Location: Sections 2.3.6-2.3.7 (except for the part about multivariate Gaussians), pgs. 97-105
-No Additional Notes-
- create concept: shift + click on graph
- change concept title: shift + click on existing concept
- link together concepts: shift + click drag from one concept to another
- remove concept from graph: click on concept then press delete/backspace
- add associated content to concept: click the small circle that appears on the node when hovering over it
- other actions: use the icons in the upper right corner to optimize the graph placement, preview the graph, or download a json representation