computations on multivariate Gaussians
(1.1 hours to learn)
Multivariate Gaussians are widely used in computational sciences because many useful operations can be performed efficiently. Marginalization is easy: we simply pull the relevant rows and columns of the mean and covariance. Conditioning can be done with a matrix inversion.
This concept has the prerequisites:
Core resources (read/watch one of the following)
→ Machine Learning: a Probabilistic Perspective
A very comprehensive graudate-level machine learning textbook.
Location: Sections 4.3.1-4.3.2, pages 111-115
→ Pattern Recognition and Machine Learning
A textbook for a graduate machine learning course, with a focus on Bayesian methods.
Location: Sections 2.3.1-2.3.1, pages 85-90
- create concept: shift + click on graph
- change concept title: shift + click on existing concept
- link together concepts: shift + click drag from one concept to another
- remove concept from graph: click on concept then press delete/backspace
- add associated content to concept: click the small circle that appears on the node when hovering over it
- other actions: use the icons in the upper right corner to optimize the graph placement, preview the graph, or download a json representation