A covariance matrix generalizes the idea of variance to multiple dimensions, where the i-th j-th element in the covariance matrix is the covariance between the i-th and j-th random variables. Covariance matrices are common throughout both statistics and machine learning and often arise when dealing with multivariate distributions.
This concept has the prerequisites:
- positive definite matrices (The covariance matrix is a PSD matrix.)
- Understand how to calculate the entries of a covariance matrix
- Understand the difference between positive and negative covariances
Core resources (we're sorry, we haven't finished tracking down resources for this concept yet)
Supplemental resources (the following are optional, but you may find them useful)
→ The Analysis Factor
Location: Article: Covariance Matrix
- The multivariate Gaussian is a widely used distribution parameterized in terms of a mean vector and covariance matrix.
- The Cauchy-Schwartz inequality for covariance follows from the fact that covariance matrices are PSD.
- Principal component analysis is a data analysis method applied to the covariance matrix.
- create concept: shift + click on graph
- change concept title: shift + click on existing concept
- link together concepts: shift + click drag from one concept to another
- remove concept from graph: click on concept then press delete/backspace
- add associated content to concept: click the small circle that appears on the node when hovering over it
- other actions: use the icons in the upper right corner to optimize the graph placement, preview the graph, or download a json representation