# covariance matrices

## Summary

A covariance matrix generalizes the idea of variance to multiple dimensions, where the i-th j-th element in the covariance matrix is the covariance between the i-th and j-th random variables. Covariance matrices are common throughout both statistics and machine learning and often arise when dealing with multivariate distributions.

## Context

This concept has the prerequisites:

- covariance
- positive definite matrices (The covariance matrix is a PSD matrix.)

## Goals

- Understand how to calculate the entries of a covariance matrix

- Understand the difference between positive and negative covariances

## Core resources (we're sorry, we haven't finished tracking down resources for this concept yet)

## Supplemental resources (the following are optional, but you may find them useful)

## -Free-

→ The Analysis Factor

→ Wikipedia

## See also

- The multivariate Gaussian is a widely used distribution parameterized in terms of a mean vector and covariance matrix.
- The Cauchy-Schwartz inequality for covariance follows from the fact that covariance matrices are PSD.
- Principal component analysis is a data analysis method applied to the covariance matrix.