positive definite matrices
(1.8 hours to learn)
Summary
A symmetric matrix A is positive definite if x^T A x > 0 for any nonzero vector x, or positive semidefinite if the inequality is not necessarily strict. They can be equivalently characterized in terms of all of the eigenvalues being positive, or all of the pivots in Gaussian elimination being positive. Examples of PSD matrices include covariance matrices and Hessian matrices of convex functions. The singular value decomposition (SVD) is closely related to the eigendecomposition of a positive semidefinite matrix.
Context
This concept has the prerequisites:
- spectral decomposition (A symmetric matrix is positive definite iff the eigenvalues are all positive.)
- matrix transpose (PD matrices are usually assumed to be symmetric.)
- matrix multiplication (PD matrices are defined in terms of matrix-vector multiplications.)
- eigenvalues and eigenvectors (A symmetric matrix is positive definite iff the eigenvalues are all positive.)
- matrix inverse (PD matrices are closed under matrix inverses.)
Goals
- Know the definition of a positive (semi-)definite matrix (in terms of the quadratic form)
- Show that a symmetric matrix is PD if and only if its eigenvalues are all positive
Core resources (read/watch one of the following)
-Free-
→ MIT Open Courseware: Linear Algebra (2011)
Videos for an introductory linear algebra course focusing on numerical methods.
Additional dependencies:
- Gaussian elimination
-Paid-
→ Introduction to Linear Algebra
An introductory linear algebra textbook with an emphasis on numerical methods.
Location:
Section 6.5, "Positive definite matrices," pages 342-347
Additional dependencies:
- Gaussian elimination
See also
- The eigenvalues of a symmetric PSD matrix are closely related to the singular value decomposition
- Some common examples of PSD matrices:
- covariance matrices , which represent dependencies between different random variables
- kernel matrices , a way of representing similarity in machine learning