# orthonormal bases

(1.3 hours to learn)

## Summary

An orthonormal basis is a basis such that each vector has unit length and each pair of vectors is orthogonal. An orthonormal basis for the full space can be represented as an orthogonal matrix. Such matrices have nice algebraic properties and are useful for representing projections and least squares solutions.

## Context

This concept has the prerequisites:

- bases
- dot product (The dot product is needed to define orthogonality and unit vectors.)
- matrix multiplication (A key property of orthogonal matrices is that the inverse equals the transpose.)
- matrix inverse (A key property of orthogonal matrices is that the inverse equals the transpose.)
- matrix transpose (A key property of orthogonal matrices is that the inverse equals the transpose.)
- projection onto a subspace (Orthogonal matrices are convenient for projection.)

## Core resources (read/watch one of the following)

## -Free-

→ Khan Academy: Linear Algebra

→ MIT Open Courseware: Linear Algebra (2011)

Videos for an introductory linear algebra course focusing on numerical methods.

## -Paid-

→ Introduction to Linear Algebra

An introductory linear algebra textbook with an emphasis on numerical methods.

Location:
Section 4.4, "Orthogonal bases and Gram-Schmidt," pages 234-238

## Supplemental resources (the following are optional, but you may find them useful)

## -Paid-

→ Multivariable Mathematics

A textbook on linear algebra and multivariable calculus with proofs.

Location:
Section 5.5.2, "Orthogonal bases," pages 232-238

## See also

- Orthogonal matrices can be characterized in terms of eigenvalues having unit norm.
- Orthogonal matrices are used in the spectral decomposition of a symmetric matrix.