linear least squares
(1.9 hours to learn)
Summary
Linear least squares gives a value of x which minimizes the norm of Ax - b. It is well defined even in cases where Ax = b has no solution. It is the basis of linear regression, one of the most widely used methods in statistics.
Context
This concept has the prerequisites:
- linear systems as matrices (Linear least squares generalizes solving systems of linear equations.)
- projection onto a subspace (Projection is used in the solution.)
- matrix transpose (The matrix transpose is used in the normal equations.)
- partial derivatives (The solution can be derived using partial derivatives.)
- four fundamental subspaces (The solution can be derived geometrically in terms of column spaces and nullspaces.)
Core resources (read/watch one of the following)
-Free-
→ Khan Academy: Linear Algebra
- Lecture "Least squares approximation"
- Lecture "Least squares examples"
- Lecture "Another least squares example"
→ MIT Open Courseware: Linear Algebra (2011)
Videos for an introductory linear algebra course focusing on numerical methods.
-Paid-
→ Multivariable Mathematics
A textbook on linear algebra and multivariable calculus with proofs.
Location:
Section 5.5, "Projections, least squares, and inner product spaces," up to "Orthogonal bases," pages 225-232
Additional dependencies:
- Lagrange multipliers
→ Introduction to Linear Algebra
An introductory linear algebra textbook with an emphasis on numerical methods.
Location:
Section 4.3, "Least squares approximation," pages 218-225
See also
- Linear least-squares is the basis for linear regression , a widely used statistical model.
- Some ways of solving the linear least-squares problem include:
- the QR decomposition
- the pseudoinverse