ridge regression

(1.3 hours to learn)

Summary

A problem with vanilla linear regression is that it can overfit, by forcing the learned parameters to match all the idiosyncrasies of the training data. Ridge regression, or regularized linear regression, is a way of extending the cost function with a regularizer which penalizes large weights. This leads to simpler solutions and often improves generalization performance. This idea of regularization can be used to improve the generalization performance of many other statistical models as well.

Context

This concept has the prerequisites:

Core resources (read/watch one of the following)

-Free-

The Elements of Statistical Learning
A graudate-level statistical learning textbook with a focus on frequentist methods.
Location: Section 3.4.3, subsection "Ridge regression," pages 59-64
Authors: Trevor Hastie,Robert Tibshirani,Jerome Friedman

Supplemental resources (the following are optional, but you may find them useful)

-Free-

Coursera: Machine Learning (2013)
An online machine learning course aimed at a broad audience.
Author: Andrew Y. Ng
Other notes:
  • Click on "Preview" to see the videos.

-Paid-

See also