AdaBoost

(1.1 hours to learn)

Summary

AdaBoost is an example of a boosting algorithm, where the goal is to take a "weak classifier" (one which performs slightly above chance) and make it into a "strong classifier" (one which performs well on the training set). It is widely used in data mining, especially in conjunction with decision trees, because of its simplicity and effectiveness.

Context

-this concept has no prerequisites-

Goals

  • Understand the steps in the AdaBoost algorithm
  • Be aware of the underlying motivation, namely taking a "weak classifier" which performs slightly above chance, and producing a "strong classifier," which classifies the whole training set correctly.

Core resources (read/watch one of the following)

-Free-

The Elements of Statistical Learning
A graudate-level statistical learning textbook with a focus on frequentist methods.
Authors: Trevor Hastie,Robert Tibshirani,Jerome Friedman
Other notes:
  • Read the introductory chapters if you're not familiar with the basic machine learning setup.
Coursera: Machine Learning
An online machine learning course aimed at advanced undergraduates.
Author: Pedro Domingos
Other notes:
  • Watch the Week One lectures if you're not familiar with the basic machine learning setup.
  • Click on "Preview" to see the videos.

-Paid-

Supplemental resources (the following are optional, but you may find them useful)

-Paid-

See also