(2.9 hours to learn)
Decision trees are a kind of tree-structured model used in machine learning and data mining. Each leaf node corresponds to a prediction, and each internal node divides the data points into two or more sets depending on the value of one of the input variables. Decision trees are widely used because of their simplicity and their ability to handle heterogeneous input features.
-this concept has no prerequisites-
- Know what a decision tree is.
- Give examples of functions which can't be represented compactly (e.g. majority, parity)
- Be able to fit a decision tree using a recursive greedy strategy.
- What is the information gain criterion, and why does it produce better splits than classification accuracy?
- Be aware that decision trees can be unstable, in that the structure changes dramatically with respect to small changes in the training data.
Core resources (read/watch one of the following)
→ The Elements of Statistical Learning
A graudate-level statistical learning textbook with a focus on frequentist methods.
- Read the introductory chapters if you're not familiar with the basic machine learning setup.
→ Coursera: Machine Learning
An online machine learning course aimed at advanced undergraduates.
- The rest of the "decision tree induction" section is optional but useful.
- Watch the Week One lectures if you're not familiar with the basic machine learning setup.
- Click on "Preview" to see the videos.
→ Artificial Intelligence: a Modern Approach
A textbook giving a broad overview of all of AI.
Location: Section 18.3, "Learning decision trees," pages 653-664
- Read sections 18.1 and 18.2 if you're not familiar with the basic machine learning setup.
Supplemental resources (the following are optional, but you may find them useful)
→ Machine Learning: a Probabilistic Perspective
A very comprehensive graudate-level machine learning textbook.
Location: Section 16.2, "Classification and regression trees (CART)," pages 544-552
→ Pattern Recognition and Machine Learning
A textbook for a graduate machine learning course, with a focus on Bayesian methods.
Location: Section 14.4, "Tree-based models," pages 663-666
- create concept: shift + click on graph
- change concept title: shift + click on existing concept
- link together concepts: shift + click drag from one concept to another
- remove concept from graph: click on concept then press delete/backspace
- add associated content to concept: click the small circle that appears on the node when hovering over it
- other actions: use the icons in the upper right corner to optimize the graph placement, preview the graph, or download a json representation