Berkeley CS281a: Statistical Learning Theory

Created by: Colorado Reed
Intended for: students taking CS281a at Berkeley

This content of this roadmap follows Prof. Jordan's lectures/textbook. I'll update it as I/we work through the material. Send me an email if you'd like to contribute (colorado AT berkeley DOT edu)

Conditional Independence and Factorization

Exact Inference

  • The variable elimination algorithm is based on interchanging sums and products in the definitions of marginals or partition functions.
  • The sum product algorithm is a belief propagation algorithm based on dynamic programming. It has the advantage over naive variable elimination in that it reuses computations to compute marginals for all nodes in the graph
  • junction trees generalize the sum product algorithm to arbitrary graphs by grouping variables together into cliques such that the cliques form a tree.

Sampling-based inference

Statistical Concepts

We discussed Bayesian vs frequentist inference; some topics we touched on include:

Linear Regression and the Least Mean Squares algorithm

Linear Classifiers

Note: I'll fill out this section as I read through the material and we discuss it in class more thoroughly

Generalized linear models and Generative vs. Discriminative models

Note: I'll fill out this section as I read through the material and we discuss it in class more thoroughly