Coursera: Machine Learning

Created by: Colorado Reed
Intended for: Coursera Machine Learning Students

Metacademy Primer

Metacademy is an open source platform designed to help you efficiently learn about any topic that you're interested in---it currently specializes in machine learning and artificial intelligence topics. The idea is that you click on a concept that interests you, and Metacademy produces a "learning plan" that will help you learn the concept and all of its prerequisite concepts that you don't already know.

Metacademy's learning experience revolves around two central components:

You can tell Metacademy that you understand a [prerequisite] concept by clicking the checkmark next to the concept's title in the graph or list view. Furthermore, you can then click the "hide" button in the upper right to hide the concepts you understand (Metacademy remembers the concepts you've learned, so it'll automatically apply these in the future).

Coursera Roadmap

This roadmap is a supplement to Andrew Ng's Coursera machine learning course. You should use this roadmap to review the essential concepts presented during each section, find detailed resources for each of the discussed concepts, and also to brush up on necessary prerequisite concepts for the covered material (especially if you'd like to learn the concepts in greater detail). This roadmap is a work in progress, but you may still find it useful in its current incarnation. If you have any comments or suggestions, you can contact me at _olorado@meta_ademy.org (replace _ with a c).

PS) After completing this course, I highly recommend furthering your machine learning knowledge via Roger Grosse's excellent Bayesian Machine Learning Roadmap.

Section I (Introduction)

  • No specific concepts for this section, which provides a general motivation for machine learning and quickly covers higher-level concepts such as supervised learning and unsupervised learning.
  • If you're new to machine learning, take a moment to read Pedro Domingos's practical machine learning overview paper, and make sure to reflect on this paper as you progress in your machine learning endeavors.

Section II (Linear Regression with One Variable )

Section III (Linear Algebra Review)

  • matrix inverse and its prerequisites will bring you up to speed on the linear algebra concepts necessary for most of this course

Section IV (Linear Regression with Multiple Variables)

  • linear regression (covered in section II), pay attention to the multivariate case
  • basis function expansions, which ties in with notion of "feature selection" as discussed in the lecture (a particular choice of basis functions for linear regression yields "polynomial regression," as discussed in the lectures)
  • linear regression closed form solution: is typically used in lieu of gradient descent optimization for smaller datasets (N < 10000) -- this solution yields the so-called "normal equations"

Section VI (Logistic Regression)

  • logistic regression note the content on regularized logistic regression (this topic is covered in the next section)

Section VII (Regularization)

Section VIII (Neural Networks: Representation)

  • This section provides a higher level overview/justification of neural-networks and multiclass classification.
  • If you'd like to dive deeper into neural networks after completing Section VIII and IX, consider enrolling in Geoffry Hinton's Neural Networks for Machine Learning Coursera course.

Section IX (Neural Networks: Learning)

Section X (Advice for Applying Machine Learning)

Section XI (Machine Learning System Design)

Section XII (Support Vector Machines)

Section XII [Clustering]

  • k-means
  • Also see k means++ for a simple k-means initialization routine that yields an optimality guarantee on the final result

Section XIV (Dimensionality Reduction)

Section XV (Anomaly Detection)

Section XVI (Recommender Systems)

Section XVII (Large Scale Machine Learning)

Section XVIII (Application Example: Photo OCR)

  • sorry, nothing for this section

What next?