EM algorithm for PCA
(35 minutes to learn)
While probabilistic PCA has a closed-form solution, it is infeasible to compute for large and high-dimensional datasets. The expectation-maximization (EM) algorithm provides an alternative. Despite its iterative nature, it can be far more computationally efficient.
This concept has the prerequisites:
- Expectation-Maximization algorithm
- probabilistic PCA (EM is a way of fitting the probabilistic PCA model.)
- maximum likelihood: multivariate Gaussians (The M step involves maximum likelihood for multivariate Gaussians.)
Core resources (read/watch one of the following)
→ Machine Learning: a Probabilistic Perspective
A very comprehensive graudate-level machine learning textbook.
Location: Section 12.2.5, pages 396-398
→ Pattern Recognition and Machine Learning
A textbook for a graduate machine learning course, with a focus on Bayesian methods.
Location: Section 12.2.2, pages 577-580
-No Additional Notes-
- create concept: shift + click on graph
- change concept title: shift + click on existing concept
- link together concepts: shift + click drag from one concept to another
- remove concept from graph: click on concept then press delete/backspace
- add associated content to concept: click the small circle that appears on the node when hovering over it
- other actions: use the icons in the upper right corner to optimize the graph placement, preview the graph, or download a json representation