EM algorithm for PCA
(35 minutes to learn)
Summary
While probabilistic PCA has a closed-form solution, it is infeasible to compute for large and high-dimensional datasets. The expectation-maximization (EM) algorithm provides an alternative. Despite its iterative nature, it can be far more computationally efficient.
Context
This concept has the prerequisites:
- Expectation-Maximization algorithm
- probabilistic PCA (EM is a way of fitting the probabilistic PCA model.)
- maximum likelihood: multivariate Gaussians (The M step involves maximum likelihood for multivariate Gaussians.)
Core resources (read/watch one of the following)
-Paid-
→ Machine Learning: a Probabilistic Perspective
A very comprehensive graudate-level machine learning textbook.
Location:
Section 12.2.5, pages 396-398
→ Pattern Recognition and Machine Learning
A textbook for a graduate machine learning course, with a focus on Bayesian methods.
Location:
Section 12.2.2, pages 577-580
See also
-No Additional Notes-