variational mixture of Gaussians
(2.7 hours to learn)
Variational Bayes EM can be applied to fitting a mixture of Gaussians model. Unlike standard mixture of Gaussians fit with EM, the variational algorithm automatically controls the model complexity and yields a lower bound on the marginal likelihood.
This concept has the prerequisites:
Core resources (read/watch one of the following)
→ Pattern Recognition and Machine Learning
A textbook for a graduate machine learning course, with a focus on Bayesian methods.
Location: Section 10.2, pages 474-486
Supplemental resources (the following are optional, but you may find them useful)
→ Machine Learning: a Probabilistic Perspective
A very comprehensive graudate-level machine learning textbook.
Location: Section 21.6, pages 749-756
-No Additional Notes-
- create concept: shift + click on graph
- change concept title: shift + click on existing concept
- link together concepts: shift + click drag from one concept to another
- remove concept from graph: click on concept then press delete/backspace
- add associated content to concept: click the small circle that appears on the node when hovering over it
- other actions: use the icons in the upper right corner to optimize the graph placement, preview the graph, or download a json representation