GP classification with the Laplace approximation
(1.4 hours to learn)
Unlike with GP regression, there is no closed-form solution to GP classification. The most basic method for approximating it is to use the Laplace approximation, thereby formulating it as an optimization problem.
This concept has the prerequisites:
- Gaussian process classification
- the Laplace approximation
- fitting logistic regression with iterative reweighted least squares (We can optimize the objective function using the same IRLS method as in logistic regression.)
- learning GP hyperparameters (Part of fitting a GP classification model is learning the hyperparameters.)
Core resources (read/watch one of the following)
→ Gaussian Processes for Machine Learning
A graduate-level machine learning textbook focusing on Gaussian processes.
Location: Section 3.4, pages 41-48
Supplemental resources (the following are optional, but you may find them useful)
→ Bayesian Reasoning and Machine Learning
A textbook for a graudate machine learning course.
→ Machine Learning: a Probabilistic Perspective
A very comprehensive graudate-level machine learning textbook.
Location: Section 15.3.1, pages 525-528
→ Pattern Recognition and Machine Learning
A textbook for a graduate machine learning course, with a focus on Bayesian methods.
Location: Section 6.4.6, pages 315-318
-No Additional Notes-
- create concept: shift + click on graph
- change concept title: shift + click on existing concept
- link together concepts: shift + click drag from one concept to another
- remove concept from graph: click on concept then press delete/backspace
- add associated content to concept: click the small circle that appears on the node when hovering over it
- other actions: use the icons in the upper right corner to optimize the graph placement, preview the graph, or download a json representation