GP classification with the Laplace approximation
(1.4 hours to learn)
Summary
Unlike with GP regression, there is no closed-form solution to GP classification. The most basic method for approximating it is to use the Laplace approximation, thereby formulating it as an optimization problem.
Context
This concept has the prerequisites:
- Gaussian process classification
- the Laplace approximation
- fitting logistic regression with iterative reweighted least squares (We can optimize the objective function using the same IRLS method as in logistic regression.)
- learning GP hyperparameters (Part of fitting a GP classification model is learning the hyperparameters.)
Core resources (read/watch one of the following)
-Free-
→ Gaussian Processes for Machine Learning
A graduate-level machine learning textbook focusing on Gaussian processes.
Location:
Section 3.4, pages 41-48
Supplemental resources (the following are optional, but you may find them useful)
-Free-
→ Bayesian Reasoning and Machine Learning
A textbook for a graudate machine learning course.
-Paid-
→ Machine Learning: a Probabilistic Perspective
A very comprehensive graudate-level machine learning textbook.
Location:
Section 15.3.1, pages 525-528
→ Pattern Recognition and Machine Learning
A textbook for a graduate machine learning course, with a focus on Bayesian methods.
Location:
Section 6.4.6, pages 315-318
See also
-No Additional Notes-