Gaussian process regression
(1.3 hours to learn)
Gaussian process regression is a Bayesian model for nonparametric regression. (That is, nonparametric in the sense that the complexity of the regression function grows with the amount of data.) The model places a prior directly on the output values without reference to an underlying parametric model.
This concept has the prerequisites:
Core resources (read/watch one of the following)
→ Gaussian Processes for Machine Learning
A graduate-level machine learning textbook focusing on Gaussian processes.
→ Pattern Recognition and Machine Learning
A textbook for a graduate machine learning course, with a focus on Bayesian methods.
Location: Section 6.4-6.4.2, pages 303-311
Supplemental resources (the following are optional, but you may find them useful)
→ Bayesian Reasoning and Machine Learning
A textbook for a graudate machine learning course.
→ Machine Learning: a Probabilistic Perspective
A very comprehensive graudate-level machine learning textbook.
Location: Section 15.1-15.2.3, pages 515-521
- In order to use GP regression in practice, it is usually necessary to learn the hyperparameters (such as the length scale) from data.
- Some other uses for Gaussian processes in machine learning:
- black-box optimization (where we only get to evaluate the function, and doing so is expensive)
- reinforcement learning
- create concept: shift + click on graph
- change concept title: shift + click on existing concept
- link together concepts: shift + click drag from one concept to another
- remove concept from graph: click on concept then press delete/backspace
- add associated content to concept: click the small circle that appears on the node when hovering over it
- other actions: use the icons in the upper right corner to optimize the graph placement, preview the graph, or download a json representation