Bayesian logistic regression
(1.8 hours to learn)
Summary
A Bayesian version of logistic regression.
Context
This concept has the prerequisites:
- logistic regression
- Bayesian parameter estimation
- Bayesian linear regression (Many of the ideas from Bayesian linear regression transfer to Bayesian logistic regression.)
- the Laplace approximation (The Laplace approximation is a simple way to approximate Bayesian logistic regression.)
- the evidence approximation (The evidence approximation is a simple way to choose hyperparameters in Bayesian logistic regression.)
- Bayesian decision theory (Decision theory tells us how to make predictions from Bayesian parameter estimation.)
- probit regression (The posterior predictive distribution is often approximated using probit regression.)
Goals
- Know the form of the Bayesian logistic regression model
- Be able to estimate the parameters of the model computationally (e.g. with the Laplace approximation or EP)
- Be able to approximate the predictive distribution computationally (e.g. with sampling or the probit approximation)
Core resources (read/watch one of the following)
-Paid-
→ Machine Learning: a Probabilistic Perspective
A very comprehensive graudate-level machine learning textbook.
Location:
Section 8.4, pages 254-261
Additional dependencies:
- probit regression
- Monte Carlo estimation
Supplemental resources (the following are optional, but you may find them useful)
-Paid-
→ Pattern Recognition and Machine Learning
A textbook for a graduate machine learning course, with a focus on Bayesian methods.
Location:
Section 4.5, pages 217-220
Additional dependencies:
- probit regression
See also
- Bayesian logistic regression is intractable to solve exactly. Some approximation methods include: Gaussian process classification is a nonlinear analogue.