optimization problems
(1.3 hours to learn)
Summary
In an optimization problem, one is interested in minimizing or maximizing a function, possibly subject to equality or inequality constraints. The extrema must occur on the boundary of the set, at points which are not differentiable, or at points where the partial derivatives are zero.
Context
This concept has the prerequisites:
- partial derivatives (Partial derivatives are necessary to characterize the critical points.)
Core resources (read/watch one of the following)
-Free-
→ MIT Open Courseware: Multivariable Caclulus (2010)
Video lectures for MIT's introductory multivariable calculus class.
-Paid-
→ Multivariable Mathematics
A textbook on linear algebra and multivariable calculus with proofs.
Location:
Section 5.2, "Maximum/minimum problems," pages 202-207
Additional dependencies:
- linear approximation
Other notes:
- See Section 5.1 for the statement of the Maximum Value Theorem.
→ Multivariable Calculus
An introductory multivariable calculus textbook.
Location:
Section 13.5, "Multivariable optimization problems," pages 878-886
Supplemental resources (the following are optional, but you may find them useful)
-Paid-
→ Pattern Recognition and Machine Learning
A textbook for a graduate machine learning course, with a focus on Bayesian methods.
Location:
Appendix D (Calculus of Variations)
Other notes:
- Good, quick review of functional calculus
See also
- Lagrange multipliers can be used to solve optimization problems with equality or inequality constraints.
- Some examples of optimization problems:
- Fitting parameters of a statistical model using maximum likelihood
- Linear least squares