(1.3 hours to learn)
In an optimization problem, one is interested in minimizing or maximizing a function, possibly subject to equality or inequality constraints. The extrema must occur on the boundary of the set, at points which are not differentiable, or at points where the partial derivatives are zero.
This concept has the prerequisites:
Core resources (read/watch one of the following)
→ Multivariable Mathematics
A textbook on linear algebra and multivariable calculus with proofs.
Location: Section 5.2, "Maximum/minimum problems," pages 202-207
- linear approximation
- See Section 5.1 for the statement of the Maximum Value Theorem.
→ Multivariable Calculus
An introductory multivariable calculus textbook.
Location: Section 13.5, "Multivariable optimization problems," pages 878-886
Supplemental resources (the following are optional, but you may find them useful)
→ Pattern Recognition and Machine Learning
A textbook for a graduate machine learning course, with a focus on Bayesian methods.
Location: Appendix D (Calculus of Variations)
- Good, quick review of functional calculus
- Lagrange multipliers can be used to solve optimization problems with equality or inequality constraints.
- Some examples of optimization problems:Convex optimization problems are a very broad class of optimization problems for which we can often find a global optimimum.
- create concept: shift + click on graph
- change concept title: shift + click on existing concept
- link together concepts: shift + click drag from one concept to another
- remove concept from graph: click on concept then press delete/backspace
- add associated content to concept: click the small circle that appears on the node when hovering over it
- other actions: use the icons in the upper right corner to optimize the graph placement, preview the graph, or download a json representation