(1.4 hours to learn)
The asymptotic (time) complexity of an algorithm refers to the scaling of the running time of an algorithm as a function of the input size. The time complexity is typically given in terms of big-O notation, where the running time is bounded up to a multiplicative constant.
-this concept has no prerequisites-
- Know the precise definition of big-O notation
- Be able to analyze the asymptotic complexity of simple algorithms
Core resources (read/watch one of the following)
→ MIT 16.070: Introduction to Computers and Programming
Lecture notes for an introductory computer science course
Location: Notes for "Big-O notation"
→ Introduction to the Theory of Computation
An undergraduate textbook on automata and the theory of computation.
Location: Section 7.1, "Measuring complexity," pages 247-256
-No Additional Notes-
- create concept: shift + click on graph
- change concept title: shift + click on existing concept
- link together concepts: shift + click drag from one concept to another
- remove concept from graph: click on concept then press delete/backspace
- add associated content to concept: click the small circle that appears on the node when hovering over it
- other actions: use the icons in the upper right corner to optimize the graph placement, preview the graph, or download a json representation