Time & Space Complexity
Common time complexities: O(1), O(log n), O(n), O(n log n), and O(n²)
Time Complexities
- O(1) → constant time (already implied in your space section, but worth listing explicitly for time too).
- O(n²), O(n³), … → polynomial time in general (you have n² but not n³+).
- O(k·n) or O(n + m) → “linear in combined size” is common in graph problems (n = vertices, m = edges).
- O(√n) → often appears in number theory, grid traversal optimizations, or block decomposition.
- O(n!) → factorial growth, classic in brute force permutation problems.
- O(nᶜ) for arbitrary constants c > 1 (general polynomial).
- O(n^n) → extremely rare, but worth knowing for completeness.
- O(log n) is when size is reduced by a constant fraction each step, like binary search
- O(n log n) is when it is turned into a subproblem that is solved independently, like merge sort or when there is an O(n) part and recursion part
Space complexities:
- O(1) uses a constant amount of extra space
- O(log n) typically for recursion, it is space proportional to the logarithm of the input
- O(n) the space proportional to the input size
- O(n log n) certain divide and conquer algos that use extra data structures
- O(n²) uses the square of the input size