This chapter formalizes the asymptotic notation introduced informally in Chapter 2 and catalogs the standard mathematical functions that arise in algorithm analysis. It provides the precise definitions of Θ, O, Ω, o, and ω notations, establishes their properties, and reviews floors, ceilings, logarithms, exponentials, factorials, Fibonacci numbers, and the iterated logarithm function.
Θ-notation (Theta) — Asymptotically tight bound:
Θ(g(n)) = {f(n) : ∃ positive constants c₁, c₂, n₀ such that 0 ≤ c₁g(n) ≤ f(n) ≤ c₂g(n) for all n ≥ n₀}
f(n) is "sandwiched" between c₁g(n) and c₂g(n) for sufficiently large n.
Example: ½n² − 3n = Θ(n²) (choose c₁ = 1/14, c₂ = 1/2, n₀ = 7).
O-notation (Big-Oh) — Asymptotic upper bound:
O(g(n)) = {f(n) : ∃ positive constants c, n₀ such that 0 ≤ f(n) ≤ cg(n) for all n ≥ n₀}
Gives an upper bound to within a constant factor. Note: Θ(g(n)) ⊆ O(g(n)).
Bounding the worst-case running time with O-notation bounds the running time on every input.
Ω-notation (Big-Omega) — Asymptotic lower bound:
Ω(g(n)) = {f(n) : ∃ positive constants c, n₀ such that 0 ≤ cg(n) ≤ f(n) for all n ≥ n₀}
Stating a running time is Ω(g(n)) means no input causes the algorithm to run faster than cg(n).
Theorem 3.1: f(n) = Θ(g(n)) if and only if f(n) = O(g(n)) and f(n) = Ω(g(n)).
o-notation (little-oh) — Upper bound that is NOT tight:
For any positive constant c > 0, 0 ≤ f(n) < cg(n) for all sufficiently large n.
Equivalently: lim(n→∞) f(n)/g(n) = 0.
Example: 2n = o(n²), but 2n² ≠ o(n²).