Overview
Dynamic programming solves optimization problems by breaking them into overlapping subproblems, solving each subproblem only once, and storing results in a table to avoid redundant computation. Unlike divide-and-conquer, which partitions problems into disjoint subproblems, dynamic programming exploits the fact that subproblems recur, transforming exponential-time recursive algorithms into efficient polynomial-time solutions. The chapter introduces the method through four canonical problems: rod cutting, matrix-chain multiplication, longest common subsequence (LCS), and optimal binary search trees.
Key Concepts
- Four-step development method: (1) Characterize the structure of an optimal solution, (2) recursively define the value of an optimal solution, (3) compute the optimal value bottom-up (or top-down with memoization), (4) construct an optimal solution from the computed information.
- Optimal substructure: A problem exhibits optimal substructure if an optimal solution contains within it optimal solutions to subproblems. This is verified using a "cut-and-paste" argument: if a subproblem solution were not optimal, replacing it would improve the overall solution, yielding a contradiction.
- Overlapping subproblems: The recursive solution to the problem solves the same subproblems repeatedly. The total number of distinct subproblems is typically polynomial in the input size, even though a naive recursion tree is exponentially large.
- Top-down with memoization: Write a recursive solution naturally, but cache results in a table; on each call, check whether the result is already stored before computing it.
- Bottom-up method: Sort subproblems by size and solve them smallest-first; when a subproblem is reached, all prerequisite subproblems are already solved. Often has better constant factors due to no recursion overhead.
- Subproblem graph: A directed graph where each vertex is a distinct subproblem and edges represent direct dependencies. The running time of a DP algorithm is often linear in the number of vertices and edges of this graph.
- Independence of subproblems: For DP to work, subproblems must be independent (they do not share resources). The longest simple path problem fails this requirement because using vertices in one subproblem prevents their use in another.
- Reconstructing solutions: Store the choice made at each subproblem in an auxiliary table (e.g., s[i,j]) to reconstruct the actual optimal solution in addition to its value.
Algorithms and Techniques
1. Rod Cutting
- Problem: Given a rod of length n and a price table p[1..n], determine cuts that maximize revenue.
- Recurrence: r_n = max_{1 ≤ i ≤ n} (p_i + r_{n−i}), with r_0 = 0.
- Naive CUT-ROD: Exponential time T(n) = 2^n due to recomputing the same subproblems.
- MEMOIZED-CUT-ROD: Top-down recursive approach with an array r[0..n] initialized to −∞; returns cached results on revisits.
- BOTTOM-UP-CUT-ROD: Iterates j = 1 to n, computing r[j] = max_{1 ≤ i ≤ j} (p[i] + r[j−i]). Both approaches run in Θ(n²).
- EXTENDED-BOTTOM-UP-CUT-ROD: Also records s[j] (the optimal first-piece size) to enable solution reconstruction via PRINT-CUT-ROD-SOLUTION.
2. Matrix-Chain Multiplication