Loading content...
Imagine having a single formula that could solve virtually any divide-and-conquer recurrence instantly. No recursion tree drawing, no tedious expansion, no guessing—just plug in the parameters and read off the answer.
That tool exists. It's called the Master Theorem, and it's one of the most powerful results in algorithm analysis. Once you internalize it, you'll be able to analyze the complexity of any D&C algorithm in seconds.
By the end of this page, you will understand the Master Theorem's three cases, know how to apply it to any recurrence of the form T(n) = aT(n/b) + f(n), and develop deep intuition for why each case produces its particular complexity. You'll also learn when the Master Theorem doesn't apply and what to do in those situations.
The Master Theorem provides asymptotic solutions for recurrences of the form:
T(n) = aT(n/b) + f(n)
Where:
The Three Cases:
Let c_crit = log_b(a). This is the critical exponent. The solution depends on how f(n) compares to n^{c_crit}:
Case 1: f(n) = O(n^c) where c < c_crit
T(n) = Θ(n^{c_crit}) = Θ(n^{log_b a})
The recursive work dominates. Leaves of the recursion tree control the complexity.
Case 2: f(n) = Θ(n^{c_crit} × log^k n) for k ≥ 0
T(n) = Θ(n^{c_crit} × log^{k+1} n)
Balanced: the work is evenly distributed across levels.
Case 3: f(n) = Ω(n^c) where c > c_crit, AND the regularity condition holds
T(n) = Θ(f(n))
The non-recursive work dominates. The root of the tree controls the complexity.
The Master Theorem is fundamentally a comparison between two forces:
Whichever force dominates determines the complexity. If they're balanced, you get an extra log factor.
The critical exponent c_crit = log_b(a) is the key to the Master Theorem. It represents the rate at which the number of subproblems grows as we descend the recursion tree.
What c_crit Tells Us:
| a (subproblems) | b (division) | c_crit = log_b(a) | Interpretation |
|---|---|---|---|
| 1 | 2 | log₂(1) = 0 | Single path: n⁰ = 1 leaf |
| 2 | 2 | log₂(2) = 1 | Perfect binary: n¹ = n leaves |
| 4 | 2 | log₂(4) = 2 | Four-way split: n² leaves |
| 3 | 2 | log₂(3) ≈ 1.585 | Karatsuba pattern |
| 7 | 2 | log₂(7) ≈ 2.807 | Strassen pattern |
| 2 | 4 | log₄(2) = 0.5 | Two subproblems from quartering |
Deriving the Critical Exponent:
Let's see why leaves contribute n^{log_b a} work:
The identity a^{log_b n} = n^{log_b a} is key:
a^{log_b n} = a^{(log_a n / log_a b)}
= (a^{log_a n})^{1/log_a b}
= n^{1/log_a b}
= n^{log_b a}
Understanding c_crit lets you predict complexity without memorizing formulas. If you're creating more subproblems than the work-per-level can "pay off," leaves dominate. If work-per-level grows faster, the root dominates. If they match, you get balanced log factors.
When This Applies:
Case 1 applies when the non-recursive work f(n) grows slower than the leaf count n^{log_b a}. The explosive growth of subproblems overwhelms the divide/combine work.
Solution:
T(n) = Θ(n^{log_b a})
Example 1: T(n) = 2T(n/2) + O(1)
This is the D&C max/min pattern. O(1) work per node, n leaves.
Example 2: T(n) = 8T(n/2) + O(n²)
This is naive D&C matrix multiplication. 8 subproblems of n/2 each, with O(n²) combine (the additions). The 8 recursive calls dominate.
Example 3: T(n) = 7T(n/2) + O(n²)
Strassen's algorithm. Reducing from 8 to 7 subproblems drops complexity from n³ to n^{2.807}.
In Case 1, imagine a recursion tree where the work at each level increases as you go down (because you have more nodes). The bottom level (leaves) does the most work overall. Total complexity is dominated by counting the leaves.
When This Applies:
Case 2 applies when f(n) grows at exactly the same rate as the leaf count, possibly with logarithmic factors. Each level of the tree contributes roughly equal work.
Solution:
T(n) = Θ(n^{log_b a} × log^{k+1} n)
The most common subcase is k = 0: f(n) = Θ(n^{log_b a}), giving T(n) = Θ(n^{log_b a} log n).
Example 1: T(n) = 2T(n/2) + O(n) — The Classic
The merge sort recurrence. Every level does O(n) work, and there are O(log n) levels.
Example 2: T(n) = T(n/2) + O(1)
Binary search. O(1) work per level × O(log n) levels.
Note: Some textbooks list this as base case of Case 2; others as a special case.
Example 3: T(n) = 4T(n/2) + O(n² log n) — Extended Case 2
When f(n) has a log factor, it carries through with an extra power.
Case 2 is special because the work is perfectly distributed. Each level contributes Θ(n^{log_b a}) work, and there are Θ(log n) levels. The log factor comes from summing equal contributions across logarithmically many levels.
When This Applies:
Case 3 applies when the non-recursive work f(n) grows faster than the leaf count. The work at the root level dominates all recursive contributions.
Solution:
T(n) = Θ(f(n))
Important: The Regularity Condition
Case 3 comes with a requirement: f(n) must satisfy the regularity condition:
a × f(n/b) ≤ c × f(n) for some c < 1 and large n
This ensures that f(n) truly dominates—the work decreases sufficiently fast as we go down the tree.
Example 1: T(n) = T(n/2) + O(n)
The quickselect average case. O(n) partitioning at the top level, then recurse on half. The work halves each level, so total is O(n).
Example 2: T(n) = 2T(n/2) + O(n²)
A hypothetical algorithm with quadratic combine. The root's O(n²) work dominates the O(n) leaf contributions.
Example 3: T(n) = 3T(n/4) + O(n log n)
The regularity condition is often glossed over, but it matters. For polynomial f(n) = n^c, regularity automatically holds. But for functions with unusual growth (like n / log n), you need to verify. If regularity fails, the Master Theorem doesn't apply directly.
Here's the systematic procedure for applying the Master Theorem to any recurrence:
Worked Example: T(n) = 9T(n/3) + n
Step 1: a = 9, b = 3, f(n) = n
Step 2: c_crit = log₃(9) = 2
Step 3: f(n) = n = n¹, so c = 1
Step 4: Compare: c = 1 < 2 = c_crit → Case 1
Step 5: No regularity check needed for Case 1
Step 6: T(n) = Θ(n^{log₃ 9}) = Θ(n²)
Worked Example: T(n) = T(2n/3) + 1
Step 1: a = 1, b = 3/2, f(n) = 1
Step 2: c_crit = log_{3/2}(1) = 0
Step 3: f(n) = 1 = n⁰, so c = 0
Step 4: Compare: c = 0 = 0 = c_crit → Case 2 (k = 0)
Step 5: Case 2 applies directly
Step 6: T(n) = Θ(n⁰ × log n) = Θ(log n)
The Master Theorem is powerful but not universal. Here are the cases where it doesn't apply:
1. Unequal Subproblem Sizes
The Master Theorem requires all subproblems to have the same size n/b.
T(n) = T(n/3) + T(2n/3) + n ✗ Can't apply Master Theorem
Solution: Use the Akra-Bazzi theorem (generalization) or the recursion tree method.
This recurrence (randomized quicksort partition) solves to O(n log n) via other methods.
2. Non-Polynomial f(n)
The Master Theorem assumes f(n) has polynomial growth. Functions like 2ⁿ, n!, or even n/log n don't fit the standard cases.
T(n) = 2T(n/2) + 2ⁿ ✗ f(n) is exponential
T(n) = 2T(n/2) + n/log n ✗ Falls between cases
Solution: Use recursion trees or generating functions for unusual f(n).
3. Non-Constant a or b
If the number of subproblems or the division factor varies with n, the Master Theorem doesn't apply.
T(n) = T(√n) + 1 ✗ Division factor is √n, not constant
T(n) = nT(n/2) + 1 ✗ Number of subproblems varies with n
Solution: Substitution or recursion tree analysis.
Note: T(n) = T(√n) + 1 can be solved by substituting m = log n.
4. Gap Between Cases 1 and 2, or 2 and 3
The Master Theorem has gaps when f(n) = Θ(n^{c_crit} / log^k n) for k > 0.
T(n) = 2T(n/2) + n/log n ✗ Between Case 1 and Case 2
Solution: Extended Master Theorem variants or direct analysis.
When the Master Theorem fails, the recursion tree method almost always works. Draw the tree, compute work at each level, sum across levels. It's more work but applies universally.
Here's everything you need to apply the Master Theorem in one place:
The Recurrence Form:
T(n) = aT(n/b) + f(n) where a ≥ 1, b > 1
The Critical Exponent:
c_crit = log_b(a)
The Three Cases:
| Case | Condition | Solution |
|---|---|---|
| 1 | f(n) = O(n^c), c < c_crit | T(n) = Θ(n^{c_crit}) |
| 2 | f(n) = Θ(n^{c_crit} log^k n) | T(n) = Θ(n^{c_crit} log^{k+1} n) |
| 3 | f(n) = Ω(n^c), c > c_crit AND regularity | T(n) = Θ(f(n)) |
| Recurrence | a | b | c_crit | Case | Solution |
|---|---|---|---|---|---|
| T(n/2) + 1 | 1 | 2 | 0 | 2 | O(log n) |
| T(n/2) + n | 1 | 2 | 0 | 3 | O(n) |
| 2T(n/2) + 1 | 2 | 2 | 1 | 1 | O(n) |
| 2T(n/2) + n | 2 | 2 | 1 | 2 | O(n log n) |
| 2T(n/2) + n² | 2 | 2 | 1 | 3 | O(n²) |
| 3T(n/2) + n | 3 | 2 | 1.58 | 1 | O(n^{1.58}) |
| 7T(n/2) + n² | 7 | 2 | 2.81 | 1 | O(n^{2.81}) |
| 4T(n/2) + n² | 4 | 2 | 2 | 2 | O(n² log n) |
What's Next:
You now have the formal machinery to solve D&C recurrences. The next page develops intuition—understanding why these results are true without formal proofs, so you can reason about recurrences even when you don't remember the exact formulas.
You've mastered the Master Theorem—the most powerful tool for analyzing divide-and-conquer complexity. You can now systematically solve any recurrence of the form T(n) = aT(n/b) + f(n) by identifying which case applies. Next, we'll build deep intuition for these results.