Loading learning content...
You now know what recursion trees are and how to draw them. But the real power of recursion trees isn't just visualization—it's rigorous complexity analysis.
Recursion trees provide a systematic method to derive time complexity without memorizing formulas or relying on intuition alone. By analyzing the structure of the tree—its height, the number of nodes at each level, and the work done at each node—we can calculate exact complexity bounds.
This technique is particularly valuable when:
By the end of this page, you will master the technique of deriving time complexity from recursion trees. You'll learn to calculate work per level, sum across all levels, and recognize when geometric series arise. These skills transform recursion trees from informal sketches into rigorous analytical tools.
Time complexity analysis with recursion trees rests on one simple principle:
Total Time = Sum of Work Done at Every Node
But summing over every individual node is tedious. We can organize this more cleverly:
Total Time = Sum of (Work at Level 0 + Work at Level 1 + ... + Work at Level h)
Where h is the tree's height. This transforms the problem into:
If we can express work at level k as a formula W(k), the total time is:
T(n) = Σ W(k) for k = 0 to h
12345678910
Total Time = Work at all nodes = (Work at Level 0) + (Work at Level 1) + ... + (Work at Level h) To find Work at Level k: 1. Count the number of nodes at level k 2. Determine the work each node at level k does 3. Multiply: Work at Level k = (# nodes at k) × (work per node at k) Then sum across all levels: T(n) = Σ [Nodes(k) × WorkPerNode(k)] for k = 0 to heightGrouping by levels is powerful because nodes at the same level often have similar properties—same problem size, same amount of work, same branching pattern. This regularity turns chaotic node-by-node counting into clean mathematical expressions, often geometric series with closed-form sums.
The first step in level-sum analysis is counting how many nodes exist at each level. This depends on the branching factor—how many recursive calls each invocation makes.
For constant branching factor b:
This assumes a complete tree where every internal node has exactly b children. Real trees may be incomplete, but this provides upper bounds.
| Branching Pattern | Nodes at Level k | Example |
|---|---|---|
| Linear (b = 1) | 1 (constant) | Factorial, linear search |
| Binary (b = 2) | 2ᵏ | Fibonacci, binary recursion |
| Ternary (b = 3) | 3ᵏ | Some divide-and-conquer |
| n-way (b = n) | nᵏ (or decreases with depth) | Permutations, exhaustive search |
12345678910111213141516171819202122232425262728
FIBONACCI (b = 2):Level 0: 1 node → 2⁰ = 1Level 1: 2 nodes → 2¹ = 2Level 2: 4 nodes → 2² = 4Level 3: 8 nodes → 2³ = 8Level k: 2ᵏ nodes Total nodes ≈ 2⁰ + 2¹ + 2² + ... + 2ⁿ = 2ⁿ⁺¹ - 1 = O(2ⁿ) BINARY SEARCH (b = 1, halving):Level 0: 1 nodeLevel 1: 1 nodeLevel 2: 1 node...Level k: 1 node Total nodes = height = O(log n) MERGE SORT (b = 2, balanced):Level 0: 1 node (1 problem of size n)Level 1: 2 nodes (2 problems of size n/2)Level 2: 4 nodes (4 problems of size n/4)...Level log(n): n nodes (n problems of size 1) Total nodes = 1 + 2 + 4 + ... + n = 2n - 1 = O(n)Real Fibonacci trees aren't perfectly complete—the left subtree is deeper than the right. But for Big-O analysis, using 2ᵏ as an upper bound at each level still gives the correct O(2ⁿ) result. In complexity analysis, upper bounds are often sufficient.
Once we know how many nodes exist at each level, we need to determine how much work each node does. The key insight: as we go deeper in the tree, problem sizes typically shrink, which affects work per node.
Common patterns:
Constant work per node (Fibonacci, factorial): Each call does O(1) work besides recursing. At level k with 2ᵏ nodes, total level work = O(2ᵏ).
Work proportional to problem size (merge sort): At level k, each subproblem has size n/2ᵏ. With 2ᵏ subproblems, total level work = 2ᵏ × O(n/2ᵏ) = O(n).
Work depends on path from root (varies): Some algorithms do more or less work depending on the choices made earlier.
1234567891011121314151617181920212223242526272829303132333435363738394041
FIBONACCI:━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━Work per node: O(1) — just addition and base case checkNode count at level k: ≤ 2ᵏWork at level k: 2ᵏ × O(1) = O(2ᵏ) Level 0: O(1)Level 1: O(2)Level 2: O(4)Level 3: O(8)...Level n: O(2ⁿ) This is a geometric series increasing by factor 2.Total = O(1 + 2 + 4 + ... + 2ⁿ) = O(2ⁿ⁺¹) = O(2ⁿ) MERGE SORT:━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━Work per node: O(size of subproblem) for mergingAt level k: each subproblem has size n/2ᵏNode count at level k: 2ᵏWork at level k: 2ᵏ × O(n/2ᵏ) = O(n) Level 0: 1 × O(n) = O(n)Level 1: 2 × O(n/2) = O(n)Level 2: 4 × O(n/4) = O(n)...Level log(n): n × O(1) = O(n) Every level does O(n) work!Total = O(n) × log(n) levels = O(n log n) BINARY SEARCH:━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━Work per node: O(1) — just one comparisonNode count at every level: 1Work at every level: 1 × O(1) = O(1) Total = O(1) × log(n) levels = O(log n)Merge sort beautifully illustrates how the number of nodes and work per node can perfectly balance. As we go deeper, nodes double but work per node halves, keeping total work per level constant. This O(n) per level across O(log n) levels is the secret to O(n log n).
After calculating work per level, we sum across all levels. This often produces geometric series, which have well-known closed forms.
Geometric Series Review:
A geometric series has the form: a + ar + ar² + ar³ + ... + arⁿ
Where a is the first term and r is the common ratio.
Sum formula: S = a × (rⁿ⁺¹ - 1) / (r - 1)
Key insight: The sum's behavior depends on whether r < 1, r = 1, or r > 1.
| Ratio (r) | Series Behavior | Big-O Result | Example |
|---|---|---|---|
| r < 1 | Decreasing — dominated by first term | O(first term) | Decreasing work per level |
| r = 1 | Constant — all terms equal | O(# terms × each term) | Merge sort (constant O(n) per level) |
| r > 1 | Increasing — dominated by last term | O(last term) | Fibonacci (doubling each level) |
123456789101112131415161718192021222324252627
CASE 1: Work INCREASES each level (r > 1) — Fibonacci━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━Work at level k: O(2ᵏ)Sum: 1 + 2 + 4 + 8 + ... + 2ⁿ = 2ⁿ⁺¹ - 1 r = 2 > 1, so the sum is dominated by the LAST term.Total = O(2ⁿ) — the bottom level dominates CASE 2: Work CONSTANT each level (r = 1) — Merge Sort━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━Work at level k: O(n)Sum: n + n + n + ... + n (log n times) = n × log n r = 1, so every term contributes equally.Total = O(n log n) — all levels contribute CASE 3: Work DECREASES each level (r < 1) — Rare but exists━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━Suppose work at level k: n/2ᵏ (with constant nodes per level)Sum: n + n/2 + n/4 + n/8 + ... r = 1/2 < 1, so the sum is dominated by the FIRST term.Total = O(n) — the root dominates Example: Certain tree algorithms where most work happens at the rootIn geometric series with r ≠ 1, one end dominates. If r > 1, the last term (deepest level) dominates—characteristic of exponential algorithms. If r < 1, the first term (root) dominates. This principle lets you quickly estimate complexity by identifying the dominant level.
Let's work through a complete complexity derivation for the naive Fibonacci algorithm using the recursion tree method.
1234
def fib(n): if n <= 1: return n return fib(n - 1) + fib(n - 2)12345678910111213141516171819202122232425262728293031323334353637383940414243444546474849505152535455
STEP 1: Draw/analyze the tree structure━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ fib(n) / \ fib(n-1) fib(n-2) / \ / \ (n-2) (n-3) (n-3) (n-4) ... ... ... ... Branching factor: 2 (each call makes 2 recursive calls)Tree height: n (the left branch goes n → n-1 → n-2 → ... → 1 → 0)Note: Right branch is shorter, but we use n as upper bound STEP 2: Count nodes per level (upper bound)━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━If every internal node has 2 children (upper bound):• Level 0: 1 node = 2⁰• Level 1: 2 nodes = 2¹• Level 2: 4 nodes = 2²• Level k: 2ᵏ nodes• Level n: 2ⁿ nodes (at most) STEP 3: Determine work per node━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━Each call does: - One or two comparisons: O(1) - One addition (for non-base cases): O(1)Total work per node: O(1) STEP 4: Calculate work per level━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━Work at level k = (nodes at level k) × (work per node) = 2ᵏ × O(1) = O(2ᵏ) STEP 5: Sum across all levels━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━Total = Σ O(2ᵏ) for k = 0 to n = O(2⁰ + 2¹ + 2² + ... + 2ⁿ) = O(2ⁿ⁺¹ - 1) = O(2ⁿ) STEP 6: Conclusion━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━T(n) = O(2ⁿ) The naive Fibonacci is exponential because:• The tree branches by factor 2• Tree height is n (linear in input)• Geometric series with r = 2 > 1 is dominated by deepest levelThe true bound for Fibonacci is O(φⁿ) where φ ≈ 1.618 (golden ratio), because the right subtree is always one level shorter. But O(2ⁿ) is a valid upper bound and sufficient for recognizing the algorithm is exponentially slow.
Now let's analyze merge sort, which represents the important class of balanced divide-and-conquer algorithms.
1234567
def mergeSort(arr, l, r): if l >= r: return m = (l + r) // 2 mergeSort(arr, l, m) # Sort left half mergeSort(arr, m + 1, r) # Sort right half merge(arr, l, m, r) # Merge: O(r - l + 1)12345678910111213141516171819202122232425262728293031323334353637383940414243444546474849505152535455565758
STEP 1: Draw/analyze the tree structure━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ sort(0, n-1) [problem size n] / \ sort(0, n/2-1) sort(n/2, n-1) [size n/2 each] / \ / \ ... ... ... ... [size n/4 each] ... / | | ... | \ [1][1][1]...[1] [n problems of size 1] Branching factor: 2Tree is perfectly balanced (always splits in half)Tree height: log₂(n) — each level halves problem size STEP 2: Count nodes per level━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━• Level 0: 1 node = 2⁰• Level 1: 2 nodes = 2¹• Level 2: 4 nodes = 2²• Level k: 2ᵏ nodes• Level log₂(n): n nodes = 2^(log n) STEP 3: Determine work per node━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━The merge operation takes O(size of subproblem):• At level k, each subproblem has size n/2ᵏ• Work at a single level-k node: O(n/2ᵏ) STEP 4: Calculate work per level━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━Work at level k = (nodes at level k) × (work per node at level k) = 2ᵏ × O(n/2ᵏ) = O(n) CRITICAL INSIGHT: The 2ᵏ and n/2ᵏ CANCEL OUT!Every level does exactly O(n) work. STEP 5: Sum across all levels━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━Total = Σ O(n) for k = 0 to log₂(n) = O(n) + O(n) + O(n) + ... [log(n) + 1 times] = O(n) × (log n + 1) = O(n log n) STEP 6: Conclusion━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━T(n) = O(n log n) Merge sort achieves this because:• The branching (×2) perfectly cancels with problem shrinking (÷2)• This keeps per-level work constant at O(n)• With log(n) levels, total = O(n log n)Merge sort exemplifies perfect balance: doubling nodes × halving work per node = constant work per level. This O(n) per level pattern is the signature of optimal divide-and-conquer algorithms. When you see this balance in a recursion tree, you know the complexity is O(n × height) = O(n log n).
The Master Theorem is a formula for solving divide-and-conquer recurrences of the form:
T(n) = a·T(n/b) + f(n)
Where:
The Master Theorem is essentially pre-computed recursion tree analysis. Understanding how recursion trees work lets you see why the Master Theorem has its three cases.
123456789101112131415161718192021222324252627282930313233343536373839404142
For recurrence T(n) = a·T(n/b) + f(n): Tree structure:• Level 0: 1 node, work = f(n)• Level 1: a nodes, each does f(n/b) work• Level 2: a² nodes, each does f(n/b²) work• Level k: aᵏ nodes, each does f(n/bᵏ) work• Height: log_b(n) levels Work at level k: = aᵏ × f(n/bᵏ) Number of leaves (bottom level): = a^(log_b n) = n^(log_b a) ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ THE THREE CASES correspond to which level dominates: CASE 1: Leaves dominate (exponentially more work at bottom) When f(n) grows slower than n^(log_b a) Total = O(n^(log_b a)) = O(number of leaves) Example: T(n) = 8T(n/2) + n a=8, b=2, n^(log_2 8) = n³, f(n)=n n grows slower than n³, so leaves dominate: O(n³) CASE 2: All levels equal (perfect balance) When f(n) = Θ(n^(log_b a)) Total = O(n^(log_b a) × log n) Example: T(n) = 2T(n/2) + n (Merge sort!) a=2, b=2, n^(log_2 2) = n, f(n)=n f(n) = n^(log_b a), perfect balance: O(n log n) CASE 3: Root dominates (most work at top) When f(n) grows faster than n^(log_b a) Total = O(f(n)) Example: T(n) = 2T(n/2) + n² a=2, b=2, n^(log_2 2) = n, f(n)=n² n² grows faster than n, so root dominates: O(n²)The Master Theorem's three cases correspond exactly to our geometric series analysis: increasing (leaves win), constant (all contribute), decreasing (root wins). If you understand recursion trees, you understand the Master Theorem—not as a formula to memorize, but as a consequence of tree structure.
Recursion trees also reveal space complexity. The key insight: the call stack at any moment contains exactly one path from root to current node.
Space complexity = Maximum stack depth = Tree height
This follows because:
| Algorithm | Tree Height | Space Complexity | Notes |
|---|---|---|---|
| Fibonacci | O(n) | O(n) | Left branch goes n → 0 |
| Merge Sort | O(log n) | O(log n) + O(n) | Log n for stack, O(n) for merge |
| Binary Search | O(log n) | O(log n) | Halves each call |
| Factorial | O(n) | O(n) | n frames on stack |
| Quick Sort (avg) | O(log n) | O(log n) | Balanced partitions |
| Quick Sort (worst) | O(n) | O(n) | Degenerate partitions |
1234567891011121314151617181920212223242526
FIBONACCI SPACE ANALYSIS: fib(5) / \ fib(4) fib(3) / \ ... fib(3) fib(2) / \ ... fib(2) fib(1) / \fib(1) fib(0) ← deepest point When we reach fib(0) via the leftmost path:Stack: [fib(5), fib(4), fib(3), fib(2), fib(0)]Depth: 5 frames = O(n) Note: fib(3) on the right branch is NOT on the stack yet.We complete the left side before starting the right side.The stack never contains both branches simultaneously. MERGE SORT SPACE ANALYSIS:However, merge sort has an additional space consideration:• Stack depth: O(log n) for recursion• Auxiliary space: O(n) for the merge step's temporary array Total space: O(n), dominated by merge auxiliary arrayDon't confuse space with time. Fibonacci has O(2ⁿ) TIME complexity (total nodes) but only O(n) SPACE complexity (tree height). The tree is wide but the stack is deep only along one path at a time. This distinction matters for understanding memory vs time tradeoffs.
Strengthen your skills by analyzing these recursive functions using the tree method. For each, draw the tree (at least 3 levels), count nodes per level, calculate work per level, and derive the total complexity.
123456789101112131415161718192021222324252627
# Problem 1: What's the complexity?def mystery1(n): if n <= 1: return 1 return mystery1(n // 2) + mystery1(n // 2) # Problem 2: What's the complexity?def mystery2(n): if n <= 1: return 1 for i in range(n): # O(n) work at each node print(i) return mystery2(n - 1) # Problem 3: What's the complexity?def mystery3(n): if n <= 1: return 1 return mystery3(n - 1) + mystery3(n - 1) + mystery3(n - 1) # Problem 4: What's the complexity?def mystery4(n): if n <= 1: return 1 for i in range(n): # O(n) work print(i) return mystery4(n // 2) + mystery4(n // 2)Problem 1: O(n) — tree has O(log n) levels but n leaves, work at leaves dominates. Problem 2: O(n²) — linear tree of height n, each node does O(k) work where k is the remaining size, sum = n + (n-1) + ... + 1 = O(n²). Problem 3: O(3ⁿ) — ternary tree with height n and 3ⁿ nodes. Problem 4: O(n log n) — like merge sort, each level does O(n) total and there are O(log n) levels.
You now have a powerful, principled method for deriving complexity from recursion trees. Let's consolidate:
What's next:
We've analyzed time and space complexity using recursion trees. But trees reveal something else: redundancy. When the same subproblem appears multiple times in a tree, we're computing the same thing repeatedly. The next page explores how to spot and quantify this redundancy—the key insight that leads to memoization and dynamic programming.
You can now derive time and space complexity from recursion trees. You understand how to count nodes per level, calculate work per level, sum across levels using geometric series, and connect tree analysis to the Master Theorem. These skills transform complexity analysis from memorization to derivation.