Loading learning content...
Understanding what a recursion tree is constitutes only half the battle. The real skill lies in drawing accurate recursion trees from any recursive function you encounter.
Many students struggle with this step. They understand the concept in the abstract but freeze when faced with actual code. "Where do I start? How many levels do I draw? What labels go on each node?" These questions often derail analysis before it begins.
This page eliminates that uncertainty. We'll develop a systematic, repeatable methodology for transforming any recursive algorithm into a clear, analyzable recursion tree. By the end, you'll have a step-by-step process you can apply to any recursive code—whether it's a simple factorial or a complex backtracking algorithm.
By the end of this page, you will have a complete, reproducible methodology for drawing recursion trees. You'll understand how to identify the recursive structure from code, choose appropriate node labels, expand the tree systematically, and apply these techniques to diverse recursive patterns.
Drawing a recursion tree is a structured process. Follow these five steps, and you'll produce an accurate tree for any recursive function:
Let's walk through each step in detail, then apply the complete method to several examples.
At first, these steps will feel mechanical. With practice, they become automatic—you'll glance at recursive code and immediately visualize its tree structure. The explicit methodology is training wheels; eventually, you'll internalize it.
The first step in drawing a recursion tree is understanding what parameters the recursive function takes. These parameters become the labels for your tree nodes.
Why parameters matter:
Each node in a recursion tree represents a specific function call with specific argument values. Two nodes with different parameters represent different computations—even if they use the same function. Two nodes with identical parameters represent the same computation (potential redundancy).
Common parameter patterns:
f(5), f(3), f(1)f(0,7), f(0,3), f(4,7)C(5,2), LCS(3,4)solve([1,2,3,4]) → solve(0,3) where indices indicate subarray123456789101112131415161718192021222324252627282930
# Single parameter — label nodes as f(n)def factorial(n): if n <= 1: return 1 return n * factorial(n - 1)# Tree labels: factorial(5), factorial(4), ..., factorial(1) # Two parameters — label nodes as f(i, j)def binarySearch(arr, low, high, target): if low > high: return -1 mid = (low + high) // 2 if arr[mid] == target: return mid if arr[mid] < target: return binarySearch(arr, mid+1, high, target) return binarySearch(arr, low, mid-1, target)# Tree labels: search(0,7), search(4,7), search(4,5), etc. # Multiple changing parameters — label with alldef combinations(n, k): if k == 0 or k == n: return 1 return combinations(n-1, k-1) + combinations(n-1, k)# Tree labels: C(5,2), C(4,1), C(4,2), etc. # Array parameter — use indices insteaddef mergeSort(arr, left, right): if left >= right: return mid = (left + right) // 2 mergeSort(arr, left, mid) mergeSort(arr, mid+1, right) merge(arr, left, mid, right)# Tree labels: sort(0,7), sort(0,3), sort(4,7), etc.Good node labels are concise but unambiguous. For complex functions, abbreviate the function name (fib, sort, dfs) and include only parameters that change between calls. If an array is passed but only indices change, label with indices. The goal is readable diagrams that clearly distinguish different subproblems.
Base cases are the termination conditions—the scenarios where the function returns without making further recursive calls. In your recursion tree, base cases become leaf nodes.
Why base cases matter for trees:
The base cases determine when to stop expanding a branch. Without correctly identifying base cases, you might:
How to identify base cases:
Look for conditional returns that don't involve recursive calls. These typically appear at the beginning of the function.
1234567891011121314151617181920212223242526272829
# Single base casedef factorial(n): if n <= 1: # BASE CASE: n is 0 or 1 return 1 # Returns directly, no recursion return n * factorial(n - 1) # Multiple base casesdef fibonacci(n): if n == 0: # BASE CASE 1: n is 0 return 0 if n == 1: # BASE CASE 2: n is 1 return 1 return fibonacci(n - 1) + fibonacci(n - 2) # Compound base casedef binarySearch(arr, low, high, target): if low > high: # BASE CASE 1: empty range return -1 mid = (low + high) // 2 if arr[mid] == target: # BASE CASE 2: found target return mid # ... recursive calls # Implicit base case through conditiondef subsets(nums, index, path, result): if index == len(nums): # BASE CASE: processed all elements result.append(path[:]) return # ... recursive calls| Pattern | Example Condition | Typical Use |
|---|---|---|
| Size becomes trivial | if n <= 1 | Factorial, Fibonacci, tree problems |
| Range becomes empty | if low > high | Binary search, divide-and-conquer |
| Index reaches bound | if index == len(arr) | Subset generation, traversal |
| Target found | if arr[mid] == target | Search algorithms |
| Remaining moves depleted | if moves == 0 | Game algorithms, path counting |
| Null pointer reached | if node is None | Tree traversals, linked lists |
Some functions have implicit base cases hidden in loop conditions or wrapped inside other logic. A for loop that iterates 0 times is effectively a base case. Always trace what happens when the function receives its smallest valid input—that reveals the true base case behavior.
The recursive calls determine the branching structure of your tree. For each non-base-case node, you need to know:
The number of recursive calls determines how many children each internal node has. The argument transformations determine the labels on those child nodes.
123456789101112131415161718192021222324252627282930313233343536373839404142
# ONE recursive call → linear tree (chain)def factorial(n): if n <= 1: return 1 return n * factorial(n - 1) # ONE call: f(n-1)# f(5) → f(4) → f(3) → f(2) → f(1)# Chain structure: each node has exactly 1 child # TWO recursive calls → binary treedef fibonacci(n): if n <= 1: return n return fibonacci(n-1) + fibonacci(n-2) # TWO calls# f(5)# / \# f(4) f(3)# / \ / \# ... ... ... ...# Binary structure: each internal node has 2 children # TWO calls with halving → balanced binary treedef mergeSort(arr, l, r): if l >= r: return m = (l + r) // 2 mergeSort(arr, l, m) # Call 1: left half mergeSort(arr, m+1, r) # Call 2: right half merge(arr, l, m, r)# sort(0,7)# / \# sort(0,3) sort(4,7)# / \ / \# (0,1) (2,3) (4,5) (6,7) # VARIABLE number of calls → n-ary treedef permutations(nums, path): if len(path) == len(nums): return for num in nums: # LOOP of calls if num not in path: permutations(nums, path + [num])# perm([])# / | \# perm([1]) perm([2]) perm([3])# / \ ... ...# ... ...Pay close attention to HOW the arguments change. If f(n) calls f(n-1), you get linear depth. If f(n) calls f(n/2), you get logarithmic depth. If f(n) calls both f(n-1) and f(n-2), you get exponential nodes. The argument transformation encodes the algorithm's complexity DNA.
Each function call does some work besides making recursive calls. This "non-recursive work" must be accounted for when analyzing complexity.
Why work per node matters:
Two algorithms might have identically shaped recursion trees but vastly different complexities because of the work done at each node. The total time is:
Total Time = Sum of (Work at Each Node)
Some examples:
123456789101112131415161718192021222324252627282930313233
# O(1) work per nodedef fibonacci(n): if n <= 1: return n # O(1): comparison, return return fibonacci(n-1) + fibonacci(n-2) # O(1): addition# Work: Each node does constant work# If tree has O(2^n) nodes, total = O(2^n) # O(n) work per nodedef mergeSort(arr, l, r): if l >= r: return # O(1) m = (l + r) // 2 # O(1) mergeSort(arr, l, m) mergeSort(arr, m+1, r) merge(arr, l, m, r) # O(r - l) = O(n at root)# Work: Root does O(n), next level each does O(n/2), etc.# Tree has O(n) nodes, but work per level is O(n)# Total = O(n log n) # O(k) work per node (variable)def subsets(nums, index, path, result): result.append(path[:]) # O(len(path)) = O(k) to copy for i in range(index, len(nums)): subsets(nums, i + 1, path + [nums[i]], result)# Work: Copying path takes O(k) where k is path length# Need to sum over all nodes with varying k # O(1) work at leaves, O(n) at internal nodesdef quickSort(arr, l, r): if l >= r: return # O(1) at leaves p = partition(arr, l, r) # O(r - l) = O(n) at root quickSort(arr, l, p - 1) quickSort(arr, p + 1, r)# Must account for different work at different levels| Algorithm | Work Per Node | Tree Structure | Total Complexity |
|---|---|---|---|
| Fibonacci (naive) | O(1) | O(2ⁿ) nodes | O(2ⁿ) |
| Factorial | O(1) | O(n) nodes | O(n) |
| Merge Sort | O(node's range size) | O(n) nodes, O(n) per level | O(n log n) |
| Binary Search | O(1) | O(log n) nodes | O(log n) |
| Subset Sum (brute) | O(1) | O(2ⁿ) nodes | O(2ⁿ) |
| Tree Traversal | O(1) | O(n) nodes | O(n) |
Merge sort is a classic example where counting nodes isn't enough. The tree has O(n) nodes, but if we counted each as O(1), we'd get O(n)—wrong! The merge step at each node takes time proportional to the range size. The tree structure reveals that each LEVEL does O(n) total work, and there are O(log n) levels, giving O(n log n).
Now we're ready to draw. The key is to expand level by level, ensuring you handle each node before moving to the next level.
The expansion procedure:
Practical tips for drawing:
f(n) not sometimes f(n) and sometimes just n).... to indicate continuation rather than drawing everything.Worked example: Building a recursion tree for fibonacci(5) step by step:
123456789101112131415161718192021222324252627282930313233343536373839404142434445464748495051525354
STEP 1: Root node fib(5) STEP 2: Expand fib(5)- Is n=5 a base case? No (5 > 1)- Recursive calls: fib(4) and fib(3)- Add as children: fib(5) / \ fib(4) fib(3) STEP 3: Expand level 1 nodes- fib(4): not base case → children: fib(3), fib(2)- fib(3): not base case → children: fib(2), fib(1) fib(5) / \ fib(4) fib(3) / \ / \ fib(3) fib(2) fib(2) fib(1) ↑ BASE CASE STEP 4: Expand level 2 nodes- fib(3): not base → children: fib(2), fib(1)- fib(2): not base → children: fib(1), fib(0)- fib(2): not base → children: fib(1), fib(0)- fib(1): BASE CASE (leaf, no children) fib(5) / \ fib(4) fib(3) / \ / \ fib(3) fib(2) fib(2) fib(1)✓ / \ / \ / \ fib(2) fib(1)✓ fib(1)✓ fib(0)✓ fib(1)✓ fib(0)✓ STEP 5: Expand remaining non-leaves- fib(2): not base → children: fib(1), fib(0) fib(5) / \ fib(4) fib(3) / \ / \ fib(3) fib(2) fib(2) fib(1)✓ / \ / \ / \ fib(2) fib(1)✓ fib(1)✓ fib(0)✓ fib(1)✓ fib(0)✓ / \ fib(1)✓ fib(0)✓ COMPLETE! All leaves are base cases (n ≤ 1)Total nodes: 15Tree height: 5 levelsLet's apply the complete five-step method to merge sort, a divide-and-conquer algorithm with a different tree structure than Fibonacci.
1234567891011
def mergeSort(arr, left, right): if left >= right: # Base case: single element or empty return mid = (left + right) // 2 mergeSort(arr, left, mid) # Sort left half mergeSort(arr, mid + 1, right) # Sort right half merge(arr, left, mid, right) # Merge sorted halves # Initial call for array of 8 elements:# mergeSort(arr, 0, 7)Step 1: Identify Parameters
The function takes arr, left, and right. Array doesn't change (same reference throughout), so we'll label nodes with indices: sort(left, right) or abbreviated (l, r).
Step 2: Find Base Cases
if left >= right: return — when the range has 0 or 1 elements. These become leaves.
Step 3: Identify Recursive Calls
Two calls:
mergeSort(arr, left, mid) — left halfmergeSort(arr, mid + 1, right) — right halfThis is binary recursion with halving of problem size.
Step 4: Determine Work Per Node
The merge operation takes O(right - left + 1) = O(range size). This is linear in the subarray size.
Step 5: Expand the Tree
12345678910111213141516171819202122
Level 0: sort(0,7) Work: merge 8 elements | = O(8) ┌───────────┴───────────┐ │ │Level 1: sort(0,3) sort(4,7) Work: O(4) + O(4) = O(8) │ │ ┌─────┴─────┐ ┌─────┴─────┐ │ │ │ │Level 2: sort(0,1) sort(2,3) sort(4,5) sort(6,7) Work: O(2)×4 = O(8) │ │ │ │ │ │ │ │Level 3: (0,0)(1,1) (2,2)(3,3) (4,4)(5,5) (6,6)(7,7) Leaves: O(1)×8 ANALYSIS:━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━• Total levels: log₂(8) + 1 = 4 (including leaves)• Nodes at level k: 2^k• Work at level k: 2^k × (n/2^k) = n = O(8) for all levels!• Total work: O(n) per level × O(log n) levels = O(n log n) KEY INSIGHT: Unlike Fibonacci, work per LEVEL stays constant at O(n).The tree has O(n) nodes, but the crucial observation is thateach level contributes O(n) to the total work.Notice how the merge sort tree is perfectly balanced—each path from root to leaf has the same length. This is because we always divide exactly in half. The balanced structure, combined with O(n) work per level, gives the famous O(n log n) complexity. Any divide-and-conquer that halves the problem produces this balanced tree structure.
The power function (computing xⁿ) is illuminating because small changes in the recursive formulation dramatically change the tree structure. Let's compare two approaches:
1234567
def power_naive(x, n): if n == 0: return 1 return x * power_naive(x, n-1) # power_naive(2, 8)# Makes n recursive calls12345678910
def power_fast(x, n): if n == 0: return 1 half = power_fast(x, n // 2) if n % 2 == 0: return half * half return half * half * x # power_fast(2, 8)# Makes log(n) recursive calls1234567891011121314151617181920212223242526272829303132333435363738394041424344454647484950515253
NAIVE POWER (Linear Tree):power(2,8) |power(2,7) |power(2,6) |power(2,5) |power(2,4) |power(2,3) |power(2,2) |power(2,1) |power(2,0) ← base case Height: 8 (equals n)Total nodes: 9Work per node: O(1)Total work: O(n) = O(8) FAST POWER (Logarithmic Tree):power(2,8) |power(2,4) |power(2,2) |power(2,1) |power(2,0) ← base case Height: 4 (equals log₂(n))Total nodes: 5Work per node: O(1)Total work: O(log n) = O(3) THE DIFFERENCE:━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━Both are linear chains (single recursive call), but:• Naive: decreases n by 1 each time → n steps• Fast: halves n each time → log(n) steps For n = 1,000,000:• Naive: 1,000,000 calls• Fast: 20 calls The tree structure reveals the exponential improvement!Both power functions produce linear trees (chains), but the key difference is how fast they reach the base case. The naive version takes n steps; the fast version takes log(n) steps. This shows why argument transformation matters so much—it determines tree height, which for linear trees equals total complexity.
You don't always need to draw the complete recursion tree. For complexity analysis, recognizing the pattern is often enough. Here's how to work efficiently:
Recognizing common patterns:
| Tree Pattern | Recognition Signs | Typical Complexity |
|---|---|---|
| Linear chain | Single recursive call, constant decrement | O(n) nodes, O(n) total |
| Logarithmic chain | Single call, halving each time | O(log n) nodes, O(log n) total |
| Complete binary tree | Two calls, halving each | O(n) nodes, O(n log n) if O(n) work/level |
| Fibonacci-style | Two calls, subtracting by 1 and 2 | O(φⁿ) ≈ O(1.618ⁿ) nodes |
| Exponential (k calls) | k recursive calls, constant decrement | O(kⁿ) nodes |
| Subset-style | Loop of calls, excluding one element | O(2ⁿ) nodes |
| Permutation-style | Loop of calls, varying loop count | O(n!) nodes |
With practice, you'll glance at recursive code and instantly recognize its pattern: 'This is Fibonacci-style, so exponential', 'This halves the problem, so logarithmic height', 'This tries all subsets, so 2ⁿ'. Building this intuition is the payoff of drawing many trees explicitly.
Drawing recursion trees is a learnable skill. Let's consolidate the key techniques:
What's next:
Now that you can draw recursion trees, we'll harness them for their primary purpose: analyzing time complexity. The next page shows how to derive precise complexity bounds directly from tree structure—turning visual diagrams into rigorous mathematical statements.
You now have a systematic methodology for drawing recursion trees from any recursive algorithm. You can identify parameters, base cases, and recursive calls, then expand the tree level by level. Practice on diverse algorithms to build fluency—the skill compounds with every tree you draw.