Loading learning content...
You've learned the mechanics of reducing 2D DP to 1D and the rolling array technique. But a critical skill remains: recognizing when these optimizations are possible.
Not every DP problem can have its space optimized. Some recurrences have dependencies that span the entire table. Others need the full table for solution reconstruction. Understanding when optimization applies — and when it doesn't — separates proficient DP practitioners from true experts.
This page builds your pattern recognition skills, giving you a systematic framework to quickly assess any DP problem's space optimization potential.
By the end of this page, you will be able to look at any DP recurrence and immediately identify: (1) whether space can be optimized, (2) what the optimal space complexity is, and (3) what you lose by applying the optimization. You'll build a mental checklist that becomes second nature with practice.
The single most important criterion for space optimization is:
The dependency window must be bounded and independent of input size.
In concrete terms: when computing dp[i][j], we should only need cells from a constant number of previous rows (or columns), not from all rows 0 through i-1.
Let's formalize this:
dp[i][j] depends only on rows i-1, i-2, ..., i-k for some fixed k. Optimizable to O(k × n) space.dp[i][j] depends only on row i-1. Optimizable to O(n) or O(2n) space. This is the most common and powerful case.dp[i] depends only on dp[i-1] (and maybe dp[i-2], etc.). Optimizable to O(1) space with rolling variables.dp[i][j] depends on rows 0, 1, ..., i-1 — the number of dependencies grows with i. NOT directly optimizable. Full table required.If computing dp[n] requires looking at ALL of dp[0], dp[1], ..., dp[n-1], you cannot reduce space below O(n). The information from every previous state must be retained. This is fundamentally different from problems with fixed lookback.
Here's a systematic checklist to quickly assess any DP problem for space optimization potential. Run through these questions in order:
dp[i][j] depend on? List them all: dp[i-1][j], dp[i][j-1], dp[i-1][j-1], etc.dp[i][...] depend only on row i-1? Or also on i-2? Or on rows 0 through i-1?i-k? This determines the number of rolling arrays needed.dp[i][j] depend on dp[i][j-1] or dp[i][j+1]? This affects iteration order but doesn't prevent optimization.12345678910111213141516171819202122232425262728293031323334353637
# Applying the checklist to common DP problems # 1. 0/1 Knapsack# Recurrence: dp[i][w] = max(dp[i-1][w], dp[i-1][w-wt[i]] + val[i])# Row dependencies: Only row i-1 ✓# Lookback distance: k = 1 ✓# Same-row dependencies: None ✓# Reconstruction needed: Usually no (just max value)# VERDICT: Optimizable to O(W) space ✓ # 2. Longest Increasing Subsequence (LIS) - Standard DP# Recurrence: dp[i] = max(dp[j] + 1) for all j < i where arr[j] < arr[i]# Row dependencies: ALL previous values dp[0], dp[1], ..., dp[i-1]# Lookback distance: k = i (grows with input!)# VERDICT: NOT optimizable with standard DP approach# (Note: O(n log n) solution uses binary search, not DP table) # 3. Edit Distance# Recurrence: dp[i][j] = min(dp[i-1][j]+1, dp[i][j-1]+1, dp[i-1][j-1]+cost)# Row dependencies: Only row i-1 ✓# Lookback distance: k = 1 ✓# Same-row dependencies: dp[i][j-1] (left cell) - requires left-to-right# Reconstruction needed: Often yes (to get the edit sequence)# VERDICT: Optimizable to O(min(m,n)) if only value needed ✓ # 4. Matrix Chain Multiplication# Recurrence: dp[i][j] = min(dp[i][k] + dp[k+1][j] + cost) for i ≤ k < j# Dependencies: Cells within the same "band" and previous bands# This is INTERVAL DP - depends on subintervals, not just adjacent rows# VERDICT: Typically NOT optimizable - need full 2D table # 5. Coin Change (Minimum Coins)# Recurrence: dp[i][a] = min(dp[i-1][a], dp[i][a-coins[i]] + 1)# Row dependencies: Row i-1, and same row (for unlimited coins)# Lookback distance: k = 1 ✓# Same-row dependencies: dp[i][a-coins[i]] - need left-to-right# VERDICT: Optimizable to O(amount) space ✓Let's catalog the common DP patterns and their space optimization potential. These patterns will help you quickly recognize optimization opportunities.
| Pattern | Typical Recurrence | Dependence | Optimal Space |
|---|---|---|---|
| Sequential Decision | dp[i] = f(dp[i-1], dp[i-2]) | Fixed lookback | O(1) |
| 2D Sequential | dp[i][j] = f(dp[i-1][...]) | Previous row only | O(n) |
| Grid Path | dp[i][j] = f(dp[i-1][j], dp[i][j-1]) | Above + left | O(n) |
| String Matching | dp[i][j] = f(dp[i-1][j], dp[i][j-1], dp[i-1][j-1]) | Adjacent cells | O(n) |
| Knapsack Family | dp[i][w] = f(dp[i-1][w], dp[i-1][w-k]) | Previous row | O(W) |
| Interval DP | dp[i][j] = f(dp[i][k], dp[k][j]) | Many subintervals | O(n²)† |
| All-pairs | dp[i][j] = f(dp[i][k], dp[k][j]) | Entire row/column | O(n²)† |
† These patterns typically cannot be space-optimized beyond their natural dimensionality.
Key insight: Most 'sequential' patterns (where we process items/characters one by one and decisions only depend on the immediately previous state) can be space-optimized. 'Interval' and 'all-pairs' patterns typically cannot.
If your DP processes elements in a sequence (items in knapsack, characters in string, cells row-by-row in grid) and each step only looks at the 'previous' step, space optimization is almost always possible. This covers the vast majority of introductory and intermediate DP problems.
Understanding which patterns cannot be optimized is equally important. Here are the key blockers:
1234567891011121314151617181920212223242526272829303132333435363738394041424344454647484950515253
# Example 1: Longest Increasing Subsequence - Cannot optimize spacedef lis_standard_dp(nums): """ dp[i] = length of LIS ending at index i Recurrence: dp[i] = max(dp[j] + 1) for all j < i where nums[j] < nums[i] Why can't we optimize? To compute dp[5], we might need dp[0], dp[2], dp[4] - any subset! We can't predict which previous values we'll need. We must keep ALL previous dp values. """ n = len(nums) dp = [1] * n # Each element is a LIS of length 1 by itself for i in range(1, n): for j in range(i): # Check ALL previous elements if nums[j] < nums[i]: dp[i] = max(dp[i], dp[j] + 1) return max(dp) # O(n) space - cannot reduce further with this approach # Note: There's an O(n log n) solution using binary search that uses# O(n) space for a different reason (storing the 'tails' array), but# that's a different algorithm, not space-optimized DP. # Example 2: Matrix Chain Multiplication - Cannot optimize spacedef matrix_chain_order(dimensions): """ dp[i][j] = minimum cost to multiply matrices i through j Recurrence: dp[i][j] = min(dp[i][k] + dp[k+1][j] + cost(i,k,j)) for all k in [i, j) Why can't we optimize? dp[1][5] depends on (dp[1][1], dp[2][5]), (dp[1][2], dp[3][5]), (dp[1][3], dp[4][5]), (dp[1][4], dp[5][5]) These are scattered across the table, not in adjacent rows. The entire upper triangle must be computed and stored. """ n = len(dimensions) - 1 dp = [[0] * n for _ in range(n)] # Fill by increasing chain length for chain_len in range(2, n + 1): for i in range(n - chain_len + 1): j = i + chain_len - 1 dp[i][j] = float('inf') for k in range(i, j): cost = (dp[i][k] + dp[k+1][j] + dimensions[i] * dimensions[k+1] * dimensions[j+1]) dp[i][j] = min(dp[i][j], cost) return dp[0][n-1] # O(n²) space - cannot reduceWhen DP space cannot be optimized, sometimes alternative algorithms with better space complexity exist. LIS has an O(n log n) time, O(n) space solution using binary search and a 'patience sorting' approach. Always consider whether a different algorithmic approach might be more efficient.
One of the most common situations where space optimization cannot be applied — even when the recurrence supports it — is when you need to reconstruct the actual solution, not just compute its value.
Consider Longest Common Subsequence (LCS):
12345678910111213141516171819202122232425262728293031323334353637383940414243444546474849505152535455
def lcs_length_only(s1: str, s2: str) -> int: """ Space-optimized: O(min(m, n)) space Returns only the LENGTH of LCS """ if len(s1) < len(s2): s1, s2 = s2, s1 prev = [0] * (len(s2) + 1) curr = [0] * (len(s2) + 1) for i in range(1, len(s1) + 1): for j in range(1, len(s2) + 1): if s1[i-1] == s2[j-1]: curr[j] = prev[j-1] + 1 else: curr[j] = max(prev[j], curr[j-1]) prev, curr = curr, prev return prev[len(s2)] def lcs_with_string(s1: str, s2: str) -> str: """ Requires full O(m × n) table to reconstruct the actual LCS string """ m, n = len(s1), len(s2) dp = [[0] * (n + 1) for _ in range(m + 1)] # Fill the DP table for i in range(1, m + 1): for j in range(1, n + 1): if s1[i-1] == s2[j-1]: dp[i][j] = dp[i-1][j-1] + 1 else: dp[i][j] = max(dp[i-1][j], dp[i][j-1]) # Backtrack to find the actual LCS lcs = [] i, j = m, n while i > 0 and j > 0: if s1[i-1] == s2[j-1]: lcs.append(s1[i-1]) i -= 1 j -= 1 elif dp[i-1][j] > dp[i][j-1]: i -= 1 else: j -= 1 return ''.join(reversed(lcs)) # The backtracking step (lines 34-44) walks backwards through the table,# making decisions based on which cell the current value came from.# Without the full table, we lose this information.For LCS and Edit Distance, there's an advanced technique called Hirschberg's Algorithm that achieves O(n) space while STILL allowing solution reconstruction. It uses a divide-and-conquer approach, computing mid-points and recursively solving subproblems. This is rarely needed in practice but demonstrates that clever algorithms can sometimes overcome apparent limitations.
With practice, you'll develop intuition for spotting optimization opportunities instantly. Here are the mental shortcuts experts use:
dp[i] = f(dp[i-1], dp[i-2], ..., dp[i-k]) with constant k reduces to O(k) space.dp[i][j] meaning 'subproblem from i to j': This is usually interval DP and cannot be easily space-optimized.| Problem Type | Key Indicator | Optimizable? | Target Space |
|---|---|---|---|
| Knapsack variants | dp[i][w] uses row i-1 | Yes | O(W) |
| String DP (LCS, Edit) | dp[i][j] uses i-1, j-1 | Yes* | O(min(m,n)) |
| Grid paths | dp[i][j] uses above, left | Yes | O(n) |
| Fibonacci family | dp[i] uses i-1, i-2 | Yes | O(1) |
| Coin change | dp[a] uses smaller amounts | Yes | O(amount) |
| LIS (standard) | dp[i] uses all j < i | No | O(n) |
| Matrix chain | dp[i][j] uses subintervals | No | O(n²) |
| Floyd-Warshall | dp[i][j] uses intermediate k | No | O(n²) |
String DP problems (LCS, Edit Distance) are space-optimizable for computing the VALUE. If you need the actual solution (the LCS string, the edit sequence), you need the full table for backtracking — unless you use advanced techniques like Hirschberg's algorithm.
Let's sharpen your skills with some practice problems. For each recurrence, determine:
Problem: Subset Sum
dp[i][s] = dp[i-1][s] OR dp[i-1][s - nums[i]]
Can we include first i numbers to sum to exactly s?
Analysis:
i-1Verdict: ✅ Optimizable to O(S) space
Approach: Single array with right-to-left iteration (same as 0/1 knapsack). We need dp[s - nums[i]] from the previous row, so we iterate decreasing s.
dp = [False] * (target + 1)
dp[0] = True
for num in nums:
for s in range(target, num - 1, -1): # Right to left
dp[s] = dp[s] or dp[s - num]
Let's consolidate the key insights from this page:
What's next:
In the final page of this module, we'll explore the trade-offs between space optimization and other concerns — when the optimization is worth it, when it isn't, and how to make informed engineering decisions.
You now have a systematic framework for recognizing space optimization opportunities in DP problems. This pattern recognition skill will save you time in both competitive programming and real-world engineering — quickly identifying what's possible before diving into implementation.