Loading content...
Imagine you're given a sorted array and asked to find two numbers that add up to a target sum. The naive approach—checking every possible pair—requires O(n²) time. But what if you could solve this in a single pass through the array? What if, instead of checking n² pairs, you could find the answer by examining at most n elements?
This is the power of the two-pointer technique: a deceptively simple algorithmic pattern that transforms quadratic-time brute-force solutions into elegant linear-time algorithms. It's one of the most frequently used patterns in technical interviews, appearing in countless problems from array manipulation to string processing to linked list traversal.
The technique is exactly what it sounds like: you maintain two pointers (or indices) that traverse your data structure according to specific rules. But the elegance lies in how these rules are designed—each pointer movement eliminates large swaths of the search space, allowing you to solve problems that would otherwise require nested loops with a single, intelligent traversal.
By the end of this page, you will understand the fundamental concept of the two-pointer technique, why it works, and how to recognize problems where it applies. You'll gain the mental model that makes this pattern intuitive rather than a trick to memorize.
At its heart, the two-pointer technique is about structured exploration. Instead of exhaustively examining every possibility, you use two pointers that move through the data in a coordinated way, making intelligent decisions about which possibilities to skip.
The fundamental insight:
In many problems, the brute-force approach examines pairs (or subsequences) that can be logically eliminated. The two-pointer technique exploits this by:
This is not a heuristic or approximation—when applicable, the two-pointer technique gives correct answers in optimal time.
In the context of arrays, a 'pointer' is simply an integer index. We call them pointers because they 'point' to specific positions in the array. The same concept applies to linked lists (where they're actual node references) and strings (where they're character positions). The technique remains the same regardless of the underlying data structure.
Visualizing the technique:
Consider an array [1, 3, 5, 7, 9] and imagine two pointers—one at the start (left = 0) and one at the end (right = 4). They're looking at values 1 and 9.
Array: [1, 3, 5, 7, 9]
↑ ↑
left right
(value=1) (value=9)
Now, each pointer can move inward. If we're searching for a target sum:
left rightright leftEach movement shrinks the search space, and we never revisit a position. This is why the algorithm is O(n)—each pointer moves at most n times total.
The two-pointer technique works because of a crucial property: each pointer movement logically eliminates multiple possibilities.
Let's prove this with the two-sum problem on a sorted array. Given [1, 3, 5, 7, 9] with target 10:
| Step | Left Index | Right Index | Left Value | Right Value | Sum | Action |
|---|---|---|---|---|---|---|
| 1 | 0 | 4 | 1 | 9 | 10 | Found! (1 + 9 = 10) |
In this case, we got lucky and found the answer immediately. But let's try target 8:
| Step | Left Index | Right Index | Left Value | Right Value | Sum | Action |
|---|---|---|---|---|---|---|
| 1 | 0 | 4 | 1 | 9 | 10 | Too large → move right left |
| 2 | 0 | 3 | 1 | 7 | 8 | Found! (1 + 7 = 8) |
Why moving right left was correct:
When 1 + 9 = 10 > 8, we knew that 9 was too large. But here's the key insight: since the array is sorted, pairing 9 with any element to the left of the current left pointer would give an even smaller sum, and we need a larger sum partner for those smaller elements. However, since 9 is already too large for our smallest remaining element (1), it's too large for all remaining elements. Therefore, we can eliminate 9 from consideration entirely.
This is the secret sauce: one comparison eliminates an entire row or column of the pair matrix.
Think of all possible pairs as a 2D matrix where rows are left indices and columns are right indices. The brute-force approach examines every cell. The two-pointer technique starts at a corner and moves either down a row or left a column based on each comparison—but each move eliminates an entire row or column. That's why n² pairs can be handled in O(n) movements.
The invariant that makes it work:
Throughout the algorithm, we maintain an important invariant: all pairs with left indices < current left or right indices > current right have already been implicitly considered and rejected.
When we move left right, we're saying "all pairs involving the old left index have been handled."
When we move right left, we're saying "all pairs involving the old right index have been handled."
At termination, every pair has been considered—most were eliminated without explicit examination, but the logic guarantees they weren't valid solutions.
Two-pointer techniques come in two primary flavors, each suited to different problem types:
When to use which variant:
| Problem Characteristic | Suggested Variant |
|---|---|
| Need to find pairs with a property | Opposite-direction |
| Data is sorted and you're searching | Opposite-direction |
| Checking symmetry (palindromes) | Opposite-direction |
| In-place modifications while traversing | Same-direction |
| Partitioning elements by condition | Same-direction |
| Detecting cycles or meeting points | Same-direction |
The choice isn't always obvious, but pattern recognition comes quickly with practice.
One of the most valuable skills is recognizing when a brute-force solution can be transformed into a two-pointer solution. Let's see this transformation explicitly.
Problem: Given a sorted array, find if there exist two elements whose sum equals a target value.
123456789101112131415161718192021222324252627282930
def has_pair_with_sum(arr: list[int], target: int) -> bool: """ Two-pointer approach: Exploit sorted order. Time: O(n) - single pass with two pointers Space: O(1) - only two index variables Key insight: In a sorted array, moving left pointer increases the sum, moving right pointer decreases it. """ left = 0 right = len(arr) - 1 while left < right: current_sum = arr[left] + arr[right] if current_sum == target: return True # Found the pair elif current_sum < target: # Sum too small, need larger values left += 1 else: # Sum too large, need smaller values right -= 1 return False # Pointers crossed, no pair found # Example usagearr = [1, 3, 5, 7, 9]print(has_pair_with_sum(arr, 8)) # True (1 + 7)print(has_pair_with_sum(arr, 20)) # FalseThe two-pointer approach makes at most n iterations total (each pointer moves at most n/2 times toward the center). For 10,000 elements, that's at most 10,000 operations—a 5000x improvement over brute force.
The opposite-direction two-pointer technique for pair sums requires sorted data. If the input isn't sorted, you have two options: (1) Sort first in O(n log n), making total time O(n log n), or (2) Use a hash set for O(n) time but O(n) space. The two-pointer technique on sorted data gives O(n) time with O(1) space.
It's one thing to see that the two-pointer technique works on examples. It's another to understand why it's correct—why it doesn't accidentally skip valid solutions.
Theorem: For a sorted array, if a valid pair (i, j) exists where i < j and arr[i] + arr[j] = target, the two-pointer algorithm will find it.
Proof by Invariant:
Throughout the algorithm, we maintain this invariant:
If a solution exists with indices (i, j) where left ≤ i < j ≤ right, then we will find it.
Base case: Initially, left = 0 and right = n-1, so all pairs are potentially in range.
Inductive step: Consider what happens at each iteration:
Termination: The algorithm terminates when left >= right, at which point no valid pairs remain to check.
Completeness: Every pair (i, j) with i < j is either:
Since every pair is accounted for, the algorithm is correct.
When your two-pointer implementation has bugs, understanding why the algorithm works helps you find them. Common bugs: using <= instead of < in the while condition (causing infinite loops), forgetting to move pointers (causing infinite loops), or moving the wrong pointer based on the condition. The proof tells you exactly what should happen at each step.
Understanding the complexity of two-pointer algorithms requires careful analysis of how the pointers move.
Time Complexity: O(n)
The analysis is straightforward:
left pointer starts at 0 and can only increase (never decrease)right pointer starts at n-1 and can only decrease (never increase)Note that this is O(n), not O(n/2). While each pointer moves at most n/2 positions on average for balanced movements, in the worst case one pointer might stay fixed while the other traverses the entire array.
Space Complexity: O(1)
The algorithm uses only a constant amount of extra memory:
left pointer: 1 integerright pointer: 1 integercurrent_sum (optional): 1 integer| Approach | Time | Space | Trade-offs |
|---|---|---|---|
| Brute Force | O(n²) | O(1) | Simple but doesn't scale |
| Hash Set | O(n) | O(n) | Fast, works on unsorted data, but uses extra space |
| Two-Pointer | O(n) | O(1) | Optimal time and space, but requires sorted input |
If the input isn't already sorted, you'll pay O(n log n) to sort it first. The total time becomes O(n log n + n) = O(n log n). This is still better than O(n²), but if you only need to check once and the data is unsorted, a hash set approach might be preferable since it's O(n) without requiring sorted data.
Not every problem can be solved with two pointers. The technique requires specific structural properties in the problem. Understanding these prerequisites helps you quickly identify two-pointer opportunities.
Prerequisites for Opposite-Direction Two-Pointer:
Prerequisites for Same-Direction Two-Pointer:
Two-pointer doesn't work for all problems! If there's no ordering, no monotonic relationship, or no way for pointer movements to eliminate large chunks of the search space, the technique likely doesn't apply. Trying to force two-pointer on an unsuitable problem leads to incorrect solutions.
Even experienced programmers make mistakes with two-pointer algorithms. Here are the most common pitfalls and how to avoid them:
left <= right when you should use left < right, or vice versa. For pair sums, you typically want left < right (strict inequality) since a number can't pair with itself.sum == target, do you return, or do you need to continue searching for more pairs?1234567891011121314151617181920212223242526272829303132
# BUG 1: Wrong condition (should be <, not <=)while left <= right: # Bug: allows left == right # ... this could match an element with itself # FIX: Use strict inequality for pairswhile left < right: # ... correct # BUG 2: Forgetting to move pointerswhile left < right: if arr[left] + arr[right] == target: print("Found!") # Bug: never moves pointers, infinite loop! elif current_sum < target: left += 1 else: right -= 1 # FIX: Always ensure pointers movewhile left < right: if arr[left] + arr[right] == target: print("Found!") return True # Or move both pointers if finding all pairs # ... # BUG 3: Moving wrong pointerif current_sum < target: right -= 1 # Bug: decreasing sum when it's already too small! # FIX: Think carefully about directionif current_sum < target: left += 1 # Correct: move to larger elementYou've now established a solid foundation for understanding the two-pointer technique. Let's consolidate the key insights:
What's next:
In the following pages, we'll dive deep into each variant:
You now understand what the two-pointer technique is, why it works, and when it applies. This conceptual foundation will make the subsequent deep-dives into specific variants and problems much more intuitive.