Loading learning content...
Permutations are everywhere in computer science and everyday life. How many ways can you arrange books on a shelf? In how many orders can a scheduler execute tasks? How many distinct passwords can be formed from a set of characters? These questions all reduce to the same fundamental problem: generating all permutations of a set of elements.
A permutation is an ordered arrangement of elements. Given a set of n distinct elements, a permutation is a sequence containing each element exactly once. The number of such arrangements is n! (n factorial)—a quantity that grows explosively with n. For 3 elements, there are 6 permutations. For 10 elements, there are 3,628,800. For 20 elements, there are over 2.4 quintillion.
This explosive growth makes naively generating and storing all permutations infeasible for even moderately large inputs. Yet with backtracking, we can systematically explore the permutation space, generating each arrangement exactly once without redundant computation.
By the end of this page, you will deeply understand:
• The mathematical structure underlying permutations • Why backtracking is the natural paradigm for permutation generation • Multiple implementation strategies with their tradeoffs • How to reason about correctness and completeness • Performance characteristics and optimization opportunities
Before diving into algorithms, we must establish a rigorous mathematical foundation. Understanding the combinatorial structure of permutations illuminates why certain algorithmic approaches work and guides us toward efficient implementations.
Formal Definition:
A permutation of a set S = {a₁, a₂, ..., aₙ} is a bijective function π: {1, 2, ..., n} → {1, 2, ..., n}. Equivalently, it's an ordered tuple (π(1), π(2), ..., π(n)) where π(i) indicates the position of element aᵢ in the arrangement, or alternatively, aπ(i) is the element at position i.
For practical purposes, we represent a permutation as an ordered sequence of the elements themselves. Given [1, 2, 3], the permutations are:
(1, 2, 3), (1, 3, 2), (2, 1, 3), (2, 3, 1), (3, 1, 2), (3, 2, 1)
The number of permutations of n distinct elements is n! (n factorial), computed as n × (n-1) × (n-2) × ... × 2 × 1.
Intuition: For the first position, we have n choices. For the second, n-1 remain. For the third, n-2, and so on until only one element remains for the last position. By the multiplication principle: n × (n-1) × ... × 1 = n!
| n | n! (Permutations) | Approximate Time at 10⁶ ops/sec |
|---|---|---|
| 3 | 6 | 6 microseconds |
| 5 | 120 | 120 microseconds |
| 8 | 40,320 | 40 milliseconds |
| 10 | 3,628,800 | 3.6 seconds |
| 12 | 479,001,600 | 8 minutes |
| 15 | 1,307,674,368,000 | 15 days |
| 20 | 2,432,902,008,176,640,000 | 77 million years |
Key Observations from the Growth Table:
Factorial grows faster than exponential. While 2ⁿ doubles with each increment of n, n! multiplies by an ever-larger factor. At n=20, n! dwarfs 2²⁰ (≈ 1 million) by a factor of over 2 trillion.
Practical limits are severe. Real-world permutation generation is feasible only for small n—typically n ≤ 10-12 for interactive applications, and perhaps n ≤ 15-18 with significant computational resources.
This constrains algorithm design. Since we must generate n! outputs, no clever algorithm can do better than O(n!) for exhaustive generation. Our goal is to achieve O(n!) with minimal overhead—ideally O(n) per permutation or O(n × n!) total.
Lexicographic Ordering:
Permutations have a natural ordering when we treat them as numbers or strings. The lexicographic order (dictionary order) arranges permutations such that [1,2,3] < [1,3,2] < [2,1,3] < ... < [3,2,1]. This ordering provides a deterministic sequence through all permutations and is often the expected output format.
Generating all permutations fits perfectly into the backtracking framework. The problem exhibits a clear decision tree structure where:
The Decision Tree Perspective:
Consider generating permutations of [1, 2, 3]. The decision tree looks like this:
[start]
/ | \
1 2 3 ← Choose first element
/ \ / \ / \
2 3 1 3 1 2 ← Choose second element
| | | | | |
3 2 3 1 2 1 ← Only one choice remains
Each root-to-leaf path generates exactly one permutation. The tree has n! leaves (one per permutation) and the total number of nodes is bounded by n × n!.
The universal backtracking template maps directly to permutation generation:
Choose: Select an unused element for the current position Explore: Recursively fill remaining positions Unchoose: Mark the element as available again (backtrack)
This pattern ensures we explore all branches while maintaining the invariant that each element appears exactly once in the current partial solution.
Why Not Other Paradigms?
Why not iterative enumeration? While possible (Heap's algorithm, next-permutation iteration), recursive backtracking is more intuitive, easier to modify for variants, and naturally extends to constrained permutations.
Why not divide-and-conquer? Permutations don't decompose into independent subproblems the way sorting does. The choice for position 1 affects all subsequent choices.
Why not dynamic programming? There's no overlapping subproblem structure. Each permutation is unique, and computing one doesn't help compute another.
Why not greedy? There's no 'best' choice at each step—we need all choices, not a single optimal one.
Backtracking is thus the natural fit: we're systematically exploring a search space, generating all valid configurations, with no pruning possible (every permutation is valid).
The most straightforward implementation uses an auxiliary boolean array to track which elements have been used. This approach is pedagogically clear and works for any type of elements, not just integers.
Algorithm Overview:
current array representing the partial permutationused boolean array marking which elements are already in currentcurrent has n elements, record the complete permutation1234567891011121314151617181920212223242526272829303132333435363738394041424344454647
function generatePermutations<T>(elements: T[]): T[][] { const results: T[][] = []; const n = elements.length; const used: boolean[] = new Array(n).fill(false); const current: T[] = []; function backtrack(): void { // Base case: complete permutation found if (current.length === n) { results.push([...current]); // Copy to avoid mutation return; } // Try each unused element in the current position for (let i = 0; i < n; i++) { if (used[i]) continue; // Skip already-used elements // Choose: add element to current permutation current.push(elements[i]); used[i] = true; // Explore: recursively fill remaining positions backtrack(); // Unchoose: remove element and mark as available current.pop(); used[i] = false; } } backtrack(); return results;} // Example usageconst input = [1, 2, 3];const perms = generatePermutations(input);console.log("Permutations of [1,2,3]:");perms.forEach(p => console.log(p.join(", "))); // Output:// 1, 2, 3// 1, 3, 2// 2, 1, 3// 2, 3, 1// 3, 1, 2// 3, 2, 1Detailed Execution Trace:
Let's trace through the algorithm for input [A, B, C]:
backtrack() called with current=[], used=[F,F,F]
i=0: Choose A → current=[A], used=[T,F,F]
backtrack() called with current=[A], used=[T,F,F]
i=0: A is used, skip
i=1: Choose B → current=[A,B], used=[T,T,F]
backtrack() called with current=[A,B], used=[T,T,F]
i=0,1: used, skip
i=2: Choose C → current=[A,B,C], used=[T,T,T]
backtrack() → length=3 → RECORD [A,B,C]
Unchoose C → current=[A,B], used=[T,T,F]
return
Unchoose B → current=[A], used=[T,F,F]
i=2: Choose C → current=[A,C], used=[T,F,T]
backtrack() → ... → RECORD [A,C,B]
Unchoose C → current=[A], used=[T,F,F]
return
Unchoose A → current=[], used=[F,F,F]
i=1: Choose B → ... → RECORD [B,A,C], [B,C,A]
i=2: Choose C → ... → RECORD [C,A,B], [C,B,A]
Notice results.push([...current]) makes a copy of the current array. This is essential! Without the copy, all entries in results would reference the same array, which ends up empty after all backtracking completes. This is a common source of bugs in backtracking implementations.
Complexity Analysis:
Time Complexity: O(n × n!)
used[i] and potentially recursing. The total recursive calls equal the nodes in the decision tree, which is O(n × n!)Space Complexity: O(n) for the recursion (excluding output storage)
used array: O(n)current array: O(n)Per-Permutation Overhead: O(n) for copying the result array.
An elegant alternative to the 'used' array approach is to generate permutations in-place using swaps. Instead of building a separate current array and tracking used elements, we permute the original array directly by systematically swapping elements.
Key Insight:
Consider the array as divided into two regions:
At each step, we select one element from the remaining region to place at position index by swapping it with whatever is currently at index. After recursing to handle positions [index+1, n-1], we swap back to restore the original state.
The Invariant:
At recursion depth index, positions [0, index-1] are fixed for this branch, and all elements originally in the array are still present—just in different positions. The region [index, n-1] contains exactly the elements not in the fixed prefix.
123456789101112131415161718192021222324252627282930313233343536373839404142434445464748495051525354
function permuteBySwapping<T>(elements: T[]): T[][] { const results: T[][] = []; const arr = [...elements]; // Work on a copy to preserve input function backtrack(index: number): void { // Base case: all positions fixed if (index === arr.length) { results.push([...arr]); // Record current arrangement return; } // Try each element from index onwards at position 'index' for (let i = index; i < arr.length; i++) { // Choose: swap element at 'i' to position 'index' [arr[index], arr[i]] = [arr[i], arr[index]]; // Explore: fix position 'index' and permute the rest backtrack(index + 1); // Unchoose: swap back to restore original arrangement [arr[index], arr[i]] = [arr[i], arr[index]]; } } backtrack(0); return results;} // Example with tracingfunction permuteWithTrace<T>(elements: T[]): void { const arr = [...elements]; const indent = (d: number) => " ".repeat(d); function backtrack(index: number, depth: number): void { console.log(`${indent(depth)}backtrack(index=${index}), arr=[${arr.join(",")}]`); if (index === arr.length) { console.log(`${indent(depth)}→ RECORD: [${arr.join(",")}]`); return; } for (let i = index; i < arr.length; i++) { console.log(`${indent(depth)}Swap positions ${index} ↔ ${i}`); [arr[index], arr[i]] = [arr[i], arr[index]]; backtrack(index + 1, depth + 1); [arr[index], arr[i]] = [arr[i], arr[index]]; console.log(`${indent(depth)}Restored arr=[${arr.join(",")}]`); } } backtrack(0, 0);} permuteWithTrace([1, 2, 3]);Trace Output for [1, 2, 3]:
backtrack(index=0), arr=[1,2,3]
Swap positions 0 ↔ 0
backtrack(index=1), arr=[1,2,3]
Swap positions 1 ↔ 1
backtrack(index=2), arr=[1,2,3]
Swap positions 2 ↔ 2
backtrack(index=3), arr=[1,2,3]
→ RECORD: [1,2,3]
Restored arr=[1,2,3]
Restored arr=[1,2,3]
Swap positions 1 ↔ 2
backtrack(index=2), arr=[1,3,2]
Swap positions 2 ↔ 2
backtrack(index=3), arr=[1,3,2]
→ RECORD: [1,3,2]
...
Advantages of the Swap Approach:
Disadvantages:
The swap-based approach does not produce permutations in lexicographic order. For [1,2,3], it produces: [1,2,3], [1,3,2], [2,1,3], [2,3,1], [3,2,1], [3,1,2]. Notice [3,2,1] appears before [3,1,2], which is incorrect for lexicographic ordering. If sorted output is required, either sort the results afterward or use a different algorithm.
For any algorithmic procedure, especially recursive ones, we should rigorously establish correctness. For permutation generation, we must prove two properties:
Soundness Proof (Using 'Used' Array Approach):
Claim: At the base case when current.length === n, current contains exactly the n input elements, each appearing exactly once.
Proof by induction on current.length:
Base case: When current.length = 0, no elements are in current and used[i] = false for all i. The invariant holds vacuously.
Inductive hypothesis: Assume when current.length = k, current contains k distinct elements from the input, and used[i] = true if and only if elements[i] is in current.
Inductive step: When we add an element:
elements[i] if !used[i] (not already in current)used[i] = truecurrent.length = k+1 with k+1 distinct elementsAt current.length = n, all n elements are in current, each exactly once. ∎
Completeness Proof:
Claim: Every permutation of the input is generated exactly once.
Proof:
Consider any permutation P = (p₁, p₂, ..., pₙ). We show the algorithm visits P exactly once.
Existence: At each depth k, the algorithm iterates through all unused elements. Since P is a permutation, pₖ is an element not used by (p₁, ..., pₖ₋₁). Thus, when filling position k, the algorithm will eventually try pₖ. By induction, the path through the recursion tree corresponding to P's construction will be traversed.
Uniqueness: The algorithm's structure ensures no permutation is generated twice. Each permutation corresponds to a unique path from root to leaf in the decision tree. Different paths generate different sequences of choices, producing different permutations. ∎
Corollary: The algorithm generates exactly n! permutations.
Proof: By completeness, all n! permutations are generated. By soundness, every output is a valid permutation. Since each valid permutation appears exactly once, the count is exactly n!. ∎
You might wonder why we prove such 'obvious' properties. In practice:
Many applications require permutations in lexicographic (dictionary) order. For a sorted input, the 'used' array approach naturally produces sorted output because we iterate through elements in their original order and choose the smallest available at each position.
Ensuring Lexicographic Order:
The 'used' array method produces lexicographic output if and only if the input is sorted. The first element chosen at each position is the smallest unused, leading to the lexicographically smallest permutation starting with that prefix.
Why the Swap Method Fails:
The swap method disrupts element ordering in the suffix. After swapping element at position 0 with element at position 2, the suffix is now [original[0], original[2]] with elements potentially out of order. Subsequent iterations don't select the smallest available element.
Forcing Lexicographic Order with Swaps:
One can modify the swap approach to restore sorted order in the suffix after each choice, but this adds O(n log n) overhead per choice, making it less efficient than the used-array approach.
12345678910111213141516171819202122232425262728293031323334353637383940414243444546474849
function permuteLexicographic<T>(elements: T[]): T[][] { // Sort input to ensure lexicographic output const sorted = [...elements].sort((a, b) => { if (a < b) return -1; if (a > b) return 1; return 0; }); const results: T[][] = []; const n = sorted.length; const used: boolean[] = new Array(n).fill(false); const current: T[] = []; function backtrack(): void { if (current.length === n) { results.push([...current]); return; } // Iterate in sorted order ensures lexicographic output for (let i = 0; i < n; i++) { if (used[i]) continue; current.push(sorted[i]); used[i] = true; backtrack(); current.pop(); used[i] = false; } } backtrack(); return results;} // Verify lexicographic orderconst chars = ["c", "a", "b"];console.log("Lexicographic permutations of ['c','a','b']:");permuteLexicographic(chars).forEach((p, i) => { console.log(`${i + 1}: ${p.join("")}`);}); // Output:// 1: abc// 2: acb// 3: bac// 4: bca// 5: cab// 6: cbaThe Next-Permutation Algorithm:
An alternative for generating lexicographic permutations is the next-permutation algorithm, which transforms the current permutation into its lexicographic successor in O(n) time. This is useful when you need to:
The algorithm works by:
If no pivot exists (the array is the last permutation), return an indicator that enumeration is complete.
1234567891011121314151617181920212223242526272829303132333435363738394041424344454647
function nextPermutation<T>(arr: T[]): boolean { const n = arr.length; // Step 1: Find the rightmost element smaller than its successor let pivot = n - 2; while (pivot >= 0 && arr[pivot] >= arr[pivot + 1]) { pivot--; } // No pivot found means this is the last permutation if (pivot < 0) return false; // Step 2: Find rightmost element greater than pivot let successor = n - 1; while (arr[successor] <= arr[pivot]) { successor--; } // Step 3: Swap pivot with successor [arr[pivot], arr[successor]] = [arr[successor], arr[pivot]]; // Step 4: Reverse suffix after pivot let left = pivot + 1; let right = n - 1; while (left < right) { [arr[left], arr[right]] = [arr[right], arr[left]]; left++; right--; } return true;} // Generate all permutations iterativelyfunction allPermutationsIterative<T>(elements: T[]): T[][] { const arr = [...elements].sort((a, b) => (a < b ? -1 : a > b ? 1 : 0)); const results: T[][] = []; do { results.push([...arr]); } while (nextPermutation(arr)); return results;} console.log("Iterative generation of [1,2,3] permutations:");allPermutationsIterative([1, 2, 3]).forEach(p => console.log(p));When implementing permutation generation in production systems, several practical factors influence algorithm choice and implementation details.
Memory Management:
Storing all n! permutations is often infeasible. For n = 12, just storing the permutations (without any overhead) requires about 5.8 GB. Alternative strategies:
123456789101112131415161718192021222324252627282930313233343536373839404142
function* permuteGenerator<T>(elements: T[]): Generator<T[], void, unknown> { const n = elements.length; const used: boolean[] = new Array(n).fill(false); const current: T[] = []; function* backtrack(): Generator<T[], void, unknown> { if (current.length === n) { yield [...current]; // Yield each permutation return; } for (let i = 0; i < n; i++) { if (used[i]) continue; current.push(elements[i]); used[i] = true; yield* backtrack(); // Delegate to recursive generator current.pop(); used[i] = false; } } yield* backtrack();} // Usage: Process permutations one at a timeconsole.log("Processing permutations without storing all:");let count = 0;for (const perm of permuteGenerator([1, 2, 3, 4])) { count++; if (count <= 3) console.log(`Permutation ${count}: [${perm}]`);}console.log(`Total permutations processed: ${count}`); // Can also stop earlyconsole.log("\nFinding first permutation ending with 4:");for (const perm of permuteGenerator([1, 2, 3, 4])) { if (perm[perm.length - 1] === 4) { console.log(`Found: [${perm}]`); break; }}current directly instead of copyingInt32Array or Uint8Array can be faster than generic arrays for numerical elementsHeap's Algorithm generates permutations such that consecutive permutations differ by exactly one swap. This minimizes array manipulation and can be implemented non-recursively with O(1) extra space (beyond the array itself). It's the theoretically optimal algorithm for raw permutation enumeration speed, but produces non-lexicographic order.
We've explored the fundamental problem of generating all permutations through the lens of backtracking. Let's consolidate the key insights:
Core Concepts:
Implementation Approaches:
What's Next:
In the following pages, we'll extend this foundation to:
You now possess a comprehensive understanding of permutation generation through backtracking. You can implement multiple approaches, reason about their correctness, analyze their complexity, and choose appropriately for practical constraints. This foundation extends naturally to the combination problems we'll explore next.