Loading content...
You now understand the elegant design of the LRU cache: a hash map pointing to nodes in a doubly linked list, with sentinel nodes simplifying edge cases. It's time to translate this design into production-quality code.
Implementation is where theory meets reality. Small mistakes — an off-by-one error, a forgotten pointer update, an incorrect order of operations — can break the entire data structure. This page provides a battle-tested implementation, explains every line, and discusses the subtle details that separate working code from correct code.
By the end, you'll have not just code that compiles, but code you deeply understand and can confidently modify, debug, and explain in any interview or code review.
By the end of this page, you will have a complete, bug-free LRU cache implementation, understand every implementation decision and its rationale, know how to handle edge cases correctly, and be prepared for common interview follow-ups and variations.
The node class is the fundamental building block. Each node lives simultaneously in two structures: it's a value in the hash map and an element in the doubly linked list.
Design considerations:
12345678910111213141516171819202122232425262728293031323334353637
/** * Doubly Linked List Node for LRU Cache * * Each node stores: * - key: Required for removing from HashMap during eviction * - value: The actual cached data * - prev/next: Pointers for doubly linked list structure * * @template K - Key type (typically number or string) * @template V - Value type (the cached data) */class LRUNode<K, V> { key: K; value: V; prev: LRUNode<K, V> | null; next: LRUNode<K, V> | null; constructor(key: K, value: V) { this.key = key; this.value = value; this.prev = null; this.next = null; }} /** * Alternative: Using a simple object for languages without classes * This pattern works well in JavaScript interviews */function createNode<K, V>(key: K, value: V): LRUNode<K, V> { return { key, value, prev: null, next: null };}In Python, using slots prevents the creation of dict for each instance, reducing memory overhead by ~50% for small objects. With millions of cache entries, this saves significant memory. Always use slots for node classes in performance-critical Python code.
The LRU cache class orchestrates the hash map and doubly linked list. Let's build it step by step, starting with the class structure and constructor.
Class structure:
123456789101112131415161718192021222324252627282930313233343536373839404142434445464748495051525354555657585960616263646566
/** * LRU Cache Implementation * * Combines a HashMap with a Doubly Linked List to achieve: * - O(1) get operations * - O(1) put operations * - O(1) eviction of least recently used item * * @template K - Key type * @template V - Value type */class LRUCache<K, V> { // === PRIVATE MEMBERS === /** * Maps keys to their corresponding nodes in the linked list. * Enables O(1) lookup of any cache entry by key. */ private cache: Map<K, LRUNode<K, V>>; /** * Maximum number of items the cache can hold. * When exceeded, the LRU item is evicted. */ private readonly capacity: number; /** * Sentinel node marking the head (MRU end) of the list. * head.next points to the most recently used real node. * Note: head itself is never removed or replaced. */ private readonly head: LRUNode<K, V>; /** * Sentinel node marking the tail (LRU end) of the list. * tail.prev points to the least recently used real node. * Note: tail itself is never removed or replaced. */ private readonly tail: LRUNode<K, V>; constructor(capacity: number) { if (capacity <= 0) { throw new Error("Capacity must be positive"); } this.capacity = capacity; this.cache = new Map(); // Initialize sentinel nodes with dummy values // These are structural markers, not real cache entries this.head = new LRUNode<K, V>(null as unknown as K, null as unknown as V); this.tail = new LRUNode<K, V>(null as unknown as K, null as unknown as V); // Connect sentinels: head <-> tail // This represents an empty cache this.head.next = this.tail; this.tail.prev = this.head; } /** * Returns the current number of items in the cache. */ get size(): number { return this.cache.size; }}Constructor analysis:
The constructor establishes a valid empty state with all invariants satisfied.
Marking sentinels as readonly/const expresses intent: these specific node instances never change. While their prev/next pointers change, the sentinel nodes themselves are fixed for the cache's lifetime. This prevents accidental reassignment bugs.
Before implementing get() and put(), we need helper methods for linked list manipulation. These encapsulate the pointer surgery, making the main methods cleaner and less error-prone.
The three essential helpers:
removeNode(node) — Disconnect a node from its current positionaddToHead(node) — Insert a node right after the head sentinelmoveToHead(node) — Shorthand for remove + addToHead1234567891011121314151617181920212223242526272829303132333435363738394041424344454647484950515253545556575859606162636465
// Inside LRUCache class: /** * Removes a node from its current position in the linked list. * * This does NOT remove from the hash map — only from the list. * After this call, the node is "floating" (not in any list). * * Time: O(1) — just 4 pointer updates * * @param node - The node to remove (must be a real node, not a sentinel) */private removeNode(node: LRUNode<K, V>): void { // Get neighbors const prevNode = node.prev!; const nextNode = node.next!; // Bypass this node: prev <-> next (cutting out 'node') prevNode.next = nextNode; nextNode.prev = prevNode; // Clear the node's pointers (optional but good practice) // Helps garbage collection and makes debugging easier node.prev = null; node.next = null;} /** * Inserts a node right after the head sentinel (making it MRU). * * The node should be a "floating" node (not currently in any list). * After this call, the node is the first real node in the list. * * Time: O(1) — just 4 pointer updates * * @param node - The node to insert (should not currently be in the list) */private addToHead(node: LRUNode<K, V>): void { // Current first real node (might be tail if list is empty) const firstNode = this.head.next!; // Insert between head and firstNode // head <-> node <-> firstNode node.prev = this.head; node.next = firstNode; // Update neighbors to point to the new node this.head.next = node; firstNode.prev = node;} /** * Moves an existing node to the head (making it MRU). * * Combines removeNode + addToHead into a single logical operation. * Used when a cache hit occurs (get or put on existing key). * * Time: O(1) * * @param node - The node to move (must already be in the list) */private moveToHead(node: LRUNode<K, V>): void { this.removeNode(node); this.addToHead(node);}In addToHead, we must read head.next BEFORE overwriting it. Writing 'this.head.next = node' first would lose the reference to the old first node. This order-of-operations bug is subtle and common — always read old values before writing new ones.
The get() method retrieves a value by key and updates recency. It's the simpler of the two main methods.
Algorithm:
12345678910111213141516171819202122232425262728293031323334353637383940414243444546
/** * Retrieves the value associated with the given key. * * If the key exists: * - Returns the associated value * - Marks the key as most recently used (moves to head) * * If the key doesn't exist: * - Returns null (or -1 for LeetCode-style int caches) * * Time Complexity: O(1) average * - Hash map lookup: O(1) average * - Move to head: O(1) always * * @param key - The key to look up * @returns The value if found, null otherwise */get(key: K): V | null { // Step 1: Hash map lookup — O(1) const node = this.cache.get(key); // Step 2: Cache miss — key not found if (node === undefined) { return null; } // Step 3: Cache hit — update recency // Move this node to the head (most recently used position) this.moveToHead(node); // Step 4: Return the cached value return node.value;} /** * LeetCode-style get() that returns -1 instead of null. * Use this for interview problems with integer values. */getLeetCode(key: K): V | -1 { const node = this.cache.get(key); if (node === undefined) { return -1; } this.moveToHead(node); return node.value;}Key observations:
In real codebases, get() returns Optional/null for missing keys. LeetCode problems often use -1 for missing integer values. Know which convention your interviewer expects, or ask to clarify.
The put() method is more complex because it handles both updates and inserts, and may trigger eviction.
Algorithm:
1234567891011121314151617181920212223242526272829303132333435363738394041424344454647484950515253545556575859
/** * Inserts or updates a key-value pair in the cache. * * If the key already exists: * - Updates the value * - Marks as most recently used * * If the key is new: * - If at capacity, evicts the LRU item first * - Inserts the new key-value pair * - Marks as most recently used * * Time Complexity: O(1) average * - Hash map operations: O(1) average * - Linked list operations: O(1) always * * @param key - The key to insert or update * @param value - The value to associate with the key */put(key: K, value: V): void { // Step 1: Check if key already exists const existingNode = this.cache.get(key); if (existingNode !== undefined) { // === CASE A: UPDATE EXISTING KEY === // Update the value in place existingNode.value = value; // Move to head (most recently used) this.moveToHead(existingNode); // Done — no size change, no eviction needed return; } // === CASE B: INSERT NEW KEY === // Step 2: Check if eviction is needed if (this.cache.size >= this.capacity) { // Evict the least recently used item (node before tail sentinel) const lruNode = this.tail.prev!; // CRITICAL: Remove from BOTH structures // First, remove from linked list this.removeNode(lruNode); // Then, remove from hash map (using the key stored in the node) this.cache.delete(lruNode.key); } // Step 3: Create and insert new node const newNode = new LRUNode(key, value); // Add to hash map this.cache.set(key, newNode); // Add to linked list head (most recently used position) this.addToHead(newNode);}The most common bug in LRU cache implementation: removing from the linked list but forgetting to remove from the hash map. This creates a 'zombie' entry — the hash map thinks the key exists, but its node is disconnected from the list. Always remove from BOTH structures during eviction.
Here's the complete, production-ready LRU cache implementation with all components together. This is the code you'd write in an interview or use in production.
123456789101112131415161718192021222324252627282930313233343536373839404142434445464748495051525354555657585960616263646566676869707172737475767778798081828384858687888990919293949596979899100101102103
/** * Complete LRU Cache Implementation * * Time Complexity: O(1) for both get and put (amortized) * Space Complexity: O(capacity) * * This implementation uses: * - HashMap for O(1) key lookup * - Doubly Linked List for O(1) recency updates * - Sentinel nodes for simplified edge case handling */ class LRUNode<K, V> { key: K; value: V; prev: LRUNode<K, V> | null = null; next: LRUNode<K, V> | null = null; constructor(key: K, value: V) { this.key = key; this.value = value; }} class LRUCache<K, V> { private cache: Map<K, LRUNode<K, V>>; private readonly capacity: number; private readonly head: LRUNode<K, V>; private readonly tail: LRUNode<K, V>; constructor(capacity: number) { if (capacity <= 0) { throw new Error("Capacity must be positive"); } this.capacity = capacity; this.cache = new Map(); // Sentinel nodes this.head = new LRUNode<K, V>(null as unknown as K, null as unknown as V); this.tail = new LRUNode<K, V>(null as unknown as K, null as unknown as V); this.head.next = this.tail; this.tail.prev = this.head; } get size(): number { return this.cache.size; } get(key: K): V | null { const node = this.cache.get(key); if (!node) return null; this.moveToHead(node); return node.value; } put(key: K, value: V): void { const existingNode = this.cache.get(key); if (existingNode) { existingNode.value = value; this.moveToHead(existingNode); return; } if (this.cache.size >= this.capacity) { const lruNode = this.tail.prev!; this.removeNode(lruNode); this.cache.delete(lruNode.key); } const newNode = new LRUNode(key, value); this.cache.set(key, newNode); this.addToHead(newNode); } private removeNode(node: LRUNode<K, V>): void { node.prev!.next = node.next; node.next!.prev = node.prev; } private addToHead(node: LRUNode<K, V>): void { node.next = this.head.next; node.prev = this.head; this.head.next!.prev = node; this.head.next = node; } private moveToHead(node: LRUNode<K, V>): void { this.removeNode(node); this.addToHead(node); }} // Usage exampleconst cache = new LRUCache<number, number>(2);cache.put(1, 1);cache.put(2, 2);console.log(cache.get(1)); // 1cache.put(3, 3); // Evicts key 2console.log(cache.get(2)); // null (was evicted)console.log(cache.get(3)); // 3A correct implementation must handle all edge cases. Here's a comprehensive test suite that covers the tricky scenarios:
Edge cases to test:
12345678910111213141516171819202122232425262728293031323334353637383940414243444546474849505152535455565758596061626364656667686970717273747576777879808182838485868788899091929394
/** * Comprehensive LRU Cache Test Suite */ function testLRUCache(): void { console.log("Running LRU Cache tests..."); // Test 1: Basic get/put (() => { const cache = new LRUCache<number, number>(2); cache.put(1, 100); cache.put(2, 200); console.assert(cache.get(1) === 100, "Test 1.1 failed"); console.assert(cache.get(2) === 200, "Test 1.2 failed"); console.log("✓ Test 1: Basic get/put"); })(); // Test 2: Get nonexistent key (() => { const cache = new LRUCache<number, number>(2); console.assert(cache.get(999) === null, "Test 2.1 failed"); cache.put(1, 100); console.assert(cache.get(999) === null, "Test 2.2 failed"); console.log("✓ Test 2: Get nonexistent key"); })(); // Test 3: Update existing key (no eviction) (() => { const cache = new LRUCache<number, number>(2); cache.put(1, 100); cache.put(2, 200); cache.put(1, 111); // Update, not new insert console.assert(cache.get(1) === 111, "Test 3.1: Value not updated"); console.assert(cache.get(2) === 200, "Test 3.2: Key 2 incorrectly evicted"); console.log("✓ Test 3: Update existing key"); })(); // Test 4: Eviction when at capacity (() => { const cache = new LRUCache<number, number>(2); cache.put(1, 100); cache.put(2, 200); cache.put(3, 300); // Should evict key 1 (LRU) console.assert(cache.get(1) === null, "Test 4.1: Key 1 not evicted"); console.assert(cache.get(2) === 200, "Test 4.2: Key 2 incorrectly evicted"); console.assert(cache.get(3) === 300, "Test 4.3: Key 3 not present"); console.log("✓ Test 4: Eviction at capacity"); })(); // Test 5: Get() updates recency (prevents eviction) (() => { const cache = new LRUCache<number, number>(2); cache.put(1, 100); cache.put(2, 200); cache.get(1); // Touch key 1 → now key 2 is LRU cache.put(3, 300); // Should evict key 2, not key 1 console.assert(cache.get(1) === 100, "Test 5.1: Key 1 incorrectly evicted"); console.assert(cache.get(2) === null, "Test 5.2: Key 2 not evicted"); console.assert(cache.get(3) === 300, "Test 5.3: Key 3 not present"); console.log("✓ Test 5: Get() updates recency"); })(); // Test 6: Capacity 1 (() => { const cache = new LRUCache<number, number>(1); cache.put(1, 100); console.assert(cache.get(1) === 100, "Test 6.1 failed"); cache.put(2, 200); // Immediate eviction of key 1 console.assert(cache.get(1) === null, "Test 6.2: Key 1 not evicted"); console.assert(cache.get(2) === 200, "Test 6.3: Key 2 not present"); console.log("✓ Test 6: Capacity 1"); })(); // Test 7: LeetCode example (() => { const cache = new LRUCache<number, number>(2); cache.put(1, 1); cache.put(2, 2); console.assert(cache.get(1) === 1, "Test 7.1 failed"); cache.put(3, 3); // evicts key 2 console.assert(cache.get(2) === null, "Test 7.2 failed"); cache.put(4, 4); // evicts key 1 console.assert(cache.get(1) === null, "Test 7.3 failed"); console.assert(cache.get(3) === 3, "Test 7.4 failed"); console.assert(cache.get(4) === 4, "Test 7.5 failed"); console.log("✓ Test 7: LeetCode example"); })(); console.log("✓ All tests passed!");} testLRUCache();In an interview, verbally walk through these test cases: "Let me trace through putting 1, putting 2, getting 1, putting 3. After put(3), key 2 should be evicted because get(1) made key 1 more recent than key 2." This demonstrates you understand the algorithm, not just the code.
You now have a complete, production-quality LRU cache implementation. Let's summarize the key points and preview advanced topics:
Advanced variations and follow-ups:
| Variation | Key Change | Challenge |
|---|---|---|
| TTL (Time-To-Live) | Entries expire after fixed time | Track insertion time, lazy or eager expiration |
| Thread-safe LRU | Concurrent get/put from multiple threads | Fine-grained locking or lock-free techniques |
| LFU Cache | Evict least frequently used | Additional frequency tracking, more complex |
| Bounded memory LRU | Limit by bytes, not count | Track entry sizes, variable eviction count |
| Distributed LRU | Cache shared across machines | Consistency, network partitions, replication |
Congratulations! You've mastered the LRU cache — one of the most important and frequently asked data structure problems. You understand the eviction policy, the O(1) requirements, the elegant hash map + doubly linked list design, and the complete implementation. This knowledge applies directly to production caching systems like Redis, Memcached, and browser caches.