Loading learning content...
The entire purpose of implementing a queue with a linked list and tail pointer is to achieve O(1) time complexity for both enqueue and dequeue. In this page, we'll implement these operations with meticulous attention to detail, analyze their complexity rigorously, and understand exactly why they're constant time.
This isn't merely about writing working code—it's about understanding the operations at a level where you can:
Let's build these operations step by step, with complete code and exhaustive analysis.
By the end of this page, you will have production-ready implementations of enqueue and dequeue, understand exactly why each achieves O(1) time complexity, master all edge cases, and be able to trace through any sequence of operations with confidence.
Enqueue adds an element to the rear of the queue. In our linked list implementation, this means appending a new node after the current tail and updating the tail pointer.
Conceptual Steps:
The Complete Implementation:
123456789101112131415161718192021222324252627282930313233343536
/** * Adds an element to the rear of the queue. * Time Complexity: O(1) - constant time, always * Space Complexity: O(1) - only allocates one node * * @param element The value to add to the queue */enqueue(element: T): void { // Step 1: Create the new node const newNode = new QueueNode<T>(element); // Step 2: Link new node after current tail (if tail exists) // This sets up the chain: ... -> oldTail -> newNode if (this.tail !== null) { this.tail.next = newNode; } // Step 3: Update tail pointer to new node // The new node is now the rear of the queue this.tail = newNode; // Step 4: Handle empty queue case // If head is null, queue was empty, so new node is also the front if (this.head === null) { this.head = newNode; } // Step 5: Update count this.count++;} // Invariants preserved:// - tail always points to the last node (or null if empty)// - tail.next is always null (we just created newNode with next=null)// - if queue has one element, head === tail// - count matches the number of nodesVisual walkthrough of enqueue to non-empty queue:
We update tail.next BEFORE updating tail. If we updated tail first, we'd lose our reference to the old tail and couldn't link the new node into the chain. Similarly, we check isEmpty (via head === null) BEFORE updating head, so we correctly detect when the queue was empty.
Let's rigorously analyze the time and space complexity of the enqueue operation.
Time Complexity: O(1)
To prove O(1), we must show that the number of operations is constant, regardless of queue size:
| Operation | Number of Executions | Cost |
|---|---|---|
| Create new node | 1 | O(1) - single allocation |
| Check if tail is null | 1 | O(1) - comparison |
| Set tail.next (if applicable) | 0 or 1 | O(1) - pointer assignment |
| Update tail pointer | 1 | O(1) - pointer assignment |
| Check if head is null | 1 | O(1) - comparison |
| Update head pointer (if applicable) | 0 or 1 | O(1) - pointer assignment |
| Increment count | 1 | O(1) - arithmetic |
Total operations: At most 7 primitive operations, regardless of whether the queue has 0, 100, or 1 million elements.
Crucially: There are no loops. We don't traverse the list. We go directly to the tail via the tail pointer and append there. This is the key difference from a linked list without a tail pointer, which would require O(n) traversal to find the end.
Space Complexity: O(1) auxiliary
Space per element: Each element in the queue occupies one node, which costs:
Unlike array-based implementations where resizing creates occasional O(n) operations (leading to O(1) amortized complexity), linked list enqueue is O(1) in the worst case. Every single enqueue takes constant time. This makes linked lists preferable in real-time systems where occasional latency spikes are unacceptable.
Dequeue removes and returns the element at the front of the queue. In our linked list implementation, this means removing the head node, returning its data, and advancing the head pointer.
Conceptual Steps:
The Complete Implementation:
1234567891011121314151617181920212223242526272829303132333435363738394041
/** * Removes and returns the element at the front of the queue. * Time Complexity: O(1) - constant time, always * Space Complexity: O(1) - no additional allocation * * @returns The element that was at the front of the queue * @throws Error if the queue is empty */dequeue(): T { // Step 1: Check for empty queue if (this.head === null) { throw new Error("Cannot dequeue from an empty queue"); } // Step 2: Store the data to return // We need to capture this before we lose the reference to the node const data: T = this.head.data; // Step 3: Advance head pointer to the next node // If there was only one element, head.next is null, so head becomes null this.head = this.head.next; // Step 4: Handle transition to empty queue // If head is now null, we just removed the last element // tail must also be set to null to maintain invariants if (this.head === null) { this.tail = null; } // Step 5: Update count this.count--; // Step 6: Return the data return data;} // Invariants preserved:// - head points to the first node (or null if empty)// - if head is null, tail is also null// - if queue has one element after dequeue, head === tail// - count matches the number of remaining nodesVisual walkthrough of dequeue from multi-element queue:
In garbage-collected languages (JavaScript, Python, Java), the old head node becomes eligible for garbage collection once no references to it remain. In manual memory management languages (C, C++), you must explicitly free/delete the old node to avoid memory leaks.
Let's rigorously analyze the time and space complexity of the dequeue operation.
Time Complexity: O(1)
Again, we show that operations are constant regardless of queue size:
| Operation | Number of Executions | Cost |
|---|---|---|
| Check if head is null | 1 | O(1) - comparison |
| Read head.data | 1 | O(1) - field access |
| Read head.next | 1 | O(1) - field access |
| Update head pointer | 1 | O(1) - pointer assignment |
| Check if head is null (after update) | 1 | O(1) - comparison |
| Update tail pointer (if applicable) | 0 or 1 | O(1) - pointer assignment |
| Decrement count | 1 | O(1) - arithmetic |
Total operations: At most 7 primitive operations, regardless of queue size.
Key observation: Like enqueue, there are no loops. We don't traverse the list. We access the head directly, retrieve its data, and update the head pointer in a single step. The queue could have millions of elements, and dequeue still performs the same constant number of operations.
Space Complexity: O(1)
Both enqueue and dequeue achieve O(1) because we maintain direct pointers to the relevant positions. Head gives us O(1) access to the front; tail gives us O(1) access to the rear. Without the tail pointer, enqueue would be O(n). Without the head pointer (if we only had tail), dequeue would be O(n). The two-pointer architecture is essential.
The peek (or front) operation returns the element at the front of the queue without removing it. This is trivial with our linked list implementation.
Implementation:
123456789101112131415161718192021222324252627282930313233
/** * Returns the element at the front of the queue without removing it. * Time Complexity: O(1) * Space Complexity: O(1) * * @returns The element at the front of the queue * @throws Error if the queue is empty */front(): T { if (this.head === null) { throw new Error("Cannot peek at an empty queue"); } return this.head.data;} // Alias for front()peek(): T { return this.front();} /** * Returns the element at the rear of the queue without removing it. * Time Complexity: O(1) - thanks to our tail pointer! * * Note: This is a bonus operation enabled by our tail pointer. * Standard queue ADT doesn't require this, but it's free for us. */rear(): T { if (this.tail === null) { throw new Error("Cannot peek rear of an empty queue"); } return this.tail.data;}Complexity: O(1) time and space. We simply return head.data—a direct field access with no traversal.
Bonus rear() operation: Because we maintain a tail pointer, viewing the rear element is also O(1). This isn't part of the standard queue ADT, but it's a free bonus of our implementation. Some applications (like certain scheduling algorithms) benefit from knowing both ends.
Peek/front is read-only—it doesn't modify the queue structure at all. This makes it safe to call repeatedly without affecting queue state. It's useful for conditional logic: 'process next item only if it meets criteria' without committing to dequeue.
These utility operations complete the queue interface. Both are trivially O(1) thanks to our maintained state.
isEmpty(): Returns whether the queue contains no elements.
123456789101112131415161718192021222324252627282930313233343536
/** * Checks if the queue contains no elements. * Time Complexity: O(1) * Space Complexity: O(1) * * @returns true if the queue is empty, false otherwise */isEmpty(): boolean { // Any of these would work due to our invariants: // return this.head === null; // return this.tail === null; // return this.count === 0; // We use head for clarity (front of queue is most intuitive) return this.head === null;} /** * Returns the number of elements in the queue. * Time Complexity: O(1) - we maintain a counter * Space Complexity: O(1) * * @returns The number of elements currently in the queue */size(): number { return this.count;} // Alternative: If we didn't maintain count, size() would be O(n):// let count = 0;// let current = this.head;// while (current !== null) {// count++;// current = current.next;// }// return count;Why maintain a count?
Without a count field, size() would require traversing the entire list—O(n). By incrementing on enqueue and decrementing on dequeue, we get O(1) size queries at the cost of:
This trade-off is almost always worthwhile. Size queries are common (for logging, monitoring, capacity decisions), and the overhead is negligible.
The count field is technically redundant—we could compute it by traversing. But maintaining it gives us O(1) size queries AND an extra invariant to check during debugging. If count doesn't match the actual node count, something is wrong. Redundant state, properly maintained, catches bugs earlier.
Here's the complete, production-ready implementation combining all operations with proper documentation, error handling, and utility methods:
12345678910111213141516171819202122232425262728293031323334353637383940414243444546474849505152535455565758596061626364656667686970717273747576777879808182838485868788899091929394
/** * A node in the linked queue. * Each node holds data and a reference to the next node. */class QueueNode<T> { data: T; next: QueueNode<T> | null; constructor(data: T) { this.data = data; this.next = null; }} /** * A queue implementation using a singly linked list with tail pointer. * All operations are O(1) time complexity. */class LinkedQueue<T> { private head: QueueNode<T> | null = null; private tail: QueueNode<T> | null = null; private count: number = 0; /** Add element to the rear of the queue. O(1) */ enqueue(element: T): void { const newNode = new QueueNode<T>(element); if (this.tail !== null) { this.tail.next = newNode; } this.tail = newNode; if (this.head === null) { this.head = newNode; } this.count++; } /** Remove and return element from the front. O(1) */ dequeue(): T { if (this.head === null) { throw new Error("Queue is empty"); } const data = this.head.data; this.head = this.head.next; if (this.head === null) { this.tail = null; } this.count--; return data; } /** View front element without removing. O(1) */ front(): T { if (this.head === null) { throw new Error("Queue is empty"); } return this.head.data; } /** Alias for front() */ peek(): T { return this.front(); } /** Check if queue is empty. O(1) */ isEmpty(): boolean { return this.head === null; } /** Get number of elements. O(1) */ size(): number { return this.count; } /** Remove all elements. O(1) */ clear(): void { this.head = null; this.tail = null; this.count = 0; } /** Convert to array (front to rear). O(n) */ toArray(): T[] { const result: T[] = []; let current = this.head; while (current !== null) { result.push(current.data); current = current.next; } return result; } /** String representation. O(n) */ toString(): string { return `Queue[${this.toArray().join(" -> ")}]`; }}This implementation handles all edge cases, maintains invariants correctly, provides useful utility methods, and achieves O(1) for all core operations. You can use this as a template or reference for your own queue implementations.
We've implemented and analyzed the core queue operations in detail. Let's consolidate the key insights:
| Operation | Time | Space | Notes |
|---|---|---|---|
| enqueue(element) | O(1) | O(1) | Append at tail, update pointers |
| dequeue() | O(1) | O(1) | Remove head, update pointers |
| front() / peek() | O(1) | O(1) | Return head.data |
| isEmpty() | O(1) | O(1) | Check if head is null |
| size() | O(1) | O(1) | Return maintained count |
| clear() | O(1) | O(1) | Set pointers to null |
What's next:
With the core operations mastered, we'll examine why the tail pointer is not just useful but essential for practical queue implementations. We'll explore what happens without it. and understand the precise conditions under which O(1) enqueue is achievable.
You now have production-ready implementations of enqueue and dequeue with rigorous complexity analysis. These implementations handle all edge cases correctly and maintain proper invariants. You can implement, analyze, and debug linked queue operations with confidence.