Loading learning content...
You have already learned that a stack is one of the most fundamental data structures in computer science—a simple container following the Last-In, First-Out (LIFO) principle. With just push, pop, and peek operations, stacks power everything from expression evaluation to the call stack that executes your programs.
But what happens when we add a single constraint to the stack? What if we demand that the elements in the stack always maintain a specific order—never violating that order, even as elements come and go?
This seemingly small constraint transforms the humble stack into something remarkably powerful: the monotonic stack. This specialized structure enables elegant O(n) solutions to problems that would otherwise require O(n²) brute-force approaches.
By the end of this page, you will understand what makes a stack 'monotonic,' why maintaining order matters, and how this constraint unlocks efficient solutions to a family of important algorithmic problems. You'll develop the intuition to recognize when a monotonic stack is the right tool for the job.
The term monotonic comes from mathematics, describing a function or sequence that is entirely non-increasing or non-decreasing. A monotonically increasing sequence only goes up (or stays flat); it never decreases. A monotonically decreasing sequence only goes down (or stays flat); it never increases.
When we apply this property to a stack, we get a monotonic stack: a stack where all elements, from bottom to top, maintain a monotonic order. This means:
We can further classify monotonic stacks as strictly monotonic (no equal elements allowed) or non-strictly monotonic (equal elements permitted). The choice depends on the problem requirements. For instance, if we're finding the next strictly greater element, we use a strictly increasing stack. If equal elements are considered 'not greater,' we might allow duplicates.
At first glance, maintaining a monotonic property seems like an arbitrary constraint. Why would we deliberately limit ourselves? The answer lies in what this constraint reveals.
Consider a common problem pattern: for each element in an array, find the next greater element—the first element to its right that is strictly larger. The brute-force approach examines every element to the right of each position, yielding O(n²) complexity.
But observe what happens when we process elements in order while maintaining a monotonically decreasing stack:
The monotonic structure is not arbitrary—it's precisely what we need to efficiently answer relationship queries between elements.
The monotonic property ensures that:
This transforms a quadratic problem into a linear one—a dramatic improvement that becomes critical as input sizes grow.
Imagine you're tracking stock prices over time, and for each day, you want to know when the price will next exceed today's price. You could check every future day (O(n²)), or you could maintain a waiting list of 'unresolved' days. When a new high price arrives, all the days with lower prices can be resolved simultaneously. The monotonic stack is that waiting list, organized so resolutions happen in the right order.
An invariant is a property that remains true throughout the execution of an algorithm. For monotonic stacks, the invariant is simple but powerful:
At all times, the stack elements maintain monotonic order from bottom to top.
To maintain this invariant, we must modify our push operation. Before pushing a new element, we pop all elements that would violate the monotonic order if the new element were added. This 'cleaning' process is essential—and it's where the magic happens.
1234567891011121314151617181920212223242526
// Monotonically INCREASING stack push operation// (each element is ≥ the one below it) function pushIncreasing(stack, newElement): // Pop all elements GREATER than newElement // (they would violate the increasing order if we kept them below newElement) while stack is not empty AND stack.top() > newElement: popped = stack.pop() // At this moment, 'newElement' is the "next smaller" for 'popped' recordNextSmaller(popped, newElement) stack.push(newElement) // Monotonically DECREASING stack push operation// (each element is ≤ the one below it) function pushDecreasing(stack, newElement): // Pop all elements SMALLER than newElement // (they would violate the decreasing order if we kept them below newElement) while stack is not empty AND stack.top() < newElement: popped = stack.pop() // At this moment, 'newElement' is the "next greater" for 'popped' recordNextGreater(popped, newElement) stack.push(newElement)Notice the symmetry:
This relationship between the stack type and the answers it provides is fundamental. The stack type determines what question you're answering:
| Stack Type | What We Pop | What Incoming Element Represents |
|---|---|---|
| Monotonically Increasing | Elements > incoming | Next Smaller Element for popped items |
| Monotonically Decreasing | Elements < incoming | Next Greater Element for popped items |
One of the most common sources of errors is confusing which stack type solves which problem. Remember: the answer for a popped element is always the element that caused it to be popped. An increasing stack pops on smaller elements (so it finds next smaller); a decreasing stack pops on greater elements (so it finds next greater).
Let's walk through a concrete example to solidify understanding. Suppose we want to find the next greater element for each position in the array [4, 2, 5, 3, 7]. We'll use a monotonically decreasing stack (because we're looking for 'next greater').
We'll track both values and indices because problems often require knowing where the answer is, not just what it is.
| Step | Current Element | Stack Before (values) | Action | Stack After (values) | Answers Found |
|---|---|---|---|---|---|
| 1 | 4 (index 0) | [] | Push 4 | [4] | None |
| 2 | 2 (index 1) | [4] | 2 < 4, Push 2 | [4, 2] | None |
| 3 | 5 (index 2) | [4, 2] | 5 > 2, Pop 2 → answer for index 1 is 5; 5 > 4, Pop 4 → answer for index 0 is 5; Push 5 | [5] | NGE[1]=5, NGE[0]=5 |
| 4 | 3 (index 3) | [5] | 3 < 5, Push 3 | [5, 3] | None |
| 5 | 7 (index 4) | [5, 3] | 7 > 3, Pop 3 → answer for index 3 is 7; 7 > 5, Pop 5 → answer for index 2 is 7; Push 7 | [7] | NGE[3]=7, NGE[2]=7 |
| End | — | [7] | Stack not empty: index 4 has no greater element | [] | NGE[4]=-1 (none) |
Final Result: [5, 5, 7, 7, -1]
Observe the critical patterns:
In practice, we typically store indices on the stack rather than values. This allows us to access the value via arr[stack.top()] while also knowing the position. Storing indices is essential when you need to calculate distances (e.g., 'how many days until a warmer temperature?') or when building result arrays.
One might look at the nested while loop inside the main for loop and worry about O(n²) complexity. But this concern misses a crucial insight: the amortized analysis.
Amortized analysis examines the total work done across all operations, not just the worst case of a single operation. For monotonic stacks, the key observation is:
Every element is pushed onto the stack exactly once and popped from the stack at most once. Regardless of how the while loop behaves on individual iterations, the total number of pushes and pops across the entire algorithm is at most 2n. Therefore, the amortized time complexity is O(n).
Let's prove this more rigorously:
n + n = 2n stack operations → O(n).This is an example of aggregate analysis: even though individual iterations might pop many elements (the while loop runs many times), the total across all iterations is bounded.
Contrast with Brute Force:
| Approach | Time Complexity | Why |
|---|---|---|
| Brute Force | O(n²) | For each element, scan all elements to its right |
| Monotonic Stack | O(n) | Each element pushed/popped at most once |
The improvement from O(n²) to O(n) is not incremental—it's transformative. For an array of 100,000 elements:
That's a 50,000x speedup. This is why understanding monotonic stacks matters in competitive programming and production systems alike.
Recognizing when a monotonic stack applies is a crucial skill. Look for these signatures in problem statements:
The common thread is relative ordering questions: for each element, you need to find another element that stands in a particular ordering relationship, constrained by position (to the left, to the right, within a window). If you're comparing each element to potentially all others, you likely have suboptimal O(n²) structure. A monotonic stack can collapse this.
Anti-patterns (when monotonic stacks don't apply):
One profound way to understand monotonic stacks is to think of them as selectively remembering the past.
As we scan through an array left to right, we encounter elements in order. A regular approach might remember everything (an auxiliary array) or nothing (pure streaming). The monotonic stack occupies a middle ground: it remembers exactly the elements that might still matter for future queries.
Why does an element stop mattering? Consider finding the next greater element:
The stack is a compressed history of elements that could still answer future queries. The monotonic property is precisely what determines which elements are 'superseded' and can be discarded.
Think of the stack as a waiting room for elements that haven't found their 'answer' yet. When a new element arrives that IS the answer for some waiting elements, those elements are resolved and leave the waiting room. The order of the waiting room (monotonic) ensures that when an answer arrives, we can efficiently identify all elements it resolves.
This perspective also explains why the algorithm is correct:
Completeness: Every element eventually either gets an answer (when a resolving element arrives) or is determined to have no answer (when it remains on the stack at the end).
Correctness: When an element is popped by incoming element X, X is indeed the first element to satisfy the condition (e.g., being greater), because any previous greater element would have already caused the pop.
Efficiency: We never re-examine elements. Once popped, an element is fully processed.
We've established the conceptual foundation for monotonic stacks. Let's consolidate the key insights:
What's Next:
In the next page, we'll dive deeper into the distinction between monotonically increasing and monotonically decreasing stacks. You'll learn precisely when to use each variant, see more examples of both, and develop the muscle memory to choose correctly under exam or interview pressure.
You now understand what a monotonic stack is, why the monotonic property matters, and how it enables O(n) solutions to ordering-relationship problems. You have the conceptual foundation needed to learn the specific variants and classic applications in the following pages.