Loading learning content...
In any hierarchical structure—whether an organizational chart, a file system directory, or a family tree—understanding where an element sits within the hierarchy is fundamental. How many levels down from the top is this employee? How many folders deep is this file? How many generations separate this ancestor from this descendant?
In tree data structures, we formalize this notion of "how far down" with a precise measurement: the depth of a node. This seemingly simple concept underpins nearly every algorithm that operates on trees, from basic traversals to sophisticated balancing operations in self-balancing trees.
By the end of this page, you will understand the formal definition of node depth, be able to compute depth for any node in any tree, and recognize how depth drives algorithmic complexity analysis. You'll see why mastering this simple measurement is essential for reasoning about tree performance.
Depth measures how far a node is from the root of the tree. Specifically:
The depth of a node is the number of edges on the path from the root to that node.
This definition is elegant in its simplicity, yet it carries profound implications for how we analyze tree algorithms.
We count edges, not nodes. The path from the root to itself contains zero edges, so the root has depth 0. A child of the root is connected by one edge, so it has depth 1. This edge-counting convention is nearly universal in computer science literature, though some sources count nodes instead (making the root depth 1). We use the edge-counting convention throughout this course.
Key properties of depth:
The root always has depth 0 — There are no edges between the root and itself.
Depth is non-negative — No node can have negative depth since paths cannot have negative edges.
Depth increases by exactly 1 for each parent-to-child edge — If a node has depth d, all its children have depth d + 1.
Depth is unique for each node — Every node has exactly one path from the root (a fundamental property of trees), so depth is well-defined.
Depth defines levels — All nodes at the same depth form a "level" of the tree (we'll explore this in detail later).
| Node Position | Number of Edges from Root | Depth |
|---|---|---|
| Root node | 0 | 0 |
| Child of root | 1 | 1 |
| Grandchild of root | 2 | 2 |
| Great-grandchild of root | 3 | 3 |
| Node k edges from root | k | k |
Understanding depth becomes intuitive when we visualize it. Consider the following binary tree structure and observe how depth changes as we move down the hierarchy:
A (depth 0) / \ / \ B C (depth 1) / \ \ D E F (depth 2) / / \ G H I (depth 3) Depth Breakdown: ───────────────────────────────────────── Depth 0: [A] → 1 node (root) Depth 1: [B, C] → 2 nodes Depth 2: [D, E, F] → 3 nodes Depth 3: [G, H, I] → 3 nodes ───────────────────────────────────────── Path examples: • Path to G: A → B → D → G (3 edges, depth = 3) • Path to F: A → C → F (2 edges, depth = 2) • Path to B: A → B (1 edge, depth = 1) • Path to A: A (0 edges, depth = 0)Observations from the visualization:
Depth stratifies the tree horizontally — All nodes at the same depth appear at the same "layer" when the tree is drawn with the root at the top.
Siblings share the same depth — Nodes B and C are both children of A, so both have depth 1. Nodes D and E are both children of B, so both have depth 2.
Depth reveals ancestry — If node X has depth d, then all ancestors of X have depths 0, 1, 2, ..., d-1, and the root (depth 0) is always an ancestor.
The path length equals the depth — To reach node G, we traverse 3 edges (A→B, B→D, D→G), confirming G's depth is 3.
Think of depth like generations in a family tree. The founding ancestor (root) is generation 0. Their children are generation 1. Grandchildren are generation 2. The "depth" of a person is how many generations they are removed from the founder. This analogy helps when reasoning about tree algorithms that process nodes by generation (level-order traversal).
Computing the depth of a node requires traversing from the root to that node, counting edges along the way. There are two primary approaches: recursive definition and iterative traversal.
Depth has an elegant recursive definition:
depth(node) = 0 if node is the root
depth(node) = 1 + depth(parent(node)) otherwise
This says: the root has depth 0, and every other node has depth one greater than its parent. This recursive structure mirrors the tree's own recursive nature.
123456789101112131415161718192021222324252627282930313233343536373839404142434445464748495051525354555657585960616263
// Tree node definitioninterface TreeNode<T> { value: T; parent: TreeNode<T> | null; // Parent pointer enables upward traversal children: TreeNode<T>[];} /** * Computes the depth of a node by traversing up to the root. * * Time Complexity: O(d) where d is the depth of the node * Space Complexity: O(1) - iterative version uses constant space * * @param node - The node whose depth we want to compute * @returns The depth of the node (0 for root) */function computeDepth<T>(node: TreeNode<T>): number { let depth = 0; let current: TreeNode<T> | null = node; // Traverse upward until we reach the root while (current.parent !== null) { depth++; current = current.parent; } return depth;} /** * Recursive version - more elegant but uses O(d) stack space */function computeDepthRecursive<T>(node: TreeNode<T>): number { // Base case: root has no parent if (node.parent === null) { return 0; } // Recursive case: depth is 1 + parent's depth return 1 + computeDepthRecursive(node.parent);} // Example usagefunction demonstrateDepthComputation() { // Build a small tree: A // / \ // B C // / // D const nodeA: TreeNode<string> = { value: 'A', parent: null, children: [] }; const nodeB: TreeNode<string> = { value: 'B', parent: nodeA, children: [] }; const nodeC: TreeNode<string> = { value: 'C', parent: nodeA, children: [] }; const nodeD: TreeNode<string> = { value: 'D', parent: nodeB, children: [] }; nodeA.children = [nodeB, nodeC]; nodeB.children = [nodeD]; console.log(`Depth of A: ${computeDepth(nodeA)}`); // 0 console.log(`Depth of B: ${computeDepth(nodeB)}`); // 1 console.log(`Depth of C: ${computeDepth(nodeC)}`); // 1 console.log(`Depth of D: ${computeDepth(nodeD)}`); // 2}The above implementation assumes nodes have parent pointers. Many tree implementations omit parent pointers to save memory. Without parent pointers, computing depth requires a top-down search from the root to find the target node, making it O(n) in the worst case rather than O(d). Alternatively, depth can be tracked as a parameter during tree traversal.
Many practical tree implementations—especially binary trees used in algorithms—don't maintain parent pointers. In these cases, depth is typically computed or tracked during traversal rather than queried for individual nodes.
There are two common patterns:
The most common approach passes the current depth as a parameter during recursive traversal. This pattern appears constantly in tree algorithms:
123456789101112131415161718192021222324252627282930313233343536373839404142434445464748495051525354555657585960616263
interface BinaryTreeNode<T> { value: T; left: BinaryTreeNode<T> | null; right: BinaryTreeNode<T> | null;} /** * Traverses the tree, tracking depth at each node. * This pattern is used extensively in tree algorithms. * * Time Complexity: O(n) - visits every node once * Space Complexity: O(h) - recursive call stack, where h is tree height */function traverseWithDepth<T>( node: BinaryTreeNode<T> | null, depth: number = 0, process: (node: BinaryTreeNode<T>, depth: number) => void): void { if (node === null) return; // Process current node with its depth process(node, depth); // Recurse to children with depth + 1 traverseWithDepth(node.left, depth + 1, process); traverseWithDepth(node.right, depth + 1, process);} /** * Example: Find the depth of all nodes with a specific value. */function findDepthsOfValue<T>( root: BinaryTreeNode<T> | null, targetValue: T): number[] { const depths: number[] = []; traverseWithDepth(root, 0, (node, depth) => { if (node.value === targetValue) { depths.push(depth); } }); return depths;} /** * Example: Group nodes by depth (compute all levels). */function groupNodesByDepth<T>( root: BinaryTreeNode<T> | null): Map<number, T[]> { const levels = new Map<number, T[]>(); traverseWithDepth(root, 0, (node, depth) => { if (!levels.has(depth)) { levels.set(depth, []); } levels.get(depth)!.push(node.value); }); return levels;}When you need to find the depth of a specific node (identified by reference or value) without parent pointers:
1234567891011121314151617181920212223242526272829303132333435363738394041424344454647484950515253545556
/** * Finds the depth of a target node by searching from the root. * Returns -1 if the node is not found in the tree. * * Time Complexity: O(n) worst case - may need to visit all nodes * Space Complexity: O(h) - recursive call stack depth */function findNodeDepth<T>( root: BinaryTreeNode<T> | null, target: BinaryTreeNode<T>, currentDepth: number = 0): number { // Base case: null node if (root === null) { return -1; // Node not found in this subtree } // Found the target node if (root === target) { return currentDepth; } // Search in left subtree const leftResult = findNodeDepth(root.left, target, currentDepth + 1); if (leftResult !== -1) { return leftResult; } // Search in right subtree return findNodeDepth(root.right, target, currentDepth + 1);} /** * Find depth by value (finds first occurrence). * Useful when nodes are identified by value rather than reference. */function findDepthByValue<T>( root: BinaryTreeNode<T> | null, targetValue: T, currentDepth: number = 0): number { if (root === null) { return -1; } if (root.value === targetValue) { return currentDepth; } const leftResult = findDepthByValue(root.left, targetValue, currentDepth + 1); if (leftResult !== -1) { return leftResult; } return findDepthByValue(root.right, targetValue, currentDepth + 1);}In practice, if you need node depths frequently, it's more efficient to compute them once during a single traversal and store them (in a Map or as a node property) rather than repeatedly searching from the root. The choice between parent pointers, parameter passing, and caching depends on your access patterns and memory constraints.
Understanding depth is crucial for analyzing the complexity of tree algorithms. The depth of nodes directly affects:
For many tree operations, the time required is proportional to the depth of the target node:
Accessing a node: Following a path from root requires visiting each node on the path. For a node at depth d, this takes O(d) time.
Insertion (in many tree types): Finding the correct position often requires traversing from root to a leaf. Insertion at depth d takes O(d) time.
Deletion: Similarly, deleting a node at depth d requires O(d) time to locate it (plus any rebalancing).
| Operation | Time Complexity | Why Depth Matters |
|---|---|---|
| Find node at depth d | O(d) | Must traverse d edges from root |
| Insert at depth d | O(d) | Must navigate to insertion point |
| Path from root to node | O(d) | Path length equals depth |
| All ancestors of node | O(d) | Node at depth d has exactly d ancestors |
| Compute depth (with parent ptr) | O(d) | Must traverse up d edges |
Depth directly impacts the space complexity of recursive tree algorithms. Each recursive call adds a frame to the call stack, and the maximum call stack depth equals the maximum node depth (tree height).
For a tree with height h (maximum depth of any node):
In languages without tail call optimization, recursive algorithms on deep trees can cause stack overflow. For a degenerate tree with 1 million nodes, the recursive call stack would be 1 million frames deep—far exceeding typical stack size limits. This is why iterative implementations using explicit stacks are sometimes necessary for production code.
The relationship between depth and performance explains why balanced trees (AVL, Red-Black, B-trees) are so important:
| Tree Shape | Maximum Depth | Operation Time | Space for Recursion |
|---|---|---|---|
| Balanced binary tree | O(log n) | O(log n) | O(log n) |
| Degenerate (linked list) | O(n) | O(n) | O(n) |
Balanced trees guarantee that no node has excessive depth, ensuring efficient operations. This is the fundamental motivation for self-balancing tree structures, which we'll explore in later chapters.
The concept of depth applies uniformly across all tree types, but its distribution and bounds vary significantly:
The definition of depth—number of edges from root to node—doesn't change based on tree type. What changes is the range of depths, distribution of nodes across depths, and guarantees about maximum depth. This uniformity makes depth a powerful tool for comparing tree structures.
Depth is used as a key tool in many tree algorithms. Here are patterns you'll encounter repeatedly:
Many problems require processing nodes level by level (all nodes at depth 0, then all at depth 1, etc.). This is the basis of Breadth-First Search (BFS) on trees:
123456789101112131415161718192021222324252627282930
/** * Level-order traversal: process nodes by increasing depth. * Uses a queue to ensure nodes are processed in order of depth. */function levelOrderTraversal<T>(root: BinaryTreeNode<T> | null): T[][] { if (root === null) return []; const result: T[][] = []; const queue: BinaryTreeNode<T>[] = [root]; while (queue.length > 0) { const levelSize = queue.length; const currentLevel: T[] = []; // Process all nodes at current depth for (let i = 0; i < levelSize; i++) { const node = queue.shift()!; currentLevel.push(node.value); // Add children (next depth) to queue if (node.left) queue.push(node.left); if (node.right) queue.push(node.right); } result.push(currentLevel); } return result; // Result: [[depth-0 nodes], [depth-1 nodes], [depth-2 nodes], ...]}Finding the maximum depth in a tree is a classic problem with an elegant recursive solution:
123456789101112131415161718192021
/** * Finds the maximum depth (height) of a binary tree. * * Time Complexity: O(n) - visits every node * Space Complexity: O(h) - recursive stack depth */function maxDepth<T>(root: BinaryTreeNode<T> | null): number { // Base case: empty tree has depth -1 (or 0, depending on convention) if (root === null) { return -1; // Using convention: empty tree has depth -1 } // Recursive case: max depth is 1 + max of children's depths const leftDepth = maxDepth(root.left); const rightDepth = maxDepth(root.right); return 1 + Math.max(leftDepth, rightDepth);} // Note: Some sources define maxDepth of empty tree as 0.// Adjust the base case accordingly for your use case.Many problems require finding nodes at specific depths or within depth ranges:
1234567891011121314151617181920212223242526272829303132333435363738394041424344
/** * Collect all nodes at a specific depth. */function getNodesAtDepth<T>( root: BinaryTreeNode<T> | null, targetDepth: number, currentDepth: number = 0): T[] { if (root === null) return []; if (currentDepth === targetDepth) { return [root.value]; } // Target depth is deeper - recurse to children return [ ...getNodesAtDepth(root.left, targetDepth, currentDepth + 1), ...getNodesAtDepth(root.right, targetDepth, currentDepth + 1), ];} /** * Find the minimum depth (shallowest leaf). * Useful for quickly determining if tree is balanced. */function minDepth<T>(root: BinaryTreeNode<T> | null): number { if (root === null) return -1; // If a leaf node if (root.left === null && root.right === null) { return 0; } // If only one child exists, go that way if (root.left === null) { return 1 + minDepth(root.right); } if (root.right === null) { return 1 + minDepth(root.left); } // Both children exist - take minimum return 1 + Math.min(minDepth(root.left), minDepth(root.right));}We've thoroughly explored the concept of node depth—one of the fundamental measurements in tree data structures. Let's consolidate the key insights:
What's next:
We've established depth as a measure from the root downward. In the next page, we'll explore height—the complementary measurement that looks from a node downward to its deepest descendant. Together, depth and height provide a complete picture of a node's position in the tree hierarchy.
You now understand how depth measures a node's position relative to the root, how to compute depth algorithmically, and how depth impacts the complexity of tree operations. This foundation is essential for analyzing any tree algorithm and understanding why balanced trees are so valuable.