Loading problem...
You are given the root node of a binary tree. Your task is to calculate the arithmetic mean (average) of all node values at each depth level of the tree.
For a tree with height h (where the root is at level 0), you should return an array of h floating-point numbers. The element at index i represents the mean value of all nodes at level i.
Traverse the tree level-by-level (breadth-first), compute the sum of all node values at each level, divide by the count of nodes at that level, and store the resulting mean. Return the complete array of level means from the root level (level 0) down to the deepest level.
Note: Results within 10⁻⁵ of the expected answer are considered correct due to floating-point precision considerations.
root = [3,9,20,null,null,15,7][3.0,14.5,11.0]At level 0, there is only the root node with value 3, so the mean is 3.0. At level 1, nodes 9 and 20 give us (9 + 20) / 2 = 14.5. At level 2, nodes 15 and 7 give us (15 + 7) / 2 = 11.0.
root = [3,9,20,15,7][3.0,14.5,11.0]Level 0 contains node 3 (mean = 3.0). Level 1 contains nodes 9 and 20 (mean = 14.5). Level 2 contains nodes 15 and 7 (mean = 11.0).
root = [1,2,3,4,5,6,7][1.0,2.5,5.5]This is a complete binary tree. Level 0 has node 1 (mean = 1.0). Level 1 has nodes 2 and 3 (mean = 2.5). Level 2 has nodes 4, 5, 6, and 7 (mean = 5.5).
Constraints