Loading problem...
In predictive analytics and machine learning, evaluating the quality of predictions is crucial for understanding model performance. One of the most fundamental and interpretable metrics for measuring prediction accuracy is the Average Absolute Deviation (AAD) score, which quantifies the average magnitude of errors between predicted and actual values.
Unlike metrics that square the errors (which can over-emphasize large outliers), the AAD treats all errors linearly based on their absolute magnitude. This makes it particularly intuitive and interpretable—the AAD value is expressed in the same units as the original data, representing the average distance between predictions and reality.
Mathematical Definition:
Given a set of n actual values y₁, y₂, ..., yₙ and their corresponding predicted values ŷ₁, ŷ₂, ..., ŷₙ, the Average Absolute Deviation is computed as:
$$AAD = \frac{1}{n} \sum_{i=1}^{n} |y_i - \hat{y}_i|$$
Where:
Key Properties:
Your Task:
Implement a function that computes the Average Absolute Deviation score between two arrays: one containing the actual (ground truth) values and another containing the predicted values. Both arrays will have the same length and contain floating-point numbers.
y_true = [3.0, -0.5, 2.0, 7.0]
y_pred = [2.5, 0.0, 2.0, 8.0]0.5Let's calculate the absolute deviation for each pair:
• |3.0 - 2.5| = 0.5 • |-0.5 - 0.0| = 0.5 • |2.0 - 2.0| = 0.0 • |7.0 - 8.0| = 1.0
Sum of absolute deviations = 0.5 + 0.5 + 0.0 + 1.0 = 2.0
Average Absolute Deviation = 2.0 / 4 = 0.5
On average, the predictions deviate from actual values by 0.5 units.
y_true = [1.0, 2.0, 3.0, 4.0, 5.0]
y_pred = [1.0, 2.0, 3.0, 4.0, 5.0]0.0When predictions perfectly match actual values:
• |1.0 - 1.0| = 0.0 • |2.0 - 2.0| = 0.0 • |3.0 - 3.0| = 0.0 • |4.0 - 4.0| = 0.0 • |5.0 - 5.0| = 0.0
Sum of absolute deviations = 0.0
Average Absolute Deviation = 0.0 / 5 = 0.0
A perfect score of 0 indicates that all predictions exactly match the actual values.
y_true = [10.0, 20.0, 30.0]
y_pred = [11.0, 21.0, 31.0]1.0Each prediction is exactly 1 unit higher than the actual value:
• |10.0 - 11.0| = 1.0 • |20.0 - 21.0| = 1.0 • |30.0 - 31.0| = 1.0
Sum of absolute deviations = 1.0 + 1.0 + 1.0 = 3.0
Average Absolute Deviation = 3.0 / 3 = 1.0
The model consistently over-predicts by exactly 1 unit, which could indicate a systematic bias in the predictions.
Constraints