Loading problem...
The softmax function is one of the most essential transformations in machine learning and deep learning. It converts a vector of arbitrary real numbers (often called logits or raw scores) into a valid probability distribution — a vector of positive numbers that sum to exactly 1.
Given an input vector z = [z₁, z₂, ..., zₙ] containing n real numbers, the softmax function computes the output vector σ(z) where each element is defined as:
$$\sigma(z)i = \frac{e^{z_i}}{\sum{j=1}^{n} e^{z_j}}$$
In this formula:
When implementing softmax, a critical challenge is numerical overflow. For large input values, computing e^z directly can exceed floating-point limits. The solution is to subtract the maximum value from all inputs before exponentiation:
$$\sigma(z)i = \frac{e^{z_i - \max(z)}}{\sum{j=1}^{n} e^{z_j - \max(z)}}$$
This transformation is mathematically equivalent but prevents overflow since the largest exponent becomes e⁰ = 1.
Your Task: Write a Python function that computes the softmax probability distribution for a given list of input scores. Your implementation should handle numerical stability to prevent overflow when dealing with large input values. Return the result as a list of floating-point numbers rounded to 4 decimal places.
scores = [1, 2, 3][0.09, 0.2447, 0.6652]Step-by-step calculation:
Notice how the largest input (3) receives the highest probability (≈66.5%), reflecting the softmax's tendency to emphasize larger values.
scores = [0, 0][0.5, 0.5]When all inputs are equal:
When all input values are identical, softmax produces a uniform distribution — each outcome has equal probability. This makes intuitive sense: if the model has no preference among options, it assigns equal likelihood to each.
scores = [-1, 0, 1][0.09, 0.2447, 0.6652]Softmax is translation-invariant:
Key insight: The output is identical to [1, 2, 3] because softmax only cares about relative differences between values, not their absolute magnitudes. Adding or subtracting a constant from all inputs doesn't change the result.
Constraints