Loading problem...
The Smooth Rectifier (commonly known as Softplus) is a differentiable activation function used extensively in neural networks as a smooth approximation of the Rectified Linear Unit (ReLU). Unlike ReLU, which has a sharp corner at zero, the smooth rectifier provides a continuously differentiable curve that enables more stable gradient flow during backpropagation.
The smooth rectifier function is defined mathematically as:
$$f(x) = \ln(1 + e^x)$$
This function elegantly combines the exponential and natural logarithm to create a smooth, monotonically increasing curve that:
When implementing this function, you must handle numerical overflow and underflow carefully:
For large positive x (e.g., x > 20): The exponential e^x becomes astronomically large, potentially causing overflow. However, ln(1 + e^x) ≈ x for large x, so the function can be simplified.
For large negative x (e.g., x < -20): The exponential e^x becomes vanishingly small, and ln(1 + e^x) ≈ e^x, which is numerically stable.
A numerically stable implementation uses:
Implement the smooth rectifier activation function that:
x = 22.1269For x = 2, we compute the smooth rectifier as:
f(2) = ln(1 + e²) = ln(1 + 7.389...) = ln(8.389...) ≈ 2.1269
The output is slightly larger than x because the function includes the contribution from ln(1 + e^x), which is always positive. As x increases, the difference between f(x) and x diminishes.
x = 00.6931For x = 0, we compute:
f(0) = ln(1 + e⁰) = ln(1 + 1) = ln(2) ≈ 0.6931
At x = 0, the smooth rectifier returns ln(2), which is the natural logarithm of 2. This is a key inflection point where the function transitions from near-zero behavior to near-linear behavior.
x = -20.1269For x = -2, we compute:
f(-2) = ln(1 + e⁻²) = ln(1 + 0.1353...) = ln(1.1353...) ≈ 0.1269
For negative inputs, the function approaches zero but never reaches it. Unlike ReLU which would output exactly 0 for negative inputs, the smooth rectifier maintains a small positive value, ensuring the gradient never completely vanishes.
Constraints