Loading problem...
In the realm of neural network architectures, activation functions play a pivotal role in introducing non-linearity and enabling networks to learn complex patterns. Among these, the Softsign activation function stands out as an elegant alternative to traditional hyperbolic tangent (tanh) functions.
The Softsign function is defined mathematically as:
$$\text{softsign}(x) = \frac{x}{1 + |x|}$$
This function possesses several remarkable properties that make it valuable for neural network design:
Key Characteristics:
Mathematical Properties: The Softsign function exhibits polynomial decay rather than the exponential decay seen in tanh. This means:
Your Task: Implement a Python function that computes the Softsign activation value for any given real number input. Your implementation should correctly handle positive, negative, and zero inputs, returning the result rounded to 4 decimal places.
x = 10.5For x = 1:
softsign(1) = 1 / (1 + |1|) = 1 / (1 + 1) = 1 / 2 = 0.5
The positive input is transformed to exactly half its theoretical maximum output, demonstrating how the function compresses values toward zero.
x = 00.0For x = 0:
softsign(0) = 0 / (1 + |0|) = 0 / 1 = 0.0
The function passes through the origin, confirming its zero-centered nature. This property is essential for maintaining balanced activations in neural networks.
x = -2-0.6667For x = -2:
softsign(-2) = -2 / (1 + |-2|) = -2 / (1 + 2) = -2 / 3 ≈ -0.6667
The negative input produces a negative output of similar magnitude (but bounded), showcasing the function's odd symmetry: softsign(-x) = -softsign(x).
Constraints