Loading problem...
The sigmoid function, also known as the logistic function, is one of the most iconic mathematical transformations in machine learning and statistics. It serves as a fundamental activation function that elegantly maps any real-valued number to a probability-like value between 0 and 1.
The sigmoid function is mathematically defined as:
$$\sigma(z) = \frac{1}{1 + e^{-z}}$$
Where:
Key Properties of the Sigmoid Function:
S-Shaped Curve: The sigmoid produces a characteristic "S" or sigmoidal shape when plotted, transitioning smoothly from 0 to 1.
Symmetric Center Point: At z = 0, the sigmoid returns exactly 0.5, representing the decision boundary.
Asymptotic Behavior: As z approaches +∞, σ(z) approaches 1. As z approaches -∞, σ(z) approaches 0.
Monotonically Increasing: The function is always increasing—larger inputs always produce larger outputs.
Differentiable: The sigmoid has a smooth derivative: σ'(z) = σ(z) × (1 - σ(z)), which is crucial for gradient-based learning.
Historical Significance:
The sigmoid function has been central to the development of neural networks since the 1980s. While modern deep learning often prefers ReLU and its variants, sigmoid remains essential in output layers for binary classification, gates in LSTM/GRU networks, and probabilistic modeling.
Your Task:
Write a Python function that computes the sigmoid activation for a given input value. The function should return the result rounded to four decimal places to ensure consistent precision.
z = 00.5When z = 0, the sigmoid calculation is σ(0) = 1 / (1 + e⁰) = 1 / (1 + 1) = 1 / 2 = 0.5. This represents the exact midpoint of the sigmoid curve, the point of maximum uncertainty or the decision boundary in binary classification.
z = 10.7311For z = 1, we calculate σ(1) = 1 / (1 + e⁻¹) = 1 / (1 + 0.3679) ≈ 1 / 1.3679 ≈ 0.7311. Notice how a small positive input pushes the output above 0.5 but not all the way to 1.
z = -10.2689For z = -1, we calculate σ(-1) = 1 / (1 + e¹) = 1 / (1 + 2.7183) ≈ 1 / 3.7183 ≈ 0.2689. This demonstrates the symmetry: σ(-1) = 1 - σ(1), showing how negative inputs produce outputs below the 0.5 midpoint.
Constraints