Loading content...
In multivariate calculus and machine learning, understanding how changes in input variables affect output values is crucial for optimization and sensitivity analysis. The Gradient Field Matrix (also known as the Jacobian matrix) provides a complete picture of these relationships for vector-valued functions.
Given a vector-valued function f: ℝⁿ → ℝᵐ that maps an n-dimensional input to an m-dimensional output, the gradient field matrix is an m × n matrix where each entry (i, j) contains the partial derivative of the i-th output component with respect to the j-th input variable:
$$J_{ij} = \frac{\partial f_i}{\partial x_j}$$
For analytical computation, you would differentiate each component function symbolically. However, in many practical scenarios—especially with complex or black-box functions—numerical differentiation using finite differences is the preferred approach:
$$\frac{\partial f_i}{\partial x_j} \approx \frac{f_i(x + h \cdot e_j) - f_i(x - h \cdot e_j)}{2h}$$
where h is a small step size (typically around 10⁻⁵ to 10⁻⁸) and eⱼ is the j-th unit vector (1 in position j, 0 elsewhere).
Mathematical Significance:
Your Task: Implement a function that computes the gradient field matrix of a given vector-valued function at a specified point using numerical differentiation. The function should handle different types of input functions and return the complete matrix of partial derivatives.
function_type = "polynomial_xyz"
point = [2.0, 3.0]
params = {"type": "x2_xy_y2"}[[4.0, 0.0], [3.0, 2.0], [0.0, 6.0]]The function f(x, y) = [x², xy, y²] maps ℝ² → ℝ³. We compute the 3×2 gradient field matrix:
Row 1 (derivatives of f₁ = x²): • ∂(x²)/∂x = 2x = 2(2) = 4.0 • ∂(x²)/∂y = 0 = 0.0
Row 2 (derivatives of f₂ = xy): • ∂(xy)/∂x = y = 3.0 • ∂(xy)/∂y = x = 2.0
Row 3 (derivatives of f₃ = y²): • ∂(y²)/∂x = 0 = 0.0 • ∂(y²)/∂y = 2y = 2(3) = 6.0
The resulting gradient field matrix captures how each output component responds to infinitesimal changes in each input variable at the point (2, 3).
function_type = "linear_transform"
point = [1.0, 2.0]
params = {"type": "sum_diff"}[[1.0, 1.0], [1.0, -1.0]]For the linear function f(x, y) = [x + y, x - y] (sum and difference), we compute the 2×2 gradient field matrix:
Row 1 (derivatives of f₁ = x + y): • ∂(x + y)/∂x = 1.0 • ∂(x + y)/∂y = 1.0
Row 2 (derivatives of f₂ = x - y): • ∂(x - y)/∂x = 1.0 • ∂(x - y)/∂y = -1.0
Note that for linear functions, the gradient field matrix is constant regardless of the evaluation point. This matrix represents a 45° rotation combined with scaling—a fundamental transformation in signal processing (converting between sum/difference and original coordinates).
function_type = "single_variable"
point = [3.0]
params = {"type": "x_squared"}[[6.0]]For the single-variable function f(x) = x², the gradient field matrix reduces to a 1×1 matrix containing just the ordinary derivative:
Entry (1,1): • df/dx = d(x²)/dx = 2x = 2(3) = 6.0
This special case shows that the gradient field matrix generalizes the concept of a derivative from single-variable calculus to multivariable vector functions. At x = 3, the instantaneous rate of change is 6, meaning a small change Δx produces approximately 6Δx change in the output.
Constraints