Loading content...
In numerical linear algebra, the Orthogonal-Triangular Factorization (also known as QR decomposition) is a powerful technique for expressing a matrix as the product of two matrices with special structural properties. This factorization is fundamental to solving linear systems, computing eigenvalues, and implementing least-squares optimization.
Given a matrix A of dimensions n × m, the goal is to decompose it into two matrices:
The relationship A = Q × R must hold after the decomposition.
The Gram-Schmidt Orthonormalization Process:
The classical approach to computing this factorization uses the Gram-Schmidt process, which iteratively constructs orthonormal vectors from the columns of the input matrix:
Normalize the first column of A to create the first column of Q: $$q_1 = \frac{a_1}{|a_1|}$$
For each subsequent column $a_k$, remove its projections onto all previous $q_j$ vectors to create an orthogonal component, then normalize: $$u_k = a_k - \sum_{j=1}^{k-1} (q_j^T a_k) q_j$$ $$q_k = \frac{u_k}{|u_k|}$$
Build R by computing the dot products: $R_{jk} = q_j^T a_k$ for $j \leq k$, and $R_{jk} = 0$ for $j > k$.
Your Task: Implement a function that performs orthogonal-triangular factorization on a given matrix using the Gram-Schmidt process. Return both the orthogonal matrix Q and the upper triangular matrix R as a tuple, with all values rounded to 4 decimal places.
A = [[3.0, 0.0], [4.0, 5.0]]Q = [[0.6, -0.8], [0.8, 0.6]], R = [[5.0, 4.0], [0.0, 3.0]]Step 1: Process the first column of A • First column: a₁ = [3, 4]ᵀ • Compute norm: ‖a₁‖ = √(3² + 4²) = √25 = 5 • Normalize: q₁ = [3/5, 4/5]ᵀ = [0.6, 0.8]ᵀ • R₁₁ = ‖a₁‖ = 5
Step 2: Process the second column of A • Second column: a₂ = [0, 5]ᵀ • Compute projection onto q₁: R₁₂ = q₁ᵀ · a₂ = (0.6)(0) + (0.8)(5) = 4 • Subtract projection: u₂ = a₂ - R₁₂·q₁ = [0, 5]ᵀ - 4·[0.6, 0.8]ᵀ = [-2.4, 1.8]ᵀ • Compute norm: ‖u₂‖ = √(2.4² + 1.8²) = √9 = 3 • Normalize: q₂ = [-2.4/3, 1.8/3]ᵀ = [-0.8, 0.6]ᵀ • R₂₂ = ‖u₂‖ = 3
Final Result: • Q = [[0.6, -0.8], [0.8, 0.6]] — orthonormal columns • R = [[5.0, 4.0], [0.0, 3.0]] — upper triangular
Verification: Q × R = [[0.6×5 + (-0.8)×0, 0.6×4 + (-0.8)×3], [0.8×5 + 0.6×0, 0.8×4 + 0.6×3]] = [[3, 0], [4, 5]] = A ✓
A = [[1.0, 1.0, 0.0], [1.0, 0.0, 1.0], [0.0, 1.0, 1.0]]Q = [[0.7071, 0.4082, -0.5774], [0.7071, -0.4082, 0.5774], [0.0, 0.8165, 0.5774]], R = [[1.4142, 0.7071, 0.7071], [0.0, 1.2247, 0.4082], [0.0, 0.0, 1.1547]]For this 3×3 matrix, the Gram-Schmidt process is applied iteratively:
Processing Column 1: • a₁ = [1, 1, 0]ᵀ, ‖a₁‖ = √2 ≈ 1.4142 • q₁ = [1/√2, 1/√2, 0]ᵀ ≈ [0.7071, 0.7071, 0.0]ᵀ • R₁₁ = 1.4142
Processing Column 2: • Project a₂ = [1, 0, 1]ᵀ onto q₁: R₁₂ = q₁ᵀa₂ = 0.7071 • Subtract projection and normalize to get q₂ ≈ [0.4082, -0.4082, 0.8165]ᵀ • R₂₂ = 1.2247
Processing Column 3: • Project a₃ = [0, 1, 1]ᵀ onto q₁ and q₂ • Orthogonalize and normalize to get q₃ ≈ [-0.5774, 0.5774, 0.5774]ᵀ • R₁₃ = 0.7071, R₂₃ = 0.4082, R₃₃ = 1.1547
The orthonormal property of Q can be verified: each column has unit length, and dot products between different columns equal zero.
A = [[1.0, 2.0, 3.0], [4.0, 5.0, 6.0]]Q = [[0.2425, 0.9701, 0.0], [0.9701, -0.2425, 0.0]], R = [[4.1231, 5.3358, 6.5485], [0.0, 0.7276, 1.4552], [0.0, 0.0, 0.0]]This example demonstrates factorization of a rectangular matrix (2×3), which has more columns than rows:
Key Observation: When n < m (tall and thin transpose), the matrix A cannot have full column rank. The third column of R will contain zeros since the column space of A is at most 2-dimensional.
Processing: • The first two columns of Q span the column space of A • R captures the coefficients relating A's columns to the orthonormal basis • The third row of R is all zeros because the input matrix only has 2 rows
This case is particularly important in overdetermined systems and least-squares regression, where we often have more features than observations.
Constraints