Chapter 3 — Matrix Properties
Understanding different types of matrices and their properties is crucial in AI & ML, especially in linear regression, optimization, and dimensionality reduction. This chapter explains these concepts with examples and Python demonstrations.
3.1 Identity Matrix
An identity matrix is a square matrix with 1s on the diagonal and 0s elsewhere. It behaves like the number 1 in matrix multiplication: I × A = A × I = A.
Example (3x3):
I = [[1,0,0],
[0,1,0],
[0,0,1]]
Importance in AI/ML: Identity matrices are used to initialize weights in certain neural network layers and are fundamental in linear algebra operations where you want to preserve the original matrix.
3.2 Inverse Matrix
The inverse of a square matrix A is a matrix A⁻¹ such that A × A⁻¹ = I. It behaves like division in scalar math.
Example (2x2):
A = [[2,1],
[5,3]]
A_inv = [[3,-1],
[-5,2]]
A × A_inv = I
Importance in AI/ML: Inverse matrices are key in solving linear systems, which appear in linear regression, least squares solutions, and other optimization tasks. Without inverses, we cannot analytically solve these equations.
3.3 Orthogonal Matrices
An orthogonal matrix has columns (and rows) that are perpendicular (dot product = 0) and normalized (unit length). Its inverse is equal to its transpose: Q⁻¹ = Qᵀ.
Example:
Q = [[1,0],
[0,-1]]
Qᵀ × Q = I
Importance in AI/ML: Orthogonal matrices are used in dimensionality reduction (e.g., PCA) to rotate data without changing distances. They maintain numerical stability in computations.
3.4 Diagonal Matrices
A diagonal matrix has non-zero elements only on its main diagonal. All off-diagonal elements are zero.
Example (3x3):
D = [[4,0,0],
[0,5,0],
[0,0,2]]
Importance in AI/ML: Diagonal matrices simplify computations. For example, eigenvalue matrices in PCA are diagonal. They also appear in scaling operations and covariance matrices.
3.5 Symmetric Matrices
A symmetric matrix is equal to its transpose: A = Aᵀ.
Example:
A = [[2,3],
[3,4]]
Aᵀ = [[2,3],
[3,4]]
Importance in AI/ML: Symmetric matrices are everywhere in ML — e.g., covariance matrices, kernel matrices. They have special properties that simplify eigenvalue decomposition, which is used in PCA and other dimensionality reduction techniques.
3.6 Quick NumPy Examples (Practical)
import numpy as np
# Identity matrix
I = np.eye(3)
print("Identity matrix:\n", I)
# Inverse
A = np.array([[2,1],[5,3]])
A_inv = np.linalg.inv(A)
print("Inverse of A:\n", A_inv)
# Orthogonal check
Q = np.array([[1,0],[0,-1]])
print("Q.T @ Q =\n", Q.T @ Q)
# Diagonal matrix
D = np.diag([4,5,2])
print("Diagonal matrix:\n", D)
# Symmetric check
S = np.array([[2,3],[3,4]])
print("S == S.T?", np.all(S == S.T))
3.7 Exercises
- Create a 4x4 identity matrix in NumPy.
- Compute the inverse of
[[3,1],[2,4]]and verify that multiplying with original gives identity. - Check if
[[0,1],[-1,0]]is orthogonal. - Create a diagonal matrix from list
[7,2,5]and print it. - Verify if
[[1,2],[2,3]]is symmetric.
Answers / Hints
np.eye(4)- Inverse:
np.linalg.inv([[3,1],[2,4]])→ Multiply with original to check identity. - Check:
Q.T @ Q == np.eye(2) np.diag([7,2,5])np.all(A == A.T)returns True.
3.8 Practice Projects / Mini Tasks
- Implement a function to check if a matrix is orthogonal.
- Visualize diagonal matrices as heatmaps to understand their structure.
- Use a symmetric covariance matrix to perform PCA on a small dataset.
3.9 Further Reading & Videos
- 3Blue1Brown — Essence of Linear Algebra (YouTube series).
- NumPy documentation on
np.linalgand matrix operations. - Linear algebra books (e.g., Gilbert Strang) chapters on matrix properties.
Next chapter: Determinants & Rank — learn how to compute determinants, understand rank, and why they matter in AI & ML models.