Linear Algebra For AIML - Chapter 5 Linear Transformations - IndianTechnoEra
Latest update Android YouTube

Linear Algebra For AIML - Chapter 5 Linear Transformations

Chapter 5 — Linear Transformations

Linear transformations are operations that map vectors from one space to another, often using matrices. They are fundamental in AI & ML for manipulating data, performing feature engineering, and transforming images in computer vision.

5.1 What is a Linear Transformation?

A linear transformation is a function T that maps vectors v in one vector space to vectors in another vector space while preserving vector addition and scalar multiplication:

T(u + v) = T(u) + T(v)
T(α * v) = α * T(v)

In matrix form, applying a transformation is usually done by multiplying a matrix A with a vector v:
v_transformed = A × v

5.2 Types of Linear Transformations

Common linear transformations in 2D or 3D:

  • Rotation: Rotates a vector around the origin.
  • Scaling: Changes the size of vectors in one or more directions.
  • Shearing: Slants the shape along an axis.
  • Translation: Shifts vectors by adding a constant vector (note: strictly speaking, translation is affine, not linear).

Importance in AI/ML: Transformations allow us to manipulate features, normalize data, rotate images, or adjust coordinate systems — essential for preprocessing, augmentation, and geometric operations in models.

5.3 Matrix Representation

Every linear transformation can be represented as a matrix multiplication:

A = [[2, 0],
     [0, 3]]  # scaling
v = [1, 1]
v_transformed = A @ v  # [2, 3]

Here, A scales x by 2 and y by 3.

5.4 Quick NumPy Examples (Practical)

import numpy as np

# Define a 2D vector
v = np.array([1, 1])

# Scaling transformation
A = np.array([[2, 0],
              [0, 3]])
v_scaled = A @ v
print("Scaled vector:", v_scaled)

# Rotation 90 degrees counterclockwise
theta = np.pi/2
R = np.array([[np.cos(theta), -np.sin(theta)],
              [np.sin(theta),  np.cos(theta)]])
v_rotated = R @ v
print("Rotated vector:", v_rotated)

# Shearing transformation
S = np.array([[1, 1],
              [0, 1]])
v_sheared = S @ v
print("Sheared vector:", v_sheared)

5.5 Geometric Intuition

- Scaling stretches or compresses vectors along axes.
- Rotation spins vectors around the origin.
- Shearing slants shapes, like turning a rectangle into a parallelogram.
- Visualizing transformations helps understand how data is manipulated in feature space or images.

5.6 AI/ML Use Cases (Why Linear Transformations Matter)

  • Computer Vision: Image rotations, scaling, and shearing for data augmentation improve model robustness.
  • Feature Engineering: Transforming features (e.g., scaling, PCA) to optimize model performance.
  • Neural Networks: Each linear layer in a neural network is a learned linear transformation.
  • Graphics & Robotics: Coordinate transformations for 3D modeling and movement planning.

5.7 Exercises

  1. Scale the vector [3, 4] by 2x in x and 0.5x in y using a matrix.
  2. Rotate vector [1, 0] by 180° using a rotation matrix.
  3. Shear the vector [2,1] along x-axis with shear factor 1.
  4. Verify that a rotation preserves the vector magnitude (||v||).
Answers / Hints
  1. Scaling matrix: [[2,0],[0,0.5]], result: [6, 2]
  2. Rotation matrix 180°: [[-1,0],[0,-1]], result: [-1,0]
  3. Shear matrix: [[1,1],[0,1]], result: [3,1]
  4. Magnitude ||[cosθ, sinθ]|| = 1, remains unchanged under rotation.

5.8 Practice Projects / Mini Tasks

  • Take a small grayscale image (matrix of pixels) and apply scaling, rotation, and shear using NumPy.
  • Implement a simple data augmentation pipeline for a dataset of 2D points.
  • Visualize original vs transformed vectors using matplotlib to see effects of linear transformations.

5.9 Further Reading & Videos

  • 3Blue1Brown — Essence of Linear Algebra (YouTube series on transformations).
  • NumPy documentation — matrix operations, rotation/scaling matrices.
  • Linear Algebra textbooks — chapters on linear transformations and eigenvectors.

Next chapter: Eigenvalues & Eigenvectors — learn how to compute them, their geometric meaning, and their importance in PCA, SVD, and ML models.

Post a Comment

Feel free to ask your query...
Cookie Consent
We serve cookies on this site to analyze traffic, remember your preferences, and optimize your experience.
Oops!
It seems there is something wrong with your internet connection. Please connect to the internet and start browsing again.
AdBlock Detected!
We have detected that you are using adblocking plugin in your browser.
The revenue we earn by the advertisements is used to manage this website, we request you to whitelist our website in your adblocking plugin.
Site is Blocked
Sorry! This site is not available in your country.