Linear Algebra For AIML - Chapter 1 Vectors - IndianTechnoEra
Latest update Android YouTube

Linear Algebra For AIML - Chapter 1 Vectors

Chapter 1 — Vectors

Vectors are the building blocks of data representation in AI & ML. This chapter explains vectors from first principles and shows practical examples with Python (NumPy).

1.1 What is a Vector?

A vector is an ordered list of numbers. We can think of a vector as a point in a coordinate system or as an arrow that has direction and magnitude. Example: [2, 5, -3].

Notation: A vector is often written as v or with boldface v. In math we use column vectors:
v = [2, 5, -3]^T (the T means transpose — it is a column).

1.2 Scalars vs Vectors

- Scalar: a single number (e.g., 5 or -2.3).
- Vector: an ordered list of scalars (e.g., [1, 0, 2]).

1.3 Dimension

The dimension (or length) of a vector is how many elements it contains. Example: [2,5,-3] is a 3-dimensional vector (written v ∈ R³).

1.4 Vector Addition & Subtraction

You add or subtract vectors element-wise. Both vectors must have the same dimension.

Example:

// Mathematical notation
u = [1, 2, 3]
v = [4, 0, -1]

u + v = [1+4, 2+0, 3+(-1)] = [5, 2, 2]
u - v = [1-4, 2-0, 3-(-1)] = [-3, 2, 4]

1.5 Scalar Multiplication

Multiply each element of a vector by a scalar number.


alpha = 3
v = [1, -2, 4]

alpha * v = [3*1, 3*-2, 3*4] = [3, -6, 12]

1.6 Magnitude (Length) of a Vector

The magnitude (or Euclidean norm) of a vector v = [v1, v2, ..., vn] is:

||v|| = sqrt(v1^2 + v2^2 + ... + vn^2)

Example: For v = [3, 4], ||v|| = sqrt(3^2 + 4^2) = 5.

1.7 Unit Vectors

A unit vector has magnitude 1. To convert a vector to a unit vector (normalize it), divide by its magnitude:

u = v / ||v||

Example: normalize v = [3,4]u = [3/5, 4/5].

1.8 Dot Product (Inner Product)

The dot product of two vectors a and b (same dimension) is:

a · b = a1*b1 + a2*b2 + ... + an*bn

Properties:

  • Gives a scalar.
  • Related to the cosine of the angle θ between vectors: a · b = ||a|| ||b|| cos(θ).

Use in ML: Dot product is used to compute similarity, projections, and inside linear layers and attention mechanisms.

1.9 Cross Product (3D)

The cross product is defined only in 3D (vectors in ) and returns a vector perpendicular to both inputs. It is useful in geometry and 3D transforms.

Given a = [a1, a2, a3], b = [b1, b2, b3],
a × b = [ a2*b3 - a3*b2,
          a3*b1 - a1*b3,
          a1*b2 - a2*b1 ]

Note: Cross product is less frequently used directly in ML models but important in 3D computer vision and graphics.

1.10 Quick NumPy Examples (Practical)

Install NumPy if you don't have it:

pip install numpy
import numpy as np

# create vectors
u = np.array([1, 2, 3])
v = np.array([4, 0, -1])

# addition & subtraction
print("u+v =", u + v)
print("u-v =", u - v)

# scalar multiplication
print("3 * v =", 3 * v)

# magnitude / norm
print("||u|| =", np.linalg.norm(u))

# normalize
u_unit = u / np.linalg.norm(u)
print("u normalized =", u_unit)

# dot product
dot = np.dot(u, v)
print("u · v =", dot)

# cosine similarity
cos_sim = dot / (np.linalg.norm(u) * np.linalg.norm(v))
print("cosine similarity =", cos_sim)

# cross product (3D)
cross = np.cross(u, v)
print("u × v =", cross)

1.11 Visual Intuition

- Adding two vectors is like placing arrows head-to-tail and drawing the resultant arrow.
- Dot product measures how aligned two vectors are: if it's large positive, they point similarly; if near zero, they're nearly orthogonal.

1.12 AI/ML Use Cases (Why Vectors Matter)

  • Word embeddings: Words are mapped to vectors (e.g., Word2Vec, GloVe, BERT tokens). Similar words have similar vectors.
  • Cosine similarity: Used to compare embeddings (e.g., semantic search).
  • Attention mechanism: In transformers, dot products between query and key vectors compute attention scores.
  • Image data: Images are vectors (flattened) or matrices/tensors; operations on them rely on vector math.

1.13 Common Pitfalls & Tips

  • Always check vector shapes before operations (shape mismatch is a common error in code).
  • Normalize vectors when comparing directions (use unit vectors for cosine similarity).
  • Be careful with broadcasting rules in NumPy when mixing arrays and scalars.

1.14 Exercises

  1. Compute the dot product and cosine similarity of a=[2,1,3] and b=[-1,4,0]. (Try by hand and then verify in Python.)
  2. Normalize the vector v=[0, -3, 4]. What is the unit vector?
  3. Given two vectors representing word embeddings, why might cosine similarity be preferred to Euclidean distance?
  4. Using NumPy, implement a function that checks if two vectors are orthogonal.
Answers / Hints
  1. Dot product: a·b = 2*(-1) + 1*4 + 3*0 = -2 + 4 + 0 = 2.
    Magnitudes: ||a|| = sqrt(4+1+9)=sqrt(14), ||b|| = sqrt(1+16+0)=sqrt(17).
    Cosine similarity = 2 / (sqrt(14)*sqrt(17)). Verify with NumPy.
  2. Magnitude of v is sqrt(0 + 9 + 16) = 5, so unit vector is [0, -3/5, 4/5].
  3. Cosine similarity measures orientation (angle) and ignores magnitude — useful when embeddings only need direction to show semantic similarity. Euclidean distance is sensitive to vector length.
  4. Orthogonal check: dot product equals zero. In code:
    def is_orthogonal(a, b, tol=1e-9):
        return abs(np.dot(a, b)) < tol
    

1.15 Practice Projects / Mini Tasks

  • Load two pre-trained word vectors (from a small word2vec file) and compute similarity between word pairs.
  • Visualize 2D vectors using a simple plot (matplotlib) to see addition and scalar multiplication results.
  • Implement a tiny search: given a query vector and a list of document vectors, return the top-3 most similar documents using cosine similarity.

1.16 Further Reading & Videos

  • 3Blue1Brown — Essence of Linear Algebra (YouTube series) — for visual intuition.
  • NumPy documentation — for operations and broadcasting rules.
  • Chapters on vectors in any standard linear algebra text (e.g., Gilbert Strang).

Next chapter: Matrices — we'll cover matrix multiplication, transpose, inverse, and how matrices represent linear transformations.

Post a Comment

Feel free to ask your query...
Cookie Consent
We serve cookies on this site to analyze traffic, remember your preferences, and optimize your experience.
Oops!
It seems there is something wrong with your internet connection. Please connect to the internet and start browsing again.
AdBlock Detected!
We have detected that you are using adblocking plugin in your browser.
The revenue we earn by the advertisements is used to manage this website, we request you to whitelist our website in your adblocking plugin.
Site is Blocked
Sorry! This site is not available in your country.