Linear Transformations

Understand functions that preserve the structure of vector spaces

When you resize an image, rotate a 3D model, or apply a filter to a photograph, you are performing a transformation. A transformation is simply a function that takes a vector as input and produces a vector as output. But not all transformations are created equal. Some transformations have a special property: they preserve the fundamental structure of the vector space. These are called linear transformations, and they are the heart of linear algebra.

What makes linear transformations so important? They are the transformations that “play nicely” with the operations of vector addition and scalar multiplication. When you understand linear transformations, you understand how matrices really work. Every matrix represents a linear transformation, and every linear transformation (on finite-dimensional spaces) can be represented by a matrix. This deep connection between algebra and geometry is one of the most powerful ideas in mathematics.

In this lesson, you will learn what makes a transformation linear, explore geometric examples like rotations, reflections, and projections, and discover how matrices arise naturally as the language for describing linear transformations.

Core Concepts

What Is a Transformation?

A transformation (also called a map or function) from a vector space $V$ to a vector space $W$ is a rule that assigns to each vector in $V$ exactly one vector in $W$. We write $T: V \to W$ to indicate that $T$ takes inputs from $V$ and produces outputs in $W$.

If $\vec{v}$ is a vector in $V$, we write $T(\vec{v})$ for the output. This output is called the image of $\vec{v}$ under $T$.

For example, consider the transformation $T: \mathbb{R}^2 \to \mathbb{R}^2$ defined by:

$$T\begin{pmatrix} x \ y \end{pmatrix} = \begin{pmatrix} 2x \ 3y \end{pmatrix}$$

This transformation takes a point $(x, y)$ and stretches it horizontally by a factor of 2 and vertically by a factor of 3. Every point in the plane has exactly one image under this transformation.

What Makes a Transformation “Linear”?

A transformation $T: V \to W$ is linear if it satisfies two properties:

1. Additivity (Preserves Addition):

$$T(\vec{u} + \vec{v}) = T(\vec{u}) + T(\vec{v})$$

The image of a sum equals the sum of the images.

2. Homogeneity (Preserves Scalar Multiplication):

$$T(c\vec{v}) = cT(\vec{v})$$

The image of a scaled vector equals the scaled image.

These two properties can be combined into a single condition: $T$ is linear if and only if

$$T(c_1\vec{v}_1 + c_2\vec{v}_2) = c_1T(\vec{v}_1) + c_2T(\vec{v}_2)$$

for all vectors $\vec{v}_1, \vec{v}_2$ and all scalars $c_1, c_2$. This generalizes to any linear combination: $T$ preserves all linear combinations.

Why These Properties Matter

The linearity properties might seem like technical conditions, but they have profound implications:

The origin stays fixed: If $T$ is linear, then $T(\vec{0}) = \vec{0}$. To see why, note that $T(\vec{0}) = T(0 \cdot \vec{v}) = 0 \cdot T(\vec{v}) = \vec{0}$.

Lines through the origin map to lines (or points): Linear transformations preserve the linear structure of the space. A line through the origin maps to another line through the origin (or collapses to the origin itself).

Parallelism is preserved: If two lines are parallel before the transformation, their images are parallel after (or both collapse to the same line or point).

A transformation that shifts every point by a fixed amount, like $T(x, y) = (x + 1, y)$, is not linear because it moves the origin. Transformations that include any “constant term” (translation) are called affine transformations, not linear transformations.

Geometric Examples of Linear Transformations

Let us explore some fundamental linear transformations in $\mathbb{R}^2$. Each of these preserves vector addition and scalar multiplication.

Rotation: Rotating all vectors by an angle $\theta$ counterclockwise around the origin is a linear transformation. If you rotate two vectors and then add them, you get the same result as adding them first and then rotating.

Reflection: Reflecting all vectors across a line through the origin is linear. The reflection of a sum equals the sum of the reflections.

Scaling (Dilation): Multiplying all vectors by a constant factor, or stretching/compressing along specific directions, is linear.

Projection: Projecting all vectors onto a line or plane through the origin is linear. The projection of a sum equals the sum of the projections.

Shear: A shear transformation slides points parallel to some axis by an amount proportional to their distance from that axis. This is linear.

Non-Linear Transformations

To appreciate what linear means, consider some transformations that are not linear:

Translation: $T(x, y) = (x + 1, y)$ is not linear because $T(\vec{0}) = (1, 0) \neq \vec{0}$.

Squaring: $T(x) = x^2$ is not linear because $T(2x) = 4x^2 \neq 2T(x) = 2x^2$.

Magnitude: $T(\vec{v}) = |\vec{v}|$ is not linear because $T(2\vec{v}) = 2|\vec{v}| = 2T(\vec{v})$ works for scalars, but $T(\vec{u} + \vec{v}) \neq T(\vec{u}) + T(\vec{v})$ in general (triangle inequality).

A Linear Transformation Is Determined by Its Action on Basis Vectors

Here is one of the most important facts about linear transformations: if you know what a linear transformation does to the basis vectors, you know what it does to every vector.

Why? Because every vector can be written as a linear combination of basis vectors, and linear transformations preserve linear combinations.

Suppose $T: \mathbb{R}^2 \to \mathbb{R}^2$ is linear, and we use the standard basis $\vec{e}_1 = \begin{pmatrix} 1 \ 0 \end{pmatrix}$ and $\vec{e}_2 = \begin{pmatrix} 0 \ 1 \end{pmatrix}$.

Any vector $\vec{v} = \begin{pmatrix} x \ y \end{pmatrix}$ can be written as:

$$\vec{v} = x\vec{e}_1 + y\vec{e}_2$$

Therefore:

$$T(\vec{v}) = T(x\vec{e}_1 + y\vec{e}_2) = xT(\vec{e}_1) + yT(\vec{e}_2)$$

Once you know $T(\vec{e}_1)$ and $T(\vec{e}_2)$, you can compute $T(\vec{v})$ for any vector $\vec{v}$.

The Matrix of a Linear Transformation

Since a linear transformation is determined by what it does to basis vectors, we can encode this information in a matrix.

If $T: \mathbb{R}^n \to \mathbb{R}^m$ is a linear transformation, its standard matrix $A$ is the $m \times n$ matrix whose columns are the images of the standard basis vectors:

$$A = \begin{pmatrix} | & | & & | \ T(\vec{e}_1) & T(\vec{e}_2) & \cdots & T(\vec{e}_n) \ | & | & & | \end{pmatrix}$$

Then for any vector $\vec{v}$:

$$T(\vec{v}) = A\vec{v}$$

This is the fundamental connection: every linear transformation from $\mathbb{R}^n$ to $\mathbb{R}^m$ corresponds to an $m \times n$ matrix, and every $m \times n$ matrix defines a linear transformation from $\mathbb{R}^n$ to $\mathbb{R}^m$.

To find the matrix of a transformation:

  1. Apply $T$ to each standard basis vector
  2. Write the results as columns
  3. The resulting matrix is the standard matrix of $T$

Composition of Transformations

If you apply one linear transformation and then another, the result is also a linear transformation. Moreover, the matrix of the combined transformation is the product of the individual matrices.

Suppose $T: \mathbb{R}^n \to \mathbb{R}^m$ has matrix $A$ and $S: \mathbb{R}^m \to \mathbb{R}^p$ has matrix $B$. Then the composition $S \circ T$ (first apply $T$, then apply $S$) is given by:

$$(S \circ T)(\vec{v}) = S(T(\vec{v})) = B(A\vec{v}) = (BA)\vec{v}$$

The matrix of $S \circ T$ is $BA$. Notice the order: we multiply in the reverse order of application. This is why matrix multiplication is defined the way it is.

The Kernel and Image

Two important sets associated with a linear transformation $T: V \to W$:

The kernel (null space) is the set of all vectors that $T$ sends to zero:

$$\ker(T) = {\vec{v} \in V : T(\vec{v}) = \vec{0}}$$

The kernel measures how much information $T$ “loses.” If the kernel contains only $\vec{0}$, then $T$ is injective (one-to-one): different inputs always produce different outputs.

The image (range) is the set of all possible outputs:

$$\text{Im}(T) = {T(\vec{v}) : \vec{v} \in V}$$

The image tells you what vectors in $W$ can actually be “reached” by $T$. If the image equals all of $W$, then $T$ is surjective (onto).

Notation and Terminology

Term Meaning Example
$T: V \to W$ Transformation from $V$ to $W$ $T: \mathbb{R}^2 \to \mathbb{R}^2$
Linear transformation Preserves addition and scaling $T(\vec{u} + \vec{v}) = T(\vec{u}) + T(\vec{v})$
Image $T(\vec{v})$ Output of $T$ applied to $\vec{v}$ $T\begin{pmatrix} 1 \ 2 \end{pmatrix} = \begin{pmatrix} 2 \ 6 \end{pmatrix}$
Kernel (null space) All $\vec{v}$ where $T(\vec{v}) = \vec{0}$ $\ker(T) = {\vec{v} : A\vec{v} = \vec{0}}$
Standard matrix Matrix whose columns are $T(\vec{e}_1), T(\vec{e}_2), \ldots$ For rotation by 90 degrees: $\begin{pmatrix} 0 & -1 \ 1 & 0 \end{pmatrix}$
Composition $S \circ T$ Apply $T$ first, then $S$ $(S \circ T)(\vec{v}) = S(T(\vec{v}))$

Examples

Example 1: Is $T(x, y) = (2x, 3y)$ a Linear Transformation?

Determine whether $T: \mathbb{R}^2 \to \mathbb{R}^2$ defined by $T(x, y) = (2x, 3y)$ is a linear transformation.

Solution:

We need to check two properties: additivity and homogeneity.

Step 1: Check additivity.

Let $\vec{u} = \begin{pmatrix} x_1 \ y_1 \end{pmatrix}$ and $\vec{v} = \begin{pmatrix} x_2 \ y_2 \end{pmatrix}$.

$$T(\vec{u} + \vec{v}) = T\begin{pmatrix} x_1 + x_2 \ y_1 + y_2 \end{pmatrix} = \begin{pmatrix} 2(x_1 + x_2) \ 3(y_1 + y_2) \end{pmatrix} = \begin{pmatrix} 2x_1 + 2x_2 \ 3y_1 + 3y_2 \end{pmatrix}$$

$$T(\vec{u}) + T(\vec{v}) = \begin{pmatrix} 2x_1 \ 3y_1 \end{pmatrix} + \begin{pmatrix} 2x_2 \ 3y_2 \end{pmatrix} = \begin{pmatrix} 2x_1 + 2x_2 \ 3y_1 + 3y_2 \end{pmatrix}$$

These are equal, so additivity holds.

Step 2: Check homogeneity.

For any scalar $c$:

$$T(c\vec{v}) = T\begin{pmatrix} cx \ cy \end{pmatrix} = \begin{pmatrix} 2(cx) \ 3(cy) \end{pmatrix} = \begin{pmatrix} 2cx \ 3cy \end{pmatrix}$$

$$cT(\vec{v}) = c\begin{pmatrix} 2x \ 3y \end{pmatrix} = \begin{pmatrix} 2cx \ 3cy \end{pmatrix}$$

These are equal, so homogeneity holds.

Conclusion: Yes, $T(x, y) = (2x, 3y)$ is a linear transformation.

Geometric interpretation: This transformation scales horizontally by 2 and vertically by 3. It stretches the plane, keeping the origin fixed. The matrix is $\begin{pmatrix} 2 & 0 \ 0 & 3 \end{pmatrix}$.

Example 2: Is $T(x, y) = (x + 1, y)$ a Linear Transformation?

Determine whether $T: \mathbb{R}^2 \to \mathbb{R}^2$ defined by $T(x, y) = (x + 1, y)$ is a linear transformation.

Solution:

The quickest way to check is to verify whether $T(\vec{0}) = \vec{0}$. For any linear transformation, this must be true.

Step 1: Check the zero vector.

$$T\begin{pmatrix} 0 \ 0 \end{pmatrix} = \begin{pmatrix} 0 + 1 \ 0 \end{pmatrix} = \begin{pmatrix} 1 \ 0 \end{pmatrix}$$

Since $T(\vec{0}) = \begin{pmatrix} 1 \ 0 \end{pmatrix} \neq \begin{pmatrix} 0 \ 0 \end{pmatrix}$, the transformation does not send the zero vector to itself.

Conclusion: No, $T(x, y) = (x + 1, y)$ is not a linear transformation.

Alternative approach: We could also show additivity fails. Let $\vec{u} = \begin{pmatrix} 1 \ 0 \end{pmatrix}$ and $\vec{v} = \begin{pmatrix} 2 \ 0 \end{pmatrix}$.

$$T(\vec{u} + \vec{v}) = T\begin{pmatrix} 3 \ 0 \end{pmatrix} = \begin{pmatrix} 4 \ 0 \end{pmatrix}$$

$$T(\vec{u}) + T(\vec{v}) = \begin{pmatrix} 2 \ 0 \end{pmatrix} + \begin{pmatrix} 3 \ 0 \end{pmatrix} = \begin{pmatrix} 5 \ 0 \end{pmatrix}$$

Since $\begin{pmatrix} 4 \ 0 \end{pmatrix} \neq \begin{pmatrix} 5 \ 0 \end{pmatrix}$, additivity fails.

Geometric interpretation: This transformation shifts every point one unit to the right. It is a translation, and translations (except the zero translation) are not linear because they move the origin.

Example 3: Find the Matrix for Rotation by 90 Degrees Counterclockwise

Find the standard matrix for the linear transformation $T: \mathbb{R}^2 \to \mathbb{R}^2$ that rotates every vector by 90 degrees counterclockwise around the origin.

Solution:

To find the standard matrix, we need to determine what $T$ does to the standard basis vectors $\vec{e}_1 = \begin{pmatrix} 1 \ 0 \end{pmatrix}$ and $\vec{e}_2 = \begin{pmatrix} 0 \ 1 \end{pmatrix}$.

Step 1: Find $T(\vec{e}_1)$.

The vector $\vec{e}_1 = \begin{pmatrix} 1 \ 0 \end{pmatrix}$ points to the right along the positive $x$-axis.

Rotating 90 degrees counterclockwise takes it to the positive $y$-axis:

$$T(\vec{e}_1) = \begin{pmatrix} 0 \ 1 \end{pmatrix}$$

Step 2: Find $T(\vec{e}_2)$.

The vector $\vec{e}_2 = \begin{pmatrix} 0 \ 1 \end{pmatrix}$ points upward along the positive $y$-axis.

Rotating 90 degrees counterclockwise takes it to the negative $x$-axis:

$$T(\vec{e}_2) = \begin{pmatrix} -1 \ 0 \end{pmatrix}$$

Step 3: Form the matrix.

The standard matrix has $T(\vec{e}_1)$ as its first column and $T(\vec{e}_2)$ as its second column:

$$A = \begin{pmatrix} 0 & -1 \ 1 & 0 \end{pmatrix}$$

Verification: Let us check that this matrix rotates $\begin{pmatrix} 1 \ 1 \end{pmatrix}$ correctly. A 45-degree line should rotate to a 135-degree line (still at the same distance from the origin).

$$\begin{pmatrix} 0 & -1 \ 1 & 0 \end{pmatrix}\begin{pmatrix} 1 \ 1 \end{pmatrix} = \begin{pmatrix} 0 \cdot 1 + (-1) \cdot 1 \ 1 \cdot 1 + 0 \cdot 1 \end{pmatrix} = \begin{pmatrix} -1 \ 1 \end{pmatrix}$$

Indeed, $\begin{pmatrix} -1 \ 1 \end{pmatrix}$ is at a 135-degree angle with the same magnitude $\sqrt{2}$. The rotation works.

Answer: The matrix for 90-degree counterclockwise rotation is $\begin{pmatrix} 0 & -1 \ 1 & 0 \end{pmatrix}$.

Example 4: Find the Matrix for Reflection Across the Line $y = x$

Find the standard matrix for the linear transformation $T: \mathbb{R}^2 \to \mathbb{R}^2$ that reflects every vector across the line $y = x$.

Solution:

Step 1: Understand the reflection geometrically.

When you reflect a point across the line $y = x$, you swap its $x$ and $y$ coordinates. The point $(a, b)$ reflects to $(b, a)$.

Step 2: Find $T(\vec{e}_1)$.

$$T\begin{pmatrix} 1 \ 0 \end{pmatrix} = \begin{pmatrix} 0 \ 1 \end{pmatrix}$$

The point $(1, 0)$ reflects to $(0, 1)$.

Step 3: Find $T(\vec{e}_2)$.

$$T\begin{pmatrix} 0 \ 1 \end{pmatrix} = \begin{pmatrix} 1 \ 0 \end{pmatrix}$$

The point $(0, 1)$ reflects to $(1, 0)$.

Step 4: Form the matrix.

$$A = \begin{pmatrix} 0 & 1 \ 1 & 0 \end{pmatrix}$$

Verification: Let us check a point not on the coordinate axes. Reflect $\begin{pmatrix} 3 \ 2 \end{pmatrix}$:

$$\begin{pmatrix} 0 & 1 \ 1 & 0 \end{pmatrix}\begin{pmatrix} 3 \ 2 \end{pmatrix} = \begin{pmatrix} 0 \cdot 3 + 1 \cdot 2 \ 1 \cdot 3 + 0 \cdot 2 \end{pmatrix} = \begin{pmatrix} 2 \ 3 \end{pmatrix}$$

Yes, $(3, 2)$ reflects to $(2, 3)$, which is correct.

Interesting property: Notice that $A^2 = I$:

$$\begin{pmatrix} 0 & 1 \ 1 & 0 \end{pmatrix}\begin{pmatrix} 0 & 1 \ 1 & 0 \end{pmatrix} = \begin{pmatrix} 1 & 0 \ 0 & 1 \end{pmatrix}$$

This makes geometric sense: reflecting twice across the same line brings you back to where you started.

Answer: The matrix for reflection across $y = x$ is $\begin{pmatrix} 0 & 1 \ 1 & 0 \end{pmatrix}$.

Example 5: Find the Matrix for Projection onto the Line $y = 2x$

Find the standard matrix for the linear transformation $T: \mathbb{R}^2 \to \mathbb{R}^2$ that projects every vector onto the line $y = 2x$.

Solution:

Step 1: Understand projection geometrically.

Projecting a vector $\vec{v}$ onto a line means finding the point on the line closest to $\vec{v}$. This is done by dropping a perpendicular from $\vec{v}$ to the line.

Step 2: Find a direction vector for the line.

The line $y = 2x$ passes through the origin with slope 2. A direction vector is $\vec{u} = \begin{pmatrix} 1 \ 2 \end{pmatrix}$.

Step 3: Use the projection formula.

The projection of $\vec{v}$ onto $\vec{u}$ is:

$$\text{proj}_{\vec{u}}(\vec{v}) = \frac{\vec{v} \cdot \vec{u}}{\vec{u} \cdot \vec{u}} \vec{u}$$

For any $\vec{v} = \begin{pmatrix} x \ y \end{pmatrix}$:

$$\vec{v} \cdot \vec{u} = x \cdot 1 + y \cdot 2 = x + 2y$$

$$\vec{u} \cdot \vec{u} = 1^2 + 2^2 = 5$$

$$\text{proj}_{\vec{u}}(\vec{v}) = \frac{x + 2y}{5} \begin{pmatrix} 1 \ 2 \end{pmatrix} = \begin{pmatrix} \frac{x + 2y}{5} \ \frac{2(x + 2y)}{5} \end{pmatrix} = \begin{pmatrix} \frac{x + 2y}{5} \ \frac{2x + 4y}{5} \end{pmatrix}$$

Step 4: Find $T(\vec{e}_1)$.

$$T\begin{pmatrix} 1 \ 0 \end{pmatrix} = \begin{pmatrix} \frac{1 + 0}{5} \ \frac{2 + 0}{5} \end{pmatrix} = \begin{pmatrix} \frac{1}{5} \ \frac{2}{5} \end{pmatrix}$$

Step 5: Find $T(\vec{e}_2)$.

$$T\begin{pmatrix} 0 \ 1 \end{pmatrix} = \begin{pmatrix} \frac{0 + 2}{5} \ \frac{0 + 4}{5} \end{pmatrix} = \begin{pmatrix} \frac{2}{5} \ \frac{4}{5} \end{pmatrix}$$

Step 6: Form the matrix.

$$A = \begin{pmatrix} \frac{1}{5} & \frac{2}{5} \ \frac{2}{5} & \frac{4}{5} \end{pmatrix} = \frac{1}{5}\begin{pmatrix} 1 & 2 \ 2 & 4 \end{pmatrix}$$

Verification: Let us check that a point on the line stays fixed. The point $\begin{pmatrix} 1 \ 2 \end{pmatrix}$ is on the line $y = 2x$:

$$\frac{1}{5}\begin{pmatrix} 1 & 2 \ 2 & 4 \end{pmatrix}\begin{pmatrix} 1 \ 2 \end{pmatrix} = \frac{1}{5}\begin{pmatrix} 1 + 4 \ 2 + 8 \end{pmatrix} = \frac{1}{5}\begin{pmatrix} 5 \ 10 \end{pmatrix} = \begin{pmatrix} 1 \ 2 \end{pmatrix}$$

Points on the line are indeed fixed by the projection.

Check a point off the line: Project $\begin{pmatrix} 5 \ 0 \end{pmatrix}$:

$$\frac{1}{5}\begin{pmatrix} 1 & 2 \ 2 & 4 \end{pmatrix}\begin{pmatrix} 5 \ 0 \end{pmatrix} = \frac{1}{5}\begin{pmatrix} 5 \ 10 \end{pmatrix} = \begin{pmatrix} 1 \ 2 \end{pmatrix}$$

The point $(5, 0)$ projects to $(1, 2)$, which is on the line $y = 2x$. Correct.

Answer: The matrix for projection onto the line $y = 2x$ is $\begin{pmatrix} \frac{1}{5} & \frac{2}{5} \ \frac{2}{5} & \frac{4}{5} \end{pmatrix}$.

General formula: For projection onto a line through the origin with direction vector $\vec{u}$, the projection matrix is:

$$P = \frac{\vec{u}\vec{u}^T}{\vec{u}^T\vec{u}}$$

For $\vec{u} = \begin{pmatrix} 1 \ 2 \end{pmatrix}$: $\vec{u}\vec{u}^T = \begin{pmatrix} 1 \ 2 \end{pmatrix}\begin{pmatrix} 1 & 2 \end{pmatrix} = \begin{pmatrix} 1 & 2 \ 2 & 4 \end{pmatrix}$ and $\vec{u}^T\vec{u} = 5$.

Key Properties and Rules

Properties of Linear Transformations

  • $T(\vec{0}) = \vec{0}$: Every linear transformation sends the zero vector to itself.

  • $T(-\vec{v}) = -T(\vec{v})$: Linear transformations preserve negation.

  • $T(c_1\vec{v}_1 + c_2\vec{v}_2 + \cdots + c_k\vec{v}_k) = c_1T(\vec{v}_1) + c_2T(\vec{v}_2) + \cdots + c_kT(\vec{v}_k)$: Linear transformations preserve all linear combinations.

Matrix-Transformation Correspondence

For transformations $T: \mathbb{R}^n \to \mathbb{R}^m$:

Transformation Matrix
Identity $T(\vec{v}) = \vec{v}$ $I_n$ (identity matrix)
Scaling by $k$: $T(\vec{v}) = k\vec{v}$ $kI_n$
Zero transformation: $T(\vec{v}) = \vec{0}$ Zero matrix
Rotation by $\theta$ in $\mathbb{R}^2$ $\begin{pmatrix} \cos\theta & -\sin\theta \ \sin\theta & \cos\theta \end{pmatrix}$

Composition Rules

  • The composition of linear transformations is linear.
  • If $T$ has matrix $A$ and $S$ has matrix $B$, then $S \circ T$ has matrix $BA$.
  • Matrix multiplication encodes composition of transformations.
  • The order reverses: apply $T$ first, then $S$, but multiply $BA$ (B times A).

Kernel and Image Properties

  • $\ker(T)$ is always a subspace of the domain.
  • $\text{Im}(T)$ is always a subspace of the codomain.
  • $T$ is injective (one-to-one) if and only if $\ker(T) = {\vec{0}}$.
  • $T$ is surjective (onto) if and only if $\text{Im}(T)$ equals the entire codomain.
  • Rank-Nullity Theorem: $\dim(\ker(T)) + \dim(\text{Im}(T)) = \dim(\text{domain})$.

Real-World Applications

Computer Graphics Transformations

Every time you play a video game or watch a 3D animated movie, your computer performs millions of linear transformations. Objects are rotated, scaled, reflected, and projected using matrices. The graphics pipeline applies a sequence of matrices:

  • Model matrix: Transforms an object from its own coordinate system to world coordinates.
  • View matrix: Transforms from world coordinates to camera coordinates (where is the camera looking?).
  • Projection matrix: Projects the 3D scene onto a 2D screen.

These matrices are multiplied together to create a single transformation that is applied to every vertex of every object. Graphics cards (GPUs) are specifically designed to perform these matrix multiplications extremely fast.

Data Transformations in Statistics

When you have data with many variables, linear transformations can simplify analysis:

  • Standardization: Centering data and scaling by standard deviation is a linear transformation (up to the translation).
  • Principal Component Analysis (PCA): Finds a new coordinate system (basis) where the data is easier to understand. This is a linear transformation to a new basis.
  • Whitening: Transforms data so that variables become uncorrelated with unit variance.

These transformations preserve the linear structure of the data while making patterns easier to detect.

Encoding and Decoding Signals

In communications, signals are often encoded using linear transformations for efficiency and error correction:

  • Fourier Transform: Decomposes a signal into frequency components. This is a linear transformation from the “time domain” to the “frequency domain.”
  • Error-Correcting Codes: Encode messages using matrix multiplication. The receiver can detect and correct errors by analyzing the transformed signal.
  • Data Compression: JPEG and MP3 compression use linear transformations (discrete cosine transform) to identify which components of an image or sound can be discarded without noticeable quality loss.

Quantum Gates in Quantum Computing

In quantum computing, the state of a quantum system is represented as a vector, and operations on qubits are linear transformations represented by special matrices (unitary matrices). Quantum gates like the Hadamard gate, Pauli gates, and CNOT gate are all linear transformations:

$$H = \frac{1}{\sqrt{2}}\begin{pmatrix} 1 & 1 \ 1 & -1 \end{pmatrix}$$

The composition of quantum gates (running them in sequence) corresponds to matrix multiplication, just as with classical linear transformations.

Self-Test Problems

Problem 1: Is $T(x, y) = (x - y, x + y)$ a linear transformation?

Show Answer

Yes, it is linear.

Check additivity: Let $\vec{u} = (x_1, y_1)$ and $\vec{v} = (x_2, y_2)$.

$T(\vec{u} + \vec{v}) = T(x_1 + x_2, y_1 + y_2) = ((x_1+x_2) - (y_1+y_2), (x_1+x_2) + (y_1+y_2))$

$T(\vec{u}) + T(\vec{v}) = (x_1 - y_1, x_1 + y_1) + (x_2 - y_2, x_2 + y_2) = ((x_1-y_1) + (x_2-y_2), (x_1+y_1) + (x_2+y_2))$

These are equal.

Check homogeneity: $T(c\vec{v}) = T(cx, cy) = (cx - cy, cx + cy) = c(x-y, x+y) = cT(\vec{v})$.

Both properties hold, so $T$ is linear.

The matrix is $\begin{pmatrix} 1 & -1 \ 1 & 1 \end{pmatrix}$.

Problem 2: Is $T(x, y) = (xy, x)$ a linear transformation?

Show Answer

No, it is not linear.

Check homogeneity: $T(c\vec{v}) = T(cx, cy) = ((cx)(cy), cx) = (c^2xy, cx)$

But $cT(\vec{v}) = c(xy, x) = (cxy, cx)$.

Since $c^2xy \neq cxy$ in general (for example, when $c = 2$ and $xy = 1$), homogeneity fails.

The transformation involves multiplication of the input components, which is a nonlinear operation.

Problem 3: Find the standard matrix for the transformation that reflects vectors across the $x$-axis.

Show Answer

Find where the basis vectors go:

$T(\vec{e}_1) = T\begin{pmatrix} 1 \ 0 \end{pmatrix} = \begin{pmatrix} 1 \ 0 \end{pmatrix}$ (points on the $x$-axis are unchanged)

$T(\vec{e}_2) = T\begin{pmatrix} 0 \ 1 \end{pmatrix} = \begin{pmatrix} 0 \ -1 \end{pmatrix}$ (points above the $x$-axis reflect below)

Matrix: $\begin{pmatrix} 1 & 0 \ 0 & -1 \end{pmatrix}$

Problem 4: Find the standard matrix for rotation by 180 degrees.

Show Answer

Rotating by 180 degrees sends every vector $\vec{v}$ to $-\vec{v}$.

$T(\vec{e}_1) = T\begin{pmatrix} 1 \ 0 \end{pmatrix} = \begin{pmatrix} -1 \ 0 \end{pmatrix}$

$T(\vec{e}_2) = T\begin{pmatrix} 0 \ 1 \end{pmatrix} = \begin{pmatrix} 0 \ -1 \end{pmatrix}$

Matrix: $\begin{pmatrix} -1 & 0 \ 0 & -1 \end{pmatrix} = -I$

Alternatively, use the rotation formula with $\theta = 180° = \pi$:

$\begin{pmatrix} \cos\pi & -\sin\pi \ \sin\pi & \cos\pi \end{pmatrix} = \begin{pmatrix} -1 & 0 \ 0 & -1 \end{pmatrix}$

Problem 5: If $T$ has matrix $A = \begin{pmatrix} 1 & 2 \ 3 & 4 \end{pmatrix}$, compute $T\begin{pmatrix} 2 \ -1 \end{pmatrix}$.

Show Answer

Apply the matrix to the vector:

$T\begin{pmatrix} 2 \ -1 \end{pmatrix} = \begin{pmatrix} 1 & 2 \ 3 & 4 \end{pmatrix}\begin{pmatrix} 2 \ -1 \end{pmatrix} = \begin{pmatrix} 1(2) + 2(-1) \ 3(2) + 4(-1) \end{pmatrix} = \begin{pmatrix} 2 - 2 \ 6 - 4 \end{pmatrix} = \begin{pmatrix} 0 \ 2 \end{pmatrix}$

Problem 6: Let $S$ be rotation by 90 degrees counterclockwise and $T$ be reflection across the $x$-axis. Find the matrix for $S \circ T$ (first reflect, then rotate).

Show Answer

Matrix for $T$ (reflection across $x$-axis): $A = \begin{pmatrix} 1 & 0 \ 0 & -1 \end{pmatrix}$

Matrix for $S$ (90-degree rotation): $B = \begin{pmatrix} 0 & -1 \ 1 & 0 \end{pmatrix}$

For $S \circ T$, we compute $BA$:

$BA = \begin{pmatrix} 0 & -1 \ 1 & 0 \end{pmatrix}\begin{pmatrix} 1 & 0 \ 0 & -1 \end{pmatrix} = \begin{pmatrix} 0(1) + (-1)(0) & 0(0) + (-1)(-1) \ 1(1) + 0(0) & 1(0) + 0(-1) \end{pmatrix} = \begin{pmatrix} 0 & 1 \ 1 & 0 \end{pmatrix}$

Interestingly, this is the matrix for reflection across the line $y = x$.

Problem 7: Find the kernel of the linear transformation with matrix $A = \begin{pmatrix} 1 & 2 \ 2 & 4 \end{pmatrix}$.

Show Answer

The kernel consists of all $\vec{v} = \begin{pmatrix} x \ y \end{pmatrix}$ such that $A\vec{v} = \vec{0}$.

$\begin{pmatrix} 1 & 2 \ 2 & 4 \end{pmatrix}\begin{pmatrix} x \ y \end{pmatrix} = \begin{pmatrix} 0 \ 0 \end{pmatrix}$

This gives the system:

  • $x + 2y = 0$
  • $2x + 4y = 0$ (this is just 2 times the first equation)

From $x + 2y = 0$, we get $x = -2y$.

Let $y = t$. Then $x = -2t$.

$\ker(A) = \left{ t\begin{pmatrix} -2 \ 1 \end{pmatrix} : t \in \mathbb{R} \right}$

The kernel is a line through the origin with direction vector $\begin{pmatrix} -2 \ 1 \end{pmatrix}$.

Summary

  • A transformation $T: V \to W$ is a function from one vector space to another. The output $T(\vec{v})$ is called the image of $\vec{v}$.

  • A transformation is linear if it preserves vector addition and scalar multiplication: $T(\vec{u} + \vec{v}) = T(\vec{u}) + T(\vec{v})$ and $T(c\vec{v}) = cT(\vec{v})$.

  • Linear transformations always send the zero vector to itself: $T(\vec{0}) = \vec{0}$. Transformations that include translations are not linear.

  • Geometric examples of linear transformations include rotations, reflections, projections, scaling, and shears. All of these preserve the origin.

  • A linear transformation is completely determined by what it does to basis vectors. If you know $T(\vec{e}_1), T(\vec{e}_2), \ldots, T(\vec{e}_n)$, you can compute $T(\vec{v})$ for any vector.

  • The standard matrix of $T: \mathbb{R}^n \to \mathbb{R}^m$ has $T(\vec{e}_i)$ as its $i$-th column. Then $T(\vec{v}) = A\vec{v}$ for all $\vec{v}$.

  • Every linear transformation from $\mathbb{R}^n$ to $\mathbb{R}^m$ corresponds to a unique $m \times n$ matrix, and every $m \times n$ matrix defines a linear transformation.

  • Composition of transformations corresponds to matrix multiplication. If $T$ has matrix $A$ and $S$ has matrix $B$, then $S \circ T$ has matrix $BA$. Note the order reversal.

  • The kernel of $T$ is the set of vectors that map to $\vec{0}$. The image of $T$ is the set of all possible outputs.

  • Applications include computer graphics (rotations, projections, scaling), data analysis (PCA, standardization), signal processing (Fourier transforms), and quantum computing (quantum gates).