Eigenvalues and Eigenvectors

Discover the special vectors that only get scaled by a transformation

When a matrix transforms space, most vectors get both stretched and rotated, changing direction as well as length. But some special vectors experience something simpler: they only get scaled, pointing in the same direction (or the exact opposite direction) after the transformation. These remarkable vectors are called eigenvectors, and the scaling factors are called eigenvalues. Understanding them unlocks powerful applications across mathematics, science, and engineering.

Think about pushing a swing. If you push at random times, the motion is chaotic. But if you push at just the right frequency, the swing responds by moving in a simple, predictable pattern. That “natural frequency” is an eigenvalue, and the swinging motion is an eigenvector. Eigenvalues and eigenvectors reveal the natural modes of a system, the directions in which a transformation acts most simply.

In this lesson, you will learn how to identify eigenvectors and eigenvalues, compute them systematically using the characteristic equation, and understand what these special values tell you about a matrix and the transformation it represents.

Core Concepts

What Directions Are Preserved by a Transformation?

When you apply a matrix $A$ to a vector $\vec{v}$, the result $A\vec{v}$ is usually a completely different vector. It might point in a new direction, have a different length, or both. But for certain special vectors, something simpler happens: the output $A\vec{v}$ points in the same direction as the input $\vec{v}$ (or the exact opposite direction). The transformation merely scales the vector without rotating it.

These are the vectors that reveal the “preferred directions” of a transformation. Along these directions, the matrix acts simply by stretching or compressing.

For example, consider a transformation that stretches space horizontally by a factor of 3 while leaving the vertical direction unchanged. Any vector pointing purely horizontally gets scaled by 3. Any vector pointing purely vertically gets scaled by 1 (unchanged). These horizontal and vertical directions are the eigenvector directions, and 3 and 1 are the eigenvalues.

The Definition: $A\vec{v} = \lambda\vec{v}$

A non-zero vector $\vec{v}$ is called an eigenvector of a square matrix $A$ if

$$A\vec{v} = \lambda\vec{v}$$

for some scalar $\lambda$. The scalar $\lambda$ is called the eigenvalue corresponding to the eigenvector $\vec{v}$.

Let us unpack this definition:

  • $A\vec{v}$ is the result of applying the matrix $A$ to the vector $\vec{v}$.
  • $\lambda\vec{v}$ is the vector $\vec{v}$ scaled by the factor $\lambda$.
  • The equation says these two are equal: transforming $\vec{v}$ by $A$ produces the same result as simply scaling $\vec{v}$ by $\lambda$.

The word “eigen” comes from German, meaning “own” or “characteristic.” Eigenvectors are the matrix’s “own vectors,” the directions that are characteristic of how the matrix behaves.

Important: An eigenvector must be non-zero. The zero vector satisfies $A\vec{0} = \vec{0} = \lambda\vec{0}$ for any $\lambda$, but it carries no directional information. We exclude it from being an eigenvector.

Geometric Interpretation

Geometrically, an eigenvector points along a direction that the transformation preserves (up to scaling). If you draw the eigenvector as an arrow, after applying the matrix, the arrow still points along the same line. It might get longer, shorter, or even flip to point the opposite way, but it stays on that line.

The eigenvalue tells you the scaling factor:

  • If $\lambda > 1$, the eigenvector gets stretched.
  • If $0 < \lambda < 1$, the eigenvector gets compressed.
  • If $\lambda = 1$, the eigenvector is unchanged.
  • If $\lambda < 0$, the eigenvector gets flipped to point in the opposite direction (and possibly scaled).
  • If $\lambda = 0$, the eigenvector gets collapsed to the zero vector (the matrix sends it to the origin).

The Characteristic Equation

How do you find the eigenvalues of a matrix? The key insight comes from rewriting the eigenvalue equation.

Starting with $A\vec{v} = \lambda\vec{v}$, we want to find the values of $\lambda$ for which there exists a non-zero solution $\vec{v}$.

Rewrite the equation:

$$A\vec{v} - \lambda\vec{v} = \vec{0}$$

$$(A - \lambda I)\vec{v} = \vec{0}$$

Here, $I$ is the identity matrix (the same size as $A$), and $A - \lambda I$ is a new matrix formed by subtracting $\lambda$ from each diagonal entry of $A$.

This equation says that $\vec{v}$ is in the null space of $(A - \lambda I)$. For a non-zero solution to exist, the matrix $(A - \lambda I)$ must be singular (not invertible). And a matrix is singular exactly when its determinant is zero.

This gives us the characteristic equation:

$$\det(A - \lambda I) = 0$$

The eigenvalues of $A$ are the values of $\lambda$ that satisfy this equation.

The Characteristic Polynomial

When you compute $\det(A - \lambda I)$, you get a polynomial in $\lambda$. This polynomial is called the characteristic polynomial of $A$.

For an $n \times n$ matrix, the characteristic polynomial has degree $n$. Therefore, an $n \times n$ matrix has exactly $n$ eigenvalues (counting multiplicity, and including complex eigenvalues).

For a $2 \times 2$ matrix:

$$A = \begin{pmatrix} a & b \ c & d \end{pmatrix}$$

The characteristic polynomial is:

$$\det\begin{pmatrix} a - \lambda & b \ c & d - \lambda \end{pmatrix} = (a - \lambda)(d - \lambda) - bc = \lambda^2 - (a + d)\lambda + (ad - bc)$$

This equals $\lambda^2 - \text{trace}(A)\lambda + \det(A)$, where the trace is the sum of diagonal entries.

Finding Eigenvectors

Once you have an eigenvalue $\lambda$, you find the corresponding eigenvectors by solving:

$$(A - \lambda I)\vec{v} = \vec{0}$$

This is a homogeneous system of linear equations. The solutions form a subspace (the null space of $A - \lambda I$), and any non-zero vector in this subspace is an eigenvector for $\lambda$.

The process:

  1. Compute $A - \lambda I$ (subtract $\lambda$ from each diagonal entry).
  2. Row reduce to find the null space.
  3. The null space vectors are the eigenvectors for $\lambda$.

Eigenspaces

For a given eigenvalue $\lambda$, the set of all vectors $\vec{v}$ satisfying $A\vec{v} = \lambda\vec{v}$ (including the zero vector) is called the eigenspace of $\lambda$. This is exactly the null space of $(A - \lambda I)$.

The eigenspace is a subspace of $\mathbb{R}^n$ (or $\mathbb{C}^n$). It contains all eigenvectors for that eigenvalue, plus the zero vector. The dimension of the eigenspace tells you how many linearly independent eigenvectors share that eigenvalue.

Algebraic vs. Geometric Multiplicity

An eigenvalue can appear multiple times in the characteristic polynomial. The number of times $\lambda$ appears as a root of the characteristic polynomial is called the algebraic multiplicity of $\lambda$.

The geometric multiplicity of $\lambda$ is the dimension of its eigenspace, which equals the number of linearly independent eigenvectors for $\lambda$.

A fundamental fact: the geometric multiplicity is always less than or equal to the algebraic multiplicity.

When the geometric multiplicity equals the algebraic multiplicity for all eigenvalues, the matrix has enough eigenvectors to form a basis for the entire space. Such matrices are called diagonalizable and have especially nice properties.

Complex Eigenvalues

The characteristic equation is a polynomial equation, and polynomials do not always have real roots. Some matrices have complex eigenvalues, even when all their entries are real.

This happens when the transformation involves rotation. A rotation matrix has no direction in the plane that it preserves (except for 0 and 180-degree rotations), so it has no real eigenvectors. The eigenvalues turn out to be complex numbers.

For real matrices, complex eigenvalues always come in conjugate pairs: if $a + bi$ is an eigenvalue, then so is $a - bi$.

Notation and Terminology

Term Meaning Example
Eigenvalue Scalar $\lambda$ in $A\vec{v} = \lambda\vec{v}$ If $A\begin{pmatrix} 1 \ 1 \end{pmatrix} = 4\begin{pmatrix} 1 \ 1 \end{pmatrix}$, then $\lambda = 4$
Eigenvector Non-zero vector $\vec{v}$ in $A\vec{v} = \lambda\vec{v}$ In the above example, $\vec{v} = \begin{pmatrix} 1 \ 1 \end{pmatrix}$
Characteristic polynomial $\det(A - \lambda I)$ For $A = \begin{pmatrix} 2 & 0 \ 0 & 5 \end{pmatrix}$: $(2-\lambda)(5-\lambda)$
Characteristic equation $\det(A - \lambda I) = 0$ $(2-\lambda)(5-\lambda) = 0$
Eigenspace All eigenvectors for a given $\lambda$, plus $\vec{0}$ $E_\lambda = \text{null}(A - \lambda I)$
Spectrum Set of all eigenvalues of $A$ ${2, 5}$ for the diagonal matrix above
Algebraic multiplicity Number of times $\lambda$ appears as a root $\lambda = 2$ has algebraic multiplicity 2 in $(\lambda - 2)^2$
Geometric multiplicity Dimension of the eigenspace for $\lambda$ $\dim(E_\lambda)$

Examples

Example 1: Verify That a Vector Is an Eigenvector

Verify that $\vec{v} = \begin{pmatrix} 1 \ 1 \end{pmatrix}$ is an eigenvector of $A = \begin{pmatrix} 3 & 1 \ 1 & 3 \end{pmatrix}$, and find the corresponding eigenvalue.

Solution:

To verify that $\vec{v}$ is an eigenvector, we compute $A\vec{v}$ and check if the result is a scalar multiple of $\vec{v}$.

Step 1: Compute $A\vec{v}$.

$$A\vec{v} = \begin{pmatrix} 3 & 1 \ 1 & 3 \end{pmatrix}\begin{pmatrix} 1 \ 1 \end{pmatrix} = \begin{pmatrix} 3 \cdot 1 + 1 \cdot 1 \ 1 \cdot 1 + 3 \cdot 1 \end{pmatrix} = \begin{pmatrix} 4 \ 4 \end{pmatrix}$$

Step 2: Check if this is a scalar multiple of $\vec{v}$.

$$\begin{pmatrix} 4 \ 4 \end{pmatrix} = 4\begin{pmatrix} 1 \ 1 \end{pmatrix} = 4\vec{v}$$

Yes, $A\vec{v} = 4\vec{v}$.

Conclusion: The vector $\vec{v} = \begin{pmatrix} 1 \ 1 \end{pmatrix}$ is an eigenvector of $A$ with eigenvalue $\lambda = 4$.

Geometric interpretation: When this matrix transforms the vector $\begin{pmatrix} 1 \ 1 \end{pmatrix}$, it simply stretches it by a factor of 4 without changing its direction.

Example 2: Find Eigenvalues of a Diagonal Matrix

Find the eigenvalues of $A = \begin{pmatrix} 2 & 0 \ 0 & 5 \end{pmatrix}$.

Solution:

For a diagonal matrix, the eigenvalues are simply the diagonal entries. Let us verify this using the characteristic equation.

Step 1: Set up the characteristic equation.

$$\det(A - \lambda I) = 0$$

$$\det\begin{pmatrix} 2 - \lambda & 0 \ 0 & 5 - \lambda \end{pmatrix} = 0$$

Step 2: Compute the determinant.

For a $2 \times 2$ matrix, $\det\begin{pmatrix} a & b \ c & d \end{pmatrix} = ad - bc$.

$$\det\begin{pmatrix} 2 - \lambda & 0 \ 0 & 5 - \lambda \end{pmatrix} = (2 - \lambda)(5 - \lambda) - (0)(0) = (2 - \lambda)(5 - \lambda)$$

Step 3: Solve the characteristic equation.

$$(2 - \lambda)(5 - \lambda) = 0$$

This gives $\lambda = 2$ or $\lambda = 5$.

Answer: The eigenvalues are $\lambda_1 = 2$ and $\lambda_2 = 5$.

Key insight: For diagonal matrices, the eigenvalues are always the diagonal entries. The eigenvectors are the standard basis vectors: $\begin{pmatrix} 1 \ 0 \end{pmatrix}$ for $\lambda = 2$ and $\begin{pmatrix} 0 \ 1 \end{pmatrix}$ for $\lambda = 5$.

Example 3: Find All Eigenvalues of a 2x2 Matrix

Find all eigenvalues of $A = \begin{pmatrix} 4 & 2 \ 1 & 3 \end{pmatrix}$.

Solution:

Step 1: Set up $A - \lambda I$.

$$A - \lambda I = \begin{pmatrix} 4 & 2 \ 1 & 3 \end{pmatrix} - \begin{pmatrix} \lambda & 0 \ 0 & \lambda \end{pmatrix} = \begin{pmatrix} 4 - \lambda & 2 \ 1 & 3 - \lambda \end{pmatrix}$$

Step 2: Compute the determinant and set it equal to zero.

$$\det(A - \lambda I) = (4 - \lambda)(3 - \lambda) - (2)(1)$$

$$= 12 - 4\lambda - 3\lambda + \lambda^2 - 2$$

$$= \lambda^2 - 7\lambda + 10$$

Step 3: Solve the characteristic equation.

$$\lambda^2 - 7\lambda + 10 = 0$$

Factor: $(\lambda - 5)(\lambda - 2) = 0$

(Check: $(-5)(-2) = 10$ and $-5 + (-2) = -7$.)

Step 4: Find the roots.

$$\lambda = 5 \quad \text{or} \quad \lambda = 2$$

Answer: The eigenvalues are $\lambda_1 = 5$ and $\lambda_2 = 2$.

Note: We can verify by checking that the trace (sum of diagonal entries) equals the sum of eigenvalues: $4 + 3 = 7 = 5 + 2$. And the determinant equals the product of eigenvalues: $(4)(3) - (2)(1) = 10 = 5 \times 2$.

Example 4: Find Eigenvectors for Each Eigenvalue

For the matrix $A = \begin{pmatrix} 4 & 2 \ 1 & 3 \end{pmatrix}$ with eigenvalues $\lambda_1 = 5$ and $\lambda_2 = 2$, find the corresponding eigenvectors.

Solution:

For $\lambda_1 = 5$:

Step 1: Compute $A - 5I$.

$$A - 5I = \begin{pmatrix} 4 - 5 & 2 \ 1 & 3 - 5 \end{pmatrix} = \begin{pmatrix} -1 & 2 \ 1 & -2 \end{pmatrix}$$

Step 2: Solve $(A - 5I)\vec{v} = \vec{0}$.

This gives the system:

  • $-x + 2y = 0$
  • $x - 2y = 0$

Both equations say the same thing: $x = 2y$.

Step 3: Express the general solution.

Let $y = t$ (a free parameter). Then $x = 2t$.

$$\vec{v} = \begin{pmatrix} 2t \ t \end{pmatrix} = t\begin{pmatrix} 2 \ 1 \end{pmatrix}$$

Eigenvector for $\lambda_1 = 5$: Any non-zero scalar multiple of $\begin{pmatrix} 2 \ 1 \end{pmatrix}$.

We typically write $\vec{v}_1 = \begin{pmatrix} 2 \ 1 \end{pmatrix}$ as a representative eigenvector.

For $\lambda_2 = 2$:

Step 1: Compute $A - 2I$.

$$A - 2I = \begin{pmatrix} 4 - 2 & 2 \ 1 & 3 - 2 \end{pmatrix} = \begin{pmatrix} 2 & 2 \ 1 & 1 \end{pmatrix}$$

Step 2: Solve $(A - 2I)\vec{v} = \vec{0}$.

This gives the system:

  • $2x + 2y = 0$
  • $x + y = 0$

Both equations say: $x + y = 0$, or $x = -y$.

Step 3: Express the general solution.

Let $y = t$. Then $x = -t$.

$$\vec{v} = \begin{pmatrix} -t \ t \end{pmatrix} = t\begin{pmatrix} -1 \ 1 \end{pmatrix}$$

Eigenvector for $\lambda_2 = 2$: Any non-zero scalar multiple of $\begin{pmatrix} -1 \ 1 \end{pmatrix}$.

We write $\vec{v}_2 = \begin{pmatrix} -1 \ 1 \end{pmatrix}$ as a representative eigenvector.

Verification:

For $\lambda_1 = 5$: $A\vec{v}_1 = \begin{pmatrix} 4 & 2 \ 1 & 3 \end{pmatrix}\begin{pmatrix} 2 \ 1 \end{pmatrix} = \begin{pmatrix} 10 \ 5 \end{pmatrix} = 5\begin{pmatrix} 2 \ 1 \end{pmatrix}$. Correct.

For $\lambda_2 = 2$: $A\vec{v}_2 = \begin{pmatrix} 4 & 2 \ 1 & 3 \end{pmatrix}\begin{pmatrix} -1 \ 1 \end{pmatrix} = \begin{pmatrix} -2 \ 2 \end{pmatrix} = 2\begin{pmatrix} -1 \ 1 \end{pmatrix}$. Correct.

Summary:

  • Eigenvalue $\lambda_1 = 5$ has eigenvector $\vec{v}_1 = \begin{pmatrix} 2 \ 1 \end{pmatrix}$
  • Eigenvalue $\lambda_2 = 2$ has eigenvector $\vec{v}_2 = \begin{pmatrix} -1 \ 1 \end{pmatrix}$
Example 5: Complex Eigenvalues (Rotation Matrix)

Find the eigenvalues and eigenvectors of the rotation matrix $A = \begin{pmatrix} 0 & -1 \ 1 & 0 \end{pmatrix}$.

Solution:

This matrix rotates vectors by 90 degrees counterclockwise. Geometrically, no direction in the real plane is preserved by a 90-degree rotation, so we should not expect real eigenvectors.

Step 1: Set up the characteristic equation.

$$A - \lambda I = \begin{pmatrix} 0 - \lambda & -1 \ 1 & 0 - \lambda \end{pmatrix} = \begin{pmatrix} -\lambda & -1 \ 1 & -\lambda \end{pmatrix}$$

Step 2: Compute the determinant.

$$\det(A - \lambda I) = (-\lambda)(-\lambda) - (-1)(1) = \lambda^2 + 1$$

Step 3: Solve the characteristic equation.

$$\lambda^2 + 1 = 0$$

$$\lambda^2 = -1$$

$$\lambda = \pm i$$

where $i = \sqrt{-1}$ is the imaginary unit.

Answer (Eigenvalues): $\lambda_1 = i$ and $\lambda_2 = -i$.

Finding the eigenvectors:

For $\lambda_1 = i$:

$$A - iI = \begin{pmatrix} -i & -1 \ 1 & -i \end{pmatrix}$$

Solve $(A - iI)\vec{v} = \vec{0}$:

From the first row: $-ix - y = 0$, so $y = -ix$.

Let $x = 1$. Then $y = -i$.

$$\vec{v}_1 = \begin{pmatrix} 1 \ -i \end{pmatrix}$$

For $\lambda_2 = -i$:

$$A - (-i)I = \begin{pmatrix} i & -1 \ 1 & i \end{pmatrix}$$

From the first row: $ix - y = 0$, so $y = ix$.

Let $x = 1$. Then $y = i$.

$$\vec{v}_2 = \begin{pmatrix} 1 \ i \end{pmatrix}$$

Summary:

  • Eigenvalue $\lambda_1 = i$ has eigenvector $\vec{v}_1 = \begin{pmatrix} 1 \ -i \end{pmatrix}$
  • Eigenvalue $\lambda_2 = -i$ has eigenvector $\vec{v}_2 = \begin{pmatrix} 1 \ i \end{pmatrix}$

Interpretation: The complex eigenvalues $\pm i$ can be written in polar form as $e^{\pm i\pi/2}$. The angle $\pi/2$ corresponds to 90 degrees, which is exactly the rotation angle. Complex eigenvalues $a \pm bi$ correspond to rotation by angle $\theta = \arctan(b/a)$ combined with scaling by factor $\sqrt{a^2 + b^2}$. Here, $|i| = 1$ confirms that a pure rotation does not change lengths.

Verification: Let us verify for $\lambda_1 = i$:

$$A\vec{v}_1 = \begin{pmatrix} 0 & -1 \ 1 & 0 \end{pmatrix}\begin{pmatrix} 1 \ -i \end{pmatrix} = \begin{pmatrix} 0 \cdot 1 + (-1)(-i) \ 1 \cdot 1 + 0 \cdot (-i) \end{pmatrix} = \begin{pmatrix} i \ 1 \end{pmatrix}$$

$$i\vec{v}_1 = i\begin{pmatrix} 1 \ -i \end{pmatrix} = \begin{pmatrix} i \ -i^2 \end{pmatrix} = \begin{pmatrix} i \ 1 \end{pmatrix}$$

Indeed, $A\vec{v}_1 = i\vec{v}_1$. The eigenvector equation is satisfied.

Key Properties and Rules

Summary of Eigenvalue Properties

Property Statement
Sum of eigenvalues Equals the trace of $A$ (sum of diagonal entries)
Product of eigenvalues Equals $\det(A)$
Eigenvalues of $A^{-1}$ Are $1/\lambda$ for each eigenvalue $\lambda$ of $A$
Eigenvalues of $A^k$ Are $\lambda^k$ for each eigenvalue $\lambda$ of $A$
Eigenvalues of $A + cI$ Are $\lambda + c$ for each eigenvalue $\lambda$ of $A$
Eigenvalues of $cA$ Are $c\lambda$ for each eigenvalue $\lambda$ of $A$

Finding Eigenvalues: The Algorithm

  1. Compute $A - \lambda I$ by subtracting $\lambda$ from each diagonal entry.
  2. Compute $\det(A - \lambda I)$ to get the characteristic polynomial.
  3. Solve $\det(A - \lambda I) = 0$ to find the eigenvalues.

Finding Eigenvectors: The Algorithm

For each eigenvalue $\lambda$:

  1. Compute $A - \lambda I$.
  2. Row reduce to find the null space of $A - \lambda I$.
  3. The non-zero vectors in the null space are the eigenvectors for $\lambda$.

Special Cases

  • Diagonal matrices: Eigenvalues are the diagonal entries. Eigenvectors are the standard basis vectors.
  • Triangular matrices: Eigenvalues are the diagonal entries. (Eigenvectors must still be computed.)
  • Real symmetric matrices: Always have real eigenvalues and orthogonal eigenvectors.
  • Rotation matrices: Have complex conjugate eigenvalues $e^{\pm i\theta}$ where $\theta$ is the rotation angle.

The Big Picture: Why Eigenvalues Matter

If a matrix $A$ has $n$ linearly independent eigenvectors, then $A$ can be diagonalized: there exists an invertible matrix $P$ such that

$$P^{-1}AP = D$$

where $D$ is a diagonal matrix with the eigenvalues on the diagonal. This makes computing powers of $A$ easy:

$$A^k = PD^kP^{-1}$$

Since $D$ is diagonal, $D^k$ is obtained by raising each diagonal entry to the $k$-th power. This is much faster than multiplying $A$ by itself $k$ times.

Real-World Applications

Google PageRank Algorithm

The PageRank algorithm, which made Google the dominant search engine, is built on eigenvalues and eigenvectors. The internet is modeled as a giant matrix where entry $(i, j)$ represents the probability of clicking from page $j$ to page $i$. The ranking of web pages is determined by the eigenvector corresponding to eigenvalue 1 of this matrix.

Pages that are linked to by many important pages get higher components in this eigenvector, making them rank higher. The eigenvector represents the long-term distribution of a random web surfer clicking links, and this is exactly what PageRank uses to measure importance.

Principal Component Analysis (PCA) in Data Science

When you have data with many variables (high-dimensional data), PCA finds the directions of maximum variance. These directions are the eigenvectors of the covariance matrix, and the eigenvalues tell you how much variance lies in each direction.

By keeping only the eigenvectors with the largest eigenvalues, you can reduce the dimensionality of your data while preserving most of the information. This is essential for visualizing high-dimensional data, speeding up machine learning algorithms, and removing noise.

Vibration Modes in Mechanical Systems

When engineers design bridges, buildings, or aircraft, they need to understand how the structure vibrates. Each structure has natural frequencies at which it “wants” to vibrate. These natural frequencies are eigenvalues of a matrix describing the system, and the vibration patterns (mode shapes) are the eigenvectors.

Matching an external force to a natural frequency causes resonance, which can be catastrophic (like the Tacoma Narrows Bridge collapse). Engineers use eigenvalue analysis to ensure natural frequencies are far from expected forcing frequencies.

Stability Analysis in Differential Equations

Systems of differential equations, which model everything from population dynamics to electrical circuits, can be analyzed using eigenvalues. The eigenvalues of the coefficient matrix determine whether the system is stable (solutions decay to equilibrium), unstable (solutions blow up), or oscillatory.

If all eigenvalues have negative real parts, the system is stable. If any eigenvalue has a positive real part, the system is unstable. Purely imaginary eigenvalues (like $\pm i$ in the rotation example) correspond to sustained oscillations without growth or decay.

Self-Test Problems

Problem 1: Verify that $\vec{v} = \begin{pmatrix} 1 \ 1 \end{pmatrix}$ is an eigenvector of $A = \begin{pmatrix} 4 & 1 \ 2 & 3 \end{pmatrix}$ and find the eigenvalue.

Show Answer

Compute $A\vec{v}$:

$$A\vec{v} = \begin{pmatrix} 4 & 1 \ 2 & 3 \end{pmatrix}\begin{pmatrix} 1 \ 1 \end{pmatrix} = \begin{pmatrix} 4 \cdot 1 + 1 \cdot 1 \ 2 \cdot 1 + 3 \cdot 1 \end{pmatrix} = \begin{pmatrix} 5 \ 5 \end{pmatrix}$$

Is this a scalar multiple of $\vec{v} = \begin{pmatrix} 1 \ 1 \end{pmatrix}$?

$$\begin{pmatrix} 5 \ 5 \end{pmatrix} = 5\begin{pmatrix} 1 \ 1 \end{pmatrix} = 5\vec{v}$$

Yes, $A\vec{v} = 5\vec{v}$.

Answer: The vector $\vec{v} = \begin{pmatrix} 1 \ 1 \end{pmatrix}$ is an eigenvector of $A$ with eigenvalue $\lambda = 5$.

Problem 2: Find the eigenvalues of $A = \begin{pmatrix} 1 & 4 \ 2 & 3 \end{pmatrix}$.

Show Answer

Characteristic equation: $\det(A - \lambda I) = 0$.

$$\det\begin{pmatrix} 1 - \lambda & 4 \ 2 & 3 - \lambda \end{pmatrix} = (1-\lambda)(3-\lambda) - 8$$

$$= 3 - \lambda - 3\lambda + \lambda^2 - 8 = \lambda^2 - 4\lambda - 5$$

Solve $\lambda^2 - 4\lambda - 5 = 0$:

Factor: $(\lambda - 5)(\lambda + 1) = 0$

Eigenvalues: $\lambda_1 = 5$ and $\lambda_2 = -1$.

Check: trace $= 1 + 3 = 4 = 5 + (-1)$. Determinant $= 1 \cdot 3 - 4 \cdot 2 = -5 = 5 \cdot (-1)$. Both check out.

Problem 3: For the matrix in Problem 2, find an eigenvector for $\lambda = 5$.

Show Answer

Compute $A - 5I$:

$$A - 5I = \begin{pmatrix} 1 - 5 & 4 \ 2 & 3 - 5 \end{pmatrix} = \begin{pmatrix} -4 & 4 \ 2 & -2 \end{pmatrix}$$

Solve $(A - 5I)\vec{v} = \vec{0}$:

Row 1: $-4x + 4y = 0 \Rightarrow x = y$

Row 2: $2x - 2y = 0 \Rightarrow x = y$ (same condition)

Let $y = t$, then $x = t$.

Eigenvector: $\vec{v} = t\begin{pmatrix} 1 \ 1 \end{pmatrix}$. A representative eigenvector is $\begin{pmatrix} 1 \ 1 \end{pmatrix}$.

Problem 4: Find the eigenvalues of $A = \begin{pmatrix} 3 & 0 & 0 \ 0 & 3 & 0 \ 0 & 0 & 5 \end{pmatrix}$.

Show Answer

This is a diagonal matrix. The eigenvalues are the diagonal entries:

Eigenvalues: $\lambda_1 = 3$ (with algebraic multiplicity 2) and $\lambda_2 = 5$ (with algebraic multiplicity 1).

Problem 5: If $A$ has eigenvalue $\lambda = 4$, what is the corresponding eigenvalue of $A^2$? What about $A^{-1}$ (assuming $A$ is invertible)?

Show Answer

If $A\vec{v} = 4\vec{v}$, then:

For $A^2$: $A^2\vec{v} = A(A\vec{v}) = A(4\vec{v}) = 4(A\vec{v}) = 4 \cdot 4\vec{v} = 16\vec{v}$

The eigenvalue of $A^2$ is $\lambda^2 = 16$.

For $A^{-1}$: From $A\vec{v} = 4\vec{v}$, multiply both sides by $A^{-1}$:

$\vec{v} = 4A^{-1}\vec{v}$

$A^{-1}\vec{v} = \frac{1}{4}\vec{v}$

The eigenvalue of $A^{-1}$ is $1/\lambda = 1/4$.

Problem 6: Find the eigenvalues of $A = \begin{pmatrix} 2 & -1 \ 1 & 2 \end{pmatrix}$.

Show Answer

Characteristic polynomial:

$$\det\begin{pmatrix} 2 - \lambda & -1 \ 1 & 2 - \lambda \end{pmatrix} = (2-\lambda)^2 + 1 = \lambda^2 - 4\lambda + 4 + 1 = \lambda^2 - 4\lambda + 5$$

Use the quadratic formula:

$$\lambda = \frac{4 \pm \sqrt{16 - 20}}{2} = \frac{4 \pm \sqrt{-4}}{2} = \frac{4 \pm 2i}{2} = 2 \pm i$$

Eigenvalues: $\lambda_1 = 2 + i$ and $\lambda_2 = 2 - i$ (complex conjugate pair).

This matrix represents rotation combined with scaling. The magnitude $|2 + i| = \sqrt{5}$ is the scaling factor, and $\arctan(1/2)$ is the rotation angle.

Problem 7: A $3 \times 3$ matrix $A$ has eigenvalues $1$, $2$, and $3$. What is $\det(A)$? What is the trace of $A$?

Show Answer

The determinant equals the product of eigenvalues:

$$\det(A) = 1 \cdot 2 \cdot 3 = 6$$

The trace equals the sum of eigenvalues:

$$\text{trace}(A) = 1 + 2 + 3 = 6$$

Summary

  • An eigenvector of a matrix $A$ is a non-zero vector $\vec{v}$ such that $A\vec{v} = \lambda\vec{v}$ for some scalar $\lambda$. The scalar $\lambda$ is the corresponding eigenvalue.

  • Geometrically, eigenvectors point in directions that the transformation preserves (up to scaling). The eigenvalue tells you the scaling factor along that direction.

  • To find eigenvalues, solve the characteristic equation: $\det(A - \lambda I) = 0$. The left side is the characteristic polynomial.

  • To find eigenvectors for a given $\lambda$, solve $(A - \lambda I)\vec{v} = \vec{0}$, which means finding the null space of $A - \lambda I$.

  • The eigenspace for $\lambda$ is the set of all eigenvectors for that eigenvalue, plus the zero vector. It is a subspace.

  • Algebraic multiplicity is how many times $\lambda$ appears as a root of the characteristic polynomial. Geometric multiplicity is the dimension of the eigenspace.

  • Some matrices have complex eigenvalues. This happens when the characteristic polynomial has no real roots, typically for matrices involving rotation.

  • For diagonal and triangular matrices, the eigenvalues are simply the diagonal entries.

  • Key relationships: the sum of eigenvalues equals the trace, and the product of eigenvalues equals the determinant.

  • Applications include Google PageRank (finding the dominant eigenvector), PCA (eigenvalues of covariance matrices), vibration analysis (natural frequencies), and stability analysis (sign of eigenvalue real parts).