ℹ️ Skipped - page is already crawled
| Filter | Status | Condition | Details |
|---|---|---|---|
| HTTP status | PASS | download_http_code = 200 | HTTP 200 |
| Age cutoff | PASS | download_stamp > now() - 6 MONTH | 1 months ago |
| History drop | PASS | isNull(history_drop_reason) | No drop reason |
| Spam/ban | PASS | fh_dont_index != 1 AND ml_spam_score = 0 | ml_spam_score=0 |
| Canonical | PASS | meta_canonical IS NULL OR = '' OR = src_unparsed | Not set |
| Property | Value |
|---|---|
| URL | https://ximera.osu.edu/la/LinearAlgebra/EIG-M-0020/main |
| Last Crawled | 2026-03-14 09:11:59 (29 days ago) |
| First Indexed | 2019-01-24 18:55:13 (7 years ago) |
| HTTP Status Code | 200 |
| Meta Title | EIG-0020: Finding Eigenvalues and Eigenvectors - Ximera |
| Meta Description | Ximera provides the backend technology for online courses |
| Meta Canonical | null |
| Boilerpipe Text | We explore the theory behind finding the eigenvalues and associated eigenvectors of a
square matrix.
EIG-0020: Finding Eigenvalues and Eigenvectors
Let be an matrix. In Module EIG-0010 we learned that the eigenvectors
and eigenvalues of are vectors and scalars that satisfy the equation
We listed a few reasons why we are interested in finding eigenvalues and eigenvectors,
but we did not give any process for finding them. In this module we will focus on the
process.
If a vector is an eigenvector satisfying Equation (
def:eigen
), then it also satisfies the following
equations.
This shows that any eigenvector of is in the null space of the related matrix, . Since
eigenvectors are non-zero vectors, this means that will have eigenvectors if and only
if the null space of is nontrivial. The only way that can be nontrivial is if
.
If the rank of an matrix is less than , then the matrix is singular. Since must
be singular for any eigenvalue , Theorem
th:detofsingularmatrix
implies that is an eigenvalue
of if and only if
Eigenvalues
In theory, then, to find the eigenvalues of , one can solve Equation (
eqn:chareqn
) for
.
The equation
is called the
characteristic equation
of . The left-hand side of the equation is a
polynomial in and is called the
characteristic polynomial
of .
Let . Compute the eigenvalues of this matrix using the characteristic equation.
The characteristic equation has solutions and . These are the eigenvalues of
.
Let . Compute the eigenvalues of using the characteristic equation.
and
Let . Compute the eigenvalues of using the characteristic equation.
Matrix has eigenvalues and .
In Example
ex:3x3eig
, the factor appears twice in the characteristic polynomial. This
repeated factor gives rise to the eigenvalue . We say that has
algebraic multiplicity
.
The three examples above are a bit contrived. It is not always possible to completely
factor the characteristic polynomial. However, a fundamental fact from algebra is
that every degree polynomial has roots (counting multiplicity) provided that we
allow complex numbers. This is why sometimes eigenvalues and their corresponding
eigenvectors involve complex numbers. The next example illustrates this
point.
Let . Compute the eigenvalues of this matrix.
So one of the eigenvalues of is . To get the other eigenvalues we must solve .
Using the quadratic formula, we compute that and are also eigenvalues of
.
Let . Compute the eigenvalues of this matrix.
What do you observe about the eigenvalues?
The eigenvalues are the diagonal entries
of the matrix.
What property of the matrix makes this “coincidence” possible?
is a triangular matrix.
The matrix in Exploration Problem
init:3x3tri
is a triangular matrix, and the property you
observed holds in general.
Let be a triangular matrix. Then the eigenvalues of are the entries on the main
diagonal.
Proof
See Practice Problem
prob:eigtri
.
Let be a diagonal matrix. Then the eigenvalues of are the entries on the main
diagonal.
One final note about eigenvalues. We began this section with the sentence, ”In
theory, then, to find the eigenvalues of , one can solve Equation (
eqn:chareqn
) for .”
In general, one does not attempt to compute eigenvalues by solving the
characteristic equation of a matrix, as there is no simple way to solve such an
equation for . Instead, one can often approximate the eigenvalues using
iterative
methods
.
Eigenvectors
Once we have computed an eigenvalue of an matrix , the next step is to compute
the associated eigenvectors. In other words, we seek vectors such that , or
equivalently,
For any given eigenvalue there are infinitely many eigenvectors associated with it. In
fact, the eigenvectors associated with form a subspace of . (see Practice Problems
prob:eigenspace1
and
prob:eigenspace2
) This motivates the following definition.
The set of all eigenvectors associated with a given eigenvalue of a matrix is known as
the
eigenspace
associated with that eigenvalue.
So given an eigenvalue , there is an associated eigenspace , and our goal is to
find a basis of , for then any eigenvector will be a linear combination of
the vectors in that basis. Moreover, we are trying to find a basis for the
set of vectors that satisfy Equation
eqn:nullspace
, which means we seek a basis for . We
have already learned how to compute a basis of a null space - see Module
VSP-0040.
Let’s return to the examples we did in the first section of this module.
(Finding eigenvectors for Example
ex:2x2eig
)
Recall that has eigenvalues and . Compute a basis for the eigenspace
associated with each of these eigenvalues.
Eigenvectors associated with the
eigenvalue are in the null space of . So we seek a basis for . We compute:
From this we see that the eigenspace associated with consists of vectors of the form .
This means that is one possible basis for .
In a similar way, we compute a basis for , the subspace of all
eigenvectors associated with the eigenvalue . Now we compute:
Vectors in the null space have the form This means that is one possible basis for the
eigenspace .
(Finding eigenvectors for Example
ex:2x2eig2
) We know from Example
ex:2x2eig2
that has eigenvalues and .
Compute a basis for the eigenspace associated with each of these eigenvalues.
Let’s begin
by finding a basis for the eigenspace , which is the subspace of consisting of eigenvectors
corresponding to the eigenvalue . We need to compute a basis for . We compute:
From this we see that an eigenvector in has the form . This means that is one
possible basis for the eigenspace . By letting , we obtain an arguably nicer-looking
basis: .
To compute a basis for , the subspace of all eigenvectors associated to the eigenvalue , we compute:
From this we find that is one possible basis for the eigenspace .
(Finding eigenvectors for Example
ex:3x3eig
) We know from Example
ex:3x3eig
that has
eigenvalues and . Compute a basis for the eigenspace associated to each of these
eigenvalues.
We first find a basis for the eigenspace . We need to compute a basis for . We compute:
Notice that there are two free variables. The eigenvectors in have the form
So one possible basis for the eigenspace is given by .
Next we find a basis for the eigenspace . We need to compute a basis for . We compute:
This time there is one free variable. The eigenvectors in have the form , so a possible
basis for the eigenspace is given by .
(Finding eigenvectors for Example
ex:3x3_complex_eig
) We know from Example
ex:3x3_complex_eig
that has eigenvalues ,
, and . Compute a basis for the eigenspace associated with each eigenvalue.
We first
find a basis for the eigenspace . We need to compute a basis for . We compute:
From this we see that for any eigenvector in we have and , but is a free variable.
So one possible basis for the eigenspace is given by
Next we find a basis for the eigenspace . We need to compute a basis for . We compute:
There is one free variable. Setting , we get and . From this we see that
eigenvectors in have the form , so a possible basis for the eigenspace is
given by . We ask you in Practice Problem
prob:3x3_complex_ev
to show that is a basis for
.
Practice Problems
In this exercise we will prove that the eigenvectors associated with an eigenvalue of
an matrix form a subspace of .
Let and be eigenvectors of associated with . Show
that is also an eigenvector of associated with . (This shows that the set of
eigenvectors of associated with is closed under addition).
Show that the set of eigenvectors of associated with is closed under scalar
multiplication.
Compute the eigenvalues of the given matrix and find the corresponding eigenspaces.
Answer: (List the eigenvalues in an increasing order.)
A basis for is . A basis for is .
Answer:
A basis for is . A basis for is .
Let . Compute a basis for each of the eigenspaces of this matrix, , , and
.
Let .
Compute the eigenvalues of this matrix.
One of the eigenvalues of is
-3.
Answer:
(List your answers in an increasing order.)
Compute a basis for each of the eigenspaces of this matrix, , , and .
Answer: A basis for is , a basis for is ,
and a basis for is .
Complete Example
ex:3x3_complex_ev
by showing that a basis for is given by , where is the
eigenspace associated with the eigenvalue of the matrix .
Prove Theorem
th:eigtri
. (HINT: Proceed by induction on the dimension n. For the
inductive step, compute by expanding along the first column (or row) if is upper
(lower) triangular.)
The following set of problems deals with geometric interpretation of eigenvalues and
eigenvectors, as well as linear transformations of the plane. Please use EIG-0010 and
LTR-0070 for reference.
Recall that a vertical stretch/compression of the plane is a
linear transformation whose standard matrix is
Find the eigenvalues of . Find a basis for the eigenspace corresponding to each
eigenvalue.
Answer: A basis for is and a basis for is
Sketch several vectors in each eigenspace and use geometry to explain why the
eigenvectors you sketched make sense.
Recall that a horizontal shear of the plane is a linear transformation whose standard
matrix is
Find the eigenvalue of .
Answer:
Find a basis for the eigenspace corresponding to .
Answer: A basis for is
Sketch several vectors in the eigenspace and use geometry to explain why the
eigenvectors you sketched make sense.
Recall that a counterclockwise rotation of the plane through angle is a linear
transformation whose standard matrix is
Verify that the eigenvalues of are
Explain why is real number if and only if is a multiple of . (Compare this to
Practice Problem
prob:rotmatrixrealeig1
of EIG-0010.)
Suppose is a muliple of . Then the eigenspaces corresponding to the two eigenvalues
are the same. Which of the following describes the eigenspace?
All vectors in .
All vectors along the -axis.
All vectors along the -axis.
All vectors along the line .
Recall that a reflection of the plane about the line is a linear transformation whose
standard matrix is
Verify that the eigenvalues of are
Find a basis for eigenspaces and . (For simplicity, assume that .)
Answer: A basis for is and a basis for is
Choose the best description of .
All vectors in .
All vectors with “slope” .
All
vectors with “slope” .
All vectors with “slope” .
Choose the best description of .
All vectors along the line .
All vectors parallel to
the -axis.
All vectors parallel to the -axis.
All vectors perpendicular to the line .
Use geometry to explain why the eigenspaces you found make sense.
Exercise Source
Practice Problem
prob:3x3fromKuttler1
is adopted from Problem 7.1.11 of Ken Kuttler’s
A First Course in
Linear Algebra
. (CC-BY)
Ken Kuttler, A First Course in Linear Algebra, Lyryx 2017, Open Edition, p.
361. |
| Markdown | [](https://ximera.osu.edu/)
- [Get Help]()
[Contact my instructor](https://ximera.osu.edu/la/LinearAlgebra/instructors)
[Report error to authors](https://github.com/pabender/LinearAlgebra/issues/new?title=EIG-0020:%20Finding%20Eigenvalues%20and%20Eigenvectors&body=%0A%0ASee%20[EIG-M-0020/main.tex]\(https://github.com/pabender/LinearAlgebra/blob/master/EIG-M-0020/main.tex\)) [Request help using Ximera](mailto:ximera-help@osu.edu?Subject=/la/LinearAlgebra/EIG-M-0020/main) [Report bug to programmers](https://github.com/kisonecat/ximera/issues/new?title=/la/LinearAlgebra/EIG-M-0020/main)
##### Warning
×
You are about to **erase your work** on this activity. Are you sure you want to do this?
No, keep my work.
Yes, delete my work.
##### Updated Version Available
×
There is an **updated version** of this activity. If you update to the most recent version of this activity, then your current progress on this activity will be erased. Regardless, your record of completion will remain. How would you like to proceed?
Keep the old version.
[Delete my work and update to the new version.](https://ximera.osu.edu/la/LinearAlgebra/EIG-M-0020/main)
##### Mathematical Expression Editor
×
\+
–
×
÷
*x*ⁿ
√
ⁿ√
π
θ
φ
ρ
( )
\| \|
sin
cos
tan
arcsin
arccos
arctan
*e*ˣ
ln
log
Cancel
OK
#### About this Project
Project and contact information.
#### RRN-0010: A Brief Introduction to ℝn
We define and learn how to plot points in .
#### VEC-0010: Introduction to Vectors
We introduce vectors and notation associated with vectors in standard position.
#### VEC-0020: Length of a Vector
We find vector magnitude.
#### VEC-0030: Vector Arithmetic
We define vector addition and scalar multiplication algebraically and geometrically.
#### VEC-0035: Standard Unit Vectors in ℝn
We introduce standard unit vectors in , and , and express a given vector as a linear combination of standard unit vectors.
#### VEC-0040: Linear Combinations of Vectors
We define a linear combination of vectors and examine whether a given vector may be expressed as a linear combination of other vectors, both algebraically and geometrically.
#### VEC-0050: Dot Product and its Properties
We define the dot product and prove its algebraic properties.
#### VEC-0060: Dot Product and the Angle Between Vectors
We state and prove the cosine formula for the dot product of two vectors, and show that two vectors are orthogonal if and only if their dot product is zero.
#### VEC-0070: Orthogonal Projections
We find the projection of a vector onto a given non-zero vector, and find the distance between a point and a line.
#### RRN-0020: Parametric Equations of Lines
We use parametric equations to represent lines in , and .
#### RRN-0030: Planes in ℝ3
We establish that a plane is determined by a point and a normal vector, and use this information to derive a general equation for planes in .
#### SYS-0010: Introduction to Systems of Linear Equations
We solve systems of equations in two and three variables and interpret the results geometrically.
#### SYS-0020: Augmented Matrix Notation and Elementary Row Operations
We introduce the augmented matrix notation and solve linear system by carrying augmented matrices to row-echelon or reduced row-echelon form.
#### SYS-0030: Gaussian Elimination and Rank
We introduce Gaussian elimination and Gauss-Jordan elimination algorithms, and define the *rank* of a matrix.
#### MAT-0010: Addition and Scalar Multiplication of Matrices
We introduce matrices, define matrix addition and scalar multiplication, and prove properties of those operations.
#### MAT-0020: Matrix Multiplication
We introduce matrix-vector and matrix-matrix multiplication, and interpret matrix-vector multiplication as linear combination of the columns of the matrix.
#### MAT-0023: Block Matrix Multiplication
We present and practice block matrix multiplication.
#### MAT-0025: Transpose of a Matrix
We define the transpose of a matrix and state several properties of the transpose. We introduce symmetric, skew symmetric and diagonal matrices.
#### MAT-0030: Linear Systems as Matrix and Linear Combination Equations
We interpret linear systems as matrix equations and as equations involving linear combinations of vectors. We define singular and nonsingular matrices.
#### VEC-0090: Span
We define the span of a collection of vectors and explore the concept algebraically and geometrically.
#### VEC-0100: Linear Independence
We define linear independence of a set of vectors, and explore this concept algebraically and geometrically.
#### SYS-0050: Homogeneous Linear Systems
We define a homogeneous linear system and express a solution to a system of equations as a sum of a particular solution and the general solution to the associated homogeneous system.
#### MAT-0050: The Inverse of a Matrix
We develop a method for finding the inverse of a square matrix, discuss when the inverse does not exist, and use matrix inverses to solve matrix equations.
#### MAT-0060: Elementary Matrices
We introduce elementary matrices and demonstrate how multiplication of a matrix by an elementary matrix is equivalent to to performing an elementary row operation on the matrix.
#### VEC-0110: Linear Independence and Matrices
We prove several results concerning linear independence of rows and columns of a matrix.
#### VSP-0020: ℝn and Subspaces of ℝn
We define closure under addition and scalar multiplication, and we demonstrate how to determine whether a subset of vectors in is a subspace of .
#### VSP-0030: Introduction to Bases
We define bases and consider examples of bases of and subspaces of .
#### VSP-0035: Bases and Dimension
We discuss existence of bases of and subspaces of , and define dimension.
#### VSP-0040: Subspaces of ℝn Associated with Matrices
We define the row space, the column space, and the null space of a matrix, and we prove the Rank-Nullity Theorem.
#### VSP-0050: Abstract Vector Spaces
We state the definition of an abstract vector space, and learn how to determine if a given set with two operations is a vector space. We define a subspace of a vector space and state the subspace test. We find linear combinations and span of elements of a vector space.
#### VSP-0060: Bases and Dimension for Abstract Vector Spaces
We revisit the definitions of linear independence, bases, and dimension in the context of abstract vector spaces.
#### LTR-0010: Introduction to Linear Transformations
We define a linear transformation from into and determine whether a given transformation is linear.
#### LTR-0020: Standard Matrix of a Linear Transformation from ℝn to ℝm
We establish that every linear transformation of is a matrix transformation, and define the standard matrix of a linear transformation.
#### LTR-0022: Linear Transformations of Abstract Vector Spaces
We define linear transformation for abstract vector spaces, and illustrate the definition with examples.
#### LTR-0025: Linear Transformations and Bases
We establish that a linear transformation of a vector space is completely determined by its action on a basis.
#### LTR-0030: Composition and Inverses of Linear Transformations
We define composition of linear transformations, inverse of a linear transformation, and discuss existence and uniqueness of inverses.
#### LTR-0035: Existence of the Inverse of a Linear Transformation
We prove that a linear transformation has an inverse if and only if the transformation is “one-to-one” and “onto”.
#### LTR-0070: Geometric Transformations of the Plane
We find standard matrices for classic transformations of the plane such as scalings, shears, rotations and reflections.
#### LTR-0050: Image and Kernel of a Linear Transformation
We define the image and kernel of a linear transformation and prove the Rank-Nullity Theorem for linear transformations.
#### LTR-0060: Isomorphic Vector Spaces
We define isomorphic vector spaces, discuss isomorphisms and their properties, and prove that any vector space of dimension is isomorphic to .
#### LTR-0080: Matrix of a Linear Transformation with Respect to Arbitrary Bases
We find the matrix of a linear transformation with respect to arbitrary bases, and find the matrix of an inverse linear transformation.
#### DET-0010: Definition of the Determinant – Expansion Along the First Row
We define the determinant of a square matrix in terms of cofactor expansion along the first row.
#### DET-0020: Definition of the Determinant – Expansion Along the First Column
We define the determinant of a square matrix in terms of cofactor expansion along the first column, and show that this definition is equivalent to the definition in terms of cofactor expansion along the first row.
#### DET-0030: Elementary Row Operations and the Determinant
We examine the effect of elementary row operations on the determinant and use row reduction algorithm to compute the determinant.
#### DET-0040: Properties of the Determinant
We summarize the properties of the determinant that we already proved, and prove that a matrix is singular if and only if its determinant is zero, the determinant of a product is the product of the determinants, and the determinant of the transpose is equal to the determinant of the matrix.
#### DET-0050: The Laplace Expansion Theorem
We state and prove the Laplace Expansion Theorem for determinants.
#### DET-0060: Determinants and Inverses of Nonsingular Matrices
We derive the formula for Cramer’s rule and use it to express the inverse of a matrix in terms of determinants.
#### VEC-0080: Cross Product and its Properties
We define the cross product and prove several algebraic and geometric properties.
#### DET-0070: Determinants as Areas and Volumes
We interpret a determinant as the area of a parallelogram, and a determinant as the volume of a parallelepiped.
#### EIG-0010: Describing Eigenvalues and Eigenvectors Algebraically and Geometrically
We introduce the concepts of eigenvalues and eigenvectors of a matrix.
#### EIG-0020: Finding Eigenvalues and Eigenvectors
We explore the theory behind finding the eigenvalues and associated eigenvectors of a square matrix.
#### EIG-0040: Similar Matrices and Their Properties
Abstract goes here
#### EIG-0050: Diagonalizable Matrices and Multiplicity
In this module we discuss algebraic multiplicity, geometric multiplicity, and their relationship to diagonalizability.
#### Debugging Differential Equations
We introduce the augmented matrix notation and solve linear system by carrying augmented matrices to row-echelon or reduced row-echelon form.
1. [la](https://ximera.osu.edu/la)
2. [Ohio OER Linear Algebra](https://ximera.osu.edu/la/LinearAlgebra)
3. EIG-0020: Finding Eigenvalues and Eigenvectors
We explore the theory behind finding the eigenvalues and associated eigenvectors of a square matrix.
### EIG-0020: Finding Eigenvalues and Eigenvectors
Let be an matrix. In Module EIG-0010 we learned that the eigenvectors and eigenvalues of are vectors and scalars that satisfy the equation
We listed a few reasons why we are interested in finding eigenvalues and eigenvectors, but we did not give any process for finding them. In this module we will focus on the process.
If a vector is an eigenvector satisfying Equation ([def:eigen](https://ximera.osu.edu/la/LinearAlgebra/EIG-M-0020/main#def:eigen)), then it also satisfies the following equations.
This shows that any eigenvector of is in the null space of the related matrix, . Since eigenvectors are non-zero vectors, this means that will have eigenvectors if and only if the null space of is nontrivial. The only way that can be nontrivial is if .
If the rank of an matrix is less than , then the matrix is singular. Since must be singular for any eigenvalue , Theorem [th:detofsingularmatrix](https://ximera.osu.edu/la/LinearAlgebra/EIG-M-0020/main#th:detofsingularmatrix) implies that is an eigenvalue of if and only if
#### Eigenvalues
In theory, then, to find the eigenvalues of , one can solve Equation ([eqn:chareqn](https://ximera.osu.edu/la/LinearAlgebra/EIG-M-0020/main#eqn:chareqn)) for .
The equation is called the *characteristic equation* of . The left-hand side of the equation is a polynomial in and is called the *characteristic polynomial* of .
Let . Compute the eigenvalues of this matrix using the characteristic equation.
The characteristic equation has solutions and . These are the eigenvalues of .
Let . Compute the eigenvalues of using the characteristic equation.
and
Let . Compute the eigenvalues of using the characteristic equation.
Matrix has eigenvalues and .
In Example [ex:3x3eig](https://ximera.osu.edu/la/LinearAlgebra/EIG-M-0020/main#ex:3x3eig), the factor appears twice in the characteristic polynomial. This repeated factor gives rise to the eigenvalue . We say that has *algebraic multiplicity* .
The three examples above are a bit contrived. It is not always possible to completely factor the characteristic polynomial. However, a fundamental fact from algebra is that every degree polynomial has roots (counting multiplicity) provided that we allow complex numbers. This is why sometimes eigenvalues and their corresponding eigenvectors involve complex numbers. The next example illustrates this point.
Let . Compute the eigenvalues of this matrix.
So one of the eigenvalues of is . To get the other eigenvalues we must solve . Using the quadratic formula, we compute that and are also eigenvalues of .
Let . Compute the eigenvalues of this matrix.
What do you observe about the eigenvalues?
The eigenvalues are the diagonal entries of the matrix.
What property of the matrix makes this “coincidence” possible?
is a triangular matrix.
The matrix in Exploration Problem [init:3x3tri](https://ximera.osu.edu/la/LinearAlgebra/EIG-M-0020/main#init:3x3tri) is a triangular matrix, and the property you observed holds in general.
Let be a triangular matrix. Then the eigenvalues of are the entries on the main diagonal.
Proof
See Practice Problem [prob:eigtri](https://ximera.osu.edu/la/LinearAlgebra/EIG-M-0020/main#prob:eigtri).
Let be a diagonal matrix. Then the eigenvalues of are the entries on the main diagonal.
One final note about eigenvalues. We began this section with the sentence, ”In theory, then, to find the eigenvalues of , one can solve Equation ([eqn:chareqn](https://ximera.osu.edu/la/LinearAlgebra/EIG-M-0020/main#eqn:chareqn)) for .” In general, one does not attempt to compute eigenvalues by solving the characteristic equation of a matrix, as there is no simple way to solve such an equation for . Instead, one can often approximate the eigenvalues using *iterative methods*.
#### Eigenvectors
Once we have computed an eigenvalue of an matrix , the next step is to compute the associated eigenvectors. In other words, we seek vectors such that , or equivalently,
For any given eigenvalue there are infinitely many eigenvectors associated with it. In fact, the eigenvectors associated with form a subspace of . (see Practice Problems [prob:eigenspace1](https://ximera.osu.edu/la/LinearAlgebra/EIG-M-0020/main#prob:eigenspace1) and [prob:eigenspace2](https://ximera.osu.edu/la/LinearAlgebra/EIG-M-0020/main#prob:eigenspace2)) This motivates the following definition.
The set of all eigenvectors associated with a given eigenvalue of a matrix is known as the *eigenspace* associated with that eigenvalue.
So given an eigenvalue , there is an associated eigenspace , and our goal is to find a basis of , for then any eigenvector will be a linear combination of the vectors in that basis. Moreover, we are trying to find a basis for the set of vectors that satisfy Equation [eqn:nullspace](https://ximera.osu.edu/la/LinearAlgebra/EIG-M-0020/main#eqn:nullspace), which means we seek a basis for . We have already learned how to compute a basis of a null space - see Module VSP-0040.
Let’s return to the examples we did in the first section of this module.
(Finding eigenvectors for Example [ex:2x2eig](https://ximera.osu.edu/la/LinearAlgebra/EIG-M-0020/main#ex:2x2eig) )
Recall that has eigenvalues and . Compute a basis for the eigenspace associated with each of these eigenvalues.
Eigenvectors associated with the eigenvalue are in the null space of . So we seek a basis for . We compute:
From this we see that the eigenspace associated with consists of vectors of the form . This means that is one possible basis for .
In a similar way, we compute a basis for , the subspace of all eigenvectors associated with the eigenvalue . Now we compute:
Vectors in the null space have the form This means that is one possible basis for the eigenspace .
(Finding eigenvectors for Example [ex:2x2eig2](https://ximera.osu.edu/la/LinearAlgebra/EIG-M-0020/main#ex:2x2eig2)) We know from Example [ex:2x2eig2](https://ximera.osu.edu/la/LinearAlgebra/EIG-M-0020/main#ex:2x2eig2) that has eigenvalues and . Compute a basis for the eigenspace associated with each of these eigenvalues.
Let’s begin by finding a basis for the eigenspace , which is the subspace of consisting of eigenvectors corresponding to the eigenvalue . We need to compute a basis for . We compute:
From this we see that an eigenvector in has the form . This means that is one possible basis for the eigenspace . By letting , we obtain an arguably nicer-looking basis: .
To compute a basis for , the subspace of all eigenvectors associated to the eigenvalue , we compute:
From this we find that is one possible basis for the eigenspace .
(Finding eigenvectors for Example [ex:3x3eig](https://ximera.osu.edu/la/LinearAlgebra/EIG-M-0020/main#ex:3x3eig)) We know from Example [ex:3x3eig](https://ximera.osu.edu/la/LinearAlgebra/EIG-M-0020/main#ex:3x3eig) that has eigenvalues and . Compute a basis for the eigenspace associated to each of these eigenvalues.
We first find a basis for the eigenspace . We need to compute a basis for . We compute:
Notice that there are two free variables. The eigenvectors in have the form
So one possible basis for the eigenspace is given by .
Next we find a basis for the eigenspace . We need to compute a basis for . We compute:
This time there is one free variable. The eigenvectors in have the form , so a possible basis for the eigenspace is given by .
(Finding eigenvectors for Example [ex:3x3\_complex\_eig](https://ximera.osu.edu/la/LinearAlgebra/EIG-M-0020/main#ex:3x3_complex_eig)) We know from Example [ex:3x3\_complex\_eig](https://ximera.osu.edu/la/LinearAlgebra/EIG-M-0020/main#ex:3x3_complex_eig) that has eigenvalues , , and . Compute a basis for the eigenspace associated with each eigenvalue.
We first find a basis for the eigenspace . We need to compute a basis for . We compute:
From this we see that for any eigenvector in we have and , but is a free variable. So one possible basis for the eigenspace is given by Next we find a basis for the eigenspace . We need to compute a basis for . We compute:
There is one free variable. Setting , we get and . From this we see that eigenvectors in have the form , so a possible basis for the eigenspace is given by . We ask you in Practice Problem [prob:3x3\_complex\_ev](https://ximera.osu.edu/la/LinearAlgebra/EIG-M-0020/main#prob:3x3_complex_ev) to show that is a basis for .
### Practice Problems
In this exercise we will prove that the eigenvectors associated with an eigenvalue of an matrix form a subspace of .
Let and be eigenvectors of associated with . Show that is also an eigenvector of associated with . (This shows that the set of eigenvectors of associated with is closed under addition).
Show that the set of eigenvectors of associated with is closed under scalar multiplication.
Compute the eigenvalues of the given matrix and find the corresponding eigenspaces.
Answer: (List the eigenvalues in an increasing order.)
A basis for is . A basis for is .
Answer:
A basis for is . A basis for is .
Let . Compute a basis for each of the eigenspaces of this matrix, , , and .
Let .
Compute the eigenvalues of this matrix.
One of the eigenvalues of is -3.
Answer:
(List your answers in an increasing order.)
Compute a basis for each of the eigenspaces of this matrix, , , and .
Answer: A basis for is , a basis for is ,
and a basis for is .
Complete Example [ex:3x3\_complex\_ev](https://ximera.osu.edu/la/LinearAlgebra/EIG-M-0020/main#ex:3x3_complex_ev) by showing that a basis for is given by , where is the eigenspace associated with the eigenvalue of the matrix .
Prove Theorem [th:eigtri](https://ximera.osu.edu/la/LinearAlgebra/EIG-M-0020/main#th:eigtri). (HINT: Proceed by induction on the dimension n. For the inductive step, compute by expanding along the first column (or row) if is upper (lower) triangular.)
The following set of problems deals with geometric interpretation of eigenvalues and eigenvectors, as well as linear transformations of the plane. Please use EIG-0010 and LTR-0070 for reference.
Recall that a vertical stretch/compression of the plane is a linear transformation whose standard matrix is Find the eigenvalues of . Find a basis for the eigenspace corresponding to each eigenvalue.
Answer: A basis for is and a basis for is
Sketch several vectors in each eigenspace and use geometry to explain why the eigenvectors you sketched make sense.
Recall that a horizontal shear of the plane is a linear transformation whose standard matrix is Find the eigenvalue of .
Answer:
Find a basis for the eigenspace corresponding to .
Answer: A basis for is
Sketch several vectors in the eigenspace and use geometry to explain why the eigenvectors you sketched make sense.
Recall that a counterclockwise rotation of the plane through angle is a linear transformation whose standard matrix is Verify that the eigenvalues of are Explain why is real number if and only if is a multiple of . (Compare this to Practice Problem [prob:rotmatrixrealeig1](https://ximera.osu.edu/la/LinearAlgebra/EIG-M-0020/main#prob:rotmatrixrealeig1) of EIG-0010.)
Suppose is a muliple of . Then the eigenspaces corresponding to the two eigenvalues are the same. Which of the following describes the eigenspace?
All vectors in . All vectors along the -axis. All vectors along the -axis. All vectors along the line .
Recall that a reflection of the plane about the line is a linear transformation whose standard matrix is Verify that the eigenvalues of are Find a basis for eigenspaces and . (For simplicity, assume that .)
Answer: A basis for is and a basis for is
Choose the best description of .
All vectors in . All vectors with “slope” . All vectors with “slope” . All vectors with “slope” .
Choose the best description of .
All vectors along the line . All vectors parallel to the -axis. All vectors parallel to the -axis. All vectors perpendicular to the line .
Use geometry to explain why the eigenspaces you found make sense.
### Exercise Source
Practice Problem [prob:3x3fromKuttler1](https://ximera.osu.edu/la/LinearAlgebra/EIG-M-0020/main#prob:3x3fromKuttler1) is adopted from Problem 7.1.11 of Ken Kuttler’s [A First Course in Linear Algebra](https://open.umn.edu/opentextbooks/textbooks/a-first-course-in-linear-algebra-2017). (CC-BY)
Ken Kuttler, A First Course in Linear Algebra, Lyryx 2017, Open Edition, p. 361.
- [← Previous](https://ximera.osu.edu/la/LinearAlgebra/EIG-M-0010/main)
- [Next →](https://ximera.osu.edu/la/LinearAlgebra/EIG-M-0040/main)
##### [Courses](https://ximera.osu.edu/mooculus/)
##### [About](https://ximera.osu.edu/about/overview)
##### Social
Built at [The Ohio State UniversityOSU](http://www.osu.edu/) with [support from](https://ximera.osu.edu/about/support) [NSF Grant DUE-1245433](http://www.nsf.gov/awardsearch/showAward?AWD_ID=1245433), the [Shuttleworth Foundation](https://ximera.osu.edu/about/support), the [Department of Mathematics](http://math.osu.edu/), and the [Affordable Learning Exchange](https://affordablelearning.osu.edu/)[ALX](https://affordablelearning.osu.edu/).

© 2013–2026, The Ohio State University — [Ximera team](https://ximera.osu.edu/about/team)
100 Math Tower, 231 West 18th Avenue, Columbus OH, 43210–1174
Phone: (773) 809–5659 \| [Contact](https://ximera.osu.edu/about/contact)
If you have trouble accessing this page and need to request an alternate format, contact [ximera@math.osu.edu](mailto:ximera@math.osu.edu). |
| Readable Markdown | null |
| Shard | 160 (laksa) |
| Root Hash | 4014355313726641160 |
| Unparsed URL | edu,osu!ximera,/la/LinearAlgebra/EIG-M-0020/main s443 |