âšī¸ Skipped - page is already crawled
| Filter | Status | Condition | Details |
|---|---|---|---|
| HTTP status | PASS | download_http_code = 200 | HTTP 200 |
| Age cutoff | PASS | download_stamp > now() - 6 MONTH | 0.4 months ago |
| History drop | PASS | isNull(history_drop_reason) | No drop reason |
| Spam/ban | PASS | fh_dont_index != 1 AND ml_spam_score = 0 | ml_spam_score=0 |
| Canonical | PASS | meta_canonical IS NULL OR = '' OR = src_unparsed | Not set |
| Property | Value |
|---|---|
| URL | https://www.statlect.com/matrix-algebra/eigenvalues-and-eigenvectors |
| Last Crawled | 2026-03-30 11:05:39 (12 days ago) |
| First Indexed | 2019-02-23 07:44:30 (7 years ago) |
| HTTP Status Code | 200 |
| Meta Title | Eigenvalues and eigenvectors |
| Meta Description | Introduction to eigenvalues and eigenvectors. Developing intuition. Definition, spectrum, eigenspace. |
| Meta Canonical | null |
| Boilerpipe Text | This lecture introduces the concepts of eigenvalues and eigenvectors of a
square matrix. These are amongst the most useful concepts in linear algebra:
studying the eigenvalues and eigenvectors of a square matrix is very frequent
in applied work.
Table of contents
Intuition
Definition
Characteristic equation
Spectrum
Eigenspace
Solved exercises
Exercise 1
Exercise 2
Intuition
Let us first develop some intuition about eigenvalues and eigenvectors. To do
so, we start from some concepts we explained in the lecture on the
Determinant of a
matrix
.
Consider the
linear space
of all
real vectors, which can be represented as a Cartesian plane. A vector
is a point in the plane, and the first and second entries of
are the coordinates of the point.
Now, consider a
matrix
.
The matrix transforms any set of points
(e.g., a rectangle, a circle) into another set of points
:
If the points of
form a region whose area is equal to
,
and the area of the transformed region
is
,
then
In other words, the determinant tells us by how much the linear transformation
associated with the matrix
scales up or down the area of shapes. Eigenvalues and eigenvectors provide us
with another useful piece of information.
They tell us by how much the
linear transformation scales up or down the sides of certain
parallelograms
.
Consider the parallelograms that have one vertex at the origin of the
Cartesian plane. The four vertices
are
where
and
are
vectors and
is the zero vector.
There are two vectors
and
,
called the eigenvectors of
,
such that the associated parallelogram is transformed by
into a new parallelogram having
vertices
where
and
are two scalars called the eigenvalues of
.
In other words, the linear transformation multiplies the length of one pair of
parallel sides by
and the length of the other pair by
,
but it keeps the angles of the parallelogram unchanged. The next figure
provides an illustration of this kind of transformation: the original
parallelogram (in blue) is transformed into another parallelogram (in red) by
a matrix
whose eigenvalues are equal to
and
.
Since a pair of parallel sides is scaled by
and the other pair by
,
the area of the parallelogram is scaled by a factor of
.
But we also know that the area of the parallelogram is scaled by
.
As a
consequence,
that
is, the determinant of a matrix is equal to the product of its eigenvalues, a
fact that holds in general.
The definition of eigenvalues and eigenvectors we are going to provide below
generalizes these concepts to linear spaces that can have more than two
dimensions.
Definition
We are now ready to define eigenvalues and eigenvectors.
Definition
Let
be a
matrix
. If there exist a
vector
and a scalar
such
that
then
is called an eigenvalue of
and
an eigenvector corresponding to
.
This definition fits with the example above about the vertices of the
parallelogram. The two vertices
and
are eigenvectors corresponding to the eigenvalues
and
because
Furthermore,
these two equations can be added so as to obtain the transformation of the
vertex
:
Characteristic equation
Note that the eigenvalue
equation
can
be written
as
where
is the
identity matrix
. The latter
equation has a non-zero solution only if the columns of the matrix
are
linearly dependent
,
that is, if the matrix is
singular
. But
a matrix is singular if and
only if its determinant is zero
. As a consequence, any eigenvalue of
must satisfy the
equation
which
is called
characteristic equation
.
The expression
is a monic polynomial of degree
in
,
known as characteristic polynomial.
By using the
fundamental theorem
of algebra
, it is possible to write the characteristic equation
as
where
are the
solutions of the equation (i.e., the roots of the characteristic polynomial).
The fundamental theorem of algebra guarantees that exactly
solutions exist, but these solutions are not guaranteed to be real (i.e., they
can be complex numbers), even when the entries of
are all real. They are also not guaranteed to be distinct, that is, two
solutions could be equal.
Spectrum
In the previous section we have explained that a
matrix
has
not necessarily distinct and possibly complex eigenvalues. The set of all
eigenvalues of
is called the spectrum of
.
Eigenspace
Note that
if
then
you can multiply both sides of the equation by a non-zero scalar
and
get
In other words, if
is an eigenvalue of
and
is an eigenvector corresponding to
,
then any multiple of
is an eigenvector corresponding to
.
Thus, the eigenvector corresponding to a given eigenvalue is not unique. In
this section we prove that the set of all eigenvectors corresponding to a
given eigenvalue is a linear space.
Definition
Let
be a
matrix and
one of its eigenvalues. The union of the zero vector and the set of all the
eigenvectors corresponding to the eigenvalue
is called the eigenspace of
.
Note that we include the zero vector in the eigenspace because eigenvectors
are required to be non-zero.
The next proposition shows that an eigenspace is closed with respect to
linear combinations
, that
is, it is a linear space.
Proposition
The eigenspace corresponding to an eigenvalue is a
linear space
.
Proof
Clearly, once an eigenvalue
has
been found (e.g., by solving the characteristic equation), the eigenspace of
can be found by
solving
the linear
system
Solved exercises
Below you can find some exercises with explained solutions.
Exercise 1
Consider the matrix
Show
that
is
an eigenvector of
and find its corresponding eigenvalue.
Solution
We have
that
Thus,
is an eigenvector of
corresponding to the eigenvalue
.
Exercise 2
Define
Find
the eigenvalues of
by solving the characteristic equation.
Solution
The characteristic equation
is
Therefore,
the eigenvalues of
are
and
.
How to cite
Please cite as:
Taboga, Marco (2021). "Eigenvalues and eigenvectors", Lectures on matrix algebra. https://www.statlect.com/matrix-algebra/eigenvalues-and-eigenvectors. |
| Markdown | 
[StatLect](https://www.statlect.com/)
[Index](https://www.statlect.com/) \> [Matrix algebra](https://www.statlect.com/matrix-algebra/)
# Eigenvalues and eigenvectors
by [Marco Taboga](https://www.statlect.com/about/#author), PhD
This lecture introduces the concepts of eigenvalues and eigenvectors of a square matrix. These are amongst the most useful concepts in linear algebra: studying the eigenvalues and eigenvectors of a square matrix is very frequent in applied work.

Table of contents
1. [Intuition](https://www.statlect.com/matrix-algebra/eigenvalues-and-eigenvectors#hid2)
2. [Definition](https://www.statlect.com/matrix-algebra/eigenvalues-and-eigenvectors#hid3)
3. [Characteristic equation](https://www.statlect.com/matrix-algebra/eigenvalues-and-eigenvectors#hid4)
4. [Spectrum](https://www.statlect.com/matrix-algebra/eigenvalues-and-eigenvectors#hid5)
5. [Eigenspace](https://www.statlect.com/matrix-algebra/eigenvalues-and-eigenvectors#hid6)
6. [Solved exercises](https://www.statlect.com/matrix-algebra/eigenvalues-and-eigenvectors#hid7)
1. [Exercise 1](https://www.statlect.com/matrix-algebra/eigenvalues-and-eigenvectors#hid8)
2. [Exercise 2](https://www.statlect.com/matrix-algebra/eigenvalues-and-eigenvectors#hid9)
## Intuition
Let us first develop some intuition about eigenvalues and eigenvectors. To do so, we start from some concepts we explained in the lecture on the [Determinant of a matrix](https://www.statlect.com/matrix-algebra/determinant-of-a-matrix).
Consider the [linear space](https://www.statlect.com/matrix-algebra/linear-spaces)  of all  real vectors, which can be represented as a Cartesian plane. A vector  is a point in the plane, and the first and second entries of  are the coordinates of the point.
Now, consider a  matrix . The matrix transforms any set of points  (e.g., a rectangle, a circle) into another set of points : ![\[eq1\]](https://www.statlect.com/images/eigenvalues-and-eigenvectors__9.png)
If the points of  form a region whose area is equal to , and the area of the transformed region  is , then![\[eq2\]](data:image/gif;base64,R0lGODlhAQABAIAAANvf7wAAACH5BAEAAAAALAAAAAABAAEAAAICRAEAOw==)
In other words, the determinant tells us by how much the linear transformation associated with the matrix  scales up or down the area of shapes. Eigenvalues and eigenvectors provide us with another useful piece of information. **They tell us by how much the linear transformation scales up or down the sides of certain parallelograms**.
Consider the parallelograms that have one vertex at the origin of the Cartesian plane. The four vertices are![\[eq3\]](data:image/gif;base64,R0lGODlhAQABAIAAANvf7wAAACH5BAEAAAAALAAAAAABAAEAAAICRAEAOw==)where  and  are  vectors and  is the zero vector.
There are two vectors  and , called the eigenvectors of , such that the associated parallelogram is transformed by  into a new parallelogram having vertices![\[eq4\]](data:image/gif;base64,R0lGODlhAQABAIAAANvf7wAAACH5BAEAAAAALAAAAAABAAEAAAICRAEAOw==)where  and  are two scalars called the eigenvalues of . In other words, the linear transformation multiplies the length of one pair of parallel sides by  and the length of the other pair by , but it keeps the angles of the parallelogram unchanged. The next figure provides an illustration of this kind of transformation: the original parallelogram (in blue) is transformed into another parallelogram (in red) by a matrix  whose eigenvalues are equal to  and .

Since a pair of parallel sides is scaled by  and the other pair by , the area of the parallelogram is scaled by a factor of ![\[eq5\]](data:image/gif;base64,R0lGODlhAQABAIAAANvf7wAAACH5BAEAAAAALAAAAAABAAEAAAICRAEAOw==). But we also know that the area of the parallelogram is scaled by ![\[eq6\]](data:image/gif;base64,R0lGODlhAQABAIAAANvf7wAAACH5BAEAAAAALAAAAAABAAEAAAICRAEAOw==). As a consequence,![\[eq7\]](https://www.statlect.com/images/eigenvalues-and-eigenvectors__38.png)that is, the determinant of a matrix is equal to the product of its eigenvalues, a fact that holds in general.
The definition of eigenvalues and eigenvectors we are going to provide below generalizes these concepts to linear spaces that can have more than two dimensions.
## Definition
We are now ready to define eigenvalues and eigenvectors.
Definition Let  be a  [matrix](https://www.statlect.com/matrix-algebra/vectors-and-matrices). If there exist a  vector  and a scalar  such that![\[eq8\]](data:image/gif;base64,R0lGODlhAQABAIAAANvf7wAAACH5BAEAAAAALAAAAAABAAEAAAICRAEAOw==)then  is called an eigenvalue of  and  an eigenvector corresponding to .
This definition fits with the example above about the vertices of the parallelogram. The two vertices  and  are eigenvectors corresponding to the eigenvalues  and  because![\[eq9\]](data:image/gif;base64,R0lGODlhAQABAIAAANvf7wAAACH5BAEAAAAALAAAAAABAAEAAAICRAEAOw==)Furthermore, these two equations can be added so as to obtain the transformation of the vertex :![\[eq10\]](https://www.statlect.com/images/eigenvalues-and-eigenvectors__55.png)
## Characteristic equation
Note that the eigenvalue equation![\[eq11\]](data:image/gif;base64,R0lGODlhAQABAIAAANvf7wAAACH5BAEAAAAALAAAAAABAAEAAAICRAEAOw==)can be written as![\[eq12\]](data:image/gif;base64,R0lGODlhAQABAIAAANvf7wAAACH5BAEAAAAALAAAAAABAAEAAAICRAEAOw==)where  is the  [identity matrix](https://www.statlect.com/matrix-algebra/identity-matrix). The latter equation has a non-zero solution only if the columns of the matrix  are [linearly dependent](https://www.statlect.com/matrix-algebra/linear-independence), that is, if the matrix is [singular](https://www.statlect.com/matrix-algebra/inverse-matrix). But [a matrix is singular if and only if its determinant is zero](https://www.statlect.com/matrix-algebra/determinant-properties). As a consequence, any eigenvalue of  must satisfy the equation![\[eq13\]](data:image/gif;base64,R0lGODlhAQABAIAAANvf7wAAACH5BAEAAAAALAAAAAABAAEAAAICRAEAOw==)which is called **characteristic equation**.
The expression ![\[eq14\]](data:image/gif;base64,R0lGODlhAQABAIAAANvf7wAAACH5BAEAAAAALAAAAAABAAEAAAICRAEAOw==) is a monic polynomial of degree  in , known as characteristic polynomial.
By using the [fundamental theorem of algebra](https://www.statlect.com/matrix-algebra/polynomials-in-linear-algebra), it is possible to write the characteristic equation as![\[eq15\]](https://www.statlect.com/images/eigenvalues-and-eigenvectors__66.png)where ![\[eq16\]](data:image/gif;base64,R0lGODlhAQABAIAAANvf7wAAACH5BAEAAAAALAAAAAABAAEAAAICRAEAOw==) are the  solutions of the equation (i.e., the roots of the characteristic polynomial).
The fundamental theorem of algebra guarantees that exactly  solutions exist, but these solutions are not guaranteed to be real (i.e., they can be complex numbers), even when the entries of  are all real. They are also not guaranteed to be distinct, that is, two solutions could be equal.
## Spectrum
In the previous section we have explained that a  matrix  has  not necessarily distinct and possibly complex eigenvalues. The set of all eigenvalues of  is called the spectrum of .
## Eigenspace
Note that if![\[eq11\]](data:image/gif;base64,R0lGODlhAQABAIAAANvf7wAAACH5BAEAAAAALAAAAAABAAEAAAICRAEAOw==)then you can multiply both sides of the equation by a non-zero scalar  and get![\[eq18\]](data:image/gif;base64,R0lGODlhAQABAIAAANvf7wAAACH5BAEAAAAALAAAAAABAAEAAAICRAEAOw==)
In other words, if  is an eigenvalue of  and  is an eigenvector corresponding to , then any multiple of  is an eigenvector corresponding to . Thus, the eigenvector corresponding to a given eigenvalue is not unique. In this section we prove that the set of all eigenvectors corresponding to a given eigenvalue is a linear space.
Definition Let  be a  matrix and  one of its eigenvalues. The union of the zero vector and the set of all the eigenvectors corresponding to the eigenvalue  is called the eigenspace of .
Note that we include the zero vector in the eigenspace because eigenvectors are required to be non-zero.
The next proposition shows that an eigenspace is closed with respect to [linear combinations](https://www.statlect.com/matrix-algebra/linear-combinations), that is, it is a linear space.
Proposition The eigenspace corresponding to an eigenvalue is a [linear space](https://www.statlect.com/matrix-algebra/linear-spaces).
Proof
Suppose that  is an eigenvalue of a square matrix  and take any two vectors  and  belonging to the eigenspace of . Then,![\[eq19\]](data:image/gif;base64,R0lGODlhAQABAIAAANvf7wAAACH5BAEAAAAALAAAAAABAAEAAAICRAEAOw==)Now take a linear combination  of the two eigenvectors![\[eq20\]](https://www.statlect.com/images/eigenvalues-and-eigenvectors__97.png)where  and  are two scalars. Then,![\[eq21\]](https://www.statlect.com/images/eigenvalues-and-eigenvectors__100.png)Thus,  is an eigenvector corresponding to . In other words, any linear combination of the vectors of the eigenspace belongs to the eigenspace.
Clearly, once an eigenvalue has been found (e.g., by solving the characteristic equation), the eigenspace of  can be found by [solving the linear system](https://www.statlect.com/matrix-algebra/systems-of-linear-equations-and-matrices)![\[eq12\]](data:image/gif;base64,R0lGODlhAQABAIAAANvf7wAAACH5BAEAAAAALAAAAAABAAEAAAICRAEAOw==)
## Solved exercises
Below you can find some exercises with explained solutions.
### Exercise 1
Consider the matrix ![\[eq23\]](data:image/gif;base64,R0lGODlhAQABAIAAANvf7wAAACH5BAEAAAAALAAAAAABAAEAAAICRAEAOw==)Show that ![\[eq24\]](data:image/gif;base64,R0lGODlhAQABAIAAANvf7wAAACH5BAEAAAAALAAAAAABAAEAAAICRAEAOw==)is an eigenvector of  and find its corresponding eigenvalue.
Solution
We have that![\[eq25\]](https://www.statlect.com/images/eigenvalues-and-eigenvectors__109.png)Thus,  is an eigenvector of  corresponding to the eigenvalue .
### Exercise 2
Define![\[eq26\]](data:image/gif;base64,R0lGODlhAQABAIAAANvf7wAAACH5BAEAAAAALAAAAAABAAEAAAICRAEAOw==)Find the eigenvalues of  by solving the characteristic equation.
Solution
The characteristic equation is![\[eq27\]](https://www.statlect.com/images/eigenvalues-and-eigenvectors__115.png)Therefore, the eigenvalues of  are  and .
## How to cite
Please cite as:
Taboga, Marco (2021). "Eigenvalues and eigenvectors", Lectures on matrix algebra. https://www.statlect.com/matrix-algebra/eigenvalues-and-eigenvectors.
The books
Most of the learning materials found on this website are now available in a traditional textbook format.
[Probability and statistics](https://www.statlect.com/about/book)[Matrix algebra](https://www.statlect.com/about/linear-algebra-book)
Featured pages
- [Wald test](https://www.statlect.com/fundamentals-of-statistics/Wald-test)
- [Binomial distribution](https://www.statlect.com/probability-distributions/binomial-distribution)
- [Convergence in distribution](https://www.statlect.com/asymptotic-theory/convergence-in-distribution)
- [Normal distribution](https://www.statlect.com/probability-distributions/normal-distribution)
- [Poisson distribution](https://www.statlect.com/probability-distributions/Poisson-distribution)
- [Gamma function](https://www.statlect.com/mathematical-tools/gamma-function)
Explore
- [Uniform distribution](https://www.statlect.com/probability-distributions/uniform-distribution)
- [Maximum likelihood](https://www.statlect.com/fundamentals-of-statistics/maximum-likelihood)
- [Student t distribution](https://www.statlect.com/probability-distributions/student-t-distribution)
Main sections
- [Mathematical tools](https://www.statlect.com/mathematical-tools/)
- [Fundamentals of probability](https://www.statlect.com/fundamentals-of-probability/)
- [Probability distributions](https://www.statlect.com/probability-distributions/)
- [Asymptotic theory](https://www.statlect.com/asymptotic-theory/)
- [Fundamentals of statistics](https://www.statlect.com/fundamentals-of-statistics/)
- [Glossary](https://www.statlect.com/glossary/)
About
- [About Statlect](https://www.statlect.com/about/)
- [Contacts](https://www.statlect.com/about/contacts)
- [Cookies, privacy and terms of use](https://www.statlect.com/about/cookies-privacy-policy-and-terms-of-use)
Glossary entries
- [Probability space](https://www.statlect.com/glossary/probability-space)
- [Mean squared error](https://www.statlect.com/glossary/mean-squared-error)
- [Probability mass function](https://www.statlect.com/glossary/probability-mass-function)
- [Type I error](https://www.statlect.com/glossary/Type-I-error)
- [Probability density function](https://www.statlect.com/glossary/probability-density-function)
- [Distribution function](https://www.statlect.com/glossary/distribution-function)
Share
- To enhance your privacy,
- we removed the social buttons,
- but **don't forget to share**. |
| Readable Markdown | null |
| Shard | 126 (laksa) |
| Root Hash | 3586688910177265926 |
| Unparsed URL | com,statlect!www,/matrix-algebra/eigenvalues-and-eigenvectors s443 |