đŸ•ˇī¸ Crawler Inspector

URL Lookup

Direct Parameter Lookup

Raw Queries and Responses

1. Shard Calculation

Query:
Response:
Calculated Shard: 126 (from laksa146)

2. Crawled Status Check

Query:
Response:

3. Robots.txt Check

Query:
Response:

4. Spam/Ban Check

Query:
Response:

5. Seen Status Check

â„šī¸ Skipped - page is already crawled

📄
INDEXABLE
✅
CRAWLED
12 days ago
🤖
ROBOTS ALLOWED

Page Info Filters

FilterStatusConditionDetails
HTTP statusPASSdownload_http_code = 200HTTP 200
Age cutoffPASSdownload_stamp > now() - 6 MONTH0.4 months ago
History dropPASSisNull(history_drop_reason)No drop reason
Spam/banPASSfh_dont_index != 1 AND ml_spam_score = 0ml_spam_score=0
CanonicalPASSmeta_canonical IS NULL OR = '' OR = src_unparsedNot set

Page Details

PropertyValue
URLhttps://www.statlect.com/matrix-algebra/eigenvalues-and-eigenvectors
Last Crawled2026-03-30 11:05:39 (12 days ago)
First Indexed2019-02-23 07:44:30 (7 years ago)
HTTP Status Code200
Meta TitleEigenvalues and eigenvectors
Meta DescriptionIntroduction to eigenvalues and eigenvectors. Developing intuition. Definition, spectrum, eigenspace.
Meta Canonicalnull
Boilerpipe Text
This lecture introduces the concepts of eigenvalues and eigenvectors of a square matrix. These are amongst the most useful concepts in linear algebra: studying the eigenvalues and eigenvectors of a square matrix is very frequent in applied work. Table of contents Intuition Definition Characteristic equation Spectrum Eigenspace Solved exercises Exercise 1 Exercise 2 Intuition Let us first develop some intuition about eigenvalues and eigenvectors. To do so, we start from some concepts we explained in the lecture on the Determinant of a matrix . Consider the linear space of all real vectors, which can be represented as a Cartesian plane. A vector is a point in the plane, and the first and second entries of are the coordinates of the point. Now, consider a matrix . The matrix transforms any set of points (e.g., a rectangle, a circle) into another set of points : If the points of form a region whose area is equal to , and the area of the transformed region is , then In other words, the determinant tells us by how much the linear transformation associated with the matrix scales up or down the area of shapes. Eigenvalues and eigenvectors provide us with another useful piece of information. They tell us by how much the linear transformation scales up or down the sides of certain parallelograms . Consider the parallelograms that have one vertex at the origin of the Cartesian plane. The four vertices are where and are vectors and is the zero vector. There are two vectors and , called the eigenvectors of , such that the associated parallelogram is transformed by into a new parallelogram having vertices where and are two scalars called the eigenvalues of . In other words, the linear transformation multiplies the length of one pair of parallel sides by and the length of the other pair by , but it keeps the angles of the parallelogram unchanged. The next figure provides an illustration of this kind of transformation: the original parallelogram (in blue) is transformed into another parallelogram (in red) by a matrix whose eigenvalues are equal to and . Since a pair of parallel sides is scaled by and the other pair by , the area of the parallelogram is scaled by a factor of . But we also know that the area of the parallelogram is scaled by . As a consequence, that is, the determinant of a matrix is equal to the product of its eigenvalues, a fact that holds in general. The definition of eigenvalues and eigenvectors we are going to provide below generalizes these concepts to linear spaces that can have more than two dimensions. Definition We are now ready to define eigenvalues and eigenvectors. Definition Let be a matrix . If there exist a vector and a scalar such that then is called an eigenvalue of and an eigenvector corresponding to . This definition fits with the example above about the vertices of the parallelogram. The two vertices and are eigenvectors corresponding to the eigenvalues and because Furthermore, these two equations can be added so as to obtain the transformation of the vertex : Characteristic equation Note that the eigenvalue equation can be written as where is the identity matrix . The latter equation has a non-zero solution only if the columns of the matrix are linearly dependent , that is, if the matrix is singular . But a matrix is singular if and only if its determinant is zero . As a consequence, any eigenvalue of must satisfy the equation which is called characteristic equation . The expression is a monic polynomial of degree in , known as characteristic polynomial. By using the fundamental theorem of algebra , it is possible to write the characteristic equation as where are the solutions of the equation (i.e., the roots of the characteristic polynomial). The fundamental theorem of algebra guarantees that exactly solutions exist, but these solutions are not guaranteed to be real (i.e., they can be complex numbers), even when the entries of are all real. They are also not guaranteed to be distinct, that is, two solutions could be equal. Spectrum In the previous section we have explained that a matrix has not necessarily distinct and possibly complex eigenvalues. The set of all eigenvalues of is called the spectrum of . Eigenspace Note that if then you can multiply both sides of the equation by a non-zero scalar and get In other words, if is an eigenvalue of and is an eigenvector corresponding to , then any multiple of is an eigenvector corresponding to . Thus, the eigenvector corresponding to a given eigenvalue is not unique. In this section we prove that the set of all eigenvectors corresponding to a given eigenvalue is a linear space. Definition Let be a matrix and one of its eigenvalues. The union of the zero vector and the set of all the eigenvectors corresponding to the eigenvalue is called the eigenspace of . Note that we include the zero vector in the eigenspace because eigenvectors are required to be non-zero. The next proposition shows that an eigenspace is closed with respect to linear combinations , that is, it is a linear space. Proposition The eigenspace corresponding to an eigenvalue is a linear space . Proof Clearly, once an eigenvalue has been found (e.g., by solving the characteristic equation), the eigenspace of can be found by solving the linear system Solved exercises Below you can find some exercises with explained solutions. Exercise 1 Consider the matrix Show that is an eigenvector of and find its corresponding eigenvalue. Solution We have that Thus, is an eigenvector of corresponding to the eigenvalue . Exercise 2 Define Find the eigenvalues of by solving the characteristic equation. Solution The characteristic equation is Therefore, the eigenvalues of are and . How to cite Please cite as: Taboga, Marco (2021). "Eigenvalues and eigenvectors", Lectures on matrix algebra. https://www.statlect.com/matrix-algebra/eigenvalues-and-eigenvectors.
Markdown
![Search for probability and statistics terms on Statlect](data:image/png;base64,iVBORw0KGgoAAAANSUhEUgAAABkAAAAXCAMAAADJPRQhAAACXlBMVEX///8zZpkyZZktY5guY5gxZZk0Z5kyZpkyZZgjWZAtYpchXJQcVo8tY5f//PgiWpE2XIkvY5ctZJkwZZksY5gmXpUlXJM0ZpklXZQkXJOpssJNbpbt5+csYpcvZJcmXZP79vQ6Y5H59fQjWpImXJItXpLX2N1OcpqUp78kV4wXUYuDk6saVI11h6NNbpUdVo8eV4+hrsFEbJf///7i4+d4kK6PrMmWpbgZVI6ToLW0vcoiU4cmVon69/fEx88YT4clWIxAa5oZUotAYo3Pz9Tt7O2vs8AZVY8nVYghWZExZZjt7fAHQn81Z5okXJSBlrAzXIovZJkoU4TEydMZVZAfVIrJ0NubrMAjWZEkWpD78OyuusseV5AeVIsrYJTe5e2ywdMjWpDu8fS2wtImWo/T2+Oqucqrucrx7Ovd2dsSUY20w9MdVIyxu8ppgqMoX5UjV4wQQ3szZprn5uk/ZpItZJgTTIcpYJZTdJpshqYcVo4iWpJ/katBZ5McU4q9xM/28/ImVooWUo5whKHn4uQ6Yo8mX5ZLbZX/+/k5Y5EvZJglXpYZTIJtiambqr6ouMuQorlNcJkTTYcuZJkrYpgoWIn18vIbToUeWJEpX5UjWpEeU4oxZpovV4f59vXg4ucjVIkuZJonVIa7wMtHa5QUTIYnX5X99vMvY5gsYZYfWZEdUIZxiqjR198nWIuGmbEkUYPd3+LO0tlsiatIcJw9aZg8bZ5AaZZUeqOVorfs7O7Cy9gkV4stZJojT4Ho5+n8+ff7/P3/+fYdSn359fNEa5YeS3769/X//vz69/Y1HbsLAAABU0lEQVR4Xm3JY5MrURiF0fN2x+bQtm3btm3btm1c2rbxr25V+mTSM5X96dm1EG02tXY+vo1G6PoMAl0trUx1xBGRUVfAwnBADgkMoQkBxg6ONDAz1wOmm7WHbbi9PrCdnDXiogeKIPdEhDy9vA+PwM9fDQECYMQF4xMSCsywTiySbqI4GqkXE6sb/4RKUVJySirSLC1dnkFVZhZk59AkN4/IL1BVYRGnpJQmZWJOeYWqKgVQVU2TGi7BrVNVfQOrqZkmLa1EG872DrJLA3d6oLcPd78SBocuZVgBI6O4x8YJmJikempaCTOzCG9ufgF0F5eWV1bX1jc2YWtbhGFnV8jcY7DY+/oHOjwWm398guH0jDy/uHHz1m2eVHr33v0HDxHeo8ek7OkzhJ6/ePnq9Zu3796r4cNH8tPnL1R//fYdaSbhy378RFr36/cfpH1//2mF/6EqUMbQOA8lAAAAAElFTkSuQmCC) [StatLect](https://www.statlect.com/) [Index](https://www.statlect.com/) \> [Matrix algebra](https://www.statlect.com/matrix-algebra/) # Eigenvalues and eigenvectors by [Marco Taboga](https://www.statlect.com/about/#author), PhD This lecture introduces the concepts of eigenvalues and eigenvectors of a square matrix. These are amongst the most useful concepts in linear algebra: studying the eigenvalues and eigenvectors of a square matrix is very frequent in applied work. ![Table of Contents](data:image/png;base64,iVBORw0KGgoAAAANSUhEUgAAAB0AAAAYCAIAAACJPGHrAAAAAXNSR0IArs4c6QAAAARnQU1BAACxjwv8YQUAAAAJcEhZcwAADsMAAA7DAcdvqGQAAABJSURBVEhLY/z//z8DDQATlKY2GGrmgsL3yJEjUB4Dg42NDZBEFiEeQPRCwGj4QsBQS7+j6QECRtMDBIymBwgYTQ8QMLTMZWAAAEw6JxXCQPIuAAAAAElFTkSuQmCC) Table of contents 1. [Intuition](https://www.statlect.com/matrix-algebra/eigenvalues-and-eigenvectors#hid2) 2. [Definition](https://www.statlect.com/matrix-algebra/eigenvalues-and-eigenvectors#hid3) 3. [Characteristic equation](https://www.statlect.com/matrix-algebra/eigenvalues-and-eigenvectors#hid4) 4. [Spectrum](https://www.statlect.com/matrix-algebra/eigenvalues-and-eigenvectors#hid5) 5. [Eigenspace](https://www.statlect.com/matrix-algebra/eigenvalues-and-eigenvectors#hid6) 6. [Solved exercises](https://www.statlect.com/matrix-algebra/eigenvalues-and-eigenvectors#hid7) 1. [Exercise 1](https://www.statlect.com/matrix-algebra/eigenvalues-and-eigenvectors#hid8) 2. [Exercise 2](https://www.statlect.com/matrix-algebra/eigenvalues-and-eigenvectors#hid9) ## Intuition Let us first develop some intuition about eigenvalues and eigenvectors. To do so, we start from some concepts we explained in the lecture on the [Determinant of a matrix](https://www.statlect.com/matrix-algebra/determinant-of-a-matrix). Consider the [linear space](https://www.statlect.com/matrix-algebra/linear-spaces) ![\$S\$](data:image/gif;base64,R0lGODlhAQABAIAAANvf7wAAACH5BAEAAAAALAAAAAABAAEAAAICRAEAOw==) of all ![\$2 imes 1\$](data:image/gif;base64,R0lGODlhAQABAIAAANvf7wAAACH5BAEAAAAALAAAAAABAAEAAAICRAEAOw==) real vectors, which can be represented as a Cartesian plane. A vector ![\$sin S\$](data:image/gif;base64,R0lGODlhAQABAIAAANvf7wAAACH5BAEAAAAALAAAAAABAAEAAAICRAEAOw==) is a point in the plane, and the first and second entries of ![\$s\$](data:image/gif;base64,R0lGODlhAQABAIAAANvf7wAAACH5BAEAAAAALAAAAAABAAEAAAICRAEAOw==) are the coordinates of the point. Now, consider a ![\$2 imes 2\$](data:image/gif;base64,R0lGODlhAQABAIAAANvf7wAAACH5BAEAAAAALAAAAAABAAEAAAICRAEAOw==) matrix ![A](data:image/gif;base64,R0lGODlhAQABAIAAANvf7wAAACH5BAEAAAAALAAAAAABAAEAAAICRAEAOw==). The matrix transforms any set of points ![\$Tsubset S\$](data:image/gif;base64,R0lGODlhAQABAIAAANvf7wAAACH5BAEAAAAALAAAAAABAAEAAAICRAEAOw==) (e.g., a rectangle, a circle) into another set of points ![\$T\_{A}\$](data:image/gif;base64,R0lGODlhAQABAIAAANvf7wAAACH5BAEAAAAALAAAAAABAAEAAAICRAEAOw==): ![\[eq1\]](https://www.statlect.com/images/eigenvalues-and-eigenvectors__9.png) If the points of ![\$T\$](data:image/gif;base64,R0lGODlhAQABAIAAANvf7wAAACH5BAEAAAAALAAAAAABAAEAAAICRAEAOw==) form a region whose area is equal to ![\$lpha \$](data:image/gif;base64,R0lGODlhAQABAIAAANvf7wAAACH5BAEAAAAALAAAAAABAAEAAAICRAEAOw==), and the area of the transformed region ![\$T\_{A}\$](data:image/gif;base64,R0lGODlhAQABAIAAANvf7wAAACH5BAEAAAAALAAAAAABAAEAAAICRAEAOw==) is ![\$lpha \_{A}\$](data:image/gif;base64,R0lGODlhAQABAIAAANvf7wAAACH5BAEAAAAALAAAAAABAAEAAAICRAEAOw==), then![\[eq2\]](data:image/gif;base64,R0lGODlhAQABAIAAANvf7wAAACH5BAEAAAAALAAAAAABAAEAAAICRAEAOw==) In other words, the determinant tells us by how much the linear transformation associated with the matrix ![A](data:image/gif;base64,R0lGODlhAQABAIAAANvf7wAAACH5BAEAAAAALAAAAAABAAEAAAICRAEAOw==) scales up or down the area of shapes. Eigenvalues and eigenvectors provide us with another useful piece of information. **They tell us by how much the linear transformation scales up or down the sides of certain parallelograms**. Consider the parallelograms that have one vertex at the origin of the Cartesian plane. The four vertices are![\[eq3\]](data:image/gif;base64,R0lGODlhAQABAIAAANvf7wAAACH5BAEAAAAALAAAAAABAAEAAAICRAEAOw==)where ![\$x\_{1}\$](data:image/gif;base64,R0lGODlhAQABAIAAANvf7wAAACH5BAEAAAAALAAAAAABAAEAAAICRAEAOw==) and ![\$x\_{2}\$](data:image/gif;base64,R0lGODlhAQABAIAAANvf7wAAACH5BAEAAAAALAAAAAABAAEAAAICRAEAOw==) are ![\$2 imes 1\$](data:image/gif;base64,R0lGODlhAQABAIAAANvf7wAAACH5BAEAAAAALAAAAAABAAEAAAICRAEAOw==) vectors and ![0](data:image/gif;base64,R0lGODlhAQABAIAAANvf7wAAACH5BAEAAAAALAAAAAABAAEAAAICRAEAOw==) is the zero vector. There are two vectors ![\$x\_{1}\$](data:image/gif;base64,R0lGODlhAQABAIAAANvf7wAAACH5BAEAAAAALAAAAAABAAEAAAICRAEAOw==) and ![\$x\_{2}\$](data:image/gif;base64,R0lGODlhAQABAIAAANvf7wAAACH5BAEAAAAALAAAAAABAAEAAAICRAEAOw==), called the eigenvectors of ![A](data:image/gif;base64,R0lGODlhAQABAIAAANvf7wAAACH5BAEAAAAALAAAAAABAAEAAAICRAEAOw==), such that the associated parallelogram is transformed by ![A](data:image/gif;base64,R0lGODlhAQABAIAAANvf7wAAACH5BAEAAAAALAAAAAABAAEAAAICRAEAOw==) into a new parallelogram having vertices![\[eq4\]](data:image/gif;base64,R0lGODlhAQABAIAAANvf7wAAACH5BAEAAAAALAAAAAABAAEAAAICRAEAOw==)where ![\$lambda \_{1}\$](data:image/gif;base64,R0lGODlhAQABAIAAANvf7wAAACH5BAEAAAAALAAAAAABAAEAAAICRAEAOw==) and ![\$lambda \_{2}\$](data:image/gif;base64,R0lGODlhAQABAIAAANvf7wAAACH5BAEAAAAALAAAAAABAAEAAAICRAEAOw==) are two scalars called the eigenvalues of ![A](data:image/gif;base64,R0lGODlhAQABAIAAANvf7wAAACH5BAEAAAAALAAAAAABAAEAAAICRAEAOw==). In other words, the linear transformation multiplies the length of one pair of parallel sides by ![\$lambda \_{1}\$](data:image/gif;base64,R0lGODlhAQABAIAAANvf7wAAACH5BAEAAAAALAAAAAABAAEAAAICRAEAOw==) and the length of the other pair by ![\$lambda \_{2}\$](data:image/gif;base64,R0lGODlhAQABAIAAANvf7wAAACH5BAEAAAAALAAAAAABAAEAAAICRAEAOw==), but it keeps the angles of the parallelogram unchanged. The next figure provides an illustration of this kind of transformation: the original parallelogram (in blue) is transformed into another parallelogram (in red) by a matrix ![A](data:image/gif;base64,R0lGODlhAQABAIAAANvf7wAAACH5BAEAAAAALAAAAAABAAEAAAICRAEAOw==) whose eigenvalues are equal to ![\$lambda \_{1}=3/2\$](data:image/gif;base64,R0lGODlhAQABAIAAANvf7wAAACH5BAEAAAAALAAAAAABAAEAAAICRAEAOw==) and ![\$lambda \_{2}=2\$](data:image/gif;base64,R0lGODlhAQABAIAAANvf7wAAACH5BAEAAAAALAAAAAABAAEAAAICRAEAOw==). ![Plot of a parallelogram formed by the eigenvectors of a matrix](https://www.statlect.com/images/linear-transformation-with-eigenvectors.png) Since a pair of parallel sides is scaled by ![\$lambda \_{1}\$](data:image/gif;base64,R0lGODlhAQABAIAAANvf7wAAACH5BAEAAAAALAAAAAABAAEAAAICRAEAOw==) and the other pair by ![\$lambda \_{2}\$](data:image/gif;base64,R0lGODlhAQABAIAAANvf7wAAACH5BAEAAAAALAAAAAABAAEAAAICRAEAOw==), the area of the parallelogram is scaled by a factor of ![\[eq5\]](data:image/gif;base64,R0lGODlhAQABAIAAANvf7wAAACH5BAEAAAAALAAAAAABAAEAAAICRAEAOw==). But we also know that the area of the parallelogram is scaled by ![\[eq6\]](data:image/gif;base64,R0lGODlhAQABAIAAANvf7wAAACH5BAEAAAAALAAAAAABAAEAAAICRAEAOw==). As a consequence,![\[eq7\]](https://www.statlect.com/images/eigenvalues-and-eigenvectors__38.png)that is, the determinant of a matrix is equal to the product of its eigenvalues, a fact that holds in general. The definition of eigenvalues and eigenvectors we are going to provide below generalizes these concepts to linear spaces that can have more than two dimensions. ## Definition We are now ready to define eigenvalues and eigenvectors. Definition Let ![A](data:image/gif;base64,R0lGODlhAQABAIAAANvf7wAAACH5BAEAAAAALAAAAAABAAEAAAICRAEAOw==) be a ![\$K imes K\$](data:image/gif;base64,R0lGODlhAQABAIAAANvf7wAAACH5BAEAAAAALAAAAAABAAEAAAICRAEAOw==) [matrix](https://www.statlect.com/matrix-algebra/vectors-and-matrices). If there exist a ![Kx1](data:image/gif;base64,R0lGODlhAQABAIAAANvf7wAAACH5BAEAAAAALAAAAAABAAEAAAICRAEAOw==) vector ![\$x eq 0\$](data:image/gif;base64,R0lGODlhAQABAIAAANvf7wAAACH5BAEAAAAALAAAAAABAAEAAAICRAEAOw==) and a scalar ![\$lambda \$](data:image/gif;base64,R0lGODlhAQABAIAAANvf7wAAACH5BAEAAAAALAAAAAABAAEAAAICRAEAOw==) such that![\[eq8\]](data:image/gif;base64,R0lGODlhAQABAIAAANvf7wAAACH5BAEAAAAALAAAAAABAAEAAAICRAEAOw==)then ![\$lambda \$](data:image/gif;base64,R0lGODlhAQABAIAAANvf7wAAACH5BAEAAAAALAAAAAABAAEAAAICRAEAOw==) is called an eigenvalue of ![A](data:image/gif;base64,R0lGODlhAQABAIAAANvf7wAAACH5BAEAAAAALAAAAAABAAEAAAICRAEAOw==) and ![x](data:image/gif;base64,R0lGODlhAQABAIAAANvf7wAAACH5BAEAAAAALAAAAAABAAEAAAICRAEAOw==) an eigenvector corresponding to ![\$lambda \$](data:image/gif;base64,R0lGODlhAQABAIAAANvf7wAAACH5BAEAAAAALAAAAAABAAEAAAICRAEAOw==). This definition fits with the example above about the vertices of the parallelogram. The two vertices ![\$x\_{1}\$](data:image/gif;base64,R0lGODlhAQABAIAAANvf7wAAACH5BAEAAAAALAAAAAABAAEAAAICRAEAOw==) and ![\$x\_{2}\$](data:image/gif;base64,R0lGODlhAQABAIAAANvf7wAAACH5BAEAAAAALAAAAAABAAEAAAICRAEAOw==) are eigenvectors corresponding to the eigenvalues ![\$lambda \_{1}\$](data:image/gif;base64,R0lGODlhAQABAIAAANvf7wAAACH5BAEAAAAALAAAAAABAAEAAAICRAEAOw==) and ![\$lambda \_{2}\$](data:image/gif;base64,R0lGODlhAQABAIAAANvf7wAAACH5BAEAAAAALAAAAAABAAEAAAICRAEAOw==) because![\[eq9\]](data:image/gif;base64,R0lGODlhAQABAIAAANvf7wAAACH5BAEAAAAALAAAAAABAAEAAAICRAEAOw==)Furthermore, these two equations can be added so as to obtain the transformation of the vertex ![\$x\_{1}+x\_{2}\$](data:image/gif;base64,R0lGODlhAQABAIAAANvf7wAAACH5BAEAAAAALAAAAAABAAEAAAICRAEAOw==):![\[eq10\]](https://www.statlect.com/images/eigenvalues-and-eigenvectors__55.png) ## Characteristic equation Note that the eigenvalue equation![\[eq11\]](data:image/gif;base64,R0lGODlhAQABAIAAANvf7wAAACH5BAEAAAAALAAAAAABAAEAAAICRAEAOw==)can be written as![\[eq12\]](data:image/gif;base64,R0lGODlhAQABAIAAANvf7wAAACH5BAEAAAAALAAAAAABAAEAAAICRAEAOw==)where ![I](data:image/gif;base64,R0lGODlhAQABAIAAANvf7wAAACH5BAEAAAAALAAAAAABAAEAAAICRAEAOw==) is the ![\$K imes K\$](data:image/gif;base64,R0lGODlhAQABAIAAANvf7wAAACH5BAEAAAAALAAAAAABAAEAAAICRAEAOw==) [identity matrix](https://www.statlect.com/matrix-algebra/identity-matrix). The latter equation has a non-zero solution only if the columns of the matrix ![\$lambda I-A\$](data:image/gif;base64,R0lGODlhAQABAIAAANvf7wAAACH5BAEAAAAALAAAAAABAAEAAAICRAEAOw==) are [linearly dependent](https://www.statlect.com/matrix-algebra/linear-independence), that is, if the matrix is [singular](https://www.statlect.com/matrix-algebra/inverse-matrix). But [a matrix is singular if and only if its determinant is zero](https://www.statlect.com/matrix-algebra/determinant-properties). As a consequence, any eigenvalue of ![A](data:image/gif;base64,R0lGODlhAQABAIAAANvf7wAAACH5BAEAAAAALAAAAAABAAEAAAICRAEAOw==) must satisfy the equation![\[eq13\]](data:image/gif;base64,R0lGODlhAQABAIAAANvf7wAAACH5BAEAAAAALAAAAAABAAEAAAICRAEAOw==)which is called **characteristic equation**. The expression ![\[eq14\]](data:image/gif;base64,R0lGODlhAQABAIAAANvf7wAAACH5BAEAAAAALAAAAAABAAEAAAICRAEAOw==) is a monic polynomial of degree ![K](data:image/gif;base64,R0lGODlhAQABAIAAANvf7wAAACH5BAEAAAAALAAAAAABAAEAAAICRAEAOw==) in ![\$lambda \$](data:image/gif;base64,R0lGODlhAQABAIAAANvf7wAAACH5BAEAAAAALAAAAAABAAEAAAICRAEAOw==), known as characteristic polynomial. By using the [fundamental theorem of algebra](https://www.statlect.com/matrix-algebra/polynomials-in-linear-algebra), it is possible to write the characteristic equation as![\[eq15\]](https://www.statlect.com/images/eigenvalues-and-eigenvectors__66.png)where ![\[eq16\]](data:image/gif;base64,R0lGODlhAQABAIAAANvf7wAAACH5BAEAAAAALAAAAAABAAEAAAICRAEAOw==) are the ![K](data:image/gif;base64,R0lGODlhAQABAIAAANvf7wAAACH5BAEAAAAALAAAAAABAAEAAAICRAEAOw==) solutions of the equation (i.e., the roots of the characteristic polynomial). The fundamental theorem of algebra guarantees that exactly ![K](data:image/gif;base64,R0lGODlhAQABAIAAANvf7wAAACH5BAEAAAAALAAAAAABAAEAAAICRAEAOw==) solutions exist, but these solutions are not guaranteed to be real (i.e., they can be complex numbers), even when the entries of ![A](data:image/gif;base64,R0lGODlhAQABAIAAANvf7wAAACH5BAEAAAAALAAAAAABAAEAAAICRAEAOw==) are all real. They are also not guaranteed to be distinct, that is, two solutions could be equal. ## Spectrum In the previous section we have explained that a ![\$K imes K\$](data:image/gif;base64,R0lGODlhAQABAIAAANvf7wAAACH5BAEAAAAALAAAAAABAAEAAAICRAEAOw==) matrix ![A](data:image/gif;base64,R0lGODlhAQABAIAAANvf7wAAACH5BAEAAAAALAAAAAABAAEAAAICRAEAOw==) has ![K](data:image/gif;base64,R0lGODlhAQABAIAAANvf7wAAACH5BAEAAAAALAAAAAABAAEAAAICRAEAOw==) not necessarily distinct and possibly complex eigenvalues. The set of all eigenvalues of ![A](data:image/gif;base64,R0lGODlhAQABAIAAANvf7wAAACH5BAEAAAAALAAAAAABAAEAAAICRAEAOw==) is called the spectrum of ![A](data:image/gif;base64,R0lGODlhAQABAIAAANvf7wAAACH5BAEAAAAALAAAAAABAAEAAAICRAEAOw==). ## Eigenspace Note that if![\[eq11\]](data:image/gif;base64,R0lGODlhAQABAIAAANvf7wAAACH5BAEAAAAALAAAAAABAAEAAAICRAEAOw==)then you can multiply both sides of the equation by a non-zero scalar ![\$lpha \$](data:image/gif;base64,R0lGODlhAQABAIAAANvf7wAAACH5BAEAAAAALAAAAAABAAEAAAICRAEAOw==) and get![\[eq18\]](data:image/gif;base64,R0lGODlhAQABAIAAANvf7wAAACH5BAEAAAAALAAAAAABAAEAAAICRAEAOw==) In other words, if ![\$lambda \$](data:image/gif;base64,R0lGODlhAQABAIAAANvf7wAAACH5BAEAAAAALAAAAAABAAEAAAICRAEAOw==) is an eigenvalue of ![A](data:image/gif;base64,R0lGODlhAQABAIAAANvf7wAAACH5BAEAAAAALAAAAAABAAEAAAICRAEAOw==) and ![x](data:image/gif;base64,R0lGODlhAQABAIAAANvf7wAAACH5BAEAAAAALAAAAAABAAEAAAICRAEAOw==) is an eigenvector corresponding to ![\$lambda \$](data:image/gif;base64,R0lGODlhAQABAIAAANvf7wAAACH5BAEAAAAALAAAAAABAAEAAAICRAEAOw==), then any multiple of ![x](data:image/gif;base64,R0lGODlhAQABAIAAANvf7wAAACH5BAEAAAAALAAAAAABAAEAAAICRAEAOw==) is an eigenvector corresponding to ![\$lambda \$](data:image/gif;base64,R0lGODlhAQABAIAAANvf7wAAACH5BAEAAAAALAAAAAABAAEAAAICRAEAOw==). Thus, the eigenvector corresponding to a given eigenvalue is not unique. In this section we prove that the set of all eigenvectors corresponding to a given eigenvalue is a linear space. Definition Let ![A](data:image/gif;base64,R0lGODlhAQABAIAAANvf7wAAACH5BAEAAAAALAAAAAABAAEAAAICRAEAOw==) be a ![\$K imes K\$](data:image/gif;base64,R0lGODlhAQABAIAAANvf7wAAACH5BAEAAAAALAAAAAABAAEAAAICRAEAOw==) matrix and ![\$lambda \$](data:image/gif;base64,R0lGODlhAQABAIAAANvf7wAAACH5BAEAAAAALAAAAAABAAEAAAICRAEAOw==) one of its eigenvalues. The union of the zero vector and the set of all the eigenvectors corresponding to the eigenvalue ![\$lambda \$](data:image/gif;base64,R0lGODlhAQABAIAAANvf7wAAACH5BAEAAAAALAAAAAABAAEAAAICRAEAOw==) is called the eigenspace of ![\$lambda \$](data:image/gif;base64,R0lGODlhAQABAIAAANvf7wAAACH5BAEAAAAALAAAAAABAAEAAAICRAEAOw==). Note that we include the zero vector in the eigenspace because eigenvectors are required to be non-zero. The next proposition shows that an eigenspace is closed with respect to [linear combinations](https://www.statlect.com/matrix-algebra/linear-combinations), that is, it is a linear space. Proposition The eigenspace corresponding to an eigenvalue is a [linear space](https://www.statlect.com/matrix-algebra/linear-spaces). Proof Suppose that ![\$lambda \$](data:image/gif;base64,R0lGODlhAQABAIAAANvf7wAAACH5BAEAAAAALAAAAAABAAEAAAICRAEAOw==) is an eigenvalue of a square matrix ![A](data:image/gif;base64,R0lGODlhAQABAIAAANvf7wAAACH5BAEAAAAALAAAAAABAAEAAAICRAEAOw==) and take any two vectors ![\$x\_{1}\$](data:image/gif;base64,R0lGODlhAQABAIAAANvf7wAAACH5BAEAAAAALAAAAAABAAEAAAICRAEAOw==) and ![\$x\_{2}\$](data:image/gif;base64,R0lGODlhAQABAIAAANvf7wAAACH5BAEAAAAALAAAAAABAAEAAAICRAEAOw==) belonging to the eigenspace of ![\$lambda \$](data:image/gif;base64,R0lGODlhAQABAIAAANvf7wAAACH5BAEAAAAALAAAAAABAAEAAAICRAEAOw==). Then,![\[eq19\]](data:image/gif;base64,R0lGODlhAQABAIAAANvf7wAAACH5BAEAAAAALAAAAAABAAEAAAICRAEAOw==)Now take a linear combination ![\$y\$](data:image/gif;base64,R0lGODlhAQABAIAAANvf7wAAACH5BAEAAAAALAAAAAABAAEAAAICRAEAOw==) of the two eigenvectors![\[eq20\]](https://www.statlect.com/images/eigenvalues-and-eigenvectors__97.png)where ![\$lpha \_{1}\$](data:image/gif;base64,R0lGODlhAQABAIAAANvf7wAAACH5BAEAAAAALAAAAAABAAEAAAICRAEAOw==) and ![\$lpha \_{2}\$](data:image/gif;base64,R0lGODlhAQABAIAAANvf7wAAACH5BAEAAAAALAAAAAABAAEAAAICRAEAOw==) are two scalars. Then,![\[eq21\]](https://www.statlect.com/images/eigenvalues-and-eigenvectors__100.png)Thus, ![\$y\$](data:image/gif;base64,R0lGODlhAQABAIAAANvf7wAAACH5BAEAAAAALAAAAAABAAEAAAICRAEAOw==) is an eigenvector corresponding to ![\$lambda \$](data:image/gif;base64,R0lGODlhAQABAIAAANvf7wAAACH5BAEAAAAALAAAAAABAAEAAAICRAEAOw==). In other words, any linear combination of the vectors of the eigenspace belongs to the eigenspace. Clearly, once an eigenvalue ![\$lambda ~\$](data:image/gif;base64,R0lGODlhAQABAIAAANvf7wAAACH5BAEAAAAALAAAAAABAAEAAAICRAEAOw==)has been found (e.g., by solving the characteristic equation), the eigenspace of ![\$lambda \$](data:image/gif;base64,R0lGODlhAQABAIAAANvf7wAAACH5BAEAAAAALAAAAAABAAEAAAICRAEAOw==) can be found by [solving the linear system](https://www.statlect.com/matrix-algebra/systems-of-linear-equations-and-matrices)![\[eq12\]](data:image/gif;base64,R0lGODlhAQABAIAAANvf7wAAACH5BAEAAAAALAAAAAABAAEAAAICRAEAOw==) ## Solved exercises Below you can find some exercises with explained solutions. ### Exercise 1 Consider the matrix ![\[eq23\]](data:image/gif;base64,R0lGODlhAQABAIAAANvf7wAAACH5BAEAAAAALAAAAAABAAEAAAICRAEAOw==)Show that ![\[eq24\]](data:image/gif;base64,R0lGODlhAQABAIAAANvf7wAAACH5BAEAAAAALAAAAAABAAEAAAICRAEAOw==)is an eigenvector of ![A](data:image/gif;base64,R0lGODlhAQABAIAAANvf7wAAACH5BAEAAAAALAAAAAABAAEAAAICRAEAOw==) and find its corresponding eigenvalue. Solution We have that![\[eq25\]](https://www.statlect.com/images/eigenvalues-and-eigenvectors__109.png)Thus, ![x](data:image/gif;base64,R0lGODlhAQABAIAAANvf7wAAACH5BAEAAAAALAAAAAABAAEAAAICRAEAOw==) is an eigenvector of ![A](data:image/gif;base64,R0lGODlhAQABAIAAANvf7wAAACH5BAEAAAAALAAAAAABAAEAAAICRAEAOw==) corresponding to the eigenvalue ![\$lambda =1\$](data:image/gif;base64,R0lGODlhAQABAIAAANvf7wAAACH5BAEAAAAALAAAAAABAAEAAAICRAEAOw==). ### Exercise 2 Define![\[eq26\]](data:image/gif;base64,R0lGODlhAQABAIAAANvf7wAAACH5BAEAAAAALAAAAAABAAEAAAICRAEAOw==)Find the eigenvalues of ![A](data:image/gif;base64,R0lGODlhAQABAIAAANvf7wAAACH5BAEAAAAALAAAAAABAAEAAAICRAEAOw==) by solving the characteristic equation. Solution The characteristic equation is![\[eq27\]](https://www.statlect.com/images/eigenvalues-and-eigenvectors__115.png)Therefore, the eigenvalues of ![A](data:image/gif;base64,R0lGODlhAQABAIAAANvf7wAAACH5BAEAAAAALAAAAAABAAEAAAICRAEAOw==) are ![\$lambda \_{1}=2\$](data:image/gif;base64,R0lGODlhAQABAIAAANvf7wAAACH5BAEAAAAALAAAAAABAAEAAAICRAEAOw==) and ![\$lambda \_{2}=1\$](data:image/gif;base64,R0lGODlhAQABAIAAANvf7wAAACH5BAEAAAAALAAAAAABAAEAAAICRAEAOw==). ## How to cite Please cite as: Taboga, Marco (2021). "Eigenvalues and eigenvectors", Lectures on matrix algebra. https://www.statlect.com/matrix-algebra/eigenvalues-and-eigenvectors. The books Most of the learning materials found on this website are now available in a traditional textbook format. [Probability and statistics](https://www.statlect.com/about/book)[Matrix algebra](https://www.statlect.com/about/linear-algebra-book) Featured pages - [Wald test](https://www.statlect.com/fundamentals-of-statistics/Wald-test) - [Binomial distribution](https://www.statlect.com/probability-distributions/binomial-distribution) - [Convergence in distribution](https://www.statlect.com/asymptotic-theory/convergence-in-distribution) - [Normal distribution](https://www.statlect.com/probability-distributions/normal-distribution) - [Poisson distribution](https://www.statlect.com/probability-distributions/Poisson-distribution) - [Gamma function](https://www.statlect.com/mathematical-tools/gamma-function) Explore - [Uniform distribution](https://www.statlect.com/probability-distributions/uniform-distribution) - [Maximum likelihood](https://www.statlect.com/fundamentals-of-statistics/maximum-likelihood) - [Student t distribution](https://www.statlect.com/probability-distributions/student-t-distribution) Main sections - [Mathematical tools](https://www.statlect.com/mathematical-tools/) - [Fundamentals of probability](https://www.statlect.com/fundamentals-of-probability/) - [Probability distributions](https://www.statlect.com/probability-distributions/) - [Asymptotic theory](https://www.statlect.com/asymptotic-theory/) - [Fundamentals of statistics](https://www.statlect.com/fundamentals-of-statistics/) - [Glossary](https://www.statlect.com/glossary/) About - [About Statlect](https://www.statlect.com/about/) - [Contacts](https://www.statlect.com/about/contacts) - [Cookies, privacy and terms of use](https://www.statlect.com/about/cookies-privacy-policy-and-terms-of-use) Glossary entries - [Probability space](https://www.statlect.com/glossary/probability-space) - [Mean squared error](https://www.statlect.com/glossary/mean-squared-error) - [Probability mass function](https://www.statlect.com/glossary/probability-mass-function) - [Type I error](https://www.statlect.com/glossary/Type-I-error) - [Probability density function](https://www.statlect.com/glossary/probability-density-function) - [Distribution function](https://www.statlect.com/glossary/distribution-function) Share - To enhance your privacy, - we removed the social buttons, - but **don't forget to share**.
Readable Markdownnull
Shard126 (laksa)
Root Hash3586688910177265926
Unparsed URLcom,statlect!www,/matrix-algebra/eigenvalues-and-eigenvectors s443