🕷️ Crawler Inspector

URL Lookup

Direct Parameter Lookup

Raw Queries and Responses

1. Shard Calculation

Query:
Response:
Calculated Shard: 79 (from laksa092)

2. Crawled Status Check

Query:
Response:

3. Robots.txt Check

Query:
Response:

4. Spam/Ban Check

Query:
Response:

5. Seen Status Check

ℹ️ Skipped - page is already crawled

📄
INDEXABLE
CRAWLED
23 hours ago
🤖
ROBOTS SERVER UNREACHABLE
Failed to connect to robots server: Operation timed out after 2002 milliseconds with 0 bytes received

Page Info Filters

FilterStatusConditionDetails
HTTP statusPASSdownload_http_code = 200HTTP 200
Age cutoffPASSdownload_stamp > now() - 6 MONTH0 months ago
History dropPASSisNull(history_drop_reason)No drop reason
Spam/banPASSfh_dont_index != 1 AND ml_spam_score = 0ml_spam_score=0
CanonicalPASSmeta_canonical IS NULL OR = '' OR = src_unparsedNot set

Page Details

PropertyValue
URLhttps://mathforquantum.quantumtinkerer.tudelft.nl/6_eigenvectors_QM/
Last Crawled2026-04-07 09:13:41 (23 hours ago)
First Indexednot set
HTTP Status Code200
Meta TitleEigenvalues and eigenvectors - Mathematics for Quantum Physics
Meta DescriptionLecture notes for the TU Delft course TN3105 - Mathematics for Quantum Physics
Meta Canonicalnull
Boilerpipe Text
The lecture on eigenvalues and eigenvectors consists of the following parts: 6.1. Eigenvalue equations in linear algebra 6.2. Eigenvalue equations in quantum mechanics and at the end of the lecture notes, there is a set of corresponding exercises: 6.3. Problems The contents of this lecture are summarised in the following video : Eigenvalues and eigenvectors The total length of the videos: ~3 minutes 30 seconds In the previous lecture, we discussed a number of operator equations , which have the form A ^ | ψ ⟩ = | φ ⟩ , where | ψ ⟩ and | φ ⟩ are state vectors belonging to the Hilbert space of the system H . Eigenvalue equation: A specific class of operator equations, which appear frequently in quantum mechanics, consists of equations in the form A ^ | ψ ⟩ = λ ψ | ψ ⟩ , where λ ψ is a scalar (in general complex). These are equations where the action of the operator A ^ on the state vector | ψ ⟩ returns the same state vector multiplied by the scalar λ ψ . This type of operator equations are known as eigenvalue equations and are of great importance for the description of quantum systems. In this lecture, we present the main ingredients of these equations and how we can apply them to quantum systems. 6.1. Eigenvalue equations in linear algebra ¶ First of all, let us review eigenvalue equations in linear algebra. Assume that we have a (square) matrix A with dimensions n × n and v → is a column vector in n dimensions. The corresponding eigenvalue equation will be of form A v → = λ v → . with λ being a scalar number (real or complex, depending on the type of vector space). We can express the previous equation in terms of its components, assuming as usual some specific choice of basis, by using the rules of matrix multiplication: Eigenvalue equation: Eigenvalue and Eigenvector ∑ j = 1 n A i j v j = λ v i . The scalar λ is known as the eigenvalue of the equation, while the vector v → is known as the associated eigenvector . The key feature of such equations is that applying a matrix A to the vector v → returns the original vector up to an overall rescaling, λ v → . Number of solutions In general, there will be multiple solutions to the eigenvalue equation A v → = λ v → , each one characterised by an specific eigenvalue and eigenvectors. Note that in some cases one has degenerate solutions , whereby a given matrix has two or more eigenvectors that are equal. Characteristic equation: In order to determine the eigenvalues of the matrix A , we need to evaluate the solutions of the so-called characteristic equation of the matrix A , defined as d e t ( A − λ I ) = 0 , where I is the identity matrix of dimensions n × n , and d e t is the determinant. This relation follows from the eigenvalue equation in terms of components ∑ j = 1 n A i j v j = λ v i , → ∑ j = 1 n A i j v j − ∑ j = 1 n λ δ i j v j = 0 , → ∑ j = 1 n ( A i j − λ δ i j ) v j = 0 . Therefore, the eigenvalue condition can be written as a set of coupled linear equations ∑ j = 1 n ( A i j − λ δ i j ) v j = 0 , i = 1 , 2 , … , n , which only admit non-trivial solutions if the determinant of the matrix A − λ I vanishes (the so-called Cramer's condition), thus leading to the characteristic equation. Once we have solved the characteristic equation, we end up with n eigenvalues λ k , k = 1 , … , n . We can then determine the corresponding eigenvector v → k = ( v k , 1 v k , 2 ⋮ v k , n ) , by solving the corresponding system of linear equations ∑ j = 1 n ( A i j − λ k δ i j ) v k , j = 0 , i = 1 , 2 , … , n , Let us remind ourselves that in n = 2 dimensions the determinant of a matrix is evaluated as d e t ( A ) = | A 11 A 12 A 21 A 22 | = A 11 A 22 − A 12 A 21 , while the corresponding expression for a matrix belonging to a vector space in n = 3 dimensions in terms of the previous expression will be given as d e t ( A ) = | A 11 A 12 A 13 A 21 A 22 A 23 A 31 A 32 A 33 | = + A 11 | A 22 A 23 A 32 A 33 | − A 12 | A 21 A 23 A 31 A 33 | + A 13 | A 21 A 22 A 31 A 32 | Example Let us illustrate how to compute eigenvalues and eigenvectors by considering a n = 2 vector space. Consider the following matrix A = ( 1 2 − 1 4 ) , which has associated the following characteristic equation d e t ( A − λ ⋅ I ) = | 1 − λ 2 − 1 4 − λ | = ( 1 − λ ) ( 4 − λ ) + 2 = λ 2 − 5 λ + 6 = 0 . This is a quadratic equation which we know how to solve exactly; the two eigenvalues are λ 1 = 3 and λ 2 = 2 . Next, we can determine the associated eigenvectors v → 1 and v → 2 . For the first one, the equation to solve is ( 1 2 − 1 4 ) ( v 1 , 1 v 1 , 2 ) = λ 1 ( v 1 , 1 v 1 , 2 ) = 3 ( v 1 , 1 v 1 , 2 ) from where we find the condition that v 1 , 1 = v 1 , 2 . An important property of eigenvalue equations is that the eigenvectors are only fixed up to an overall normalisation condition . This should be clear from its definition: if a vector v → satisfies A v → = λ v → , then the vector v → ′ = c v → with c some constant will also satisfy the same equation. So then we find that the eigenvalue λ 1 has an associated eigenvector v → 1 = ( 1 1 ) , and indeed one can check that A v → 1 = ( 1 2 − 1 4 ) ( 1 1 ) = ( 3 3 ) = 3 v → 1 , as we intended to demonstrate. Exercise As an exercise, try to obtain the expression of the eigenvector corresponding to the second eigenvalue λ 2 = 2 . 6.2. Eigenvalue equations in quantum mechanics ¶ We can now extend the ideas of eigenvalue equations from linear algebra to the case of quantum mechanics. The starting point is the eigenvalue equation for the operator A ^ , A ^ | ψ ⟩ = λ ψ | ψ ⟩ , where the vector state | ψ ⟩ is the eigenvector of the equation and λ ψ is the corresponding eigenvalue, in general a complex scalar. In general this equation will have multiple solutions, which for a Hilbert space H with n dimensions can be labelled as A ^ | ψ k ⟩ = λ ψ k | ψ k ⟩ , k = 1 , … , n . In order to determine the eigenvalues and eigenvectors of a given operator A ^ , we will have to solve the corresponding eigenvalue problem for this operator, what we called above as the characteristic equation . This is most efficiently done in the matrix representation of this operation, where we have that the above operator equation can be expressed in terms of its components as ( A 11 A 12 A 13 … A 21 A 22 A 23 … A 31 A 32 A 33 … ⋮ ⋮ ⋮ ) ( ψ k , 1 ψ k , 2 ψ k , 3 ⋮ ) = λ ψ k ( ψ k , 1 ψ k , 2 ψ k , 3 ⋮ ) . As discussed above, this condition is identical to solving a set of linear equations for the form ( A 11 − λ ψ k A 12 A 13 … A 21 A 22 − λ ψ k A 23 … A 31 A 32 A 33 − λ ψ k … ⋮ ⋮ ⋮ ) ( ψ k , 1 ψ k , 2 ψ k , 3 ⋮ ) = 0 . Cramer's rule This set of linear equations only has a non-trivial set of solutions provided that the determinant of the matrix vanishes, as follows from the Cramer's condition: d e t ( A 11 − λ ψ A 12 A 13 … A 21 A 22 − λ ψ A 23 … A 31 A 32 A 33 − λ ψ … ⋮ ⋮ ⋮ ) = | A 11 − λ ψ A 12 A 13 … A 21 A 22 − λ ψ A 23 … A 31 A 32 A 33 − λ ψ … ⋮ ⋮ ⋮ | = 0 which in general will have n independent solutions, which we label as λ ψ , k . Once we have solved the n eigenvalues { λ ψ , k } , we can insert each of them in the original evolution equation and determine the components of each of the eigenvectors, which we can express as columns vectors | ψ 1 ⟩ = ( ψ 1 , 1 ψ 1 , 2 ψ 1 , 3 ⋮ ) , | ψ 2 ⟩ = ( ψ 2 , 1 ψ 2 , 2 ψ 2 , 3 ⋮ ) , … , | ψ n ⟩ = ( ψ n , 1 ψ n , 2 ψ n , 3 ⋮ ) . Orthogonality of eigenvectors An important property of eigenvalue equations is that if you have two eigenvectors | ψ i ⟩ and | ψ j ⟩ that have associated different eigenvalues, λ ψ i ≠ λ ψ j , then these two eigenvectors are orthogonal to each other, that is ⟨ ψ j | ψ i ⟩ = 0 f o r i ≠ j . This property is extremely important, since it suggest that we could use the eigenvectors of an eigenvalue equation as a set of basis elements for this Hilbert space. Recall from the discussions of eigenvalue equations in linear algebra that the eigenvectors | ψ i ⟩ are defined up to an overall normalisation constant . Clearly, if | ψ i ⟩ is a solution of A ^ | ψ i ⟩ = λ ψ i | ψ i ⟩ then c | ψ i ⟩ will also be a solution, with c being a constant. In the context of quantum mechanics, we need to choose this overall rescaling constant to ensure that the eigenvectors are normalised, thus they satisfy ⟨ ψ i | ψ i ⟩ = 1 f o r   a l l   i . With such a choice of normalisation, one says that the eigenvectors in a set are orthogonal among them. Eigenvalue spectrum and degeneracy The set of all eigenvalues of an operator is called the eigenvalue spectrum of an operator. Note that different eigenvectors can also have the same eigenvalue. If this is the case the eigenvalue is said to be degenerate . 6.3. Problems ¶ Eigenvalues and eigenvectors I Find the characteristic polynomial and eigenvalues for each of the following matrices, A = ( 5 3 2 10 ) B = ( 7 i − 1 2 6 i ) C = ( 2 0 − 1 0 3 1 1 0 4 ) Hamiltonian The Hamiltonian for a two-state system is given by H = ( ω 1 ω 2 ω 2 ω 1 ) A basis for this system is | 0 ⟩ = ( 1 0 ) , | 1 ⟩ = ( 0 1 ) Find the eigenvalues and eigenvectors of the Hamiltonian H , and express the eigenvectors in terms of { | 0 ⟩ , | 1 ⟩ } Eigenvalues and eigenvectors II Find the eigenvalues and eigenvectors of the matrices A = ( − 2 − 1 − 1 6 3 2 0 0 1 ) B = ( 1 1 2 2 2 2 − 1 − 1 − 1 ) . The Hadamard gate In one of the problems of the previous section we discussed that an important operator used in quantum computation is the Hadamard gate , which is represented by the matrix: H ^ = 1 2 ( 1 1 1 − 1 ) . Determine the eigenvalues and eigenvectors of this operator. Hermitian matrix Show that the Hermitian matrix ( 0 0 i 0 1 0 − i 0 0 ) has only two real eigenvalues and find and orthonormal set of three eigenvectors. Orthogonality of eigenvectors Confirm, by explicit calculation, that the eigenvalues of the real, symmetric matrix ( 2 1 2 1 2 2 2 2 1 ) are real, and its eigenvectors are orthogonal.
Markdown
[Skip to content](https://mathforquantum.quantumtinkerer.tudelft.nl/6_eigenvectors_QM/#6-eigenvalues-and-eigenvectors) Mathematics for Quantum Physics Eigenvalues and eigenvectors Type to start searching [Source](https://gitlab.kwant-project.org/mathematics-for-quantum-physics/lectures "Go to repository") Mathematics for Quantum Physics [Source](https://gitlab.kwant-project.org/mathematics-for-quantum-physics/lectures "Go to repository") - [Introduction](https://mathforquantum.quantumtinkerer.tudelft.nl/) - [1\. Complex numbers](https://mathforquantum.quantumtinkerer.tudelft.nl/1_complex_numbers/) - [2\. Coordinate systems](https://mathforquantum.quantumtinkerer.tudelft.nl/2_coordinates/) - [3\. Vector spaces](https://mathforquantum.quantumtinkerer.tudelft.nl/3_vector_spaces/) - [4\. Vector spaces in quantum mechanics](https://mathforquantum.quantumtinkerer.tudelft.nl/4_vector_spaces_QM/) - [5\. Operators in quantum mechanics](https://mathforquantum.quantumtinkerer.tudelft.nl/5_operators_QM/) - 6\. Eigenvectors and eigenvalues [6\. Eigenvectors and eigenvalues](https://mathforquantum.quantumtinkerer.tudelft.nl/6_eigenvectors_QM/) Table of contents - [6\.1. Eigenvalue equations in linear algebra](https://mathforquantum.quantumtinkerer.tudelft.nl/6_eigenvectors_QM/#61-eigenvalue-equations-in-linear-algebra) - [6\.2. Eigenvalue equations in quantum mechanics](https://mathforquantum.quantumtinkerer.tudelft.nl/6_eigenvectors_QM/#62-eigenvalue-equations-in-quantum-mechanics) - [6\.3. Problems](https://mathforquantum.quantumtinkerer.tudelft.nl/6_eigenvectors_QM/#63-problems) - [7\. Differential equations 1](https://mathforquantum.quantumtinkerer.tudelft.nl/7_differential_equations_1/) - [8\. Differential equations 2](https://mathforquantum.quantumtinkerer.tudelft.nl/8_differential_equations_2/) Table of contents - [6\.1. Eigenvalue equations in linear algebra](https://mathforquantum.quantumtinkerer.tudelft.nl/6_eigenvectors_QM/#61-eigenvalue-equations-in-linear-algebra) - [6\.2. Eigenvalue equations in quantum mechanics](https://mathforquantum.quantumtinkerer.tudelft.nl/6_eigenvectors_QM/#62-eigenvalue-equations-in-quantum-mechanics) - [6\.3. Problems](https://mathforquantum.quantumtinkerer.tudelft.nl/6_eigenvectors_QM/#63-problems) # 6\. Eigenvalues and eigenvectors[¶](https://mathforquantum.quantumtinkerer.tudelft.nl/6_eigenvectors_QM/#6-eigenvalues-and-eigenvectors "Permanent link") The lecture on eigenvalues and eigenvectors consists of the following parts: - [6\.1. Eigenvalue equations in linear algebra](https://mathforquantum.quantumtinkerer.tudelft.nl/6_eigenvectors_QM/#61-eigenvalue-equations-in-linear-algebra) - [6\.2. Eigenvalue equations in quantum mechanics](https://mathforquantum.quantumtinkerer.tudelft.nl/6_eigenvectors_QM/#62-eigenvalue-equations-in-quantum-mechanics) and at the end of the lecture notes, there is a set of corresponding exercises: - [6\.3. Problems](https://mathforquantum.quantumtinkerer.tudelft.nl/6_eigenvectors_QM/#63-problems) *** The contents of this lecture are summarised in the following **video**: - [Eigenvalues and eigenvectors](https://www.dropbox.com/s/n6hb5cu2iy8i8x4/linear_algebra_09.mov?dl=0) *The total length of the videos: ~3 minutes 30 seconds* *** In the previous lecture, we discussed a number of *operator equations*, which have the form ^A\|ψ⟩\=\|φ⟩, A ^ \| ψ ⟩ \= \| φ ⟩ , where \|ψ⟩ \| ψ ⟩ and \|φ⟩ \| φ ⟩ are state vectors belonging to the Hilbert space of the system H H. Eigenvalue equation: A specific class of operator equations, which appear frequently in quantum mechanics, consists of equations in the form ^A\|ψ⟩\=λψ\|ψ⟩, A ^ \| ψ ⟩ \= λ ψ \| ψ ⟩ , where λψ λ ψ is a scalar (in general complex). These are equations where the action of the operator ^A A ^ on the state vector \|ψ⟩ \| ψ ⟩ returns *the same state vector* multiplied by the scalar λψ λ ψ. This type of operator equations are known as *eigenvalue equations* and are of great importance for the description of quantum systems. In this lecture, we present the main ingredients of these equations and how we can apply them to quantum systems. ## 6\.1. Eigenvalue equations in linear algebra[¶](https://mathforquantum.quantumtinkerer.tudelft.nl/6_eigenvectors_QM/#61-eigenvalue-equations-in-linear-algebra "Permanent link") First of all, let us review eigenvalue equations in linear algebra. Assume that we have a (square) matrix A A with dimensions n×n n × n and →v v → is a column vector in n n dimensions. The corresponding eigenvalue equation will be of form A→v\=λ→v. A v → \= λ v → . with λ λ being a scalar number (real or complex, depending on the type of vector space). We can express the previous equation in terms of its components, assuming as usual some specific choice of basis, by using the rules of matrix multiplication: Eigenvalue equation: Eigenvalue and Eigenvector n∑j\=1Aijvj\=λvi. ∑ j \= 1 n A i j v j \= λ v i . The scalar λ λ is known as the *eigenvalue* of the equation, while the vector →v v → is known as the associated *eigenvector*. The key feature of such equations is that applying a matrix A A to the vector →v v → returns *the original vector* up to an overall rescaling, λ→v λ v →. Number of solutions In general, there will be multiple solutions to the eigenvalue equation A→v\=λ→v A v → \= λ v →, each one characterised by an specific eigenvalue and eigenvectors. Note that in some cases one has *degenerate solutions*, whereby a given matrix has two or more eigenvectors that are equal. Characteristic equation: In order to determine the eigenvalues of the matrix A A, we need to evaluate the solutions of the so-called *characteristic equation* of the matrix A A, defined as det(A−λI)\=0, d e t ( A − λ I ) \= 0 , where I I is the identity matrix of dimensions n×n n × n, and det d e t is the determinant. This relation follows from the eigenvalue equation in terms of components n∑j\=1Aijvj\=λvi,→n∑j\=1Aijvj−n∑j\=1λδijvj\=0,→n∑j\=1(Aij−λδij)vj\=0. ∑ j \= 1 n A i j v j \= λ v i , → ∑ j \= 1 n A i j v j − ∑ j \= 1 n λ δ i j v j \= 0 , → ∑ j \= 1 n ( A i j − λ δ i j ) v j \= 0 . Therefore, the eigenvalue condition can be written as a set of coupled linear equations n∑j\=1(Aij−λδij)vj\=0,i\=1,2,…,n, ∑ j \= 1 n ( A i j − λ δ i j ) v j \= 0 , i \= 1 , 2 , … , n , which only admit non-trivial solutions if the determinant of the matrix A−λI A − λ I vanishes (the so-called Cramer's condition), thus leading to the characteristic equation. Once we have solved the characteristic equation, we end up with n n eigenvalues λk λ k, k\=1,…,n k \= 1 , … , n. We can then determine the corresponding eigenvector →vk\=⎛⎜ ⎜ ⎜ ⎜ ⎜⎝vk,1vk,2⋮vk,n⎞⎟ ⎟ ⎟ ⎟ ⎟⎠, v → k \= ( v k , 1 v k , 2 ⋮ v k , n ) , by solving the corresponding system of linear equations n∑j\=1(Aij−λkδij)vk,j\=0,i\=1,2,…,n, ∑ j \= 1 n ( A i j − λ k δ i j ) v k , j \= 0 , i \= 1 , 2 , … , n , Let us remind ourselves that in n\=2 n \= 2 dimensions the determinant of a matrix is evaluated as det(A)\=∣∣∣A11A12A21A22∣∣∣\=A11A22−A12A21, d e t ( A ) \= \| A 11 A 12 A 21 A 22 \| \= A 11 A 22 − A 12 A 21 , while the corresponding expression for a matrix belonging to a vector space in n\=3 n \= 3 dimensions in terms of the previous expression will be given as det(A)\=∣∣ ∣∣A11A12A13A21A22A23A31A32A33∣∣ ∣∣\=\+A11∣∣∣A22A23A32A33∣∣∣−A12∣∣∣A21A23A31A33∣∣∣\+A13∣∣∣A21A22A31A32∣∣∣ d e t ( A ) \= \| A 11 A 12 A 13 A 21 A 22 A 23 A 31 A 32 A 33 \| \= \+ A 11 \| A 22 A 23 A 32 A 33 \| − A 12 \| A 21 A 23 A 31 A 33 \| \+ A 13 \| A 21 A 22 A 31 A 32 \| Example Let us illustrate how to compute eigenvalues and eigenvectors by considering a n\=2 n \= 2 vector space. Consider the following matrix A\=(12−14), A \= ( 1 2 − 1 4 ) , which has associated the following characteristic equation det(A−λ⋅I)\=∣∣∣1−λ2−14−λ∣∣∣\=(1−λ)(4−λ)\+2\=λ2−5λ\+6\=0. d e t ( A − λ ⋅ I ) \= \| 1 − λ 2 − 1 4 − λ \| \= ( 1 − λ ) ( 4 − λ ) \+ 2 \= λ 2 − 5 λ \+ 6 \= 0 . This is a quadratic equation which we know how to solve exactly; the two eigenvalues are λ1\=3 λ 1 \= 3 and λ2\=2 λ 2 \= 2. Next, we can determine the associated eigenvectors →v1 v → 1 and →v2 v → 2. For the first one, the equation to solve is (12−14)(v1,1v1,2)\=λ1(v1,1v1,2)\=3(v1,1v1,2) ( 1 2 − 1 4 ) ( v 1 , 1 v 1 , 2 ) \= λ 1 ( v 1 , 1 v 1 , 2 ) \= 3 ( v 1 , 1 v 1 , 2 ) from where we find the condition that v1,1\=v1,2 v 1 , 1 \= v 1 , 2. An important property of eigenvalue equations is that the eigenvectors are only fixed up to an *overall normalisation condition*. This should be clear from its definition: if a vector →v v → satisfies A→v\=λ→v A v → \= λ v →, then the vector →v′\=c→v v → ′ \= c v → with c c some constant will also satisfy the same equation. So then we find that the eigenvalue λ1 λ 1 has an associated eigenvector →v1\=(11), v → 1 \= ( 1 1 ) , and indeed one can check that A→v1\=(12−14)(11)\=(33)\=3→v1, A v → 1 \= ( 1 2 − 1 4 ) ( 1 1 ) \= ( 3 3 ) \= 3 v → 1 , as we intended to demonstrate. Exercise As an exercise, try to obtain the expression of the eigenvector corresponding to the second eigenvalue λ2\=2 λ 2 \= 2. ## 6\.2. Eigenvalue equations in quantum mechanics[¶](https://mathforquantum.quantumtinkerer.tudelft.nl/6_eigenvectors_QM/#62-eigenvalue-equations-in-quantum-mechanics "Permanent link") We can now extend the ideas of eigenvalue equations from linear algebra to the case of quantum mechanics. The starting point is the eigenvalue equation for the operator ^A A ^, ^A\|ψ⟩\=λψ\|ψ⟩, A ^ \| ψ ⟩ \= λ ψ \| ψ ⟩ , where the vector state \|ψ⟩ \| ψ ⟩ is the eigenvector of the equation and λψ λ ψ is the corresponding eigenvalue, in general a complex scalar. In general this equation will have multiple solutions, which for a Hilbert space H H with n n dimensions can be labelled as ^A\|ψk⟩\=λψk\|ψk⟩,k\=1,…,n. A ^ \| ψ k ⟩ \= λ ψ k \| ψ k ⟩ , k \= 1 , … , n . In order to determine the eigenvalues and eigenvectors of a given operator ^A A ^, we will have to solve the corresponding eigenvalue problem for this operator, what we called above as the *characteristic equation*. This is most efficiently done in the matrix representation of this operation, where we have that the above operator equation can be expressed in terms of its components as ⎛⎜ ⎜ ⎜ ⎜⎝A11A12A13…A21A22A23…A31A32A33…⋮⋮⋮⎞⎟ ⎟ ⎟ ⎟⎠⎛⎜ ⎜ ⎜ ⎜ ⎜⎝ψk,1ψk,2ψk,3⋮⎞⎟ ⎟ ⎟ ⎟ ⎟⎠\=λψk⎛⎜ ⎜ ⎜ ⎜ ⎜⎝ψk,1ψk,2ψk,3⋮⎞⎟ ⎟ ⎟ ⎟ ⎟⎠. ( A 11 A 12 A 13 … A 21 A 22 A 23 … A 31 A 32 A 33 … ⋮ ⋮ ⋮ ) ( ψ k , 1 ψ k , 2 ψ k , 3 ⋮ ) \= λ ψ k ( ψ k , 1 ψ k , 2 ψ k , 3 ⋮ ) . As discussed above, this condition is identical to solving a set of linear equations for the form ⎛⎜ ⎜ ⎜ ⎜ ⎜⎝A11−λψkA12A13…A21A22−λψkA23…A31A32A33−λψk…⋮⋮⋮⎞⎟ ⎟ ⎟ ⎟ ⎟⎠⎛⎜ ⎜ ⎜ ⎜ ⎜⎝ψk,1ψk,2ψk,3⋮⎞⎟ ⎟ ⎟ ⎟ ⎟⎠\=0. ( A 11 − λ ψ k A 12 A 13 … A 21 A 22 − λ ψ k A 23 … A 31 A 32 A 33 − λ ψ k … ⋮ ⋮ ⋮ ) ( ψ k , 1 ψ k , 2 ψ k , 3 ⋮ ) \= 0 . Cramer's rule This set of linear equations only has a non-trivial set of solutions provided that the determinant of the matrix vanishes, as follows from the Cramer's condition: det⎛⎜ ⎜ ⎜ ⎜ ⎜⎝A11−λψA12A13…A21A22−λψA23…A31A32A33−λψ…⋮⋮⋮⎞⎟ ⎟ ⎟ ⎟ ⎟⎠\=∣∣ ∣ ∣ ∣ ∣∣A11−λψA12A13…A21A22−λψA23…A31A32A33−λψ…⋮⋮⋮∣∣ ∣ ∣ ∣ ∣∣\=0 d e t ( A 11 − λ ψ A 12 A 13 … A 21 A 22 − λ ψ A 23 … A 31 A 32 A 33 − λ ψ … ⋮ ⋮ ⋮ ) \= \| A 11 − λ ψ A 12 A 13 … A 21 A 22 − λ ψ A 23 … A 31 A 32 A 33 − λ ψ … ⋮ ⋮ ⋮ \| \= 0 which in general will have n n independent solutions, which we label as λψ,k λ ψ , k. Once we have solved the n n eigenvalues {λψ,k} { λ ψ , k }, we can insert each of them in the original evolution equation and determine the components of each of the eigenvectors, which we can express as columns vectors \|ψ1⟩\=⎛⎜ ⎜ ⎜ ⎜ ⎜⎝ψ1,1ψ1,2ψ1,3⋮⎞⎟ ⎟ ⎟ ⎟ ⎟⎠,\|ψ2⟩\=⎛⎜ ⎜ ⎜ ⎜ ⎜⎝ψ2,1ψ2,2ψ2,3⋮⎞⎟ ⎟ ⎟ ⎟ ⎟⎠,…,\|ψn⟩\=⎛⎜ ⎜ ⎜ ⎜ ⎜⎝ψn,1ψn,2ψn,3⋮⎞⎟ ⎟ ⎟ ⎟ ⎟⎠. \| ψ 1 ⟩ \= ( ψ 1 , 1 ψ 1 , 2 ψ 1 , 3 ⋮ ) , \| ψ 2 ⟩ \= ( ψ 2 , 1 ψ 2 , 2 ψ 2 , 3 ⋮ ) , … , \| ψ n ⟩ \= ( ψ n , 1 ψ n , 2 ψ n , 3 ⋮ ) . Orthogonality of eigenvectors An important property of eigenvalue equations is that if you have two eigenvectors \|ψi⟩ \| ψ i ⟩ and \|ψj⟩ \| ψ j ⟩ that have associated *different* eigenvalues, λψi≠λψj λ ψ i ≠ λ ψ j, then these two eigenvectors are orthogonal to each other, that is ⟨ψj\|ψi⟩\=0fori≠j. ⟨ ψ j \| ψ i ⟩ \= 0 f o r i ≠ j . This property is extremely important, since it suggest that we could use the eigenvectors of an eigenvalue equation as a *set of basis elements* for this Hilbert space. Recall from the discussions of eigenvalue equations in linear algebra that the eigenvectors \|ψi⟩ \| ψ i ⟩ are defined *up to an overall normalisation constant*. Clearly, if \|ψi⟩ \| ψ i ⟩ is a solution of ^A\|ψi⟩\=λψi\|ψi⟩ A ^ \| ψ i ⟩ \= λ ψ i \| ψ i ⟩ then c\|ψi⟩ c \| ψ i ⟩ will also be a solution, with c c being a constant. In the context of quantum mechanics, we need to choose this overall rescaling constant to ensure that the eigenvectors are normalised, thus they satisfy ⟨ψi\|ψi⟩\=1for all i. ⟨ ψ i \| ψ i ⟩ \= 1 f o r a l l i . With such a choice of normalisation, one says that the eigenvectors in a set are *orthogonal* among them. Eigenvalue spectrum and degeneracy The set of all eigenvalues of an operator is called the *eigenvalue spectrum* of an operator. Note that different eigenvectors can also have the same eigenvalue. If this is the case the eigenvalue is said to be *degenerate*. *** ## 6\.3. Problems[¶](https://mathforquantum.quantumtinkerer.tudelft.nl/6_eigenvectors_QM/#63-problems "Permanent link") 1. *Eigenvalues and eigenvectors I* Find the characteristic polynomial and eigenvalues for each of the following matrices, A\=(53210)B\=(7i−126i)C\=⎛⎜⎝20−1031104⎞⎟⎠ A \= ( 5 3 2 10 ) B \= ( 7 i − 1 2 6 i ) C \= ( 2 0 − 1 0 3 1 1 0 4 ) 2. *Hamiltonian* The Hamiltonian for a two-state system is given by H\=(ω1ω2ω2ω1) H \= ( ω 1 ω 2 ω 2 ω 1 ) A basis for this system is \|0⟩\=(10),\|1⟩\=(01) \| 0 ⟩ \= ( 1 0 ) , \| 1 ⟩ \= ( 0 1 ) Find the eigenvalues and eigenvectors of the Hamiltonian H H, and express the eigenvectors in terms of {\|0⟩,\|1⟩} { \| 0 ⟩ , \| 1 ⟩ } 3. *Eigenvalues and eigenvectors II* Find the eigenvalues and eigenvectors of the matrices A\=⎛⎜⎝−2−1−1632001⎞⎟⎠B\=⎛⎜⎝112222−1−1−1⎞⎟⎠ A \= ( − 2 − 1 − 1 6 3 2 0 0 1 ) B \= ( 1 1 2 2 2 2 − 1 − 1 − 1 ). 4. *The Hadamard gate* In one of the problems of the previous section we discussed that an important operator used in quantum computation is the *Hadamard gate*, which is represented by the matrix: ^H\=1√2(111−1). H ^ \= 1 2 ( 1 1 1 − 1 ) . Determine the eigenvalues and eigenvectors of this operator. 5. *Hermitian matrix* Show that the Hermitian matrix ⎛⎜⎝00i010−i00⎞⎟⎠ ( 0 0 i 0 1 0 − i 0 0 ) has only two real eigenvalues and find and orthonormal set of three eigenvectors. 6. *Orthogonality of eigenvectors* Confirm, by explicit calculation, that the eigenvalues of the real, symmetric matrix ⎛⎜⎝212122221⎞⎟⎠ ( 2 1 2 1 2 2 2 2 1 ) are real, and its eigenvectors are orthogonal. Copyright © 2019-2022 Delft University of Technology, CC-BY-SA 4.0. Made with [Material for MkDocs](https://squidfunk.github.io/mkdocs-material/)
Readable Markdown
The lecture on eigenvalues and eigenvectors consists of the following parts: - [6\.1. Eigenvalue equations in linear algebra](https://mathforquantum.quantumtinkerer.tudelft.nl/6_eigenvectors_QM/#61-eigenvalue-equations-in-linear-algebra) - [6\.2. Eigenvalue equations in quantum mechanics](https://mathforquantum.quantumtinkerer.tudelft.nl/6_eigenvectors_QM/#62-eigenvalue-equations-in-quantum-mechanics) and at the end of the lecture notes, there is a set of corresponding exercises: - [6\.3. Problems](https://mathforquantum.quantumtinkerer.tudelft.nl/6_eigenvectors_QM/#63-problems) *** The contents of this lecture are summarised in the following **video**: - [Eigenvalues and eigenvectors](https://www.dropbox.com/s/n6hb5cu2iy8i8x4/linear_algebra_09.mov?dl=0) *The total length of the videos: ~3 minutes 30 seconds* *** In the previous lecture, we discussed a number of *operator equations*, which have the form A ^ \| ψ ⟩ \= \| φ ⟩ , where \| ψ ⟩ and \| φ ⟩ are state vectors belonging to the Hilbert space of the system H. Eigenvalue equation: A specific class of operator equations, which appear frequently in quantum mechanics, consists of equations in the form A ^ \| ψ ⟩ \= λ ψ \| ψ ⟩ , where λ ψ is a scalar (in general complex). These are equations where the action of the operator A ^ on the state vector \| ψ ⟩ returns *the same state vector* multiplied by the scalar λ ψ. This type of operator equations are known as *eigenvalue equations* and are of great importance for the description of quantum systems. In this lecture, we present the main ingredients of these equations and how we can apply them to quantum systems. ## 6\.1. Eigenvalue equations in linear algebra[¶](https://mathforquantum.quantumtinkerer.tudelft.nl/6_eigenvectors_QM/#61-eigenvalue-equations-in-linear-algebra "Permanent link") First of all, let us review eigenvalue equations in linear algebra. Assume that we have a (square) matrix A with dimensions n × n and v → is a column vector in n dimensions. The corresponding eigenvalue equation will be of form A v → \= λ v → . with λ being a scalar number (real or complex, depending on the type of vector space). We can express the previous equation in terms of its components, assuming as usual some specific choice of basis, by using the rules of matrix multiplication: Eigenvalue equation: Eigenvalue and Eigenvector ∑ j \= 1 n A i j v j \= λ v i . The scalar λ is known as the *eigenvalue* of the equation, while the vector v → is known as the associated *eigenvector*. The key feature of such equations is that applying a matrix A to the vector v → returns *the original vector* up to an overall rescaling, λ v →. Number of solutions In general, there will be multiple solutions to the eigenvalue equation A v → \= λ v →, each one characterised by an specific eigenvalue and eigenvectors. Note that in some cases one has *degenerate solutions*, whereby a given matrix has two or more eigenvectors that are equal. Characteristic equation: In order to determine the eigenvalues of the matrix A, we need to evaluate the solutions of the so-called *characteristic equation* of the matrix A, defined as d e t ( A − λ I ) \= 0 , where I is the identity matrix of dimensions n × n, and d e t is the determinant. This relation follows from the eigenvalue equation in terms of components ∑ j \= 1 n A i j v j \= λ v i , → ∑ j \= 1 n A i j v j − ∑ j \= 1 n λ δ i j v j \= 0 , → ∑ j \= 1 n ( A i j − λ δ i j ) v j \= 0 . Therefore, the eigenvalue condition can be written as a set of coupled linear equations ∑ j \= 1 n ( A i j − λ δ i j ) v j \= 0 , i \= 1 , 2 , … , n , which only admit non-trivial solutions if the determinant of the matrix A − λ I vanishes (the so-called Cramer's condition), thus leading to the characteristic equation. Once we have solved the characteristic equation, we end up with n eigenvalues λ k, k \= 1 , … , n. We can then determine the corresponding eigenvector v → k \= ( v k , 1 v k , 2 ⋮ v k , n ) , by solving the corresponding system of linear equations ∑ j \= 1 n ( A i j − λ k δ i j ) v k , j \= 0 , i \= 1 , 2 , … , n , Let us remind ourselves that in n \= 2 dimensions the determinant of a matrix is evaluated as d e t ( A ) \= \| A 11 A 12 A 21 A 22 \| \= A 11 A 22 − A 12 A 21 , while the corresponding expression for a matrix belonging to a vector space in n \= 3 dimensions in terms of the previous expression will be given as d e t ( A ) \= \| A 11 A 12 A 13 A 21 A 22 A 23 A 31 A 32 A 33 \| \= \+ A 11 \| A 22 A 23 A 32 A 33 \| − A 12 \| A 21 A 23 A 31 A 33 \| \+ A 13 \| A 21 A 22 A 31 A 32 \| Example Let us illustrate how to compute eigenvalues and eigenvectors by considering a n \= 2 vector space. Consider the following matrix A \= ( 1 2 − 1 4 ) , which has associated the following characteristic equation d e t ( A − λ ⋅ I ) \= \| 1 − λ 2 − 1 4 − λ \| \= ( 1 − λ ) ( 4 − λ ) \+ 2 \= λ 2 − 5 λ \+ 6 \= 0 . This is a quadratic equation which we know how to solve exactly; the two eigenvalues are λ 1 \= 3 and λ 2 \= 2. Next, we can determine the associated eigenvectors v → 1 and v → 2. For the first one, the equation to solve is ( 1 2 − 1 4 ) ( v 1 , 1 v 1 , 2 ) \= λ 1 ( v 1 , 1 v 1 , 2 ) \= 3 ( v 1 , 1 v 1 , 2 ) from where we find the condition that v 1 , 1 \= v 1 , 2. An important property of eigenvalue equations is that the eigenvectors are only fixed up to an *overall normalisation condition*. This should be clear from its definition: if a vector v → satisfies A v → \= λ v →, then the vector v → ′ \= c v → with c some constant will also satisfy the same equation. So then we find that the eigenvalue λ 1 has an associated eigenvector v → 1 \= ( 1 1 ) , and indeed one can check that A v → 1 \= ( 1 2 − 1 4 ) ( 1 1 ) \= ( 3 3 ) \= 3 v → 1 , as we intended to demonstrate. Exercise As an exercise, try to obtain the expression of the eigenvector corresponding to the second eigenvalue λ 2 \= 2. ## 6\.2. Eigenvalue equations in quantum mechanics[¶](https://mathforquantum.quantumtinkerer.tudelft.nl/6_eigenvectors_QM/#62-eigenvalue-equations-in-quantum-mechanics "Permanent link") We can now extend the ideas of eigenvalue equations from linear algebra to the case of quantum mechanics. The starting point is the eigenvalue equation for the operator A ^, A ^ \| ψ ⟩ \= λ ψ \| ψ ⟩ , where the vector state \| ψ ⟩ is the eigenvector of the equation and λ ψ is the corresponding eigenvalue, in general a complex scalar. In general this equation will have multiple solutions, which for a Hilbert space H with n dimensions can be labelled as A ^ \| ψ k ⟩ \= λ ψ k \| ψ k ⟩ , k \= 1 , … , n . In order to determine the eigenvalues and eigenvectors of a given operator A ^, we will have to solve the corresponding eigenvalue problem for this operator, what we called above as the *characteristic equation*. This is most efficiently done in the matrix representation of this operation, where we have that the above operator equation can be expressed in terms of its components as ( A 11 A 12 A 13 … A 21 A 22 A 23 … A 31 A 32 A 33 … ⋮ ⋮ ⋮ ) ( ψ k , 1 ψ k , 2 ψ k , 3 ⋮ ) \= λ ψ k ( ψ k , 1 ψ k , 2 ψ k , 3 ⋮ ) . As discussed above, this condition is identical to solving a set of linear equations for the form ( A 11 − λ ψ k A 12 A 13 … A 21 A 22 − λ ψ k A 23 … A 31 A 32 A 33 − λ ψ k … ⋮ ⋮ ⋮ ) ( ψ k , 1 ψ k , 2 ψ k , 3 ⋮ ) \= 0 . Cramer's rule This set of linear equations only has a non-trivial set of solutions provided that the determinant of the matrix vanishes, as follows from the Cramer's condition: d e t ( A 11 − λ ψ A 12 A 13 … A 21 A 22 − λ ψ A 23 … A 31 A 32 A 33 − λ ψ … ⋮ ⋮ ⋮ ) \= \| A 11 − λ ψ A 12 A 13 … A 21 A 22 − λ ψ A 23 … A 31 A 32 A 33 − λ ψ … ⋮ ⋮ ⋮ \| \= 0 which in general will have n independent solutions, which we label as λ ψ , k. Once we have solved the n eigenvalues { λ ψ , k }, we can insert each of them in the original evolution equation and determine the components of each of the eigenvectors, which we can express as columns vectors \| ψ 1 ⟩ \= ( ψ 1 , 1 ψ 1 , 2 ψ 1 , 3 ⋮ ) , \| ψ 2 ⟩ \= ( ψ 2 , 1 ψ 2 , 2 ψ 2 , 3 ⋮ ) , … , \| ψ n ⟩ \= ( ψ n , 1 ψ n , 2 ψ n , 3 ⋮ ) . Orthogonality of eigenvectors An important property of eigenvalue equations is that if you have two eigenvectors \| ψ i ⟩ and \| ψ j ⟩ that have associated *different* eigenvalues, λ ψ i ≠ λ ψ j, then these two eigenvectors are orthogonal to each other, that is ⟨ ψ j \| ψ i ⟩ \= 0 f o r i ≠ j . This property is extremely important, since it suggest that we could use the eigenvectors of an eigenvalue equation as a *set of basis elements* for this Hilbert space. Recall from the discussions of eigenvalue equations in linear algebra that the eigenvectors \| ψ i ⟩ are defined *up to an overall normalisation constant*. Clearly, if \| ψ i ⟩ is a solution of A ^ \| ψ i ⟩ \= λ ψ i \| ψ i ⟩ then c \| ψ i ⟩ will also be a solution, with c being a constant. In the context of quantum mechanics, we need to choose this overall rescaling constant to ensure that the eigenvectors are normalised, thus they satisfy ⟨ ψ i \| ψ i ⟩ \= 1 f o r a l l i . With such a choice of normalisation, one says that the eigenvectors in a set are *orthogonal* among them. Eigenvalue spectrum and degeneracy The set of all eigenvalues of an operator is called the *eigenvalue spectrum* of an operator. Note that different eigenvectors can also have the same eigenvalue. If this is the case the eigenvalue is said to be *degenerate*. *** ## 6\.3. Problems[¶](https://mathforquantum.quantumtinkerer.tudelft.nl/6_eigenvectors_QM/#63-problems "Permanent link") 1. *Eigenvalues and eigenvectors I* Find the characteristic polynomial and eigenvalues for each of the following matrices, A \= ( 5 3 2 10 ) B \= ( 7 i − 1 2 6 i ) C \= ( 2 0 − 1 0 3 1 1 0 4 ) 2. *Hamiltonian* The Hamiltonian for a two-state system is given by H \= ( ω 1 ω 2 ω 2 ω 1 ) A basis for this system is \| 0 ⟩ \= ( 1 0 ) , \| 1 ⟩ \= ( 0 1 ) Find the eigenvalues and eigenvectors of the Hamiltonian H, and express the eigenvectors in terms of { \| 0 ⟩ , \| 1 ⟩ } 3. *Eigenvalues and eigenvectors II* Find the eigenvalues and eigenvectors of the matrices A \= ( − 2 − 1 − 1 6 3 2 0 0 1 ) B \= ( 1 1 2 2 2 2 − 1 − 1 − 1 ). 4. *The Hadamard gate* In one of the problems of the previous section we discussed that an important operator used in quantum computation is the *Hadamard gate*, which is represented by the matrix: H ^ \= 1 2 ( 1 1 1 − 1 ) . Determine the eigenvalues and eigenvectors of this operator. 5. *Hermitian matrix* Show that the Hermitian matrix ( 0 0 i 0 1 0 − i 0 0 ) has only two real eigenvalues and find and orthonormal set of three eigenvectors. 6. *Orthogonality of eigenvectors* Confirm, by explicit calculation, that the eigenvalues of the real, symmetric matrix ( 2 1 2 1 2 2 2 2 1 ) are real, and its eigenvectors are orthogonal.
Shard79 (laksa)
Root Hash12064810265130771879
Unparsed URLnl,tudelft!quantumtinkerer,mathforquantum,/6_eigenvectors_QM/ s443