🕷️ Crawler Inspector

URL Lookup

Direct Parameter Lookup

Raw Queries and Responses

1. Shard Calculation

Query:
Response:
Calculated Shard: 18 (from laksa095)

2. Crawled Status Check

Query:
Response:

3. Robots.txt Check

Query:
Response:

4. Spam/Ban Check

Query:
Response:

5. Seen Status Check

ℹ️ Skipped - page is already crawled

📄
INDEXABLE
CRAWLED
1 month ago
🤖
ROBOTS ALLOWED

Page Info Filters

FilterStatusConditionDetails
HTTP statusPASSdownload_http_code = 200HTTP 200
Age cutoffPASSdownload_stamp > now() - 6 MONTH1.9 months ago
History dropPASSisNull(history_drop_reason)No drop reason
Spam/banPASSfh_dont_index != 1 AND ml_spam_score = 0ml_spam_score=0
CanonicalPASSmeta_canonical IS NULL OR = '' OR = src_unparsedNot set

Page Details

PropertyValue
URLhttps://math.stackexchange.com/questions/1520832/real-life-examples-for-eigenvalues-eigenvectors
Last Crawled2026-02-10 11:26:02 (1 month ago)
First Indexed2017-04-27 22:23:06 (8 years ago)
HTTP Status Code200
Meta TitleReal life examples for eigenvalues / eigenvectors - Mathematics Stack Exchange
Meta Descriptionnull
Meta Canonicalnull
Boilerpipe Text
Eigenvectors are axes, eigenvalues are distances along these axes Eigenvectors are the base of dimensional reduction techniques like PCA ( principal component analysis ), extremely useful in situations where we want to reduce the number of dimensions to a more practical one. Concrete example: We want to find similar pictures in a large set and show the relationships in 2D (we don't know what are the similarity criteria, not either their number): The result above is obtained using a non guided simple dimension reduction technique . Probably it isn't very useful, but it illustrates the creation of partial clusters, e.g. the direction the person is looking to, or the color of the skin, or whether the person opens the mouth, or who the person is. While there are plenty of criteria measured, we are able to use a 2D system to reveal their effects combined, this is actually the expected benefit. The algorithm here found two super abstract criteria which are eigenvectors, and returned the corresponding pair of eigenvalues for each picture, used as individual coordinates to arrange the set. Face features as eigenvector: Eigenface Using eigenvectors is a base technique in face recognition where we want to associate a name to a person picture. The eigenvectors in this case are eigenfaces . Imagine we got black and white images of 47x62 pixels which can have some gray attribute, we actually have data with a value in 1348 dimensions: Not all pixels are critical in a given image. We want to reduce the number of dimensions to the "useful" ones, without losing the main features of the images. Say we want to move from 1348 to 150 dimensions. This is the usual way of pre-processing images before doing some image classification, like face recognition, in order to decrease CPU workload. The reduction is done by finding eigenvectors of the input images, these eigenvectors can be seen as basis images, from which the complete (actually nearly complete) images can be reconstructed. Below are the 32 first eigenvectors, out of 150 which were computed by some PCA, in order of usefulness, that is they are dimensions along which original images have the highest variance. These images are somehow like the major harmonics of a sound (obtained using a Fourier transform ): Note the three first eigenvectors are luminosity related, the main variance in images not shot in controlled conditions. To reconstruct the images from these eigenvectors, we only need to know the associated eigenvalues. The information to store and/or process for each image is now a vector of 150 eigenvalues, instead of the original vector of 1348 pixel values. A large gain. Still not much information has been lost. Here is a subset of original images, and images reconstructed using the eigenvectors shown above: The number of eigenvectors used (and their choice) determines how much variance from the pixels is lost. This number is labeled below as "number of components", from the name of the reduction technique used: The principal component analysis (PCA): For this graph, we see: With the 32 eigenvectors shown above, we can reconstruct images and get back more than 80% of the original pictures. With 150 eigenvectors, more than 95% of the information can be retrieved. This is somehow similar to image compression, but much more too, because the eigenvalues can be used directly to make a relation between the image and the name of the person, we just need to train the classifier with a set of known images associated with the correct category (here the name to associate). (I took my examples from this excellent book on machine learning . Faces pictures are from the LFW dataset )
Markdown
# ![site logo](https://stackoverflow.com/Content/Img/SE-logo75.png) By clicking “Sign up”, you agree to our [terms of service](https://math.stackexchange.com/legal/terms-of-service/public) and acknowledge you have read our [privacy policy](https://math.stackexchange.com/legal/privacy-policy). # OR Already have an account? [Log in](https://math.stackexchange.com/users/login) [Skip to main content](https://math.stackexchange.com/questions/1520832/real-life-examples-for-eigenvalues-eigenvectors#content) #### Stack Exchange Network Stack Exchange network consists of 183 Q\&A communities including [Stack Overflow](https://stackoverflow.com/), the largest, most trusted online community for developers to learn, share their knowledge, and build their careers. [Visit Stack Exchange](https://stackexchange.com/) 1. - [Tour Start here for a quick overview of the site](https://math.stackexchange.com/tour) - [Help Center Detailed answers to any questions you might have](https://math.stackexchange.com/help) - [Meta Discuss the workings and policies of this site](https://math.meta.stackexchange.com/) - [About Us Learn more about Stack Overflow the company, and our products](https://stackoverflow.co/) 2. ### [current community](https://math.stackexchange.com/) - [Mathematics](https://math.stackexchange.com/) [help](https://math.stackexchange.com/help) [chat](https://chat.stackexchange.com/?tab=site&host=math.stackexchange.com) - [Mathematics Meta](https://math.meta.stackexchange.com/) ### your communities [Sign up](https://math.stackexchange.com/users/signup?ssrc=site_switcher&returnurl=https%3A%2F%2Fmath.stackexchange.com%2Fquestions%2F1520832%2Freal-life-examples-for-eigenvalues-eigenvectors) or [log in](https://math.stackexchange.com/users/login?ssrc=site_switcher&returnurl=https%3A%2F%2Fmath.stackexchange.com%2Fquestions%2F1520832%2Freal-life-examples-for-eigenvalues-eigenvectors) to customize your list. ### [more stack exchange communities](https://stackexchange.com/sites) [company blog](https://stackoverflow.blog/) 3. [Log in](https://math.stackexchange.com/users/login?ssrc=head&returnurl=https%3A%2F%2Fmath.stackexchange.com%2Fquestions%2F1520832%2Freal-life-examples-for-eigenvalues-eigenvectors) 4. [Sign up](https://math.stackexchange.com/users/signup?ssrc=head&returnurl=https%3A%2F%2Fmath.stackexchange.com%2Fquestions%2F1520832%2Freal-life-examples-for-eigenvalues-eigenvectors) [![Mathematics](https://math.stackexchange.com/Content/Sites/math/Img/logo.svg?v=4a36350d1199)](https://math.stackexchange.com/) 1. 1. [Home](https://math.stackexchange.com/) 2. [Questions](https://math.stackexchange.com/questions) 3. [Unanswered](https://math.stackexchange.com/unanswered) 4. [AI Assist](https://stackoverflow.com/ai-assist) 5. [Tags](https://math.stackexchange.com/tags) 6. [Chat](https://chat.stackexchange.com/) 7. [Users](https://math.stackexchange.com/users) 2. Stack Internal Stack Overflow for Teams is now called **Stack Internal**. Bring the best of human thought and AI automation together at your work. [Try for free](https://stackoverflowteams.com/teams/create/free/?utm_medium=referral&utm_source=math-community&utm_campaign=side-bar&utm_content=explore-teams) [Learn more](https://stackoverflow.co/internal/?utm_medium=referral&utm_source=math-community&utm_campaign=side-bar&utm_content=explore-teams) 3. [Stack Internal]() 4. Bring the best of human thought and AI automation together at your work. [Learn more](https://stackoverflow.co/internal/?utm_medium=referral&utm_source=math-community&utm_campaign=side-bar&utm_content=explore-teams-compact) **Stack Internal** Knowledge at work Bring the best of human thought and AI automation together at your work. [Explore Stack Internal](https://stackoverflow.co/internal/?utm_medium=referral&utm_source=math-community&utm_campaign=side-bar&utm_content=explore-teams-compact-popover) # [Real life examples for eigenvalues / eigenvectors](https://math.stackexchange.com/questions/1520832/real-life-examples-for-eigenvalues-eigenvectors) [Ask Question](https://math.stackexchange.com/questions/ask) Asked 10 years, 3 months ago Modified [3 years, 11 months ago](https://math.stackexchange.com/questions/1520832/real-life-examples-for-eigenvalues-eigenvectors?lastactivity "2022-03-06 20:03:41Z") Viewed 105k times This question shows research effort; it is useful and clear 40 Save this question. Show activity on this post. There are already good answers about importance of eigenvalues / eigenvectors, such as [this question](https://math.stackexchange.com/questions/23312/what-is-the-importance-of-eigenvalues-eigenvectors) and some others, as well as this [Wikipedia article](https://en.wikipedia.org/wiki/Eigenvalues_and_eigenvectors). I know the theory and these examples, but now in order to do my best to prepare a course I'm teaching, I'm looking for ideas about **good real life examples** of usage of these concepts. **Do you know some good *simple* real-life examples** (in economics or data analysis or anything else), in which **the usage of eigen values/vectors** is a crucial tool? - [eigenvalues-eigenvectors](https://math.stackexchange.com/questions/tagged/eigenvalues-eigenvectors "show questions tagged 'eigenvalues-eigenvectors'") [Share](https://math.stackexchange.com/q/1520832 "Short permalink to this question") Share a link to this question Copy link [CC BY-SA 3.0](https://creativecommons.org/licenses/by-sa/3.0/ "The current license for this post: CC BY-SA 3.0") Cite Follow Follow this question to receive notifications [edited Apr 13, 2017 at 12:20](https://math.stackexchange.com/posts/1520832/revisions "show all edits to this post") [![Community's user avatar](https://www.gravatar.com/avatar/a007be5a61f6aa8f3e85ae2fc18dd66e?s=64&d=identicon&r=PG)](https://math.stackexchange.com/users/-1/community) [Community](https://math.stackexchange.com/users/-1/community)Bot 1 asked Nov 9, 2015 at 14:30 [![Basj's user avatar](https://www.gravatar.com/avatar/da3871bf5450fa726f20557fa3f18f51?s=64&d=identicon&r=PG)](https://math.stackexchange.com/users/109021/basj) [Basj](https://math.stackexchange.com/users/109021/basj) 1,63111 gold badge2222 silver badges4747 bronze badges 5 - Related: [matheducators.stackexchange.com/questions/520/…](http://matheducators.stackexchange.com/questions/520/what-is-a-good-motivation-showcase-for-a-student-for-the-study-of-eigenvalues "what is a good motivation showcase for a student for the study of eigenvalues") and (to a lesser extent) [matheducators.stackexchange.com/questions/3983/…](http://matheducators.stackexchange.com/questions/3983/what-is-the-best-way-to-intuitively-explain-what-eigenvectors-and-eigenvalues-ar "what is the best way to intuitively explain what eigenvectors and eigenvalues ar") mweiss – [mweiss](https://math.stackexchange.com/users/124095/mweiss "24,637 reputation") 2015-11-09 14:45:11 +00:00 [Commented Nov 9, 2015 at 14:45](https://math.stackexchange.com/questions/1520832/real-life-examples-for-eigenvalues-eigenvectors#comment3097188_1520832) - 2 Stability, e.g. in mechanical engineering and architecture, is a classic application of eigenvalue analysis (so much so that I hesitate to offer this chestnut as an Answer). hardmath – [hardmath](https://math.stackexchange.com/users/3111/hardmath "38,055 reputation") 2015-11-09 14:52:34 +00:00 [Commented Nov 9, 2015 at 14:52](https://math.stackexchange.com/questions/1520832/real-life-examples-for-eigenvalues-eigenvectors#comment3097210_1520832) - 2 The other well known example is Google's patented page rank algorithm hardmath – [hardmath](https://math.stackexchange.com/users/3111/hardmath "38,055 reputation") 2015-11-09 14:54:14 +00:00 [Commented Nov 9, 2015 at 14:54](https://math.stackexchange.com/questions/1520832/real-life-examples-for-eigenvalues-eigenvectors#comment3097216_1520832) - @mweiss : Thanks for these links. The first question you mentioned is interesting indeed and has really good answers (such as Fibonacci, positive definite matrices, etc.), but these are only examples of application of eigenvalues *for some other maths problems*. It's like "B2B" whereas I'd like "B2C" for my students :) I agree, the 2nd answer (PageRank) is a good real-life motivation, too. Basj – [Basj](https://math.stackexchange.com/users/109021/basj "1,631 reputation") 2015-11-09 16:01:57 +00:00 [Commented Nov 9, 2015 at 16:01](https://math.stackexchange.com/questions/1520832/real-life-examples-for-eigenvalues-eigenvectors#comment3097386_1520832) - Are you interested in eigenvalues and eigenvectors in a finite dimensional linear algebra sense? Or are infinite dimensional concepts acceptable? If so, the solutions of partial differential equations (e.g., the physics of Maxwell's equations or Schrodinger's equations, etc.) are often thought of as superpositions of eigenvectors in the appropriate function space. See here: [en.wikipedia.org/wiki/Eigenfunction](https://en.wikipedia.org/wiki/Eigenfunction) This is a really concrete example of the "real world", because you can bang a drum head and the eigenvalues and eigenvectors of the wave operator determine what you hear. rajb245 – [rajb245](https://math.stackexchange.com/users/72919/rajb245 "4,853 reputation") 2015-11-22 16:36:28 +00:00 [Commented Nov 22, 2015 at 16:36](https://math.stackexchange.com/questions/1520832/real-life-examples-for-eigenvalues-eigenvectors#comment3138621_1520832) [Add a comment](https://math.stackexchange.com/questions/1520832/real-life-examples-for-eigenvalues-eigenvectors "Use comments to ask for more information or suggest improvements. Avoid answering questions in comments.") \| ## 6 Answers 6 Sorted by: [Reset to default](https://math.stackexchange.com/questions/1520832/real-life-examples-for-eigenvalues-eigenvectors?answertab=scoredesc#tab-top) This answer is useful 40 Save this answer. \+100 This answer has been awarded bounties worth 100 reputation by Basj Show activity on this post. Here are just some of the many uses of eigenvectors and eigenvalues: - [Using singular value decomposition for image compression.](https://docs.google.com/viewer?a=v&pid=sites&srcid=ZGVmYXVsdGRvbWFpbnxuYXNsdW5kZXJpY3xneDpkMTI4OTI1NTc4YjRlOGE) This is a note explaining how you can compress and image by throwing away the small eigenvalues of AAT A A T. It takes an 8 8 megapixel image of an Allosaurus, and shows how the image looks after compressing by selecting 1 1,10 10,25 25,50 50,100 100 and 200 200 of the largest singular values. - [Deriving Special Relativity is more natural in the language of linear algebra.](https://docs.google.com/viewer?a=v&pid=sites&srcid=ZGVmYXVsdGRvbWFpbnxuYXNsdW5kZXJpY3xneDo2ZTAyNzA4NTZmOGZmNmU4) In fact, Einstein's second postulate really states that "Light is an eigenvector of the Lorentz transform." This document goes over the full derivation in detail. - [Spectral Clustering.](https://en.wikipedia.org/wiki/Spectral_clustering) Whether it's in plants and biology, medical imaging, buisness and marketing, understanding the connections between fields on Facebook, or even criminology, [clustering](https://en.wikipedia.org/wiki/Cluster_analysis#Applications) is an extremely important part of modern data analysis. It allows people to find important subsystems or patterns inside noisy data sets. One such method is spectral clustering which uses the eigenvalues of a the graph of a network. Even the eigenvector of the second smallest eigenvalue of the Laplacian matrix allows us to find the two largest clusters in a network. - [Dimensionality Reduction/PCA.](https://en.wikipedia.org/wiki/Principal_component_analysis) The principal components correspond the the largest eigenvalues of ATA A T A and this yields the least squared projection onto a smaller dimensional hyperplane, and the eigenvectors become the axes of the hyperplane. Dimensionality reduction is extremely useful in machine learning and data analysis as it allows one to understand where most of the variation in the data comes from. - [Low rank factorization for collaborative prediction.](http://cs229.stanford.edu/proj2006/KleemanDenuitHenderson-MatrixFactorizationForCollaborativePrediction.pdf) This what Netflix does (or once did) to predict what rating you'll have for a movie you have not yet watched. It uses the SVD, and throws away the smallest eigenvalues of ATA A T A. - [The Google Page Rank algorithm.](https://en.wikipedia.org/wiki/PageRank) The largest eigenvector of the graph of the internet is how the pages are ranked. [Share](https://math.stackexchange.com/a/1533514 "Short permalink to this answer") Share a link to this answer Copy link [CC BY-SA 3.0](https://creativecommons.org/licenses/by-sa/3.0/ "The current license for this post: CC BY-SA 3.0") Cite Follow Follow this answer to receive notifications [edited Nov 17, 2015 at 16:43](https://math.stackexchange.com/posts/1533514/revisions "show all edits to this post") answered Nov 17, 2015 at 13:53 [![Eric Naslund's user avatar](https://www.gravatar.com/avatar/0b8c4d5afd7ecc28d1a2dc2fe9a68dd1?s=64&d=identicon&r=PG)](https://math.stackexchange.com/users/6075/eric-naslund) [Eric Naslund](https://math.stackexchange.com/users/6075/eric-naslund) 74k1212 gold badges183183 silver badges272272 bronze badges 0 [Add a comment](https://math.stackexchange.com/questions/1520832/real-life-examples-for-eigenvalues-eigenvectors "Use comments to ask for more information or suggest improvements. Avoid comments like “+1” or “thanks”.") \| This answer is useful 8 Save this answer. Show activity on this post. **Eigenvectors are axes, eigenvalues are distances along these axes** Eigenvectors are the base of dimensional reduction techniques like PCA ([principal component analysis](https://en.wikipedia.org/wiki/Principal_component_analysis)), extremely useful in situations where we want to reduce the number of dimensions to a more practical one. Concrete example: We want to find similar pictures in a large set and show the relationships in 2D (we don't know what are the similarity criteria, not either their number): [![enter image description here](https://i.sstatic.net/OU0Pg.png)](https://i.sstatic.net/OU0Pg.png) The result above is obtained using a non guided [simple dimension reduction technique](https://en.wikipedia.org/wiki/Isomap). Probably it isn't very useful, but it illustrates the creation of partial clusters, e.g. the direction the person is looking to, or the color of the skin, or whether the person opens the mouth, or who the person is. While there are plenty of criteria measured, we are able to use a 2D system to reveal their effects combined, this is actually the expected benefit. The algorithm here found two super abstract criteria which are eigenvectors, and returned the corresponding pair of eigenvalues for each picture, used as individual coordinates to arrange the set. **Face features as eigenvector: Eigenface** Using eigenvectors is a base technique in face recognition where we want to associate a name to a person picture. The eigenvectors in this case are [**eigenfaces**](https://en.wikipedia.org/wiki/Eigenface). Imagine we got black and white images of 47x62 pixels which can have some gray attribute, we actually have data with a value in 1348 dimensions: [![enter image description here](https://i.sstatic.net/tiMgC.png)](https://i.sstatic.net/tiMgC.png) Not all pixels are critical in a given image. We want to [reduce the number of dimensions](https://en.wikipedia.org/wiki/Dimensionality_reduction) to the "useful" ones, without losing the main features of the images. Say we want to move from 1348 to 150 dimensions. This is the usual way of pre-processing images before doing some image classification, like face recognition, in order to decrease CPU workload. The reduction is done by finding eigenvectors of the input images, these eigenvectors can be seen as basis images, from which the complete (actually *nearly* complete) images can be reconstructed. Below are the 32 first eigenvectors, out of 150 which were computed by some PCA, in order of usefulness, that is they are dimensions along which original images have the highest variance. These images are somehow like the major harmonics of a sound (obtained using a [Fourier transform](https://en.wikipedia.org/wiki/Fourier_analysis)): [![enter image description here](https://i.sstatic.net/hdaSW.png)](https://i.sstatic.net/hdaSW.png) Note the three first eigenvectors are luminosity related, the main variance in images not shot in controlled conditions. To reconstruct the images from these eigenvectors, we only need to know the associated eigenvalues. The information to store and/or process for each image is now a vector of 150 eigenvalues, instead of the original vector of 1348 pixel values. A large gain. Still not much information has been lost. Here is a subset of original images, and images reconstructed using the eigenvectors shown above: [![enter image description here](https://i.sstatic.net/tI0Bp.png)](https://i.sstatic.net/tI0Bp.png) The number of eigenvectors used (and their choice) determines how much variance from the pixels is lost. This number is labeled below as "number of components", from the name of the reduction technique used: The principal component analysis (PCA): [![enter image description here](https://i.sstatic.net/ad1iG.png)](https://i.sstatic.net/ad1iG.png) For this graph, we see: - With the 32 eigenvectors shown above, we can reconstruct images and get back more than 80% of the original pictures. - With 150 eigenvectors, more than 95% of the information can be retrieved. This is somehow similar to image compression, but much more too, because the eigenvalues can be used directly to make a relation between the image and the name of the person, we just need to train the classifier with a set of known images associated with the correct category (here the name to associate). (I took my examples from this excellent [book on machine learning](https://github.com/jakevdp/PythonDataScienceHandbook/blob/master/notebooks/05.09-Principal-Component-Analysis.ipynb). Faces pictures are from the [LFW dataset](http://vis-www.cs.umass.edu/lfw/)) [Share](https://math.stackexchange.com/a/3985012 "Short permalink to this answer") Share a link to this answer Copy link [CC BY-SA 4.0](https://creativecommons.org/licenses/by-sa/4.0/ "The current license for this post: CC BY-SA 4.0") Cite Follow Follow this answer to receive notifications [edited Apr 5, 2021 at 17:12](https://math.stackexchange.com/posts/3985012/revisions "show all edits to this post") answered Jan 14, 2021 at 14:08 [![mins's user avatar](https://i.sstatic.net/MlqFo.png?s=64)](https://math.stackexchange.com/users/874001/mins) [mins](https://math.stackexchange.com/users/874001/mins) 52544 silver badges1111 bronze badges [Add a comment](https://math.stackexchange.com/questions/1520832/real-life-examples-for-eigenvalues-eigenvectors "Use comments to ask for more information or suggest improvements. Avoid comments like “+1” or “thanks”.") \| This answer is useful 7 Save this answer. Show activity on this post. In control theory and dynamical systems you have **modal decomposition**, which is a very useful tool to quickly create the dynamic equation for a given (real life) system Given a system of differential equation: x˙(t)\=Ax(t) x ˙ ( t ) \= A x ( t ), x(0)\=xo x ( 0 ) \= x o, A A has *distinct eigenvalues* Then the solution to this equation is given as: x(t)\=∑i\=1ncieλitvi x ( t ) \= ∑ i \= 1 n c i e λ i t v i where ci c i are the coefficient corresponding to initial condition x(0) x ( 0 ), vi v i is the i ith eigenvector, and λi λ i is the i ith eigenvalue, needless to say vi,λi v i , λ i forms a pair The physical interpretation is that the solution corresponds to the unforced/natural response of the system and is used to analyze bridge models, RC circuits, mass-spring-damper, magnetic suspension, fluid dynamics, acoustics, neuron models... Further, we can look at the eigenvalue of the A A matrix to determine the stability of the system. If all eigenvalues lie in the open left half plane, then the matrix A A is known simply as Hurwitz (a linear algebra result completely detached from dynamical system), and the system is asymptotically stable. Otherwise it will either have a state that never goes to zero, or blow up as time goes to infinity. *** This result is extremely well known, but goes by different names, in some field this is simply known as the eigenvector-eigenvalue problem: <http://jupiter.math.nctu.edu.tw/~tshieh/teaching/Math254_summerI2009/MAth254_summer_note/lecture16.pdf> <http://tutorial.math.lamar.edu/Classes/DE/RealEigenvalues.aspx> <https://see.stanford.edu/materials/lsoeldsee263/11-eig.pdf> You can also consult basic references on ODE, such as Boyce and DiPrima [Share](https://math.stackexchange.com/a/1539167 "Short permalink to this answer") Share a link to this answer Copy link [CC BY-SA 3.0](https://creativecommons.org/licenses/by-sa/3.0/ "The current license for this post: CC BY-SA 3.0") Cite Follow Follow this answer to receive notifications [edited Nov 21, 2015 at 2:06](https://math.stackexchange.com/posts/1539167/revisions "show all edits to this post") answered Nov 21, 2015 at 1:57 [![Fraïssé's user avatar](https://i.sstatic.net/XMPjnVcg.png?s=64)](https://math.stackexchange.com/users/105951/fra%C3%AFss%C3%A9) [Fraïssé](https://math.stackexchange.com/users/105951/fra%C3%AFss%C3%A9) 11\.8k77 gold badges7575 silver badges166166 bronze badges 1 - Thanks for this interesting topic, and for these informations. But really, by real-life example, I mean something that I could show from A to Z (where Z is a really real-life, and not "something that could be useful in real-life") to my students (2nd university year) Basj – [Basj](https://math.stackexchange.com/users/109021/basj "1,631 reputation") 2015-11-21 19:57:44 +00:00 [Commented Nov 21, 2015 at 19:57](https://math.stackexchange.com/questions/1520832/real-life-examples-for-eigenvalues-eigenvectors#comment3136190_1539167) [Add a comment](https://math.stackexchange.com/questions/1520832/real-life-examples-for-eigenvalues-eigenvectors "Use comments to ask for more information or suggest improvements. Avoid comments like “+1” or “thanks”.") \| This answer is useful 5 Save this answer. Show activity on this post. In real life, we effectively use eigen vectors and eigen values on a daily basis though sub-consciously most of the time. Example 1: When you watch a movie on screen(TV/movie theater,..), though the picture(s)/movie you see is actually 2D, you do not lose much information from the 3D real world it is capturing. That is because the principal eigen vector is more towards 2D plane the picture is being captured and any small loss of information(depth) is inferred automatically by our brain. (reason why we most of the times take photos using camera facing directly at us, not from the top of the head). Each scene requires certain aspects of the image to be enhanced, that is the reason the camera man/woman chooses his/her camera angle to capture most of those visual aspects. (apart from colour of costume, background scene and background music) Example 2: If you eat pizza, french fries,...or any food.... you are typically translating their taste into sour, sweet, bitter,salty, hot, etc ... principal components of taste -- though in reality the way a food is prepared is formulated in terms of ingredients' ratios (sugar,flour,butter, etc...10's of 100's of things that go into making a specific food) ... However our mind will transform all such information into the principal components of taste(eigen vector having sour, bitter, sweet, hot,..) automatically along with the food texture and smell. So we use eigen vectors every day in many situations without realizing that's how we learn about a system more effectively. Our brain simply transforms all the ingredients, cooking methods, final food product into some very effective eigen vector whose elements are taste sub parts ,smell and visual appearance internally. (All the ingredients and their quantities along with the cooking procedure represent some transformation matrix A and we can find some principal eigen vector(s) V with elements as taste+smell+appearance+touch having some linear transformation directly related. AV = wV , where w represent eigen values scalar and V an eigen vector) (top wine tasters probably have bigger taste+smell+appearance eigen vector and also with much bigger eigen values in each dimension. This concept can be extended to any field of study.) Example 3: if we take pictures of a person from many angles(front , back, top, side..) on a daily basis and would like to measure the changes in the entire body as one grows,... we can get the most information from the front angle with the axis of camera perpendicular to the line passing from crown of the head to a point passing between one's feet. This axis/camera angle captures the most useful information to measure a person's outer physical body changes as the age progresses. This axis becomes a principal eigen vectors with the largest eigen values. (Note: the data/images that we capture directly from the top of the person may give very less useful information compared to the camera directly facing him/her in this situation. That is the reason why we use PCA-Princial Component Analysis technique in determining most effective eigen vectors and related eigen values to capture most of the needed information without bothering about all the remaining axes of data capture.) Hope this helps in understanding why and how we use eigen vectors and eigen values for better perception in whatever we do on a day to day . Eigen vectors represent those axes of perception/learn along which we can know/understand/perceive things around us in very effective way(s). Finally it boils down to the differences between person to person, in consciously/sub-consciously building/refining such principal eigen vectors and related eigen values, in each field of learning that differentiate one person from the other. ( ex: musicians, artists, scientists, mathematicians, camera men, directors, teachers, doctors, engineers, parents, stock market brokers, weather prediction, ....) [Share](https://math.stackexchange.com/a/2618656 "Short permalink to this answer") Share a link to this answer Copy link [CC BY-SA 3.0](https://creativecommons.org/licenses/by-sa/3.0/ "The current license for this post: CC BY-SA 3.0") Cite Follow Follow this answer to receive notifications answered Jan 24, 2018 at 4:36 [![TrinadhBtB's user avatar](https://www.gravatar.com/avatar/275b37218440e1fd8dd350792e63d5c1?s=64&d=identicon&r=PG&f=y&so-version=2)](https://math.stackexchange.com/users/524353/trinadhbtb) [TrinadhBtB](https://math.stackexchange.com/users/524353/trinadhbtb) 5111 silver badge11 bronze badge 1 - Explained nicely. Thanks Ajitesh – [Ajitesh](https://math.stackexchange.com/users/814237/ajitesh "101 reputation") 2020-08-06 08:34:19 +00:00 [Commented Aug 6, 2020 at 8:34](https://math.stackexchange.com/questions/1520832/real-life-examples-for-eigenvalues-eigenvectors#comment7787249_2618656) [Add a comment](https://math.stackexchange.com/questions/1520832/real-life-examples-for-eigenvalues-eigenvectors "Use comments to ask for more information or suggest improvements. Avoid comments like “+1” or “thanks”.") \| This answer is useful 0 Save this answer. Show activity on this post. eigen values can be applied many ways. their significance and the purporse depend upone your decision or accepted decision to use the eigen values how you see fit. In a real word example you have can the eigen values determind for almost any graph. you give rise to the matrix which are vectors or translations of electrical systems or mechanical movements that you then compute the eigen values and obtain the eigen vectors for those values. so you will have to then interpret the resluts or reapply the data back to the graph you started with and/or a list of the values and vectors and interpret. so robot electrical systems are a great place to investigate these values. they help determine the systems electrical responses such as voltages and mechanical responses such as movements. hope it helps [Share](https://math.stackexchange.com/a/2244999 "Short permalink to this answer") Share a link to this answer Copy link [CC BY-SA 3.0](https://creativecommons.org/licenses/by-sa/3.0/ "The current license for this post: CC BY-SA 3.0") Cite Follow Follow this answer to receive notifications answered Apr 21, 2017 at 13:22 [![jarse's user avatar](https://www.gravatar.com/avatar/832072f252bbb4a41618bf323b2244f1?s=64&d=identicon&r=PG)](https://math.stackexchange.com/users/439030/jarse) [jarse](https://math.stackexchange.com/users/439030/jarse) 1 1 - Would you have an actual example on a precise situation? i.e. which is the matrix / linear function involved ? How does the definition A X = lambda X translate in the real life situation ? This would be needed to elaborate a pedagogical example. Thanks in advance. Basj – [Basj](https://math.stackexchange.com/users/109021/basj "1,631 reputation") 2017-04-21 14:41:02 +00:00 [Commented Apr 21, 2017 at 14:41](https://math.stackexchange.com/questions/1520832/real-life-examples-for-eigenvalues-eigenvectors#comment4616827_2244999) [Add a comment](https://math.stackexchange.com/questions/1520832/real-life-examples-for-eigenvalues-eigenvectors "Use comments to ask for more information or suggest improvements. Avoid comments like “+1” or “thanks”.") \| This answer is useful \-2 Save this answer. Show activity on this post. Let me give you a direct answer. In application eigen values can be: 1- Control Field: eigen values are the pole of the closed loop systems, if there values are negative for analogue systems then the system is stable, for digital systems if the values are inside the unit circle also the system is stable. 2- Mechanical system: eigen values are the natural frequency and the eigen values are the mode shapes. [Share](https://math.stackexchange.com/a/2001860 "Short permalink to this answer") Share a link to this answer Copy link [CC BY-SA 3.0](https://creativecommons.org/licenses/by-sa/3.0/ "The current license for this post: CC BY-SA 3.0") Cite Follow Follow this answer to receive notifications answered Nov 6, 2016 at 12:35 [![George Iskander's user avatar](https://graph.facebook.com/10156465630125596/picture?type=large)](https://math.stackexchange.com/users/386610/george-iskander) [George Iskander](https://math.stackexchange.com/users/386610/george-iskander) 1 2 - Thank you for your answer but this is not "a direct answer" for me at all :) `Control Field: eigen values are the pole of the closed loop systems` what is a pole in this context? what is a close loop system? In order to speak of eigen value, we need a function f f such that f(x)\=λx f ( x ) \= λ x or a matrix A A such that AX\=λX A X \= λ X . What is A A in this context? I have a math-only background and "close loop system" is like hieroglyph for me ;) Idem for `are the natural frequency and the eigen values are the mode shapes.` what is the relevant linear function f f or matrix A A ? Basj – [Basj](https://math.stackexchange.com/users/109021/basj "1,631 reputation") 2016-11-06 21:00:26 +00:00 [Commented Nov 6, 2016 at 21:00](https://math.stackexchange.com/questions/1520832/real-life-examples-for-eigenvalues-eigenvectors#comment4111894_2001860) - Mathematics without physics has no meaning and has no use. George Iskander – [George Iskander](https://math.stackexchange.com/users/386610/george-iskander "1 reputation") 2017-05-27 15:23:06 +00:00 [Commented May 27, 2017 at 15:23](https://math.stackexchange.com/questions/1520832/real-life-examples-for-eigenvalues-eigenvectors#comment4729888_2001860) [Add a comment](https://math.stackexchange.com/questions/1520832/real-life-examples-for-eigenvalues-eigenvectors "Use comments to ask for more information or suggest improvements. Avoid comments like “+1” or “thanks”.") \| ## You must [log in](https://math.stackexchange.com/users/login?ssrc=question_page&returnurl=https%3A%2F%2Fmath.stackexchange.com%2Fquestions%2F1520832) to answer this question. Start asking to get answers Find the answer to your question by asking. [Ask question](https://math.stackexchange.com/questions/ask) Explore related questions - [eigenvalues-eigenvectors](https://math.stackexchange.com/questions/tagged/eigenvalues-eigenvectors "show questions tagged 'eigenvalues-eigenvectors'") See similar questions with these tags. - Featured on Meta - [Results of the January 2026 Community Asks Sprint: Community Badges](https://meta.stackexchange.com/questions/417043/results-of-the-january-2026-community-asks-sprint-community-badges?cb=1) - [All users on Stack Exchange can now participate in chat](https://meta.stackexchange.com/questions/417109/all-users-on-stack-exchange-can-now-participate-in-chat?cb=1) #### Linked [371](https://math.stackexchange.com/questions/23312/what-is-the-importance-of-eigenvalues-eigenvectors?lq=1 "Question score (upvotes - downvotes)") [What is the importance of eigenvalues/eigenvectors?](https://math.stackexchange.com/questions/23312/what-is-the-importance-of-eigenvalues-eigenvectors?noredirect=1&lq=1) [4](https://math.stackexchange.com/questions/1946959/how-can-eigenvectors-and-eigenvalues-possibly-be-so-useful?lq=1 "Question score (upvotes - downvotes)") [How can eigenvectors and eigenvalues possibly be so useful?](https://math.stackexchange.com/questions/1946959/how-can-eigenvectors-and-eigenvalues-possibly-be-so-useful?noredirect=1&lq=1) #### Related [81](https://math.stackexchange.com/questions/36815/a-simple-explanation-of-eigenvectors-and-eigenvalues-with-big-picture-ideas-of?rq=1 "Question score (upvotes - downvotes)") [A simple explanation of eigenvectors and eigenvalues with 'big picture' ideas of why on earth they matter](https://math.stackexchange.com/questions/36815/a-simple-explanation-of-eigenvectors-and-eigenvalues-with-big-picture-ideas-of?rq=1) [2](https://math.stackexchange.com/questions/227390/aat-ata-and-othogonallity-of-eigenvectors?rq=1 "Question score (upvotes - downvotes)") [AAT=ATA A A T = A T A and othogonallity of eigenvectors](https://math.stackexchange.com/questions/227390/aat-ata-and-othogonallity-of-eigenvectors?rq=1) [2](https://math.stackexchange.com/questions/577520/an-eigen-decomposition-diagonalization-question?rq=1 "Question score (upvotes - downvotes)") [An eigen-decomposition/diagonalization question](https://math.stackexchange.com/questions/577520/an-eigen-decomposition-diagonalization-question?rq=1) [24](https://math.stackexchange.com/questions/1199380/what-is-the-intuition-behind-how-can-we-interpret-the-eigenvalues-and-eigenvec?rq=1 "Question score (upvotes - downvotes)") [What is the intuition behind / How can we interpret the eigenvalues and eigenvectors of Euclidean Distance Matrices?](https://math.stackexchange.com/questions/1199380/what-is-the-intuition-behind-how-can-we-interpret-the-eigenvalues-and-eigenvec?rq=1) [0](https://math.stackexchange.com/questions/1726914/how-to-get-the-eigenvectors-from-eigenvalues-in-a-rotation-matrix?rq=1 "Question score (upvotes - downvotes)") [How to get the eigenvectors from eigenvalues in a rotation matrix?](https://math.stackexchange.com/questions/1726914/how-to-get-the-eigenvectors-from-eigenvalues-in-a-rotation-matrix?rq=1) [0](https://math.stackexchange.com/questions/3446691/obtain-orthogonal-eigenvectors-for-non-symmetric-2x2-matrix?rq=1 "Question score (upvotes - downvotes)") [Obtain orthogonal "eigenvectors" for non-symmetric 2x2 matrix](https://math.stackexchange.com/questions/3446691/obtain-orthogonal-eigenvectors-for-non-symmetric-2x2-matrix?rq=1) [3](https://math.stackexchange.com/questions/4284841/can-a-real-matrix-have-arbitrary-complex-eigenvalues?rq=1 "Question score (upvotes - downvotes)") [Can a real matrix have arbitrary complex eigenvalues?](https://math.stackexchange.com/questions/4284841/can-a-real-matrix-have-arbitrary-complex-eigenvalues?rq=1) [4](https://math.stackexchange.com/questions/4433579/how-to-understand-eigenvalue-eigenvectors-of-matrices-without-considering-linea?rq=1 "Question score (upvotes - downvotes)") [How to understand Eigenvalue/Eigenvectors of matrices, without considering linear transformations](https://math.stackexchange.com/questions/4433579/how-to-understand-eigenvalue-eigenvectors-of-matrices-without-considering-linea?rq=1) [5](https://math.stackexchange.com/questions/4776724/eigenvalues-and-eigenvectors-of-a-block-matrix?rq=1 "Question score (upvotes - downvotes)") [Eigenvalues and Eigenvectors of a block matrix](https://math.stackexchange.com/questions/4776724/eigenvalues-and-eigenvectors-of-a-block-matrix?rq=1) #### [Hot Network Questions](https://stackexchange.com/questions?tab=hot) - [What wire gauge for 50A 120/240V generator at 125 ft distance?](https://diy.stackexchange.com/questions/329130/what-wire-gauge-for-50a-120-240v-generator-at-125-ft-distance) - [Making sense of SIX 7?](https://puzzling.stackexchange.com/questions/137002/making-sense-of-six-7) - [If an arbitration agreement can be opted out of by sending postal mail, what happens if the mail is not received?](https://law.stackexchange.com/questions/114130/if-an-arbitration-agreement-can-be-opted-out-of-by-sending-postal-mail-what-hap) - [Are these American references in Murakami's original, or are they translator's interpolations?](https://literature.stackexchange.com/questions/31508/are-these-american-references-in-murakamis-original-or-are-they-translators-i) - [Extending a Sobolev function to a bigger domain](https://math.stackexchange.com/questions/5123755/extending-a-sobolev-function-to-a-bigger-domain) - [Format enumitem inline list](https://tex.stackexchange.com/questions/759481/format-enumitem-inline-list) - [TikZ curved arrow bisecting text - how to increase arc height to clear obstacle?](https://tex.stackexchange.com/questions/759501/tikz-curved-arrow-bisecting-text-how-to-increase-arc-height-to-clear-obstacle) - [How to Export ChatGPT Conversations?](https://ai.stackexchange.com/questions/50375/how-to-export-chatgpt-conversations) - [Viability of forecasting 25 values with only 8 historical data points](https://stats.stackexchange.com/questions/674698/viability-of-forecasting-25-values-with-only-8-historical-data-points) - [Question about the definition of (vector) subspaces](https://math.stackexchange.com/questions/5123801/question-about-the-definition-of-vector-subspaces) - [Is it correct that rocket reuse is much more difficult for the upper stage than the lower stage?](https://space.stackexchange.com/questions/70316/is-it-correct-that-rocket-reuse-is-much-more-difficult-for-the-upper-stage-than) - [The practice of reasoning in Empiricism](https://philosophy.stackexchange.com/questions/136082/the-practice-of-reasoning-in-empiricism) - [How is negative obligation expressed in English, especially in the past?](https://ell.stackexchange.com/questions/373933/how-is-negative-obligation-expressed-in-english-especially-in-the-past) - [Archaeological evidence and halakhic size of an egg](https://judaism.stackexchange.com/questions/155066/archaeological-evidence-and-halakhic-size-of-an-egg) - [Detect overlapping faces in a 2D mesh using Geometry Nodes](https://blender.stackexchange.com/questions/345056/detect-overlapping-faces-in-a-2d-mesh-using-geometry-nodes) - [What would be required to give human like intelligence to rats?](https://worldbuilding.stackexchange.com/questions/272584/what-would-be-required-to-give-human-like-intelligence-to-rats) - [What is the difference between observation and perception?](https://philosophy.stackexchange.com/questions/136126/what-is-the-difference-between-observation-and-perception) - [Is it really wrong to say "I scratched my knee on the floor"?](https://ell.stackexchange.com/questions/373924/is-it-really-wrong-to-say-i-scratched-my-knee-on-the-floor) - [Seven circles in a rectangle: show that two of them are congruent](https://math.stackexchange.com/questions/5123828/seven-circles-in-a-rectangle-show-that-two-of-them-are-congruent) - [How can I mount a ceiling under a wide duct that would break up soffit framing?](https://diy.stackexchange.com/questions/329143/how-can-i-mount-a-ceiling-under-a-wide-duct-that-would-break-up-soffit-framing) - [Is the layout for this buck-converter okay?](https://electronics.stackexchange.com/questions/765597/is-the-layout-for-this-buck-converter-okay) - [Are flat projective structures on manifolds with finite fundamental group metrizable?](https://mathoverflow.net/questions/507949/are-flat-projective-structures-on-manifolds-with-finite-fundamental-group-metriz) - [Twisted Equality](https://tex.stackexchange.com/questions/759440/twisted-equality) - [Is “I’m talking to me” correct?](https://ell.stackexchange.com/questions/373928/is-i-m-talking-to-me-correct) [Question feed](https://math.stackexchange.com/feeds/question/1520832 "Feed of this question and its answers") # Subscribe to RSS Question feed To subscribe to this RSS feed, copy and paste this URL into your RSS reader. ![](https://math.stackexchange.com/posts/1520832/ivc/5ca6?prg=5b8521df-185b-468d-81cc-c953ba84de00) # Why are you flagging this comment? It contains harassment, bigotry or abuse. This comment attacks a person or group. Learn more in our [Abusive behavior policy](https://math.stackexchange.com/conduct/abusive-behavior). It's unfriendly or unkind. This comment is rude or condescending. Learn more in our [Code of Conduct](https://math.stackexchange.com/conduct/abusive-behavior). Not needed. This comment is not relevant to the post. ``` ``` Enter at least 6 characters Something else. A problem not listed above. Try to be as specific as possible. ``` ``` Enter at least 6 characters Flag comment Cancel You have 0 flags left today # ![Illustration of upvote icon after it is clicked](https://math.stackexchange.com/Content/Img/modal/img-upvote.png?v=fce73bd9724d) # Hang on, you can't upvote just yet. You'll need to complete a few actions and gain 15 reputation points before being able to upvote. **Upvoting** indicates when questions and answers are useful. [What's reputation and how do I get it?](https://stackoverflow.com/help/whats-reputation) Instead, you can save this post to reference later. Save this post for later Not now ##### [Mathematics](https://math.stackexchange.com/) - [Tour](https://math.stackexchange.com/tour) - [Help](https://math.stackexchange.com/help) - [Chat](https://chat.stackexchange.com/?tab=site&host=math.stackexchange.com) - [Contact](https://math.stackexchange.com/contact) - [Feedback](https://math.meta.stackexchange.com/) ##### [Company](https://stackoverflow.co/) - [Stack Overflow](https://stackoverflow.com/) - [Stack Internal](https://stackoverflow.co/internal/) - [Stack Data Licensing](https://stackoverflow.co/data-licensing/) - [Stack Ads](https://stackoverflow.co/advertising/) - [About](https://stackoverflow.co/) - [Press](https://stackoverflow.co/company/press/) - [Legal](https://stackoverflow.com/legal) - [Privacy Policy](https://stackoverflow.com/legal/privacy-policy) - [Terms of Service](https://stackoverflow.com/legal/terms-of-service/public) - Cookie Settings - [Cookie Policy](https://policies.stackoverflow.co/stack-overflow/cookie-policy) ##### [Stack Exchange Network](https://stackexchange.com/) - [Technology](https://stackexchange.com/sites#technology) - [Culture & recreation](https://stackexchange.com/sites#culturerecreation) - [Life & arts](https://stackexchange.com/sites#lifearts) - [Science](https://stackexchange.com/sites#science) - [Professional](https://stackexchange.com/sites#professional) - [Business](https://stackexchange.com/sites#business) - [API](https://api.stackexchange.com/) - [Data](https://data.stackexchange.com/) - [Blog](https://stackoverflow.blog/?blb=1) - [Facebook](https://www.facebook.com/officialstackoverflow/) - [Twitter](https://twitter.com/stackoverflow) - [LinkedIn](https://linkedin.com/company/stack-overflow) - [Instagram](https://www.instagram.com/thestackoverflow) Site design / logo © 2026 Stack Exchange Inc; user contributions licensed under [CC BY-SA](https://stackoverflow.com/help/licensing) . rev 2026.2.9.39585
Readable Markdown
**Eigenvectors are axes, eigenvalues are distances along these axes** Eigenvectors are the base of dimensional reduction techniques like PCA ([principal component analysis](https://en.wikipedia.org/wiki/Principal_component_analysis)), extremely useful in situations where we want to reduce the number of dimensions to a more practical one. Concrete example: We want to find similar pictures in a large set and show the relationships in 2D (we don't know what are the similarity criteria, not either their number): [![enter image description here](https://i.sstatic.net/OU0Pg.png)](https://i.sstatic.net/OU0Pg.png) The result above is obtained using a non guided [simple dimension reduction technique](https://en.wikipedia.org/wiki/Isomap). Probably it isn't very useful, but it illustrates the creation of partial clusters, e.g. the direction the person is looking to, or the color of the skin, or whether the person opens the mouth, or who the person is. While there are plenty of criteria measured, we are able to use a 2D system to reveal their effects combined, this is actually the expected benefit. The algorithm here found two super abstract criteria which are eigenvectors, and returned the corresponding pair of eigenvalues for each picture, used as individual coordinates to arrange the set. **Face features as eigenvector: Eigenface** Using eigenvectors is a base technique in face recognition where we want to associate a name to a person picture. The eigenvectors in this case are [**eigenfaces**](https://en.wikipedia.org/wiki/Eigenface). Imagine we got black and white images of 47x62 pixels which can have some gray attribute, we actually have data with a value in 1348 dimensions: [![enter image description here](https://i.sstatic.net/tiMgC.png)](https://i.sstatic.net/tiMgC.png) Not all pixels are critical in a given image. We want to [reduce the number of dimensions](https://en.wikipedia.org/wiki/Dimensionality_reduction) to the "useful" ones, without losing the main features of the images. Say we want to move from 1348 to 150 dimensions. This is the usual way of pre-processing images before doing some image classification, like face recognition, in order to decrease CPU workload. The reduction is done by finding eigenvectors of the input images, these eigenvectors can be seen as basis images, from which the complete (actually *nearly* complete) images can be reconstructed. Below are the 32 first eigenvectors, out of 150 which were computed by some PCA, in order of usefulness, that is they are dimensions along which original images have the highest variance. These images are somehow like the major harmonics of a sound (obtained using a [Fourier transform](https://en.wikipedia.org/wiki/Fourier_analysis)): [![enter image description here](https://i.sstatic.net/hdaSW.png)](https://i.sstatic.net/hdaSW.png) Note the three first eigenvectors are luminosity related, the main variance in images not shot in controlled conditions. To reconstruct the images from these eigenvectors, we only need to know the associated eigenvalues. The information to store and/or process for each image is now a vector of 150 eigenvalues, instead of the original vector of 1348 pixel values. A large gain. Still not much information has been lost. Here is a subset of original images, and images reconstructed using the eigenvectors shown above: [![enter image description here](https://i.sstatic.net/tI0Bp.png)](https://i.sstatic.net/tI0Bp.png) The number of eigenvectors used (and their choice) determines how much variance from the pixels is lost. This number is labeled below as "number of components", from the name of the reduction technique used: The principal component analysis (PCA): [![enter image description here](https://i.sstatic.net/ad1iG.png)](https://i.sstatic.net/ad1iG.png) For this graph, we see: - With the 32 eigenvectors shown above, we can reconstruct images and get back more than 80% of the original pictures. - With 150 eigenvectors, more than 95% of the information can be retrieved. This is somehow similar to image compression, but much more too, because the eigenvalues can be used directly to make a relation between the image and the name of the person, we just need to train the classifier with a set of known images associated with the correct category (here the name to associate). (I took my examples from this excellent [book on machine learning](https://github.com/jakevdp/PythonDataScienceHandbook/blob/master/notebooks/05.09-Principal-Component-Analysis.ipynb). Faces pictures are from the [LFW dataset](http://vis-www.cs.umass.edu/lfw/))
Shard18 (laksa)
Root Hash8045678284012640218
Unparsed URLcom,stackexchange!math,/questions/1520832/real-life-examples-for-eigenvalues-eigenvectors s443