Jump to content

Talk:Eigenvalues and eigenvectors

Page contents not supported in other languages.
From Wikipedia, the free encyclopedia

This is an old revision of this page, as edited by Gwideman (talk | contribs) at 11:09, 1 March 2021 (Lead is now ineffective, and possibly wrong). The present address (URL) is a permanent link to this revision, which may differ significantly from the current revision.

Template:Vital article

Former featured articleEigenvalues and eigenvectors is a former featured article. Please see the links under Article milestones below for its original nomination page (for older articles, check the nomination archive) and why it was removed.
Main Page trophyThis article appeared on Wikipedia's Main Page as Today's featured article on November 1, 2005.
Article milestones
DateProcessResult
October 5, 2005Peer reviewReviewed
October 14, 2005Featured article candidatePromoted
September 19, 2006Featured article reviewDemoted
Current status: Former featured article

Template:Findsourcesnotice

Suggestion for a new lede

How about the following (meanwhile withdrawn):

______________________________________________

In linear algebra, an eigenvector or characteristic vector of a linear transformation is a non-zero vector that does not change its direction under application of this linear transformation. In other words, if is a non-zero vector, then it is an eigenvector of a linear transformation  exactly if is a scalar multiple of . This condition can be written as the mapping

where is a scaling factor, known as the eigenvalue, characteristic value , or characteristic root associated with the eigenvector 

There is a one-to-one correspondence between n by n square matrices and linear transformations from n-dimensional vector spaces to themselves. So for finite dimensional vector spaces, it is equivalent to define eigenvalues and eigenvectors using either the language of linear transformations or the language of matrices.. Such a linear transformation can be uniquely represented as an n by n-square matrix , and the vector by a column vector, which is an n by 1-matrix. The above mapping is then rendered as a matrix multiplication on the left hand side and as a scaling of the column vector on the right hand side in the defining equation

which holds for eigenvectors  and corresponding eigenvalues belonging to the linear transformation represented by the matrix Therefore these are usually called the eigenvectors and eigenvalues of the matrix.

Geometrically, an eigenvector corresponding to a real, nonzero eigenvalue points in a direction that is stretched by the transformation and the eigenvalue is the factor of this stretching. If the eigenvalue is negative, the direction is reversed.

__________________________________


Please, comment. Purgy (talk) 16:45, 21 August 2016 (UTC)[reply]

I'm not sure it is better than the current lead, and there are several things that are worse. Firstly, the notation is arguably not used correctly, or at least is misleading. When we write we usually interpret this as a lambda expression, not something only true for a particular value of x. Secondly, I don't see how mentioning a one-to-one correspondence is helpful. What we actually mean (and the current lead says) is that the linear transformation is represented by the matrix. This is much stronger than "one-to-one correspondence". I'm not clear what else is different about the proposed lead. Sławomir Biały (talk) 18:10, 21 August 2016 (UTC)[reply]
While I can appreciate your reservations to the \mapsto, I lack understanding for "representing" being stronger than "one-to-one" in this here context, if one wants to avoid "isomorphisms" or the like. In my effort I tried to collect and compress the current content under the premise of minimized changes. I am not shy to confess that I am eager to repress the general use of language of matrices in math articles wherever they impose their, partly mentioned, native restrictions. Would not leaving out the one-to-one connection weaken the matrix position in even finite dimensional environments still more? In trying to contribute to improvement of this article, I certainly will never fight for some specific content. Purgy (talk) 05:43, 22 August 2016 (UTC)[reply]
What is meant is that the linear transformation is represented as a matrix, not that there is a one-to-one correspondence between the set of linear transformations and the set of matrices. Representation is the mathematically correct term here. Sławomir Biały (talk) 09:48, 22 August 2016 (UTC)[reply]

little text error

In the section: Algebraic multiplicity second paragraph: Whereas Equation (4) factors the characteristic polynomial of A into the product of n linear terms with some terms potentially repeating, the characteristic polynomial can instead be written as the product of d terms each corresponding to a distinct eigenvalue and raised to the power of the algebraic multiplicity, I had to read it a couple of times to realize the word of was missing.

I did a little linear math 40 years ago and I enjoy reading this text! --Caretta.nl (talk) 10:18, 6 July 2017 (UTC)[reply]

Thanks. :) Purgy (talk) 12:15, 6 July 2017 (UTC)[reply]

Eigenvalues and eigenfunctions of differential operators

At the beginning of this section it is mentioned that "eigenvectors and eigenvalues make sense also in infinite-dimensional Hilbert or Banach vector spaces". However, it seems to me that this is misleading: they make sense in any (finite or infinite) vector space regardless of the topology. — Preceding unsigned comment added by 147.122.31.17 (talk) 13:31, 18 July 2017 (UTC)[reply]

Hello fellow Wikipedians,

I have just modified 2 external links on Eigenvalues and eigenvectors. Please take a moment to review my edit. If you have any questions, or need the bot to ignore the links, or the page altogether, please visit this simple FaQ for additional information. I made the following changes:

When you have finished reviewing my changes, you may follow the instructions on the template below to fix any issues with the URLs.

This message was posted before February 2018. After February 2018, "External links modified" talk page sections are no longer generated or monitored by InternetArchiveBot. No special action is required regarding these talk page notices, other than regular verification using the archive tool instructions below. Editors have permission to delete these "External links modified" talk page sections if they want to de-clutter talk pages, but see the RfC before doing mass systematic removals. This message is updated dynamically through the template {{source check}} (last update: 5 June 2024).

  • If you have discovered URLs which were erroneously considered dead by the bot, you can report them with this tool.
  • If you found an error with any archives or the URLs themselves, you can fix them with this tool.

Cheers.—InternetArchiveBot (Report bug) 11:05, 18 September 2017 (UTC)[reply]

Colloquial definition in lead

The recently modified interpretation of the definition in the lead, "In the graphic setting of real vector spaces the direction of an eigenvector does not change, or is exactly reversed, under this transformation, just its length may be arbitrarily affected" is improper for several reasons. No matter how this is reworded, it will still have problems. I recommend removing it altogether. The first sentence, "In linear algebra, an eigenvector or characteristic vector of a linear transformation is a non-zero vector that only changes by an overall scale when that linear transformation is applied to it" is enough to give the reader a feel for what an eigenvector is, without any inconsistencies, before reading the technical definition.—Anita5192 (talk) 16:52, 16 October 2017 (UTC)[reply]

The two edits before mine triggered my action: I felt compliant with the perception that scaling by (-1) does not leave the direction of a vector unchanged, and I wanted to add a visually supporting formulation to the formally fully correct version of scaling the vector. Not sure about the effects of my newbie support, I added, fully intentionally, a "?" to my efforts. Since there are recommendations of removal, I will undo my edit. Perhaps someone finds a better way to support the elementary graphic geometric aspects of real EVs. Purgy (talk) 17:27, 16 October 2017 (UTC)[reply]
I was not faulting you. I think all the recent edits were performed in good faith. However, regardless of everyone's good intentions, I think the first and third sentence are enough. I don't believe we need to refer to specific vector spaces to give an intuitive feel to something very general. Regards.—Anita5192 (talk) 17:40, 16 October 2017 (UTC)[reply]
I think something like this would work better as an image caption, much like the image in the overview section (although I also think a better illustration should be found for this purpose). But, as text, "graphic setting of real vector spaces" strikes me as likely to be confusing to the target audience. Sławomir Biały (talk) 22:49, 16 October 2017 (UTC)[reply]
I never felt faulted in the slightest way, and I certainly see the confusing aspects in my weak formulations, but I have no grasp on their improper-ness, and I still think that eliminating the previous remarks about direction (which still appear later on) makes the learning curve required for the lede steeper. Maybe an animation would be best. Purgy (talk) 07:42, 17 October 2017 (UTC)[reply]
What I thought were improper were several concepts either not yet defined, or not well defined, e.g., 1. graphic setting, 2. direction, 3. reversed, 4. length. I also thought it unrealistic to restrict the topic to real vector spaces. E.g., in Z2, the only vector that could be an eigenvector is 1, and the only scalar multiples of 1 are 1 and 0. The terms direction and reverse are only trivially represented, at best.—Anita5192 (talk) 17:52, 17 October 2017 (UTC)[reply]

Adding Application to maxima-minima of multivariable functions

Eigenvalues are used in determining if a point is a local maximum , minimum or saddle point by calculating the eigenvalues of the Hessian Matrix the full article on it is Second partial derivative test. — Preceding unsigned comment added by Loneather (talkcontribs) 10:58, 6 December 2017 (UTC)[reply]

Etymology

The article claims that the German prefix "eigen-" means "proper" or "characteristic". I don't want to simply edit that statement, because it is linked to a source, but actually the main meaning of "eigen" is "own" (as in "my own", not as a verb): "mein eigenes Haus" = "my own house". So "Eigenwert" (eigenvalue) means something like "its very own value". Unless I'm missing a special meaning of "proper" (I'm German), this translation appears inappropriate to me. "Characteristic" fits better, but the main meaning "own" should be mentioned first in my opinion. 217.248.11.10 (talk) 21:24, 15 March 2018 (UTC)[reply]

The source cited reads:

eigen, adj. (Dat.) proper, inherent; own, individual, special; specific, peculiar, characteristic; spontaneous; nice, delicate, particular, exact; odd, strange, curious; ticklish;…

Eigenvectors are also called characteristic vectors in some textbooks. The "own" meaning is not the most relevant here.—Anita5192 (talk) 21:52, 15 March 2018 (UTC)[reply]
If you look beyond an English German-dictionary (Grimm or Adelung) you will find the Greek root ἔχειν, confirming the meanings of "property" and "ownership" as the core meaning of "eigen", used as a prefix, as adjective, or even as verb ("eignen") in a field with these notions as its center. Former ages, where personal property determined the perceived personality to a greater extent, already coined the view that "property, one owns" makes up (to a good deal) the "character" of a person. As I perceive it, the translations of "eigen" to "own", "characteristic", "specific", ... immediately hit the spot. Maybe, the intended meaning of "proper" in this context, prefixed to the noun ("proper value") is less immediate to a non-native speaker, compared to the postfixed use ("value proper"). Other translations, given in the source, result from a (factual) slight shift in meaning to the pejorative side ("peculiar", "strange" (="curious"), ..., but still "characteristic"), others are -say- rare, if not curious ("spontaneous", "exact", "ticklish").
As a natively German-speaker (I'm Austrian) I would -unauthorized, but spontaneously- prefer to use "own" wrt material goods, but abstract conceptions were "proper" to me. Honestly, I do think that linear maps "own" their "eigen"values, in the same sense as I "own" (without any rights) my mental conceptions of meanings. So maybe, changing the order of "own" and "proper" is merited, or it is not. :D Cheers, Purgy (talk) 08:28, 16 March 2018 (UTC)[reply]
The source cited supports the possible meanings of the prefix, but not the specific choices of "proper" and "characteristic." The editor who first inserted this evidently left no source or explanation for his or her choice of meanings. Most textbooks refer to "eigenvectors" as "characteristic vectors," but do not use other meanings. I would like to know the history of the term "eigenvector" and why the prefix "eigen–" was chosen, but none of my sources address this. The German article at [[1]] indicates that "eigen–" means "characteristic quantities," and dates to a publication by David Hilbert in 1904. Perhaps we should remove the prefix "proper" from the article.—Anita5192 (talk) 18:08, 16 March 2018 (UTC)[reply]

Not too technical

I removed the Technical template. Yes, the subject matter is technical mathematics, but I do not see how anyone can understand the subject matter without first understanding concepts like field, scalar, vector space, vector, linear algebra, and linear transformation, all of which are linked in the lead. Explaining them in depth here would be tedious and redundant. Several knowledgeable editors have been painstakingly refining this article for months to make it more readable, and I believe its present state is very readable for readers who already understand the aforementioned concepts.—Anita5192 (talk) 21:48, 8 February 2019 (UTC)[reply]

Maybe, the less technically prepared get a less steep introduction by replacing the current first sentence with something like

Eigenvalues and eigenvectors belong in a characteristic way to a linear transformation, as dealt with in linear algebra. The transformation of an eigenvector results in just scaling it by the factor given by the eigenvalue belonging to this eigenvector. More formally ...

Just a suggestion. Purgy (talk) 08:58, 9 February 2019 (UTC)[reply]
i could not agree more! after reading the lede i came to the talkpage to say that this is an utterly confusing way to explain for the reader (only) knowing what 'vector' means that the addititon of 'eigen' to the expression is simply meaning that the vector changes only in its length but not in its direction. so yes, the lede is way too technical. it should explicitly say in the very beginning that the eigenvector of v is any v' that only diifers in length from v but is not rotated to point to another direction. (okay, add, that flipping direction 180 degress by multiplying with a negative value does not count as rotation.)
the introduction of all other technical terms BEFORE getting to this simple point is making it too technical. 89.134.199.32 (talk) 20:53, 3 September 2019 (UTC).[reply]
I have moved the formal definition from the lead into its own section in the body of the article. Hopefully this will resolve the aforementioned issues.—Anita5192 (talk) 23:35, 3 September 2019 (UTC)[reply]

Historical origin of the use of lambda for eigenvalues?

My guess, it is from the early works of linear algebra and eigenvalues and eigenvectors arising from analyzing wave equations, where lambda would be used for wavelength, and different modes (eigenvectors) would correspond to various special solutions that can be linearly combined? Then set in stone when essentially same was done to Schrodinger's equation in Hamilton formulation of QM. Unfortunatly it is hard to find sources where the lambda symbol become popular for use for eigenvalues and what is the real origin of this popularity. 2A02:168:2000:5B:94CB:836:78C3:226E (talk) 12:17, 24 June 2020 (UTC)[reply]

Lead is now ineffective, and possibly wrong

"an eigenvector or characteristic vector of a linear transformation is a nonzero vector that changes by a scalar factor when that linear transformation is applied to it. "

This fails to make clear that the salient feature of an eigenvector is that it is a vector in the direction in which the linear transformation applies no rotation. As it stands:

  1. the description is incorrect in that it doesn't exclude all the directions in which the linear transformation applies a scalar factor and a rotation.
  2. It does rule out genuine eigenvectors whose eigenvalue happens to be one.

I suggest some rewording that eliminates these incorrect aspects, and makes clear that eigenvector is about the direction of non-rotation, rather than whether or not there is scaling. Gwideman (talk) 14:11, 22 February 2021 (UTC)[reply]

I don't see anything wrong with the definition above. In other directions a linear transformation need not be a rotation; it could, for example, be a sheer. The definition need not exclude other directions; the definition is about what happens to an eigenvector—not what happens to other vectors. It does not rule out eigenvalues of one; one is a valid eigenvalue and is encompassed by the definition above.—Anita5192 (talk) 17:03, 22 February 2021 (UTC)[reply]
From the definition section:
"If T is a linear transformation from a vector space V over a field F into itself and v is a nonzero vector in V, then v is an eigenvector of T if T(v) is a scalar multiple of v. This can be written as
where λ is a scalar in F, known as the eigenvalue "
This is not the same as "changes by a scalar factor". It is the same as "changes only by a scalar factor, or remains unchanged".
To answer your points:
"the definition is about what happens to an eigenvector—not what happens to other vectors." Of course it's also about other vectors! We're trying to state criteria by which all those other vectors fail to qualify as eigenvectors.
"In other directions a linear transformation need not be a rotation; it could, for example, be a sheer." Shear describes the transformation of the plane (for 2D), not the transformation of an individual vector. When a shear is applied, most vectors rotate. Eigenvector identifies the ones that do not. There is an excellent visualization on YouTube channel 3Blue1Brown titled "Eigenvectors and eigenvalues | Essence of linear algebra, chapter 14" starting at 2:59, and at 3:12 "any other vector is going to get rotated". (Sorry, Wikipedia blocked the URL.)
"It does not rule out eigenvalues of one". The word changes rules out the scalar being 1. Gwideman (talk) 11:09, 1 March 2021 (UTC)[reply]