Jump to content

Talk:Basis (linear algebra)/Archive 1

Page contents not supported in other languages.
From Wikipedia, the free encyclopedia
Archive 1

Proving that a finite set is a basis

This section confuses me... It says:

>To prove that a finite set B is a basis for a finite-dimensional vector space V, it is necessary to show that the number of elements in B >equals the dimension of V, and both of the following: > > * B is linearly independent, > * span(B) = V. > >It should be noted that this technique does not work for infinite-dimensional vector spaces.

This last sentence is not clear to me. The two conditions above seem to be ENOUGH to prove that ANY set B (be it finite or infinite) is a basis of V. Being linearly independent just means that if we take any finite subset B_0 of B we must have only trivial linear combinations leading to zero. span(B)= V just means that any vector v belonging to V must be a linear combination of a (finite) number of elements of B. These two definitions are just the two conditions mentioned above and they must work for infinite-dimensional vector spaces as well! Someone please tell me if I'm missing something here. Arestes (talk) 10:31, 7 December 2010 (UTC)

Ok, since nobody seems to reply to this, and I'm convinced that this section is just WRONG (not onoy confusing), I'm deleting it.

I believe the proof that even infinite-dimensional vector spaces have a basis is a bit different. The generalization involves proving that B is a basis for V only if B is a maximal linearly independent subset of S, where S generates V. I think though the wording in the article may not be entirely accurate so its fine to delete it. Nikolaih☎️📖 22:55, 6 June 2020 (UTC)
What are you talking about? This discussion, dated of 2010, is about a section that does not exist anymore. So what you suggest to delete? If you think that the wording of the article is not accurate, you must be accurate yourself and specify exactly the issue. In any case, as many textbooks contain a proof of the existence of a basis, if the proof given here is not correct, it must not be deleted, it must be fixed. D.Lazard (talk) 09:15, 7 June 2020 (UTC)

Hamel bases

The page Hamel basis redirects here, and I'm not sure that is appropriate. If it is appropriate, important information about Hamel bases is missing from this article. Hamel bases are discussed most frequently as bases for the real numbers considered as a vector space over the rationals, and the article here omits that important idea. I'm going to remove the redirection and add "Hamel basis" to the list of requested mathematics article, and possibly supply a new "Hamel basis" article myself when I can. -- Dominus 18:46, 20 May 2004 (UTC)

You have a point. But in the scheme of things I'd say the point made at length about orthonormal bases not being vector space bases is a more central topic.

A dedicated article about Hamel bases for R over Q would be fine, IMO. There is enough to say.

By the way, the introduction of the idea of basis by the four conditions that can be proved equivalent is a bad old Bourbakiste trick, unsuitable for WP. Basis od a vector space is something very fundamental in algebra. It needs a more gentle and readable introduction. Charles Matthews 12:40, 6 Mar 2005 (UTC)

I'm curious about the following statement: Every Hamel basis of this space is much bigger than this merely countably infinite set of functions. Hamel bases of spaces of this kind are of little if any interest. I have the following problem in mind: let V be the space of infinite sequences of reals with finitely many nonzero components. The standard Hamel basis of this space is countably infinite. To prove that the dual of V, V*, is not isomorphic to V, one might show that any Hamel basis of V* is uncountable, no? 66.235.51.96 03:51, 12 November 2005 (UTC)

Request for technical explanation

This article is crying out for a picture and an example using R3 or R2 with analogy to a simple XYZ or XY coordinate system. --Beland 16:17, 18 December 2005 (UTC)

Too Technical

I am trying to understand dirac notation by reading wikipedia, and I am finding that all the articles are very technical. Now, "basis" is an idea I can wrap my head around, so I'll add an informal thingo. If you find it inaccurate, tweak it by all means but don't just remove purely it on the basis that "this is so basic that everyone already knows it".

New To Advanced Math

Hi; I'm trying desperately to understand many of these advanced principals of mathematics, such as basis, but no matter how many times I review the material, it doesn't sink in. Could someone please provide examples, problems to solve (with their solutions) and/or ways to visualize this? beno 26 Jan 2006

Hmmm, in my experience learning university-level mathematics was a fairly involved project that would have been difficult to do off the web :). I highly recommend finding a university and at least sitting in on their classes (or better, enrolling), if you possibly can. -- pde 23:19, 8 March 2006 (UTC)
Also, MIT has video lectures in linear algebra that you might find helpful. They can be downloaded here: http://ocw.mit.edu/OcwWeb/Mathematics/18-06Spring-2005/VideoLectures/index.htm. I'm still trying to learn this stuff myself! —The preceding unsigned comment was added by Fastfilm (talkcontribs) 16:44, 26 April 2007 (UTC).

Co-ordinate vs. Coordinate

Assuming there's a difference, can someone explain why there would be inconsistent references in this article? It would only make sense logically to use one and use the same throughout (e.g. don't call apples oranges, and vice versa--just call it what you normally call it). -therearenospoons 16:30, 12 April 2006 (UTC)

Both are acceptable but 'coordinate' seems to be more standard in current usage. The change will be made.--RDBury (talk) 12:29, 13 July 2008 (UTC)

Definition

Recently, the definition was changed from:

"Let B be a subset of a vector space V. A linear combination is a finite sum of the form
where the vk are different vectors from B and the ak are scalars. The vectors in B are linearly independent if the only linear combinations adding up to the zero vector have a1 = ... = an = 0. The set B is a generating set if every vector in V is a linear combination of vectors in B. Finally, B is a basis if it is a generating set of linearly independent vectors."

to:

"A basis B of a vector space V is a linearly independant subset of V that spans (or generates) V.
If B is endowed with a specific order (i.e. B has a finite sequence of linearly independant vectors that generate V), then B is called an ordered basis."

The editor said that this simplifies the definition. While the second definition is definitely shorter, it does not explain what "linearly independent" and "spans" means. For that reason, I prefer the first definition. I expect that most people who know what "linearly independent" and "spans" means, also know what "basis" means and do not need to read the definition. -- Jitse Niesen (talk) 04:57, 28 April 2006 (UTC)

I added some more detail in the definition, however, the definition of a basis is strictly what is in the first line (reference: Linear Algebra, 4th edition, Friedberg, Insel, Spence). The extra info is just an explanation of what the definition means. --Spindled 05:41, 28 April 2006 (UTC)
I like that the new section starts with the most important fact, what a basis is. However, I did make some edits to it (covered the infinite case, there is no A in independent in English, only variables should be italics but no parentheses, numbers, or other symbols); I hope you don't mind. By the way, have you already found Wikipedia:WikiProject Mathematics? -- Jitse Niesen (talk) 12:18, 28 April 2006 (UTC)

Is "By brute force" a colloquialism?

The example of alternative proofs is terrific. However, the phrase "By brute force" sounds like a colloquialism, so I have changed it to "By algebra". I considered "By algebraic manipulation", which sounds like something a con artist would do, and "By calculation", which sounds like something a calculator could do. :-)

I have removed the characterization of alternative proof methods because it implied that anyone who uses the first method is unsophisticated. That is POV. Perhaps someone can rewrite it. --Jtir 18:58, 26 September 2006 (UTC)

Personally, I don't think 'brute force' is too colloquial, but on the other hand I on't think it is the right phrase to use in this case. The proof of independence is simply going back to the definition of independence, so maybe ... 'from the definition' might be appropriate? Madmath789 19:33, 26 September 2006 (UTC)
That's much better! I would never have thought of that because I was hung up on maintaining the "By ..." pattern. I have made the edit, which includes a smoother intro. --Jtir 20:16, 26 September 2006 (UTC)

Unclear wording about Hamel basis

The section titled Hamel Basis currently begins with the sentance:

The phrase Hamel basis is sometimes used to refer to a basis as defined above, in which the fact that all linear combinations are finite is crucial.

But what lies above is the definition for the ordered basis. Is this sentence implying the Hamel basis is a kind of ordered basis? If so, then this should be stated explicitly, instead of the vague "defined above" referent. The second part of the sentence, ...is crucial is also confusing: the sections above don't really discuss finiteness, and so insisting that finiteness is now crucial is even more confusing. Can someone please fix this up? linas 23:20, 14 October 2006 (UTC)

Is it better now or just more confusing? -- Jitse Niesen (talk) 03:26, 15 October 2006 (UTC)

In a Wiki piece about the very fundamental algebraic notion of a "basis", I would avoid emphasizing weird-looking technicalities like R as a vector space over Q. Hilbert basis and the other variants, okay. LDH (talk) 17:57, 19 November 2008 (UTC)

Shauder dimension

I'm not sure, but it seems to me by knowledge that Shauder basis have not to be of the same cardinality. Consequently Shauder dimension should be defined more clearly. —The preceding unsigned comment was added by 195.220.60.4 (talk) 12:08, 15 December 2006 (UTC).

overly math-y wording

The definition currently contains the following sentence: "A vector space that admits a finite basis is called finite-dimensional." I found the word "admits" in this context to be unfamiliar and it took me a while to figure out exactly what was being said. I'm sure people who know the topic well or who are more math-oriented do not have a problem with it. However, why not "A vector space with a finite basis is called finite-dimensional." OR "A vector space that can be generated by a finite basis is called finite-dimensional."

Any objections? Am I misunderstanding the meaning of "admits"? -pg

You're probably right that the use of "admits" is a bit jargon, though I never realized this. I changed it to "A vector space that has a finite basis is called finite-dimensional." Is that better?
Your first suggestion might be taken to require that the basis is specified. It doesn't need to be specified, it only needs to exist. Your second suggestion is okay, but I like "that has a finite basis" better because it's shorter. -- Jitse Niesen (talk) 03:32, 27 April 2007 (UTC)
I see what you mean, and your new phrasing sounds great to me. Thanks! -pg —The preceding unsigned comment was added by 71.247.186.233 (talk) 06:08, 2 May 2007 (UTC).

The picture alongside "Definition" shown does not show up clearly in my browser window. Can something be done about this ?

Thanks

Banach Section

In the subsection called "Banach Spaces" there is a statement that any Banach space with cardinality "strictly greater than the continuum ()" has an uncountable Hamel basis. Isn't the statement the continuum hypothesis? So, if that is necessary to state shouldn't we just say or say "assuming CH"? Gandalf (talk) 16:00, 13 February 2008 (UTC)

No, is provable without CH. There's a proof in the Cardinality of the continuum article. (The Continuum Hypothesis says that there is no cardinal such that . This is equivalent (in ZFC) to , or, if you prefer, .) But that statement from the Banach space section is just silly, even though it's correct. --Zundark (talk) 16:54, 13 February 2008 (UTC)
I knew that that. I'm so silly. Sorry. Gandalf (talk) 15:47, 19 February 2008 (UTC)

I reorganized (partially) the Related notions section since there was too much talking. Orthogonal basis and Schauder basis have their own pages, for one. Further - the statements concerning cardinality of the space and cardinality of the Hamel basis were false (indeed, the reals have cardinality continuum yet their Hamel basis is far from being uncountable). The distinction between Schauder dimension and Hamel dimension should be referenced and better explained, I took the liberty of deleting it for now. —Preceding unsigned comment added by Protony (talkcontribs) 22:40, 14 August 2008 (UTC)

I am looking for an exemplified difference between standard basis, Schauder bais, Ordered basis and Hamel basis. Any link or any hint. Please help me. Ahsanulhaqpk (talk) 14:07, 15 March 2009 (UTC)

Pronunciation

Is the plural "bases" pronounced "base-eez" or "base-iz"? —Preceding unsigned comment added by 24.2.48.202 (talk) 22:23, 14 November 2008 (UTC)

I say base-eez. I don't care if it's legitimate. I just want to be understood. :) LDH (talk) 17:48, 19 November 2008 (UTC)

hamel basis

In the section on "Analysis", it is implied that a Hamel basis is simply some basis for an infinite-dimensional vector space, which involves "taking infinite linear combinations of the basic vectors in order to generate the space".

This is totally wrong AFAIK; the whole point of a Hamel basis is that you only need finite linear combinations. The most famous example of a Hamel basis (see e.g. Hewitt and Stromberg, Real and Complex Analysis) for the real numbers over the rationals. There, the Hamel basis is a set of real numbers such that any real number can be written as a finite linear combination of these real numbers multiplied by rational coefficients.

See H&S for the more general definition; the other key point is that the existence of such a Hamel basis for uncountable sets requires the axiom of choice. This makes them rather odd as practical constructs, as they are only useful for non-constructive proofs.

— Steven G. Johnson (talk) 15:22, 19 March 2010 (UTC)

The section says that the Hamel basis is the notion discussed earlier in the article, and that the other notions allow "infinite linear combinations". Perhaps the phrasing is ambiguous (rather than the content being dubious). RobHar (talk) 15:59, 19 March 2010 (UTC)

Dimension theorem needs only the ultrafilter lemma?

The article mentions that it can be proven from the ultrafilter lemma alone (without full choice) that any to bases are equipotent (the dimension theorem), and this claim is repeated on the page Dimension theorem for vector spaces. The proof on that page, however, isn't at all clear and seems to require the full axiom of choice (see [1]). Can anyone shed some light on this? --81.206.99.122 (talk) 11:20, 14 May 2010 (UTC)

Indeed, the dimension theorem (also known as Löwig's theorem) follows from the ultrafilter lemma. I have checked the primary source on this kind of issues: "Consequences of the axiom of choice" by Howard & Rubin. You can also check it online at http://consequences.emich.edu/file-source/htdocs/conseq.htm . Godelian (talk) 05:04, 11 October 2011 (UTC)

Unexplained introduction of a new symbol

The article says "Let S be a subset of a vector space V. To extend S to a basis means to find a basis B that contains S as a subset. This can be done if and only if S is linearly independent. Almost always, there is more than one such B, except in rather special circumstances (i.e. L is already a basis, or L is empty and V has two elements)."

What is L? Is it just supposed to be S? Chesemonkyloma (talk) 20:32, 25 July 2012 (UTC)

Yes, it clearly is meant to be S, and I've fixed this. However, I think this section still need reviewing. — Quondum 09:30, 28 July 2012 (UTC)

Basis and Bases

The article contains two spellings (Basis and Bases) for what I believe is the same thing. Please standardize the article by using the spelling, "basis". If indeed "basis" and "bases" ARE meant to represent different concepts, then explicit mention of the similar spellings is in order.

Since I am not 100% sure about this I will leave the edits to experts. Also please see the article on matroids which has a similar issue.

Churchill17 (talk) 21:39, 21 April 2013 (UTC)

"bases" is plural. I edited the entry to clarify this. AmirOnWiki (talk) 12:21, 7 November 2014 (UTC)

Diagrams

Here are two images which show the main idea of basis vectors - namely taking linear combinations of them to obtain new vectors and that a vector can be represented in more than one basis.


A LC of one basis set of vectors obtains new vectors. If they are non-coincident and non zero, these form a new basis set. Each set is related by a linear transformation.
A vector (purple arrow) can be represented in two different bases (green and blue arrows).
If basis vectors are each scalar-multiplied by a number and then vector-added, i.e. a linear combination of vectors, the result is a vector in the space spanned by those basis vectors. Here the vector space is just 3d Euclidean space, shown are two possible bases (not orthonormal).

Feel free to take/leave, edit the captions, request labeling (which I left out to reduce clutter and see what it would be like without), move them to another article, complain, etc. Best, M∧Ŝc2ħεИτlk 17:20, 25 April 2013 (UTC)

NB: These have been redrawn twice now. If you have comments please state them here instead so feedback is all in one place. Thanks, M∧Ŝc2ħεИτlk 09:29, 27 April 2013 (UTC)

Proof that every vector space has a basis

I was making my way through this otherwise elegant proof and came across the following in the body of the proof:

"we say that L1 ≤ L2 when L1 ⊂ L2"

Elsewhere in the Wikipedia Mathematics collection (namely "Subset", section "The symbols ⊂ and ⊃") it is conspicuously noted that:

" if A ⊆ B, then A may or may not be equal to B, but if A ⊂ B, then A is definitely not equal to B."

Am I correct to note this as a error in symbology? — Preceding unsigned comment added by Hubbligart (talkcontribs) 23:53, 27 December 2014 (UTC)

Basis matrix

The definition of Basis matrix is confusing in the article, are the columns basis vectors or are the rows basis vectors? Smk65536 (talk) 08:09, 19 July 2015 (UTC)

Are you referring to the section Basis (linear algebra) §Expression of a basis? IMO, we should not be referring the ill-defined concept of a "basis matrix" at all. I have removed the section. —Quondum 14:42, 19 July 2015 (UTC)
If you would have looked a little closer you might have recognized that it was about describing a new basis in terms of an old, the change of basis matrix in other words. Just badly written using poor terminology. The section should have been fixed up (by you) and endowed with instead of being drive by deleted. YohanN7 (talk) 00:12, 20 July 2015 (UTC)

customary terminology not clear in Wikipedia; local editors, heads up

Silly me. I just recently wanted to check on the term 'component of a vector'. I was not careful enough, and let myself be misled by the article Scalar projection, which does not alert the reader to what I now believe to be the case, that 'component' is the customary term, as defined in this present article, Basis (linear algebra). My suggested remedy is to delete the article Scalar projection. Perhaps I am wrong?

It seems me that there is a need to try to tidy this. Expert local editors, heads up.Chjoaygame (talk) 13:12, 14 February 2016 (UTC)

No need to delete Scalar projection. It suffices to edit it, which I have done. D.Lazard (talk) 14:29, 14 February 2016 (UTC)
With respect, the new edit to Scalar projection is, I think, utterly inadequate. A proper survey of sources is needed if the article is to be acceptable. At present it has no sources; no sources. As it stands, it is misleading.Chjoaygame (talk) 15:17, 14 February 2016 (UTC)
As for the sourcing of this article here, Basis (linear algebra). The source in the lead is Halmos. I have just now glanced through it, and didn't find him using the term 'component'. Perhaps I didn't look hard enough? It is the pride of Halmos that he avoids coordinates, but on my quick glance, that doesn't make him a good source for the term 'component'.Chjoaygame (talk) 15:48, 14 February 2016 (UTC)
To judge from the talk-page comment of the article Scalar projection, the term component applies specifically and only to the case of orthogonal bases? But this article Basis (linear algebra) here gives no hint of that. If my reading is right, I think it should do so.
It seems that you are confused by the existence of two strongly related but slightly different concepts of vector. A vector may be an element of a vector space, and in this case, as soon as a basis has been fixed, a component is a coefficient of the decomposition of the vector on the basis. In Euclidean geometry, a vector, sometimes called a free vector or a geometric vector, is an identity that has a direction and a magnitude and defines a translation. The free vectors form a vector space, and among the bases of this vector space, the orthonormal bases play a major role, as they are used to define Cartesian coordinates. Historically, geometric vectors have been introduced a long time before vector spaces. This makes that the terminology may differ, depending if one considers Euclidean geometry or linear algebra. Thus both the "scalar component in a direction" of a geometric vector, and the "components of a vector over a basis" may be called simply "components", and that does not means that one of the terms is incorrect nor that one is more customary than the other. Almost always, the context allows to easily decide which kind of "component" is considered. The relationship between the various concepts of vectors is explained in Euclidean vector. D.Lazard (talk) 17:14, 14 February 2016 (UTC)
Perhaps you wonder why I am so uptight about this. It is because my mistake, partly due to the article Scalar projection, led me to waste much time of other editors elsewhere. Silly me, yes, but I am sore about it, and so is one of them!
Looking at my source,<Bloom, D.M. (1979), Linear Algebra and Geometry, Cambridge University Press, Cambridge UK, ISBN 0-521-21959-0, p. 98.> I don't seem to see it restricting 'component' to orthogonal bases. This puzzles me.Chjoaygame (talk) 16:19, 14 February 2016 (UTC)
Never trust the internet. This crowd here doesn't seem to restrict to orthogonal bases?Chjoaygame (talk) 16:28, 14 February 2016 (UTC)
In this Wikipedia article, Tangential and normal components, the basis is orthogonal, but the components are vectors, not scalars?Chjoaygame (talk) 16:35, 14 February 2016 (UTC)
This crowd is more explicit. Here they talk of 'Cartesian components'.Chjoaygame (talk) 16:42, 14 February 2016 (UTC)
It seems that you are confused by the existence of two strongly related but slightly different concepts of a vector. A vector may be an element of a vector space, and in this case, as soon as a basis has been fixed, a component is a coefficient of the decomposition of the vector on the basis. In Euclidean geometry, a vector, sometimes called a free vector or a geometric vector, is an identity that has a direction and a magnitude and defines a translation. The free vectors form a vector space, and among the bases of this vector space, the orthonormal bases play a major role, as they are used to define Cartesian coordinates. Historically, geometric vectors have been introduced a long time before vector spaces. This makes that the terminology may differ, depending if one considers Euclidean geometry or linear algebra. Thus both the "scalar component in a direction" of a geometric vector, and the "components of a vector over a basis" may be called simply "components", and that does not means that one of the terms is incorrect nor that one is more customary than the other. Almost always, the context allows to easily decide which kind of "component" is considered. The relationship between the various concepts of vectors is explained in Euclidean vector. D.Lazard (talk) 17:14, 14 February 2016 (UTC)
Thank you for this lecture. It looks as if it should be in the articles, not just here on the talk page. My puzzle is not between two concepts of vectors. It is as to how Wikipedia articles should present this material.
Continuing my efforts to find how the words are used. Only a few books are accessible to me right now.
In Gibson, C.G. (2003), Elementary Geometry: An Introduction, Cambridge University Press, Cambridge UK, ISBN 0-521-83448-1, on page 13 I read: "We call the vector λW the component of Z parallel to W, and the vector Z′ = Z − λW the component of Z perpendicular to W."
In Marsden, J.E., Tromba, A.J. (1976/1981), Vector Calculus, W.H. Freeman, San Francisco CA, ISBN 0-7167-1244-X, on page 1 I see that they refer to Cartesian coordinates, and their components are scalars.
Evidently, as you say, respectable sources vary. I think the articles should make this clear. Currently, they don't.Chjoaygame (talk) 18:03, 14 February 2016 (UTC)

Hello fellow Wikipedians,

I have just modified one external link on Basis (linear algebra). Please take a moment to review my edit. If you have any questions, or need the bot to ignore the links, or the page altogether, please visit this simple FaQ for additional information. I made the following changes:

When you have finished reviewing my changes, please set the checked parameter below to true or failed to let others know (documentation at {{Sourcecheck}}).

This message was posted before February 2018. After February 2018, "External links modified" talk page sections are no longer generated or monitored by InternetArchiveBot. No special action is required regarding these talk page notices, other than regular verification using the archive tool instructions below. Editors have permission to delete these "External links modified" talk page sections if they want to de-clutter talk pages, but see the RfC before doing mass systematic removals. This message is updated dynamically through the template {{source check}} (last update: 5 June 2024).

  • If you have discovered URLs which were erroneously considered dead by the bot, you can report them with this tool.
  • If you found an error with any archives or the URLs themselves, you can fix them with this tool.

Cheers.—InternetArchiveBot (Report bug) 03:02, 28 October 2016 (UTC)

Change of basis

I think the above mentioned section is quite useless, besides that it redirects to the actual change of basis article. Unfortunately, I don't think I could rewrite it myself but wanted to bring it to attention.

Also, I think there is an error in the section, we have and immediately after . But, at the beginning of the section was one of the elements of one of the bases, thus, an element of not of . BadSaturn 11:13, 30 Oct 2018 (UTC)

This section is not only useless, it is a mess: The definition of coordinates should appear in section "Definition" (IMO, at the beginning of the section as the first definition of a basis); the isomorphism with should appear elsewhere and the formula of basis change is not given. If I got time, I'll fix these issues. D.Lazard 12:30, 30 October 2018‎ D.Lazard (UTC)
 Done In fact, for fixing the mess, I had to rewrite a large part of the article. I hope that the section is now more useful. D.Lazard (talk) 09:29, 2 November 2018 (UTC)

Ordered bases (again)

I would like to find some info about ordering status of a basis wrt to transformations. I do not know whether there is something to say beyond the existence of two equivalence classes, but I assume that these things are relevant for the concept of orientation and derived notions ("ccw = +" ?). May I ask for some clarifications in this direction? Purgy (talk) 12:57, 1 November 2018 (UTC)

It is true that the orientation of a Euclidean depends on the ordering of a basis. An odd permutation of the basis elements changes the orientation of the space. This is clear from the definition of the cross product. Once an orientation has been chosen, the orientation of any ordered basis is given by the sign of the scalar triple product of the basis elements, and a transformation preserves the orientation if it preserves the sign of the triple product of any (and thus all) ordered basis. However, I am not sure that this article is the right place for explaining this, and I do not see how explaining this here by keeping the same level of technicality.
By the way, in the previous version of the article, I was somehow uncomfortable with the emphasis on the basis ordering. In fact, this is a concept that, maybe, appears in teaching, but does not appears explicitly in higher mathematics. In practice, the basis ordering is hidden behind the concept of coordinate vector. This is my reason for changing the heading, and this was my guide for rewriting the section by emphasizing on coordinates.
Also, a difficulty for rewriting this section is that I have not found any Wp article with a clear definition of the structure of vector space on the set of n-tuples of elements of the field F. Even for the definition as a set the notation is not defined in tuple, nor in Direct product, and it is not evident to find the redirect exponentiation over sets. Some idea for improving this aspect of WP? D.Lazard (talk) 14:16, 1 November 2018 (UTC)

Change of basis section is confusing

Please could someone figure out what the Change of Basis section is trying to say, and rewrite it so that it doesn't use such confusing variable letters?

1. What is the point of introducing Bo and Bn, which look like their subscripts are zero and n (where n is already used for the n dimensions). The writer realized this was confusing and added a lengthy sentence to explain these subscripts. But then B is never used again (but "n" is); instead the narrative spells out "new basis" and "old basis" (obscuring the connection to vi and wi). It would completely remove the confusion to just write Bold and Bnew, and use it consistently.

2. The vector v is introduced, with components xi (old basis) and yi (new basis). But the components vi are already introduced as the components of the old basis itself. Specifically, the reader must follow that vi are not the components of v! This is gratuitously confusing.

3. "The formula for changing the coordinates with respect to the other basis" is introduced after it was already used in the transformation of v from old coordinates to new. Not sure why it's introduced in that order. Gwideman (talk) 22:14, 28 May 2019 (UTC)

I have changed the subscripts o and n into old and new. I have also changed v into x for clarifying that vi (which is a vector) is not a component of v. Finally, I have moved the proof after the statement of the result. On the other hand, it seems that you confuse elements of a basis which are vectors (here vi and wi) and components of a vector on a basis (here xi and yi). This may explain your impression that the formula is used before being stated, when the reality is that it was proved (not used) before being stated. D.Lazard (talk) 08:49, 29 May 2019 (UTC)