Talk:Vector space/Archive 3

From Wikipedia, the free encyclopedia
Jump to: navigation, search

Riemann integral and completeness[edit]

In the section "Banach spaces" it is written "If one uses the Riemann integral instead, the space is not complete, which may be seen as a justification for Lebesgue's integration theory. Indeed, the Dirichlet function of the rationals, is the limit of Riemann-integrable functions, but not itself Riemann-integrable.[57]"

The first phrase if OK with me, but the second is not. The Dirichlet function is not an example of an element of the complete Banach space Lp not belonging to the (linear, non-closed) subspace of Riemann integrable functions. Indeed, the Dirichlet function is equivalent to a constant. Any sequence of Riemann integrable functions that converges to the Dirichlet function IN NORM, converges to a constant in norm. Pointwise convergence is not relevant here. Boris Tsirelson (talk) 07:45, 19 November 2008 (UTC)

In addition: "One example of such a function is the indicator function of the rational numbers, also known as the Dirichlet function." (quote from Dirichlet function) Either say "Dirichlet function", or "indicator function of the rationals", but not "Dirichlet function of the rationals". Boris Tsirelson (talk) 07:50, 19 November 2008 (UTC)

Thanks, Boris. I'm just a stupid guy :( -- I forgot that identification business at that point. Actually the problem is, I did not find a reference for the fact that the Riemann integral yields an incomplete space. It sounds like you might have one? Could you please help out by putting a precise ref at that place? For the moment I simply removed the wrong statement, which also solves your secound point. Jakob.scholbach (talk) 08:58, 19 November 2008 (UTC)
A function is Riemann integrable (on a finite interval, I mean) if and only if it is bounded, and continuous almost everywhere. The space Lp evidently contains unbounded functions (unless p=infinity), which makes the statement trivial. However, this is a very cheap argument; usually for an unbounded function one uses improper Riemann integral. It is much more interesting to see a function that fails to be continuous almost everywhere, and cannot be "repaired" by a change on a null set. The indicator of a dense open set of small measure fits. I'll try to find an appropriate reference. Boris Tsirelson (talk) 19:23, 19 November 2008 (UTC)
See Smith-Volterra-Cantor set; its indicator function fits (and its complement is a dense open set not of full measure).
Also, look at this quote: "Many functions in L2 of Lebesgue measure, being unbounded, cannot be integrated with the classical Riemann integral. So spaces of Riemann integrable functions would not be complete in the L2 norm, and the orthogonal decomposition would not apply to them. This shows one of the advantages of Lebesgue integration." Richard M. Dudley, "Real analysis and probability", 1989 (see Sect. 5.3, page 125).
For now I do not have anything better; maybe tomorrow... Boris Tsirelson (talk) 20:02, 19 November 2008 (UTC)

Distributions[edit]

"Distributions" section starts with "A distribution (or generalized function) is a map assigning a number to functions in a given vector space, in a continuous way". First of all the map is linear (and second, continuous). Also, to be continuous (or not), it needs to be defined on a space with a topology, not just a vector space. Also, a continuous linear functional on SOME linear topological (or Hilbert, etc) space is not at all a distribution (or generalized function). Boris Tsirelson (talk) 08:05, 19 November 2008 (UTC)

That should be OK now? As you see, I'm not at all into analysis, so I'd be grateful if you could give the article a thorough scan (in these respects or any other, too). Jakob.scholbach (talk) 09:51, 19 November 2008 (UTC)
Yes, it is OK with me now. Yes, I did scan (maybe, not quite thoroughly). Really, I like the article. Boris Tsirelson (talk) 19:25, 19 November 2008 (UTC)

Complex and real vector spaces[edit]

Maybe it would be nice to add the following (easy) example illustrating how the dimension of a space depends also on the field over which the vector space is defined:

The complex numbers over the complex field and R2 over the field of real numbers have dimensions 1 and 2 respectively.

I can add this example tomorrow (but not know; its past my bedtime).

Topology Expert (talk) 13:18, 21 November 2008 (UTC)

I have now written a word about C over R. Jakob.scholbach (talk) 21:44, 26 November 2008 (UTC)

Topological aspects of the article[edit]

The article is great but there are some problems with the topological part of the article. For instance, 'more generally, the Grassmannian manifold consists of linear subspaces of higher (fixed) dimension n' is mathematically incorrect. In general, the collection of all such subspaces need not be a manifold (Banach manifold perhaps if restrictions on the vector space are imposed but not a Euclidean manifold). I have added a bit of information on tangent bundles but a little more could be added.

Also, if the article discusses applications of vector spaces to topology, why not include something on Banach manifolds? They are very important (in my opinion) and since they are related to 'Grassmannians for arbitrary Banach spaces', it maybe useful to include something about it.

Topology Expert (talk) 18:40, 3 December 2008 (UTC)

I have added "finite-dimensional" to the projective space discussion (which also sets the stage for the Grassmannian). As for your other additions: I think the discussion of parallelizable leeds us astray, so I have trimmed it down a bit. (The material would be an addition to tangent bundle or tangent space, but we have to stay very focussed here). Banach manifolds? Hm, currently we don't really talk about "usual" manifolds (and I don't yet see why we should). What particular application do you have in mind? Jakob.scholbach (talk) 19:00, 3 December 2008 (UTC)

Category of vector spaces[edit]

Perhaps a brief summary of this?

Topology Expert (talk) 19:23, 3 December 2008 (UTC)

Well, what particularly do you mean? The category of vector spaces is mentioned. Of the particular properties, we could mention semisimplicity perhaps. Jakob.scholbach (talk) 21:04, 3 December 2008 (UTC)
Maybe not; I did not know that there was an article on that. I will add that to the 'see also' section.

Topology Expert (talk) 07:31, 4 December 2008 (UTC)

Minor details[edit]

The article is coming along but there are still a few minor facts that should be added here and there. For instance, there was no mention of what an eigenspace is or what it means for a map between vector spaces to be 'orientation preserving'. I will try to add as much as I can (minor facts) but since I can't spot all minor facts it would be helpful if other editors helped. As I mentioned, the 'topology sections' could be improved but I can do that.

Topology Expert (talk) 09:28, 4 December 2008 (UTC)

Should be a good article[edit]

In my opinion, the article should be a good article (I don't understand why it is not a featured article but I can take User:Jakob.scholbach's word on that). It has over a 100 references (for even the trivial statements) and basically anything I can think of related to vector spaces is included in the article (in all branches of mathematics). Maybe there are a few minor details that the article is missing out on but those would be probably required at the featured article nomination.

Topology Expert (talk) 09:45, 4 December 2008 (UTC)

Manifolds and tangent spaces[edit]

The section on manifolds contains the following sentence:

"It (the tangent space) can be calculated by the partial derivatives of equations defining M inside the ambiant space."

There are many things wrong with this sentences (besides the misspelling of ambient). First of all it suggest that all manifolds have some ambient space in which they are embedded. This a popular intuitive misconception that we definitely don't want advertise on wikipedia. (This misconception is a great obstruction to people understanding the curving of spacetime in general relativity.) Of course, defining the tangent space for an intrinsic way is notoriously abstract, and I see that we don't want to talk about germ_(mathematics) in this article. But even if you accept an embedding space for the manifold this sentence makes very little sense. You can either take partial derivatives of the embedding function to find the tangent space. (although that seems awkward in this context because for an embedding you'd first need to define an abstract manifold. Or you can linearize the equations defining the manifold (i.e. x2 + y2 =1 for a circle) around a point on the manifold to find the tangent space at that point. The later clearly involves partial derivatives, but I certainly wouldn't describe it as calculating by the partial derivatives of the equations. I'm not sure how to fix this in an easy war, since most of the solutions involve going into detail on a subject that is increasingly off-topic for this subject. Does anybody see an elegant concise way to rephrase this, so that it remains understandable for most people. (TimothyRias (talk) 10:33, 4 December 2008 (UTC))

Yeah. We definitely don't want to talk about non-embedded manifolds! This is just to give some idea what this is good for. But we could just remove that sentence. Jakob.scholbach (talk) 13:05, 4 December 2008 (UTC)
I agree with that error (I hate restricitive mathematics which only considers subspaces of Euclidean space but I am sure that in the next century maths will not be like that anymore). The topology part of the article does need some work before it can go to GA (everything else is fine except for the occassional mistake such as allowing the zero vector to be an eigenvector which someone corrected recently).

Topology Expert (talk) 08:16, 5 December 2008 (UTC)

By the way, it should be mentioned that differential topology is not about "calculating" partial derivatives; it is more about checking relations between differential manifolds (as Cantor once said: mathematics is not about studying objects but rather the relations between them).

Topology Expert (talk) 08:18, 5 December 2008 (UTC)

Recent edits[edit]

As the collaborative aspects of WP gains speed, which is cool, I take the opportunity to point out some ideas I have about writing a good article, by exemplifying with some recent edits. My ideas have been shaped by/after FAC discussions, manual of style and so on. I don't want to be imposing, but am just trying to save time for all of us.

  • Typesetting is something which requires care, e.g. '''R<sup>2</sup>''' (R2) should be '''R''<sup>2</sup> (R2).
  • Italics are used only for top-level notions, or to emphasize things: "The determinant of a square matrix can also tell us whether the corresponding linear transformation is orientation preserving or not." I feel orientation preserving is neither of the two.
  • Talking about "us" and "we" should be avoided.
  • Please try to keep the structure of sections etc., if you agree with it. E.g. adding another example (cross product to the Lie algebra thread) should be close to the examples already given. If you want to reorganize things (in this case put examples up first), look what other changes this necessitates (in this case moving the other example up)
  • Whereever {{Main}} templates are used, the corresponding subarticle should not be wikilinked again, to avoid overlinking. Also, main templates should link to only very important related articles (which cross product is not, IMO).
  • The "see also" section should not repeat items already covered in the text. Jakob.scholbach (talk) 13:09, 4 December 2008 (UTC)
Sorry if I was not following some of those conventions; I will try so in future. But if I ever miss a convention, feel free to either correct it or revert (I will try my best not to miss one) (I know it is hard to follow my edits like that but hopefully 90% of them, at least, should be alright).

Topology Expert (talk) 08:12, 5 December 2008 (UTC)

Thanks for cleaning it up.

Topology Expert (talk) 13:10, 5 December 2008 (UTC)

The "see also" section should not repeat items already covered in the text.
Personally, I do not agree with this convention for several reasons:
  1. A reader may not read a particular section where a topic is Wikilinked. Often only the first occurrence of a topic is Wikilinked, so a reader of a later section will be unaware that there exists a Wikilink.
  2. I personally find it very handy to be able to scroll down to See also just to see what is out there. If significant subjects are not there, it's a problem.
  3. As an editor, when changing items in an article I often wish to refer to other related articles to be sure of compatibility and not missing items of importance. It is nice to use See also for this purpose, rather than scanning through a long article to find all the embedded links.
For all these reasons, I believe all significant articles should appear in See also section or referred using a {{seealso}} template. Brews ohare (talk) 15:54, 6 December 2008 (UTC)

I disagree with you. We have to distinguish between an article which is under development, i.e. a stub or start class article, and an article that is reasonably complete, such as this one here. When writing a stub, it is very good to put related notions into the s.a. section, just as a replacement of a more thorough treatment.

If interpreted literally, your arguments (all of which are pretty much parallel) would lead to including pretty much every linked notion in the s.a. section, which is not useful either for editors nor readers. It is, sad or not, a fact that a reader will have to read an article if she/he wants to know about it. If you have little time, a glance at the TOC should give you the big points of the topic in question. See also ;) the relevant MOS section. The s.a. section is just for putting peripherical notions whose (minor) importance (to the article in question) does not give them the "right" to deserve a subsection or even a phrase. Jakob.scholbach (talk) 16:07, 6 December 2008 (UTC)

I think the truth is somewhere in between. If there is a prominent "main article" link, it makes no sense to repeat the link in the "see also" section. If there is only an obscure link to a section of another article hidden somewhere in a footnote, this is obviously no reason not to put a link to the article into the "see also" section (if it should be there otherwise). I think we really need to use judgement, weighing the relevance of a link against the prominence with which it already occurs in the article. But I agree that in finished articles the "see also" section is often not needed. --Hans Adler (talk) 16:26, 6 December 2008 (UTC)
Jakob exaggerated my suggestion, which actually states:
I believe all significant articles should appear in See also section or referred using a {{seealso}} template.
That means I'd object to putting "an obscure link to a section" in See also, but favor including "significant articles", unless already in a {{seealso}} or {{main}} template. Of course, who can argue against using judgment? Brews ohare (talk) 17:55, 6 December 2008 (UTC)
I'm sorry, Brews. Somehow I did indeed not see your last line above. Do we agree that category of vector spaces (just as an example) should not reappear in the see also section, or do you think it is significant enough to make it show up again? I guess it's also not that important of an issue. Much more enerving (to me) is it that despite my iterated posting at WT:WPM nobody seems to be inclined to review the article. What can we do about that? Jakob.scholbach (talk) 18:14, 6 December 2008 (UTC)

Hi Jacob: I have no experience with such things. Try asking User_talk:Dicklyon, who I have found to be very helpful. Brews ohare (talk) 20:23, 6 December 2008 (UTC)

Format of See also[edit]

It can be a useful discipline in the See also section to use headings to classify various links by subject. Doing this helps the reader and also leads to some useful scrutiny of what is linked, to avoid it becoming a "gee, this is interesting" section. Here is an example from k·p perturbation theory:

See also[edit]

Multiple column format is implemented using {{Col-begin}} rather than a myriad of alternatives because many of the alternatives work fine in Firefox, but not in Internet Explorer. Brews ohare (talk) 15:59, 6 December 2008 (UTC)

notes from GeometryGirl (talk)[edit]

  • I don't really know how to deal with this sentence: "Another conceptually important point is that elements of vector spaces are not usually expressed as linear combinations of a particular set of vectors". "point" sounds informal and something is wrong with "usually expressed"
  • In the section "linear equations" I would add a small note about annihilators being 'dual' to linear equations
    • What do you mean by that? Jakob.scholbach (talk) 12:55, 7 December 2008 (UTC)
      • Well, taking the example given, if e1, e2, e3 is the standard basis for (R^3)* (the dual of R^3) then the space of solutions is simply the annihilator of W = <e1 + 3e2 + e3, 4e1 + 2e2 + 2e3>. The dual perspective makes it clear why the set of solutions to a set of linear equations is naturally a vector space. GeometryGirl (talk) 14:50, 7 December 2008 (UTC)
        • Hm. I'm not so convinced. That the solutions form a vector space is clear(er) by seeing it as the kernel, right? I personally believe talking about annihilators would lead us a bit astray. We would have to talk about dual space, dual basis, pairing at that point, which is a bit too much. What do others think? Jakob.scholbach (talk) 15:54, 7 December 2008 (UTC)

GA Review[edit]

This review is transcluded from Talk:Vector space/GA1. The edit link for this section can be used to add comments to the review.
GA review (see here for criteria)
  1. It is reasonably well written.
    a (prose): b (MoS):
  2. It is factually accurate and verifiable.
    a (references): b (citations to reliable sources): c (OR):
  3. It is broad in its coverage.
    a (major aspects): b (focused):
  4. It follows the neutral point of view policy.
    a (fair representation): b (all significant views):
  5. It is stable.
  6. It contains images, where possible, to illustrate the topic.
    a (tagged and captioned): b lack of images (does not in itself exclude GA): c (non-free images have fair use rationales):
  7. Overall:
    a Pass/Fail:
    • You pass. Congratulations! Ozob (talk) 03:33, 12 December 2008 (UTC)

Here are some specific issues that I'd like fixed before this reaches GA:

  • The lead says "much of their [vector spaces'] theory is of a linear nature"; but I don't think the meaning of "linear nature" will be apparent to people unfamiliar with vector spaces. E.g., someone might not know what a linear combination or linear transformation is.
    • OK. Better now?
      • Yes.
  • In the "Motivation and definition" and definition section, "linear combination" has not yet been defined. It might be better to say that there is no preferred set of numbers for a vector, and to say no more until bases have been introduced.
    • OK.
      • Also good.
  • In the subheading "Field extensions", the description of Q(z) is odd: It sounds like you mean for z to be a transcendental, but you say that z is complex. If z=1 then the field extension is trivial; even if the field extension is non-trivial, it's not unique (square root of 2 vs. cube root of 2). I see below that you do really mean for z to be complex, but perhaps there's a better way to say what you mean.
    • I'm not sure I understand your points. I do mean z to be complex, just for concreteness. ("Another example is Q(z), the smallest field containing the rationals and some complex number z.") What is the problem with z=1 and a trivial extension? What do you mean by "it's not unique"? (I think, for simplicity, the subfield-of-C-definition I'm giving is appropriate at this stage, and yields something unique).
      • I think what bothers me is that you say you are about to give another example (singular) and then proceed to give a family of examples (plural). I've changed the text to try to make this better; is this OK for you? (BTW, I used an α instead of a z because z looks like a transcendental to me. This might have been part of my confusion, too. But change it back if you think having a z is better.)
        • α is fine, but I will have to eliminate the redundancy with the section on dimension later. Jakob.scholbach (talk) 08:07, 10 December 2008 (UTC)
  • The article should say very early that abstract vector spaces don't have a notion of an angle or of distance or of nearness. This is confusing for most people.
    • OK. (In the definition section).
  • The bolded expression〈x | y〉does not display properly on Safari 3.0.4; the left and right hand angle brackets show up as squares, Safari's usual notation for "I don't have this character". (It works when unbolded, as I found out when I previewed this page.)
    • Yeah, it was weird, there were two types of angle brackets. Can you read them now (there are three occurences in that section).
      • Yes.
  • The natural map V → V** is only discussed in the topological setting. It should be discussed in general. (Note that the map is always injective if one considers the algebraic dual (for each v, use the linear functional "project onto v".))
    • Done. (will provide a ref. later) Jakob.scholbach (talk) 12:50, 7 December 2008 (UTC)
      • I now had a look at most of the algebra books listed in the article and none of them, actually, talks about algebraic biduals. So I wonder if this is so important. (I wondered already before). Jakob.scholbach (talk) 21:40, 9 December 2008 (UTC)
        • Hmm! I know that they appear in Halmos's Finite dimensional vector spaces (p. 28, exercise 9). But it seems to me that the best reason for discussing them is the finite-dimensional case: Right now, the article doesn't discuss reflexivity of finite-dimensional vector spaces, a real gap!
  • JPEG uses a discrete cosine transform, not a discrete Fourier transform.
    • OK. Jakob.scholbach (talk) 12:04, 7 December 2008 (UTC)
      • I'm almost ready to say this is a GA; my only outstanding issue is reflexivity of finite dimensional vector spaces. Ozob (talk) 00:36, 10 December 2008 (UTC)
        • What else do you want ("... This [i.e. reflexifity for top. v.sp.] is in contrast to the linear-algebraic bidual, i.e. where no continuity requirements are imposed:... ")? Jakob.scholbach (talk) 15:53, 10 December 2008 (UTC)
          • Well, I'm not sure what's the best way to state it. But I feel like the fact that all finite-dimensional spaces are reflexive is really, really important and needs to be mentioned somewhere. The way you have it now is fine. Ozob (talk) 03:36, 12 December 2008 (UTC)

Here are some other issues which aren't as pressing but which I think you should handle before FA:

  • I'm not sure that likening a basis for a vector space to generators for a group or a basis for a topology will help most readers. Most people who use and need linear algebra have never heard of these.
    • I'm not sure either! I removed it.
  • Since you mention the determinant, it's worth mentioning that it's a construction from multilinear algebra. A sentence or two should suffice.
    • Except for det (f: V → V) being related to Λ f: Λ V → Λ V (which I think should not be touched here), I don't see why the determinant belongs to multilinear algebra. What specifically do you think of?
      • That's exactly what I was thinking of. I don't want to make a big deal about that construction, but I do think it's good to mention—it's the right way to think about the determinant, and the only way I can think of which admits generalizations (e.g. to vector bundles). I put a sentence in the article about this.
  • It seems that for most of the article, whenever you need an example of a non-abstract vector space, you use solutions to differential equations. I agree wholeheartedly that these are important, but there are probably other good examples out there which shouldn't be slighted.
    • Hm, I also talk about (non-differential) equations, but what else do you have in mind?
      • I'm not sure! I was hoping someone else here would have good ideas.
  • It also seems that you rely on convergence to justify the introduction of other structures such inner products; but inner products can be (and should be, I think) justified on geometric terms, because they're necessary to define the notion of an angle.
      • OK, you did this.
  • It's also worth mentioning the use of vector spaces in representation theory.
    • Done. (Very briefly).
      • Good, that's as much on representation theory as this article needs. Ozob (talk) 00:47, 10 December 2008 (UTC)
  • When writing an integral such as \int f(x)dx, the output looks better if you put a thinspace (a \,) between f and dx: \int f(x)\,dx.
    • OK.
  • Image:Moebiusstrip.png should be an SVG.
    • I tried to convert the image into an svg (Image:Moebiusstrip.svg), but somehow the strip (which was taken from a photo, so png previously) is invisible to me?! Any ideas about that? Jakob.scholbach (talk) 12:50, 7 December 2008 (UTC)
I'll try to make a SVG picture of a moebius strip later today. (TimothyRias (talk) 10:58, 8 December 2008 (UTC))

Ozob (talk) 02:40, 7 December 2008 (UTC)

Thanks very much, Ozob, for your review! Jakob.scholbach (talk) 12:04, 7 December 2008 (UTC)

I concur with the comments above; I have the following comment to make on tensor products, which I would like to be taken addressed before GA status:

The description of tensor product as it stands is too vague (such as "mimicking bilinearity"). It would be better to first give the universal property of tensor product of V and W as the unique vector space E + bilinear map V × WE with the universal property of expressing all bilinear maps from V × W to a vector space F as a linear maps from E to F. Then one could state that a space with these properties does exits, and outline the construction. Similarly the adjoint property of tensor product with respect to Hom is too vague. To control article size, one could consider leaving that out as tensor product article is wikilinked; otherwise one should definitely point out that tensor product is a (bi-) functor. As for extension (and restriction) of scalars (tensoring with extension field of the base field), that could be treated, but then again functoriality of tensor product would be natural to include. Perhaps effective use of summary style could help keep amount of material here still manageable. Stca74 (talk) 10:45, 7 December 2008 (UTC)

Thank you too, Stca, for your review: I have trimmed down the tensor product discussion a bit, but also made it more concrete. I think doing the universal property thing properly (i.e. with explanation) is too long and also a bit too complicated (even uninteresting?) for general folks, so should be deferred to the subpage. As for the isomorphism: I don't know why I called this adjunction isomorphism, since it is effectively both adjunction and reflexivity of f.d. spaces. Anyhow, this comment was just to put tensors in line with scalars, vectors and matrices, but I would not go into functoriality etc. Jakob.scholbach (talk) 15:12, 7 December 2008 (UTC)
Looks more precise now. However, I would still consider adding the universal property (perhaps somewhat informally, at least) - as least for me it is the only way to make sense of the construction, which otherwise risks being just a tangle of formulas. As for the last few lines after the representation of Hom as tensor product of dual of the domain with the target, I'm not sure if I can follow (or expect others to follow). Actually, the canonical map goes in general from the tensor product into the Hom space and is injective. It is bijective if one of the spaces is finite dimensional. Thus, if you insist, you get an interpretation of a tensor (element of the tensor product) as a matrix, but not really tensor as a generalisation of matrix (following scalar, vector, matrix list). Stca74 (talk) 20:06, 7 December 2008 (UTC)
OK, I scrapped the sketched ladder of "tensority". Also the universal property should be fine now. Jakob.scholbach (talk) 21:40, 9 December 2008 (UTC)
Unfortunately, the tensor product section now has a problem: It doesn't define "bilinear", so it doesn't make a lot of sense. The previous version was better in this respect because it was only hand-waving, so the reader didn't expect to understand; but now that the article is more precise, the lack of definition of "bilinear" is a problem. I'm not really sure what to do here; if one defines "bilinear" then one should give an example, but the simplest example is the dot product, which is later in the article. And being vague, as Stca74 noted, is no solution either. It might be good to introduce the dot product here and then reintroduce it later in the inner product section; the second time you'd point out that it's positive definite. (Also, the inner product section currently calls the Minkowski form an "inner product" even though it's not positive definite. I know that in physics, "inner product" doesn't mean positive definite, but it certainly does in math. This deserves a remark somewhere, I think.) Ozob (talk) 00:55, 10 December 2008 (UTC)
(<-) Well, the bilinearity is certainly no problem. I mentioned this now. The problem is more: how to create a little subsection that is inviting enough to guide the reader to subarticle. When I learnt this, I kept wondering "what is this u.pr. all about?" I only got it after learning about fiber products of (affine) schemes, but we certainly cannot put that up! Jakob.scholbach (talk) 08:07, 10 December 2008 (UTC)
Oof, that's a tough way of figuring it out! (Not that I did better!) I agree, this is a tough thing to work out. It'll have to be done before FA, though (if that's where you want to take the article). The only really elementary context I can think of where they turn up is bilinear forms. It might be best to have a section on bilinear forms first (which would mention inner products and the Minkowski metric and link to the article on signature) and then use those to justify tensor products: "Tensor products let us talk about bilinear maps, which you now know to be wonderful, in terms of linear maps, which you also know to be wonderful." That would require reorganizing the article a little, but I don't see a good other solution. Ozob (talk) 03:45, 12 December 2008 (UTC)

I will try to review each section one by one and add comments. But just something User:Ozob said:

  • The article should say very early that abstract vector spaces don't have a notion of an angle or of distance or of nearness. This is confusing for most people.

Maybe you should not emphasize this (nor should you write that they do have these structures) because you can equip vector spaces with a norm (distance and nearness) or an inner product (for angles) and I am quite sure that most of mathematics done on vector spaces studies these structures on them (such as Banach space theory or Riemannian geometry). So perhaps keeping the sentence, should imply that there is an explanation that you still can equip these structures on vector spaces since these strucutures are indeed very important in mathematics.

Topology Expert (talk) 17:17, 7 December 2008 (UTC)

Topology Expert (talk) 17:17, 7 December 2008 (UTC)

Careful with generalisations: any absolute value on a finite field is improper (|x|=1 for all non-zero x) and thus there are no interesting norms to put on vector spaces over finite fields. And while norms on finite-dimensional real vector spaces equivalent, there are still no canonical norms nor inner products. I do agree with Ozob's view that it makes sense to warn readers about this potentially counterintuitive fact. Stca74 (talk) 20:06, 7 December 2008 (UTC)

Topological vector spaces and biduality[edit]

I'm afraid the discussion on biduals (discussed already above during GA nomination) in the topological context is a bit too inaccurate as it stands, and contains claims which only hold with additional hypotheses.

First, the definition of the bidual is incomplete unless the topology of the dual is specified - there is in general no preferred topology on the (topological) dual. The bidual E ' ' is the (topological) dual of the strong dual of E. The theory is normally developed only in the context of locally convex spaces, for which it indeed follows from Hahn-Banach that the canonical mapping of E into its bidual is injective precisely when E is Hausdorff. Next, it is possible to define semireflexive spaces as the ones for which this canonical map is bijective without specifying a topology on the bidual. Reflexivity refers to the canonical map being an isomorphism when the bidual is given the strong topology (with respect to the strong topology of the dual of E). For normed spaces semireflexive and reflexive are equivalent.

As the above suggests, the concept of duality for general (locally convex) spaces is not entirely straightforward, and it would be better to avoid introducing too much of the theory in this article which is primarily about the algebraic theory of vector spaces. While it could in principle be feasible to discuss biduals more easily for normed vector spaces, I would rather agree with Ozob's comments in the GA discussion and treat biduals and reflexivity in the purely algebraic setting here. It would then be possible to point the reader to the relevan articles on topological vector spaces for the related concept in that context. For a very clear discussion on biduals in the algebraic context (also for general modules, not only vector spaces) see e.g. Bourbaki, Algebra, Ch II. Stca74 (talk) 20:39, 12 December 2008 (UTC)

OK. First, what is written certainly has be correct and as precise, so this has to be amended. From what I know, though, I cannot see why algebraic biduality is so important or even more important than topological biduality. The algebraic statement is a triviality, whereas the topological one is not at all. Also, I think, the article should not give more weight to algebraic assets than to functional analysis etc. So, some concrete questions to everybody:
  • What makes algebraic biduality so important? (I don't have the Bourbaki at hand right now, but I'm suspecting it does not tell too much about its importance).
  • Is it right to think of the strong topology on the dual as the "most natural one"? Jakob.scholbach (talk) 09:44, 13 December 2008 (UTC)
I don't know enough functional analysis to answer your questions, in particular I wouldn't know what the strong topology on the dual is in the general setting. I do think though that (everything else being equal) algebraic concepts should be stressed in this article, because "vector space" is an algebraic concept. But why do we talk about biduals at all? Isn't it a bit far removed?
The paragraph does make a nice point that for topological vector spaces, you want to talk about the continuous dual instead of the algebraic dual. That's worth keeping. But I'm not so sure about biduals. In what context do you want to introduce them? If you talk about locally convex vector spaces, you'll have to add yet more definitions. As mentioned before, it's easier to talk about it in the context of Banach spaces. But still I think it's a bit too far removed to be in this article. There is no Banach space theory in here, nor should there.
What do you think about adding instead an example of a topological vector space that is not a Banach space? The easiest example I know is C. -- Jitse Niesen (talk) 16:33, 13 December 2008 (UTC)
Strong operator topology says "It (the s.o.t.) is more natural too, since it is simply the topology of pointwise convergence for an operator." I'm not experienced enough either to say whether biduals are crucial, but reflexivity seems to be somewhat important. What say the functional analysts in the house? Jakob.scholbach (talk) 16:49, 13 December 2008 (UTC)
As for the non-Banach example, we mention the noncompleteness of Riemann integrable functions. I prefer this over C since it shows the superiority of Lebesgue, whose influence is all over the place in these matters, right? Jakob.scholbach (talk) 16:51, 13 December 2008 (UTC)

(←) On the importance of the algebraic bidual: while the proofs of the statements about biduality are indeed quite trivial for vector spaces, one is dealing with a special case of a much deeper algebraic concept (and one which is important even where the proofs are easy). The precisely same concept is already non-trivial for modules over rings more general than fields. In that context the canonical map to bidual is injective for projective modules and bijective for finitely generated projective modules. Via the well-known correspondence (finitely generated) projective modules have interpretation as (finite-rank) vector bundles (rightly introduced as generalisations of vector spaces in the article) , both in differential geometry over the rings of smooth functions, and in algebraic geometry for general commutative rings. In the somewhat more general set-up of coherent sheaves biduality and reflexivity are common issues to consider in the practice of algebraic geometry. Not surprisingly, the same occurs in homological algebra, where double duals of cohomology spaces, modules, sheaves are a very common occurrence. Eventually this leads to biduality in the context of derived categories as a crucial component of Grothendieck's "six operations" formalism for the very important generalisations of Poincaré / Serre -type dualities. The theory of D-modules deserves a related mention here too. Finally, one can mention the role the canonical mapping into bidual played in the formulation of natural transformations of category theory. Hence, all considered, I do think the (algebraic) bidual deserves to be intorduced briefly here, as a simple incarnation of truly important and fruitful concept.

On "natural" topologies to define on the topological dual E ' of a topological vector space E, I suppose one could argue that for normed spaces the strong topology is the most "natural": it is just the familiar norm topology where the norm of a functional f is the supremum of the absolute values of f(x) where x ranges over the unit ball (or sphere) in E. The canonical mapping EE ' ' is then always injection. However, from another viewpoint a "natural" topology T on the dual E ' would have the property that the natural map φ: EE '* to the algebraic dual of the topological dual (defined by the duality pairing E × E ' → R) were a bijection to the topological dual of E ' equipped with the topology T (i.e., T compatible with the duality). Now (under the necessary assumption that φ is injective) all topologies between the weak topology (weakest) and the Mackey topology (strongest) satisfy that condition. However, in general the strong topology of the dual is stronger than the Mackey topology; for the strong topology to be compatible with the duality (and hence equal to the Mackey topology) is precisely the condition of E being semi-reflexive. For normed spaces reflexive is equivalent to semireflexive, which shows that there is a clear viewpoint from which (for example) the weak topology of the dual of a normed space is "more natural" than the strong topology, at least when E is not reflexive. This would be the case for example for L1-spaces. But again, all of the above digression I think mainly helps to show why the topological reflexivity is best left to articles on topological vector spaces and functional analysis. Stca74 (talk) 18:53, 13 December 2008 (UTC)

Huh! Since I cannot cite this talk page ;-( I decided to trim down the presentation somewhat and moved the algebraic biduality statement up to the algebraic dual. I left the Hahn-Banach theorem but without referring to the bidual. Jakob.scholbach (talk) 19:50, 13 December 2008 (UTC)

Tangent space edits[edit]

this edit removed some content as per "removing inaccuracies". What exactly did you mean by that, Silly rabit? I'm inclined to revert that change (it removed references, a rough description what the tangent space is, Lie algebras vs. Lie groups). Jakob.scholbach (talk) 11:27, 3 January 2009 (UTC)

The parts of the paragraph I removed were a bit confused and overstate the important of the tangent space itself. In particular, the vector flow does not go from the tangent space to the manifold: perhaps what was meant was the exponential map? I don't know. Also, the tangent space of a Lie group is quite ordinary, and doesn't "reveal" anything special about the Lie group. It is given the structure of a Lie algebra in a natural manner, but that is something extra. Another example in the same vein: does the tangent space of a Riemannian manifold reveal something special about the manifold? No, it is the metric which does that. siℓℓy rabbit (talk) 14:42, 3 January 2009 (UTC)
I sort of agree with you, but I like to play devil's advocate. Doesn't the Lie algebra determine the Lie group? (up to the connected component of the identity?) It has been a long time since I studied Lie groups and their algebra's, but I do seem to remember something along these lines. Thenub314 (talk) 14:51, 3 January 2009 (UTC)
More or less yes. Anyway, I have added the statement about Lie groups and their Lie algebras to a more conceptually appropriate place in the article that will hopefully satisfy Jacob's objection. siℓℓy rabbit (talk) 15:06, 3 January 2009 (UTC)
OK. Probably I was indeed to sloppy. Just one point: the statement "The tangent space is the best approximation to a surface" is unclear, to me and probably more so for a reader who does not yet know about the t.sp. What exactly does "best" mean? Jakob.scholbach (talk) 22:44, 3 January 2009 (UTC)
A good point. I have added an additional link to the sentence in question to linearization, and an additional content note defining precisely what is meant by "best" in this context, together with a reference. siℓℓy rabbit (talk) 02:52, 4 January 2009 (UTC)
Good, thanks. Jakob.scholbach (talk) 19:19, 4 January 2009 (UTC)

Minor changes[edit]

I would like to make the following minor changes to the lead sentence and paragraph.

I would like to change the lead sentence to

A vector space is a mathematical structure formed by a collection of objects, called vectors, that may be added, subtracted, and scaled.
This may not be necessary. I hadn't noticed it was put in the first section until now, I kind of like it better in the lead, but it would not make me unhappy if it stays the way it is. Sorry I should read more carefully before I write. Thenub314 (talk) 08:53, 8 January 2009 (UTC)

Also in second sentence I think "Euclidean vectors" was better as just "vectors", because we follow it shortly after with the phrase "Euclidean space," and it seems like one too many Euclideans for this sentence. I suggest we change Euclidean vectors with vectorss (this is still linked to the same article), and we link plane with an appropriate article to make clear we mean the Euclidean plane. (Prehaps Plane (geometry)?)

What do people think about this? Thenub314 (talk) 08:49, 8 January 2009 (UTC)

Lead section image[edit]

Does anybody have a good idea what image could illustrate the vector space idea? The current image is pretty crappy, I think, for it conveys basically nothing. Jakob.scholbach (talk) 18:12, 11 January 2009 (UTC)

Added a better image with better description. Please have a look. PST
Well, I think that one is only little better than the previous one. BTW, please sign your posts at talk pages! Jakob.scholbach (talk) 19:28, 12 January 2009 (UTC)
Also, the new caption is not great, since in many vector spaces there is no, or at least no preferred inner product, therefore "closer to" some vector is meaningless. Jakob.scholbach (talk) 19:37, 12 January 2009 (UTC)
Yes, I just intended that to be a rough idea (I kind of felt uncormfortable when writing that caption). Hopefully someone will get a better image soon (the current images at commons are not very good so someone will probably have to upload one). I am quite happy to improve this article in the near future (but perhaps I should discuss here before I edit because I want to make sure that my edits are appropriate). --Point-set topologist (talk) 20:37, 12 January 2009 (UTC)
How about a drawing of the parallelogram law for adding and subtracting vectors? That's the cover illustration for Sheldon Axler's Linear Algebra Done Right. --Uncia (talk) 16:01, 15 January 2009 (UTC)
That's an idea. I'll try merging this illustration with a flag (0-, 1-, and 2-diml subspace of R^3) tonight, unless somebody else is up to it... Jakob.scholbach (talk) 16:34, 15 January 2009 (UTC)
How about this one? Jakob.scholbach (talk) 21:47, 15 January 2009 (UTC)
I like the picture. There are a couple of points about the caption that I thought were not clear: (1) the gray square is not actually the vector space, because the vector space extends to infinity in all directions; (2) The label 0 is used but not explained; maybe we could add "the zero vector 0 belongs to all subspaces". --Uncia (talk) 22:45, 15 January 2009 (UTC)

Although the image is much better than before, I am not perfectly satisfied. It has one error (mentioned above) not to mention that it looks a bit messy (and hard to follow). But I think that the image is temporarily good enough. PST 09:16, 16 January 2009 (UTC)

This is certainly the second best article, I have seen, in Wikipedia[edit]

If this goes for FA, I would be quite pleased to support. However, I am a little worried regarding the issue on 'range of content'. Vector spaces have so many applications everywhere (heaps in mathematics) and I don't think that the current article describes all of these applications. This maybe because it is not supposed to, but if it is, this is just a suggestion. More importantly, the sections regarding topology have to be cleaned up. I see that we should not go off the topic so that has to be done carefully (but note that the section on tangent bundles here is, in my view, better than the article :)). PST (--Point-set topologist (talk) 09:57, 15 January 2009 (UTC))