Wikipedia:Peer review/Vector space/archive1

From Wikipedia, the free encyclopedia

Vector space[edit]

This peer review discussion has been closed.
This article has recently passed its Good Article nomination. I think it has a reasonable state and I'd like to get broader input, especially on accessibility, balance and completeness of the article for a possible FA nomination.

Thanks for the review, Jakob.scholbach (talk) 14:36, 31 December 2008 (UTC)[reply]

Lead section[edit]

We should mention subtraction (or negation) as operations that can be preformed by vectors. While it may be mathematically covered by scaling, it is not the intuitive use of the word. The history really could use some inline references. For example, being an analyst I like the idea that "Later enhancements of the theory are due to the widespread presence of vector spaces in mathematical analysis, mainly in the guise of function spaces." But without a reference this sounds like POV. Thenub314 (talk) 14:03, 2 January 2009 (UTC)[reply]

In the first section on examples where we mention F and Fn, we should mention the connection is by taking n=1, and maybe mention dimension. We talk about dimension more formally later, but we already mentioned it when looking at R2. Thenub314 (talk) 14:13, 2 January 2009 (UTC)[reply]

Thank you! Your points, except for the reference (I'll do that soon) are now covered (I chose to put the subtraction in the first section rather than the lead, just for space considerations).Jakob.scholbach (talk) 22:31, 7 January 2009 (UTC)[reply]
Vector spaces with additional structure - comments by Uncia (talk · contribs)
  • order: comparing the vectors componentwise - there are several ways to do this, see Ordered vector space - a little more detail would be helpful.
  • Hm. In view of the limited length of the article, and given the little importance of this to the topic, I would rather not include lexicographic order etc.
  • Normed vector spaces and inner product spaces: We never explain that〈 · | · 〉means the inner product
  • Fixed.
  • Normed vector spaces and inner product spaces: The Minkowski (or Lorentz) inner product is not an inner product in the usual sense because it is not positive definite - do you want to bring up this inner product in this discussion?
  • Good point. I'll ponder about that.
  • Banach spaces - the Lp definition is not quite correct - the elements of the space are equivalence classes of functions and not individual functions. It's probably not worth making a big deal over, just mention that "functions that agree except on a set of measure zero are considered to be the same".
  • This is in a footnote (nb 9). Do you mean it should be moved up?
  • Hilbert spaces - The given L2 inner product is the "mathematical physics" definition; the "mathematics" definition has the conjugates flipped, that is,
See Lp_space#Special_cases.
  • Right.
  • Hilbert spaces - the intro is confusing because it mixes together several different kinds of approximation. I suggest that the message be split in two parts: (1) It is handy to be able to approximate a general function by members of a set of more special (and often much nicer) functions; (2) In Hilbert space we can extend the idea of basis to approximate general vectors by members of a set of more special (and often much nicer) vectors
  • I've reworded that a bit.
Miscellaneous comments by Uncia (talk · contribs)
  • Lede - I think it would be useful to mention the Euclidean vector, which is the physics and engineering vector. Probably more readers are familiar with these vectors than with the mathematical vector, and these vectors were a historical step in developing vector spaces. You probably would also mention William Rowan Hamilton, Oliver Heaviside, and Josiah Willard Gibbs who were the developers of this theory.
  • Euclidean vectors done. With the history, I have to confess, my knowledge is terribly sparse. Do you know a bit about that? I'd appreciate if we could collaborate on that. Or perhaps you can recommend a book?
  • I would argue against over-emphasizing the vectors of Hamilton, Heaviside, and Gibbs. Modern vector spaces were developed almost entirely independently of the notions of vectors rooted in Euclidean geometry, first by Hermann Grassmann and then by Giuseppe Peano and Alfred North Whitehead. See, for instance, the historical note to Bourbaki's Algebra. siℓℓy rabbit (talk) 03:40, 8 January 2009 (UTC)[reply]
  • Motivation and definition - The introduction seems awkward in a couple of aspects:
    • We mention (without defining) dimension - dimension actually plays a fairly small role in vector spaces (other than the distinction between the finite and infinite dimensional cases), so I'm not sure it should appear so early.
    • We seem to be hinting at the idea of a basis, without defining it and without explaining its significance. This "motivation" is unmotivated! Maybe we really want to talk about projections onto vectors in preferred directions, which is what the figure seems to be about.
  • I don't want to write about bases or dimension at that point, simply because it is to motivate what comes next. Also, to allude to what comes next. I think, the word "dimension" is reasonably common for a general reader (even if he does not know the definition at all!) to use it at that point. Does this make sense?
  • History - "The founding leg of the definition of vectors" - I don't understand the term "founding leg" - I don't think this is an expression in English. Can it be explained in other terms?
  • OK. I'm not a native speaker...
  • Applications - Fourier expansion - The Stone–Weierstrass theorem is applied incorrectly. That theorem implies that a continuous function on an interval is a uniform limit of trigonometric polynomials. However it does not guarantee that you simply add more terms or that the coefficients stay fixed as your approach the limit. The Fourier expansion comes from approximating in the L2 norm, approximates any L2 function (not just continuous ones) and does guarantee that the coefficients are fixed.
  • OK, I have reworded that.
  • Applications - Fourier expansion - the term Discrete Fourier Transform is probably better here than Fast Fourier Transform, as the latter is a technique for calculating the former. Also, fast integer multiplication is not the main application of DFT or FFT but just an interesting sideline; the wide-ranging field of Digital Signal Processing contains most of the applications of DFTs.
  • OK. Do you want to add some words about applications of DFT? Otherwise I'll try later. I would not remove the fast integer multiplication, that's pretty important from a computational point of view, AFAIK.
  • I have expanded the section a good bit (and changed its name to Fourier analysis). It may be too long now and wander away too far from vector spaces, but I think it now gives a good view of the importance of this application. --Uncia (talk) 17:42, 11 January 2009 (UTC)[reply]
Thanks a lot for your review! Jakob.scholbach (talk) 22:31, 7 January 2009 (UTC)[reply]

I like the recent expansion of the Fourier section. Maybe we will have to trim it down a bit, but more importantly, the statements have to be referenced. I guess that's easy for you? I think, I'll nominate the article for FA then. Jakob.scholbach (talk) 18:02, 11 January 2009 (UTC)[reply]

I have added references for all statements in Fourier analysis. --Uncia (talk) 01:06, 13 January 2009 (UTC)[reply]

I think there's a place in there for reciproqual spaces, as a subsection of Fourrier spaces, due to their notability. Headbomb {ταλκκοντριβςWP Physics} 22:40, 13 January 2009 (UTC)[reply]