Talk:Linear algebra

From Wikipedia, the free encyclopedia
Jump to: navigation, search
WikiProject Mathematics (Rated C-class, Top-importance)
WikiProject Mathematics
This article is within the scope of WikiProject Mathematics, a collaborative effort to improve the coverage of Mathematics on Wikipedia. If you would like to participate, please visit the project page, where you can join the discussion and see a list of open tasks.
Mathematics rating:
C Class
Top Importance
 Field: Algebra
A vital article.
One of the 500 most frequently viewed mathematics articles.
Wikipedia Version 1.0 Editorial Team / v0.5 / Vital
WikiProject icon This article has been reviewed by the Version 1.0 Editorial Team.
Taskforce icon
This article has been selected for Version 0.5 and subsequent release versions of Wikipedia.
 
C-Class article C  This article has been rated as C-Class on the quality scale.
Taskforce icon
This article is a vital article.
Former good article Linear algebra was one of the Mathematics good articles, but it has been removed from the list. There are suggestions below for improving the article to meet the good article criteria. Once these issues have been addressed, the article can be renominated. Editors may also seek a reassessment of the decision if they believe there was a mistake.
June 16, 2006 Good article reassessment Delisted

Misleading statement[edit]

The statement from the article:

"Since vectors, as n-tuples, are ordered lists of n components, it is possible to summarize and manipulate data efficiently in this framework. For example, in economics, one can create and use, say, 8-dimensional vectors or 8-tuples to represent the Gross National Product of 8 countries. One can decide to display the GNP of 8 countries for a particular year, where the countries' order is specified, for example, (United States, United Kingdom, France, Germany, Spain, India, Japan, Australia), by using a vector (v1, v2, v3, v4, v5, v6, v7, v8) where each country's GNP is in its respective position."

is misleading and incorrect. There is a big difference between a tuple and a vector. The tuple of GNP values of 8 countries does not behave like a vector. For example, how would it behave under a linear transformation? What are its basis vectors?

It would improve this article if this statement were removed.

206.169.236.122 20:02, 6 February 2007 (UTC)

What else?[edit]

So what other stuff has the structure of a linear space but has elements that are not real or complex numbers?

You can have a space comprised of, say, all continuous functions or polynomials. In the polynomial case, however, Pn is isomorphic to Rn+1 (Pn being the space of all polynomials of degree at most n).

Veddan (talk) 10:25, 24 March 2008 (UTC)

Consider the space C([0,1],R) of all continuous real-valued functions over the closed interval [0,1], this is a vector space, since linear combinations of continuous functions are continuous. The vectors in C([0,1],R) do not have "elements" in the same way n-tuples of real or complex numbers do. Also, the fact that polynomials of arbitrarily high degree exist in the space means that it is not finite-dimensional.

However, one way to unify the two ideas is to think about n-tuples as functions from the finite set {1,2,...,n} to R. Now finite-dimensional vector space Rn can be seen as a space of functions whose domain is finite, whereas the infinite-dimensional vector space C([0,1],R) is a space of functions whose domain is infinite.

tobilehman (talk) 19:58, 21 December 2011 (UTC)

Wrong statement[edit]

"However, it has few, if any, applications in the natural sciences and the social sciences, and is rarely used except in esoteric mathematical disciplines."

This is just plain wrong. Linear algebra is used in both the natural and social sciences. Physics and Chemistry are obvious. Biology uses matrices and all that malarkey when looking at coupled ODEs. Social sciences use them in some stats work and in ODEs/PDEs. Anywho, the above statement is misleading and should be removed.--137.205.132.41 10:20, 16 January 2007 (UTC)

Finite fields[edit]

In computational number theory you sometimes get people doing linear algebra on matrices made out of integers modulo a prime. Often the prime is 2, but larger ones are also used.

My guess is the elements have to be from a ring or maybe a field. Anyway something with a group operation on the whole set, another group operation on the set except for identity of the first group, distributive law between the two group operations.

Anything to do with finite fields? --Damian Yerrick

Fields and rings[edit]

You can do linear algebra over any field. If you're working with rings, they're called modules. Modules share many of the properties of vector spaces, but certain important basic facts are no longer true (the term dimension doesn't make much sense anymore, as bases may not have the same cardinality.) --Seb

Striking (wrong) example[edit]

Quoted from the main page:

A vector space, as a purely abstract concept about which we prove theorems, is part of abstract algebra, and well integrated into this field. Some striking examples of this are the group of invertible linear maps or matrices,

This is truly a striking example :-)

Toby Bartels and I are going to correct this and I think we're also going to write about linear algebra over a rig (algebra) (this is not a typo!). -- Miguel

Linear algebraists, please help[edit]

The derivation of the maximum-likelihood estimator of the covariance matrix of a multivariate normal distribution is perhaps surprisingly subtle and elegant, involving the spectral theorem of linear algebra and the fact that it is sometimes better to view a scalar as the trace of a 1×1 matrix than as a mere scalar. See estimation of covariance matrices. Please help contribute a "linear algebraists' POV" to that article. Michael Hardy 20:20, 10 Sep 2004 (UTC)

A similar request (this time from a non-mathematician). I plan to introduce a proof from linear algebra into the arbitrage pricing theory article. Firstly, I would like to tighten up the wording such that it is acceptable; secondly, I would like to link the argument to the appropriate linear algebra theorem. Hope that's do-able - and thanks if it is. Basically, this is how the derivation there usually goes (where the generic-vectors below have a financial meaning): "If the fact that (1) a vector is orthogonal to n-1 vectors, implies that (2) it is also orthogonal to an nth vector, then (3) this nth vector can be formed as a linear combination of the other n-1 vectors." Fintor 13:38, 23 October 2006 (UTC)

How did Hamilton name vectors?[edit]

Quote from the article: "In 1843, William Rowan Hamilton (from whom the term vector stems) discovered the quaternions."

Huh? I didn't find the answer on a quick perusal of the William Rowan Hamilton article either. I didn't see it in quaternions either. It sounds like an interesting story, but what (or where) is the story? Spalding 18:25, Oct 4, 2004 (UTC)

Well, it's there now. Lowellian (talk)[[]] 23:05, Oct 7, 2004 (UTC)

Useful Theorems of linear algebra[edit]

The statement about definite and semi-definite matrices is not correct as stated. Matrices should be assumed to be symmetric. Moreover, this is slightly off-topic: it is rather part of bilinear algebra rather than linear algebra.

The statement ``A non-zero matrix A with n rows and n columns is non-singular if there exists a matrix B that satisfies AB = BA = I where I is the identity matrix is much more a definition than a theorem

In my opinion, the main non-trivial result of linear algebra says that the Dimension of a vector space is well defined: Theorem: If a vector space has two bases, then they have the same cardinality.

A matrix is orthogonal diagonalizable if and only if it is normal (please check and edit!!!) —Preceding unsigned comment added by Niv.sarig (talkcontribs) 23:06, 15 December 2009 (UTC)

Clearly, A matrix with only positive eigenvalues is not necessarily positive definite (also not semi for non-negative). These should be erased. (A counter example is M=[2 0;5 1] with eigenvalues 1,2>0 and vector v=[1;-1] but the inner product is v^tMv=-2<0.) —Preceding unsigned comment added by Niv.sarig (talkcontribs) 20:46, 2 October 2010 (UTC)

Equivalent statements for square matrices[edit]

This is not an elegant section: it feels slightly like a dumping ground for a bunch of facts. Did anyone else have the same feeling? (Yiliu60)

Yes. Jitse Niesen (talk) 6 July 2005 09:21 (UTC)

Vote for new external link[edit]

Here is my site with linear algebra example problems. Someone please put this link in the external links section if you think it's helpful and relevant. Tbsmith

http://www.exampleproblems.com/wiki/index.php/Linear_Algebra

GA Promotion[edit]

Hi everyone,

I promoted this article but I do feel that this is a borderline good article as it is an extremely brief article for such a large branch of mathmatics.

Cedars 07:50, 23 April 2006 (UTC)

I've delisted as a good article, because
  • It is broad in its coverage - for such an important branch of mathematics it is too brief
  • No examples
--Salix alba (talk) 10:57, 16 June 2006 (UTC)

Finite dimensions[edit]

I assume the part about "systems of linear equations in finite dimensions" was intended to distinguish the subject of linear algebra from, say functional analysis. However, the distinction lies not in the number of dimensions, but in whether the linear structure is studied as a thing in itself (as opposed to being studied in the context of a topology). In other words, pure vector spaces are the province of linear algebra, while topological vector spaces are the province of functional analysis. Thus, even infinite-dimensional linear phenomena, if studied from a purely algebraic standpoint, are technically part of linear algebra.--Komponisto 21:36, 26 July 2006 (UTC)

Clarify 'over a field'[edit]

I think it would be helpful if someone clarified the meaning of "over a field" from the first sentence of the fourth paragraph in the 'Elementary Introduction' section. The sentence reads as follows: 'A vector space is defined over a field, such as the field of real numbers or the field of complex numbers.' -- —The preceding unsigned comment was added by DrEricH (talkcontribs) .

That's a fair point. I reformulated it so that it does not use the phrase "over a field" anymore. -- Jitse Niesen (talk) 02:12, 16 August 2006 (UTC)

Plagiarism?[edit]

The passage:

For small systems, ad hoc methods are sufficient. Larger systems require one to have more systematic methods. The modern day approach can be seen 2,000 years ago in a Chinese text, the Nine Chapters on the Mathematical Art (simplified Chinese: 九章算术; traditional Chinese: 九章算術; pinyin: Jiǔzhāng Suànshù).

Is very similar to:

For small systems, ad hoc methods certainly suffice. Larger systems, however, require more systematic methods. The approach generally used today was beautifully explained 2,000 years ago in a Chinese text, the Nine Chapters on the Mathematical Art (Jiuzhang Suanshu, 九章算術).

Which is taken from Linear Algebra with Applications Third Edition by Otto Bretscher.

To me, it sounded a bit too similar to the original text.

Just something I noticed.

Thanks for your message. I agree that the similarity is too much to be coincidence. The first two paragraphs in the "History" section were added in a single edit, so they are both suspect. I thus deleted them. -- Jitse Niesen (talk) 12:05, 23 January 2007 (UTC)

Choice about choice[edit]

A number of editors of linear algebra/vector space articles are uncomfortable about making statements which rely on the axiom of choice without mentioning it. To some extent, I share their unease (or to put it more light-heartedly, "You say every vector space has a basis? Great! Now give me a well-ordering of the real numbers - it might come in handy..."). On the other hand, these are articles about linear algebra, so it is a pity to constantly distract the reader with digressions into logic and set theory. So we have a choice (fortunately a finite choice!): do we mention choice or not? I know this has been discussed on a few talk pages before, but my recent edit of this article suggests a compromise: footnoting references to choice. I imagine the use of footnotes may polarize opinion, but it might be a sensible way forward in this case, so let me know what you think. And I'll make a few other similar edits to stimulate the discussion :) Geometry guy 21:42, 13 February 2007 (UTC)

(Hmmm... I'm quite proud of that split infinitive.) I've footnoted the choices in Dual space. One obvious question that arises is whether it would be better to have just one choice-related note with all the relevant caveats, or several. I'd be inclined to put them together to avoid repetition. Geometry guy 22:18, 13 February 2007 (UTC)

seems silly to avoid all together making statements that require the axiom of choice in linear algebra articles. however, so long as the topic belongs in linear algebra proper, probably good to discuss the finite dimensional case first if possible. it should be explicitly stated when the axiom of choice is needed. i agree that doing so via footnotes, as you've done in the dual space article, is a good idea. (even better if they are expanded a bit). Mct mht 15:17, 14 February 2007 (UTC)
I actually agree entirely, but wanted to invite opinion. (Also I have a slight preference for avoiding choice when it is not needed. For example, a lot of the claims about dual spaces do hold without choice for duals of vector spaces with bases. For another example, I prefer the statement "does not have a countable basis" to the statement "has an uncountable basis".) Anyway, I'm glad you like the footnotes idea, and would be happy for them to be expanded. Ultimately there might be a place for an article on "choice in linear algebra". Geometry guy 23:22, 14 February 2007 (UTC)
a seperate article that collects relevant linear algebraic results and delineate when the axiom of choice is needed and not needed looks like the best solution. let's hope someone will take that up. :-) Mct mht 02:52, 15 February 2007 (UTC)

Some unfavorable impressions[edit]

I have to (regretfully) state that for a "top importance" article, this one is remarkably incoherent. Problems are manifold, but just for starters, there is the issue of consistency within the article itself and in wikipedia in general.

  • If linear algebra studies systems of linear equations, as the introduction states, then how come Gauss is not even mentioned in the history section?
  • Moreover, since according to the article on abstract algebra, linear algebra is its proper part, it would seem circular to state that linear algebra is widely used in both abstract algebra and ... On the other hand, applications of linear algebra to differential equations are not even mentioned (and no, it's not covered by a reference to functional analysis).
  • Why is matrix theory not referenced at all? One would hope it's not because we cannot explain the difference between it and linear algebra!
  • The History section seems particularly weak. As the article on matrices discusses, they were introduced in ancient times and used throughout the Middle Ages. Of course, Gauss's work in the beginning of 19th century is very relevant for development of linear algebra, but so is, for example, Laplace's work before, and Cauchy's after, neither of which is mentioned. Arthur Cayley only introduced notation for determinants and abstract matrices, it's hardly proper to credit him with invention of linear algebra! In fact, there are [1], [2] two articles on history of linear algebra in MacTutor History of Mathematics Archive, which, while not complete, nonetheless make me think that it's better to scrap the current history section altogether as simplistic and factually wrong, and rewrite it anew.
  • The section Elementary introduction is a weird mixture, an ad hoc explanation of vector spaces (and as someone has already commented above, a tuple is not at all a representative object for linear algebra as a discipline), with matrices, determinants, and the general idea of linearity interspersed.
  • All but the very first Useful theorems deal with matrices, would it not be more natural to put them into the article on matrix theory?

And the list goes on, and on, and on. Arcfrk 15:01, 19 March 2007 (UTC)

Although the commentary here is a bit harsh (and some of these issues are easily fixed, for example by referring to applications of linear algebra in other areas of abstract algebra, for instance using representation theory), I do agree with the substance of the criticisms. This really is one of the most fundamental articles in pure mathematics, and I think we have a real opportunity here to expand and enhance it. Geometry guy 22:24, 19 March 2007 (UTC)

Rewrite?[edit]

How about rewriting it? I mean, really. Readable articles for basic mathematics topics shouldn't be _too_ hard for us, should they? I tried turning the intro into grammatically correct english; as for content, however, I came here to learn and my linear algebra, history thereof, etc. is still weak. Please help! User:x14n 10th-ish Oct. 2007


I'm hopefully gonna spend some time in the next few days reworking some pieces of this article. As is, it really is a complete mess. A couple of thoughts that spring to mend:

-There should be a definition of vector space, some substantial mention of module theory and a couple of comments about why vector spaces and modules are different and why they are the same.

-The example given about the GNP of 8 countries is misleading in its triviality. Vectors are much more than just lists of numbers. Towards the end of explaining what linear algebra is and what a vector space really is, this article should have some well developed heuristic explanations of the concept of "linearity".

-I think the section that just lists important theorems should be trashed. It is completely unenlightening to just list off a bunch of results that all involve technical concepts, none of which have been defined.

Its late for me and these comments might be a little bit vague but please respond. Ill try to realize some of this stuff when ive had some sleep.Jrdodge 08:31, 11 November 2007 (UTC)

A matrix is invertible if and only if its determinant is nonzero[edit]

This is inaccurate, for instance a matrix over the integers modulo 4 with a determinant of 2 would be uninvertible.

Was I being too pedantic for a Wikipedia article?

Maybe it should read "A matrix is invertible if and only if its determinant is nonzero (but see Invertible)" ?

I was taught this exact statement in school, and it cost me time and effort when I started trying to work with matrices over rings other than the integers or the reals. I'd rather not see anyone else misled by this implicit assumption. 91.84.221.238 (talk) 02:24, 15 January 2008 (UTC)

From the structure of the article, I interpreted the assumption of the previous section (that scalars come from a field) as carrying over to the subsequent section ("Some useful theorems"). The next section ("Generalisations and related topics") discusses matrices over other algebraic objects. Myasuda (talk) 02:52, 15 January 2008 (UTC)
I tried to fix this in a way that would not reduce the accessibility of the page, by replacing "non-zero" with "has an inverse", and then pointing out what this implies for real/complex or for integer entries, without mentioning that this holds over any commutative ring. MvH (talk) 18:19, 9 April 2014 (UTC)MvH
PS. There are many students taking abstract algebra that make the mistake non-zero det --> invertible, long after the point where they should know better! MvH (talk) 18:22, 9 April 2014 (UTC)MvH

Chinese Linear Algebra side note[edit]

I was reading my linear Algebra book for class (Otto Bretsher's Linear Algebra with Applications. 3 edition. Upper Saddle River, New Jersey: Pearson Education, 2005) when I came across something interesting on page 8: "When mathematicans in ancient China had to solve a system of simultaneous linear equations such as\mathbf{} = \begin{bmatrix}

3x & +21y & -3z & =0\\
-6x & -2y & -z & =62\\
2x & -3y & =8z & =32\end{bmatrix}, they took all the numbers involved in this system and arranged them in a rectangular pattern (Fang Cheng in Chinese as follows:

\mathbf{A} = \begin{bmatrix}

3 & 21 & -3 & 0 \\
-6 & -2 & -1 & 62\\
4 & 9 & 2 & 32\end{bmatrix}

All the information about this system is conveniently stored in this array of numbers. The entries were respresented by counting rods; [...] the equations were then solved in a hands-on fashion, by manipulating the rods" I did some googling and found out that how Fang Chang is Chapter 8 in a book called Nine Chapters on the Mathematical Art which shows how almost 2000 years ago Chinese had a method similar to Guassian Elimination for solving linear equations even though they didn't call it Linear Alegbra. I thought it would be an interesting side note to add to the history section. the link about the book is here Nine Chapters on wiki and here Nine Chapters on google books —Preceding unsigned comment added by 128.61.43.160 (talk) 18:05, 12 June 2008 (UTC)

Boy, does this page need changes[edit]

I agree with people elsewhere on this page who feel that this entry is in need of major rewriting and changing. I have just made a dent in this. I began by tweaking the "history" section slightly (to treat the history of linear algebra as synonymous with the history of "abstract", ie, post-1850s linear algebra, is inaccurate) and removing a good chunk of stuff about quaternions. (Although it is certainly related to linear algebra, so are vectors--- which predate quaternions by centuries--- and it seems unusual to give so much attention to quaternions here.)

The statement that the use of Cramer's rule (which dates to the 1700s, mind you, not the 1850s) to solve partial differential equations led to the introduction of linear algebra to the math curriculum is, to this reader, laughable, and as it was not sourced in a way that makes sense to me I took it out. Copson's quote seems to have much more bearing on the difference in education between two universities, one English and one Scottish, in the beginning of the 19th century, than it does on anything specific to linear algebra. (In any case, the quoted portion of Copson does not say anything about the role of Cramer's rule in partial differential equations.)

Frankly, the article in its current form reads like a mish-mash of submissions by beginning students of linear algebra, well-intentioned people relating the observations of third parties (e.g. footnotes in general science works, or introductory paragraphs in introductory textbooks), and people too inclined to include abstract technical detail that is probably better left to more specific entries than "linear algebra". In my very humble opinion. 75.167.204.90 (talk) 05:08, 7 July 2009 (UTC)

Changes[edit]

I've started a revamping of the article, which was long overdue. In the first installment, I've replaced a rambling and incoherent section Elementary introduction with a synopsis of the first few chapters in a standard linear algebra text. The section on history needs to go, too: there is no excuse to having such poor quality material, especially since a very good historical account is contained in the article on vector spaces (and, perhaps, elsewhere on wikipedia). It seems unwise to fork the content, especially from the maintenance point of view. I also feel that the list of results doesn't add much, but if anyone has ideas about how to incorporate some of them into the narrative, please, share them here or implement them yourself. Arcfrk (talk) 05:07, 9 February 2010 (UTC)

The article[edit]

This article must speak to the non-mathematical reader and to students. Mathematicians already know linear algebra and do not need to read about it on Wikipedia. Every line of the article should pass two tests. It should be mathematically accurate. And it should be readable by someone who is not a mathematician. The later parts of this article can address students who already know the material in the earlier parts of the article, but there should not be anything in the article that you need a Ph.D. in mathematics to read.

I've watched this article over the years swing back and forth between extremes. Sometimes it is oversimplified, sometimes too technical. I would like to see it at least move out of the start class, and I think there are currently some editors here working toward the same goal.

Rick Norwood (talk) 16:48, 9 February 2010 (UTC)

Turning now to the picture, I find both the picture and the caption confusing. Were I beginning reader, I might think that all subspaces were lines through the origin. And the three dimensional effect is not clear: I'm not sure where the colored planes intersect. Also, vector subspaces are fundamental to the study of vector spaces, but more important in linear algebra, I think, is the use of a matrix to transform one vector to another. Does anyone have a picture showing this? Rick Norwood (talk) 17:06, 9 February 2010 (UTC)

Main Structures + Matricies[edit]

A "Matrix" isn't listed in the section on main structures, yet references to matricies are all over this page especially in the "most useful theorems" section. —Preceding unsigned comment added by 65.50.39.118 (talk) 05:39, 7 September 2010 (UTC)

anachronism[edit]

The statement "any claim that the concepts of linear algebra were known to mathematicians prior to the end of the nineteenth century is inaccurate, an instance of the historical error of anachronism." seems strange. Herman Grassmann's The theory of Linear Extension (1834) seems to deal with linear algebra. 128.240.229.7 (talk) 07:35, 21 January 2011 (UTC) Niko

If we have a source that confirms that this work indeed deals with linear algebra, then we can add the sentence "... althouh Herman Grassmann's The theory of Linear Extension (1834) deals with linear algebra." But perhaps we need to be wp:BOLD and just remove that sentence, since (1) it sounds like a declaration—i.e. unencyclopedic—, (2) it is not sourced, and (3) I think it is very unlikely that a source will ever be found. So I have removed the assertion. If someone has a good source for it, it can of course gladly be reinserted. DVdm (talk) 19:18, 21 January 2011 (UTC)

Feb 2011/ Introduction Re-write[edit]

Hi, I tried to re-write the introduction but someone reversed my changes. I think defining linear algebra as a branch of mathematics that studies vectors is not quite correct and also kind of circular. I'll re-write again if some people have suggestions where what I wrote wasn't clear, but I don't feel like writing everything out again only to have it deleted. Loadedsalt (talk) 22:35, 18 February 2011 (UTC)

Clearly, I agree that linear algebra does not study vectors, and I have just changed it. Your re-write, however, was a bit too radical for a single change and, perhaps, hard for non-experts. Let's see if my revision survives longer :) 2andrewknyazev (talk) 00:55, 19 February 2011 (UTC)

I don't have a problem with the recent edits. A larger concern is that the article needs a substantial expansion. Someone needs to write a section on solving linear systems and a section on applications l, for starters. As the article improves, generally the lead needs to be rewritten anyway. Sławomir Biały (talk) 13:06, 19 February 2011 (UTC)

Agenda for editting page[edit]

Hi all,

I have been editing this page lately, and here are some of my ideas on how to improve the page. Please feel free to act on these ideas, comment on them, reject them, and add your own thoughts!

1) In the scopes of study section, elaborate on determinants and inner product spaces, since they are important concepts. I am not sure if we should, but we could add some information on Hermitian and normal operators and the fact that they are diagonalizable and have orthonormal basis of eigenvectors.

2) In the applications section, elaborate on the solution of linear equations. I do not know if it is best to introduce this application through an example or through theory. I also don't know if we should use the augmented matrix notation or the equation notation that we are currently using; my inclination is that the augmented matrix notation might be cleaner and might connect better with the rest of the article.

3) Add the section on best-fit lines. Personally, I am not too familiar with this subject, so it might be better if someone else writes that section, but if no one will do it, I could relearn that material and write it up.

4) Add more applications? There are so many, so it is debatable how many we should include.

5) Expand the history section. The history of mathematics is really interesting, and when I read Wikipedia articles on math concepts, I like to read the history section.

6) Flesh out the generalization section. Maybe we should elaborate on what linear algebra theorems remains true in module theory and what becomes false, or put to symbols the concept of multilinear algebra. We should not write a whole expose of the subject, but link the subject to linear algebra.

7) Maybe mention its role in a mathematical education? That it is often used as a bridge to abstract math?

Best,

Majesty of Knowledge (talk) 22:46, 26 January 2012 (UTC)


I think that that:

  • I agree with #1 above and also canonical forms, should be mentioned.
  • History should wait until the mathematics is described - there are already lots of details available and the history is not as important here for understanding as it is in analysis.
  • The section on quantum mechanics should be dropped since it requires more advanced mathematics (e.g. probability theory, partial differential equations, Hilbert spaces).
  • that a straight forward unifying theme for examples should involve a calculation from analytical geometry. (say Intersection of two planes)
  • we should avoid many applications but rather try to offer a more intuitive explanation of what linear means intuitively.
  • the module theory invariants theorems should be outlined under module theory.
  • I think that some results on multilinear algebra should go in this article.

OrenBochman (talk) 00:26, 19 March 2012 (UTC)

Non-linear algebra[edit]

Non-linear algebra redirects to this page! Really!? That's absurd..... someone, should write a new page on the general results of non-linear algebra. Here's a good text: http://arxiv.org/pdf/hep-th/0609022v4.pdf — Preceding unsigned comment added by 99.149.190.128 (talk) 19:44, 6 May 2012 (UTC)

I thought that was strange, too. Maybe it should redirect to Nonlinear system for the time being? I don't want to change it myself, as I know nothing about the topic and am not sure which page is more closely related. 138.16.18.24 (talk) 16:10, 17 April 2014 (UTC)

Dense diagonalizables[edit]

The following comment was removed from the eigenvectors and eigenvalues section today:

but diagonalizable matrices form a dense subset of all matrices.

Perhaps true, but without reference, and inserted into introductory material, the comment is out of place. The topic requires a topology of matrices and an indication of density, beyond the scope of this article.Rgdboer (talk) 22:16, 7 October 2012 (UTC)