# Talk:Symmetric matrix

WikiProject Mathematics (Rated Start-class, Mid-importance)
This article is within the scope of WikiProject Mathematics, a collaborative effort to improve the coverage of Mathematics on Wikipedia. If you would like to participate, please visit the project page, where you can join the discussion and see a list of open tasks.
Mathematics rating:
 Start Class
 Mid Importance
Field: Algebra

Shouldn't be better to create a distinct entry for 'skew-symmetric matrix' ?

## Inverse Matrix

Does the inverse of a square symmetrical matrix have any special properties? Does being symmetrical provide any shortcut to finding an inverse? 58.107.136.85 (talk) 03:56, 11 April 2008 (UTC)

If the inverse of a symmetrical matrix is also a symmetrical matrix it should be stated under properties. —Preceding unsigned comment added by 77.13.24.86 (talk) 18:25, 25 January 2011 (UTC)

Yes! Of course the inverse of a symmetric matrix is symmetric; its very easy to show too.

Proof:

Suppose A = A^t and A is non-singular, then there exists A^-1 such that A*A^-1 = I. Applying the transposition operator to each side of the equation we get...

Transpose[A*A^-1] = Transpose[I]... {(A^-1)^t}*A^t = I; however, we have that A = A^t, so it follows that... {(A^-1)^t}*A = I, but the inverse is unique therefore,... (A^-1)^t = A^-1. This proves that the inverse is symmetric. QED — Preceding unsigned comment added by Brydustin (talkcontribs) 00:36, 1 January 2012 (UTC)

## Basis, Eigenvectors

It's easy to identify a symmetric matrix when it's written in terms of an orthogonal basis, but what about when it's not? Is a real-valued matrix symmetrix iff its eigenvectors are orthogonal? —Ben FrantzDale 00:31, 11 September 2006 (UTC)

Reading more carefully answers my question: "Every symmetric matrix is thus, up to choice of an orthonormal basis, a diagonal matrix." So apparently the answer is yes. —Ben FrantzDale 15:27, 11 September 2006 (UTC)

I believe you're confusing a couple of concepts here. A matrix is a rectangular array of numbers, and it's symmetric if it's, well, symmetric. Of course, a linear map can be represented as a matrix when a choice of basis has been fixed. On the other hand, the concept of symmetry for a linear operator is basis independent. Greg Woodhouse 01:34, 30 November 2006 (UTC)

being symmetric with real entries implies unitarily diagonalizable; the converse need not be true. anti-symmetric matrices with real entries are normal therefore unitarily diagonalizable. but the eigenvalues are no longer real, so one must speak of unitary matrices, rather than orthogonal. Mct mht 04:07, 12 September 2006 (UTC)

It's been a while since I followed up on this. I still feel like there is something missing in this article. For me back in 2006, I was confused about the importance of symmetry of a matrix because they are "just" rectangular arrays of numbers. As such, symmetry seems like a superficial property that can be undone by simple things like swapping rows. Furthermore, we could have a matrix that is symmetric but meaninglessly so. For example, a data matrix of participants with age and weight as columns. If Alice is 80 and weighs 90 pounds and bob is 90 and weighs 80 pounds, then you get a symmetric table, but that symmetry doesn't mean anything (for starters, the units don't match, but we could construct something for which they did). That left me wondering "when does symmetry mean something?" I now think I understand. Consider the moment matrix of a bunch of points in R3. That is a symmetric 3×3 matrix. As I've come to understand things, that matrix is contravariant (in the tensor sense) in its rows and columns.

I think matched variance of rows and columns is a necessary (but not sufficient) condition for a matrix to be symmetric in any meaningful sense. That implies that a meaningfully symmetric matrix is strictly-speaking the matrix representation of a tensor. Does that sound right? (I don't mean to say that [80 90; 90 80] isn't symmetric, I am just saying that for that symmetry to be anything other than coincidence, the matrix has to have matched variance in rows and columns.) —Ben FrantzDale (talk) 13:46, 14 December 2010 (UTC)

"More precisely, a matrix is symmetric if and only if it has an orthonormal basis of eigenvectors" This statement is just wrong. See 'Normal Matrix'. Normal matrices need not be symmetric (in fact they can be anti-symmetric), but does have an orthonormal basis of eigenvectors. However, it IS true that if a matrix is symmetric, then it has an orthonormal basis (in fact this is trivially true, since all 'symmetric matrices' are 'normal matrices', and normal matrices have an orthonormal basis of eigenvectors) Please correct. —Preceding unsigned comment added by 128.122.20.210 (talk) 03:17, 30 December 2010 (UTC)

## Symmetric matrices are usually considered to be real valued

I've made several changes to indicate that symmetric matrices are generally assumed to be real valued. With this, the real spectral theorem can be stated properly. VectorPosse 05:03, 12 September 2006 (UTC)

Would it be better to have a little more detailed discussion of Hermitian? --TedPavlic 16:21, 19 February 2007 (UTC)

It may be worthwhile to add a section on complex symmetric matrices, or matrices that are (complex) symmetric w/r/t an orthonormal basis. They are not as useful as self-adjoint operators, but the category includes toeplitz matrices, hankel matrices and any normal matrix. 140.247.23.104 04:43, 12 January 2007 (UTC)

I agree. We just need to make sure it's in a different section so that it doesn't get mixed up with the stuff about the spectral theorem. VectorPosse 19:28, 19 February 2007 (UTC)

## Products of Symmetric Matrices: Eigenspaces Closed Under Transformation

As the article states, products of symmetric matrices are symmetric if and only if the matrices commute. However, it also says, "Two real symmetric matrices commute if and only if they have the same eigenspaces." This makes no sense. Consider arbitrary matrix $A$ and the identity matrix $I$. Certainly, $A I = I A$, so these matrices commute. However, in general $A$ and $I$ will not have the same eigenspaces! I think this statement was supposed to be, "Two real symmetric matrices commute if and only if they are simultaneously diagonalizable," or, "Two real symmetric matrices commute if and only if the eigenspace for one matrix is closed under the other matrix." Both of these statements sound complicated compared to the original statement. I'm not sure if it's worthwhile to even mention it. However, I'm going to make a change. I'm okay with someone removing the statement entirely. --TedPavlic 17:34, 19 February 2007 (UTC)

the previous version was correct. two real symmetric matrix commute iff they can be simultaneously diagoanlized iff they have the same eigenspaces. please undo your change. Mct mht 10:24, 21 February 2007 (UTC)
As far as I can see, Ted's counterexample (identity matrix and arbitrary symmetric matrix) shows that two symmetric matrices can commute without having the same eigenspaces. Please tell me where we go wrong. -- Jitse Niesen (talk) 11:25, 21 February 2007 (UTC)
hm, that depends on what's meant by "having the same eigenspaces", no? if that means "the collection of eigenspaces coincide", then you would be right. (however, seems to me the wording of the comment, which i removed, about the "closure" of eigenspaces can be improved.) perhaps it's more precise to say two real symmetric matrix commute iff there exists a basis consisting of common eigenvectors. Mct mht 12:17, 21 February 2007 (UTC)
also, the identity matrix is really a degenerate case. since it and its multiples are the only matrices that's diagonal irrespective of the basis chosen. excluding such cases (if A restricted to a subspace V is a · I, remove V), seems to me that the general claim is true: real symmetric matrices {Ai} commute pairwise iff the family of eigenspaces of Ai and the family of eigenspaces of Aj are the same for all i and j. Mct mht 15:42, 21 February 2007 (UTC)
I agree with "two real symmetric matrices commute iff there exists a basis consisting of common eigenvectors". I think the more common formulation is "two real symmetric matrices commute iff they are simultaneously diagonalizable", so I'd prefer that. I agree that the formulation "the eigenspace for one matrix is closed under the other matrix" is rather unfortunate as I had to read that sentence a couple of times before I understood what is meant.
I don't understand what you mean with "if A restricted to a subspace V is a · I, remove V". Every matrix is a multiple of the identity when restricted to an eigenspace, and after removing the eigenspaces of a symmetric matrix there's nothing left. -- Jitse Niesen (talk) 04:04, 22 February 2007 (UTC)
shoot, you're right. well, remove V if dimension V is > 1. that better? Mct mht 04:10, 22 February 2007 (UTC)
hm, forget it Jitse, that did not make it better. you're right there. Mct mht 12:20, 22 February 2007 (UTC)

Hey..the definition of symmetrizable matrices is not complete. A symmetrizable matrix is a product of a symmetric matrix and a positive definite matrix. The positive definite matrix need not be a invertible diagonal matrix as in the section. Please check... Naik.a.s —Preceding unsigned comment added by Naik.a.s (talkcontribs) 10:08, 27 July 2009 (UTC)

## eigenvalues

are the eigenvalues of A:n×n, A=AT always {0,...,0,tr(A)} ?
applies for matrix BTB with B=[1,2,3,4]
--Saippuakauppias 10:48, 31 December 2007 (UTC)

No. For instance, the identity matrix is symmetric, but has eigenvalues {1,1,…,1}. However, every matrix of the form $A=B^TB$ does have {0,…,0,tr(A)) as its eigenvalues. Such matrices are called rank-one matrices, because their rank is one. -- Jitse Niesen (talk) 15:24, 31 December 2007 (UTC)

In the article the statement "Two real symmetric matrices commute if and only if they have the same eigenspaces." is wrong. For a counterexample consider the identity matrix and any diagonal matrix with more than one eigenvalue. The statement should read: "If two real symmetric matrices of dimension n commute then a basis for R^n can be chosen so that every element of the basis is an eigenvector for both matrices."

Incidentally the answer above is is assuming that B is itself a rank one matrix (as in the example given with B=[1,2,3,4]). It's not true for B an arbitrary matrix.

137.222.137.107 (talk) 15:16, 15 June 2012 (UTC)Nick Gill

## The spectral theorem...

...is conspicuous by the absence of any mention of it in this article!

Maybe I'll be back. Michael Hardy (talk) 02:18, 10 August 2008 (UTC)

It's at the start of the "Properties" section. -- Jitse Niesen (talk) 10:57, 10 August 2008 (UTC)

## trace of the product of three matrices

Hi,

there's a mistake in the article. It's claimed that the trace of three symmetric (or hermitian) matrices is invariant under arbitrary permutations. To prove this, it's used that (CBA)^t = CBA which is simply not true because the product of symmetric (hermitian) matrices is symmetric (hermitian) if and only if they commute. —Preceding unsigned comment added by 192.33.103.47 (talk) 09:27, 28 June 2010 (UTC)