Spectrum of a matrix

From Wikipedia, the free encyclopedia
Jump to: navigation, search

In mathematics, the spectrum of a (finite) matrix is the multiset of its eigenvalues.[1][2][3] This notion can be extended to the spectrum of an operator in the infinite-dimensional case.

The determinant equals the product of the eigenvalues. Similarly, the trace equals the sum of the eigenvalues.[4][5][6] From this point of view, we can define the pseudo-determinant for a singular matrix to be the product of all the nonzero eigenvalues (the density of multivariate normal distribution will need this quantity).


Let V be a finite-dimensional vector space over some field K and suppose T: VV is a linear map. The spectrum of T, denoted σT, is the multiset of roots of the characteristic polynomial of T. Thus the elements of the spectrum are precisely the eigenvalues of T, and the multiplicity of an eigenvalue λ in the spectrum equals the dimension of the generalized eigenspace of T for λ (also called the algebraic multiplicity of λ).

Now, fix a basis B of V over K and suppose M∈MatK(V) is a matrix. Define the linear map T: VV point-wise by Tx=Mx, where on the right-hand side x is interpreted as a column vector and M acts on x by matrix multiplication. We now say that xV is an eigenvector of M if x is an eigenvector of T. Similarly, λ∈K is an eigenvalue of M if it is an eigenvalue of T, and with the same multiplicity, and the spectrum of M, written σM, is the multiset of all such eigenvalues.


  1. ^ Golub & Van Loan (1996, p. 310)
  2. ^ Kreyszig (1972, p. 273)
  3. ^ Nering (1970, p. 270)
  4. ^ Golub & Van Loan (1996, p. 310)
  5. ^ Herstein (1964, pp. 271-272)
  6. ^ Nering (1970, pp. 115-116)