Relates the dimensions of a linear map's kernel and image with the dimension of its domain
"Rank theorem" redirects here. For the rank theorem of multivariable calculus, see constant rank theorem.
The rank–nullity theorem is a fundamental theorem in linear algebra which relates the dimensions of a linear map's kernel and image with the dimension of its domain.
Stating the theorem
Let , be vector spaces, where is finite dimensional. Let be a linear transformation. Then[1]
,
where
and
One can refine this theorem via the splitting lemma to be a statement about an isomorphism of spaces, not just dimensions. Explicitly, since T induces an isomorphism from to , the existence of a basis for V that extends any given basis of implies, via the splitting lemma, that . Taking dimensions, the Rank-Nullity theorem follows immediately.
Matrices
Since [2], matrices immediately come to mind when discussing linear maps. In the case of an matrix, the dimension of the domain is , the number of columns in the matrix. Thus the Rank-Nullity theorem for a given matrix immediately becomes
.
Proofs
Here we provide two proofs. The first[3] operates in the general case, using linear maps. The second proof[4] looks at the homogeneous system for with rank and shows explicitly that there exists a set of linearly independent solutions that span the kernel of .
While the theorem requires that the domain of the linear map be finite-dimensional, there is no such assumption on the codomain. This means that there are linear maps not given by matrices for which the theorem applies. Despite this, the first proof is not actually more general than the second: since the image of the linear map is finite-dimensional, we can represent the map from its domain to its image by a matrix, prove the theorem for that matrix, then compose with the inclusion of the image into the full codomain.
First proof
Let be vector spaces over some field and defined as in the statement of the theorem with .
As is a subspace, there exists a basis for it.
Suppose and let
be such a basis.
We may now, by the Steinitz exchange lemma, extend with linearly independent vectors to form a full basis of .
Let
such that
is a basis for .
From this, we know that
.
We now claim that is a basis for .
By definition, is generating; it remains to be shown that it is also linearly independent to conclude that it is a basis.
Suppose is not linearly independent, and let
for some .
Thus, owing to the linearity of , it follows that
.
This is a contradiction to being a basis, unless all are equal to zero. This shows that is linearly independent, and more specifically that it is a basis for .
To summarise, we have , a basis for , and , a basis for .
Therefore, the column vectors of constitute a set of linearly independent solutions for .
We next prove that any solution of must be a linear combination of the columns of .
For this, let
be any vector such that . Note that since the columns of are linearly independent, implies .
Therefore,
This proves that any vector that is a solution of must be a linear combination of the special solutions given by the columns of . And we have already seen that the columns of are linearly independent. Hence, the columns of constitute a basis for the null space of . Therefore, the nullity of is . Since equals rank of , it follows that . This concludes our proof.
The rank–nullity theorem for finite-dimensional vector spaces may also be formulated in terms of the index of a linear map. The index of a linear map , where and are finite-dimensional, is defined by
.
Intuitively, is the number of independent solutions of the equation , and is the number of independent restrictions that have to be put on to make solvable. The rank–nullity theorem for finite-dimensional vector spaces is equivalent to the statement
.
We see that we can easily read off the index of the linear map from the involved spaces, without any need to analyze in detail. This effect also occurs in a much deeper result: the Atiyah–Singer index theorem states that the index of certain differential operators can be read off the geometry of the involved spaces.
Notes
^Friedberg; Insel; Spence. Linear Algebra. Pearson. p. 70. ISBN9780321998897.
^Friedberg; Insel; Spence. Linear Algebra. pp. 103–104. ISBN9780321998897.
^Friedberg; Insel; Spence. Linear Algebra. Pearson. p. 70. ISBN9780321998897.
^Banerjee, Sudipto; Roy, Anindya (2014), Linear Algebra and Matrix Analysis for Statistics, Texts in Statistical Science (1st ed.), Chapman and Hall/CRC, ISBN978-1420095388
Banerjee, Sudipto; Roy, Anindya (2014), Linear Algebra and Matrix Analysis for Statistics, Texts in Statistical Science (1st ed.), Chapman and Hall/CRC, ISBN978-1420095388