# Rank factorization

Given an ${\textstyle m\times n}$ matrix ${\textstyle A}$ of rank ${\textstyle r}$, a rank decomposition or rank factorization of ${\textstyle A}$ is a product ${\textstyle A=CF}$, where ${\textstyle C}$ is an ${\textstyle m\times r}$ matrix and ${\textstyle F}$ is an ${\textstyle r\times n}$ matrix.

## Existence

Every finite-dimensional matrix has a rank decomposition: Let ${\textstyle A}$ be an ${\textstyle m\times n}$ matrix whose column rank is ${\textstyle r}$. Therefore, there are ${\textstyle r}$ linearly independent columns in ${\textstyle A}$; equivalently, the dimension of the column space of ${\textstyle A}$ is ${\textstyle r}$. Let ${\textstyle c_{1},c_{2},\ldots ,c_{r}}$ be any basis for the column space of ${\textstyle A}$ and place them as column vectors to form the ${\textstyle m\times r}$ matrix ${\textstyle C=[c_{1}:c_{2}:\ldots :c_{r}]}$. Therefore, every column vector of ${\textstyle A}$ is a linear combination of the columns of ${\textstyle C}$. To be precise, if ${\textstyle A=[a_{1}:a_{2}:\ldots :a_{n}]}$ is an ${\textstyle m\times n}$ matrix with ${\textstyle a_{j}}$ as the ${\textstyle j}$-th column, then

${\displaystyle a_{j}=f_{1j}c_{1}+f_{2j}c_{2}+\cdots +f_{rj}c_{r},}$

where ${\textstyle f_{ij}}$'s are the scalar coefficients of ${\textstyle a_{j}}$ in terms of the basis ${\textstyle c_{1},c_{2},\ldots ,c_{r}}$. This implies that ${\textstyle A=CF}$, where ${\textstyle f_{ij}}$ is the ${\textstyle (i,j)}$-th element of ${\textstyle F}$.

## Non-uniqueness

If ${\textstyle A=C_{1}F_{1}}$ is rank factorization, taking ${\textstyle C_{2}=C_{1}R}$ and ${\textstyle F_{2}=R^{-1}F_{1}}$ gives another rank factorization for any invertible matrix ${\textstyle R}$ of compatible dimensions.

Conversely, if ${\textstyle A=F_{1}G_{1}=F_{2}G_{2}}$ are two rank factorizations of ${\textstyle A}$, then there exists an invertible matrix ${\textstyle R}$ such that ${\textstyle F_{1}=F_{2}R}$ and ${\textstyle G_{1}=R^{-1}G_{2}}$.[1]

## Construction

### Rank factorization from row echelon forms

In practice, we can construct one specific rank factorization as follows: we can compute ${\textstyle B}$, the reduced row echelon form of ${\textstyle A}$. Then ${\textstyle C}$ is obtained by removing from ${\textstyle A}$ all non-pivot columns, and ${\textstyle F}$ by eliminating all zero rows of ${\textstyle B}$.

#### Example

Consider the matrix

${\displaystyle A={\begin{bmatrix}1&3&1&4\\2&7&3&9\\1&5&3&1\\1&2&0&8\end{bmatrix}}\sim {\begin{bmatrix}1&0&-2&0\\0&1&1&0\\0&0&0&1\\0&0&0&0\end{bmatrix}}=B{\text{.}}}$

${\textstyle B}$ is in reduced echelon form.

Then ${\textstyle C}$ is obtained by removing the third column of ${\textstyle A}$, the only one which is not a pivot column, and ${\textstyle F}$ by getting rid of the last row of zeroes, so

${\displaystyle C={\begin{bmatrix}1&3&4\\2&7&9\\1&5&1\\1&2&8\end{bmatrix}}{\text{,}}\qquad F={\begin{bmatrix}1&0&-2&0\\0&1&1&0\\0&0&0&1\end{bmatrix}}{\text{.}}}$

It is straightforward to check that

${\displaystyle A={\begin{bmatrix}1&3&1&4\\2&7&3&9\\1&5&3&1\\1&2&0&8\end{bmatrix}}={\begin{bmatrix}1&3&4\\2&7&9\\1&5&1\\1&2&8\end{bmatrix}}{\begin{bmatrix}1&0&-2&0\\0&1&1&0\\0&0&0&1\end{bmatrix}}=CF{\text{.}}}$

#### Proof

Let ${\textstyle P}$ be an ${\textstyle n\times n}$ permutation matrix such that ${\textstyle AP=(C,D)}$ in block partitioned form, where the columns of ${\textstyle C}$ are the ${\textstyle r}$ pivot columns of ${\textstyle A}$. Every column of ${\textstyle D}$ is a linear combination of the columns of ${\textstyle C}$, so there is a matrix ${\textstyle G}$ such that ${\textstyle D=CG}$, where the columns of ${\textstyle G}$ contain the coefficients of each of those linear combinations. So ${\textstyle AP=(C,CG)=C(I_{r},G)}$, ${\textstyle I_{r}}$ being the ${\textstyle r\times r}$ identity matrix. We will show now that ${\textstyle (I_{r},G)=FP}$.

Transforming ${\textstyle AP}$ into its reduced row echelon form amounts to left-multiplying by a matrix ${\textstyle E}$ which is a product of elementary matrices, so ${\textstyle EAP=BP=EC(I_{r},G)}$, where ${\textstyle EC={\begin{pmatrix}I_{r}\\0\end{pmatrix}}}$. We then can write ${\textstyle BP={\begin{pmatrix}I_{r}&G\\0&0\end{pmatrix}}}$, which allows us to identify ${\textstyle (I_{r},G)=FP}$, i.e. the nonzero ${\textstyle r}$ rows of the reduced echelon form, with the same permutation on the columns as we did for ${\textstyle A}$. We thus have ${\textstyle AP=CFP}$, and since ${\textstyle P}$ is invertible this implies ${\textstyle A=CF}$, and the proof is complete.

### Singular value decomposition

One can also construct a full rank factorization of ${\textstyle A}$ by using its singular value decomposition

${\displaystyle A=U\Sigma V^{*}={\begin{bmatrix}U_{1}&U_{2}\end{bmatrix}}{\begin{bmatrix}\Sigma _{r}&0\\0&0\end{bmatrix}}{\begin{bmatrix}V_{1}^{*}\\V_{2}^{*}\end{bmatrix}}=U_{1}\left(\Sigma _{r}V_{1}^{*}\right).}$

Since ${\textstyle U_{1}}$ is a full column rank matrix and ${\textstyle \Sigma _{r}V_{1}^{*}}$ is a full row rank matrix, we can take ${\textstyle C=U_{1}}$ and ${\textstyle F=\Sigma _{r}V_{1}^{*}}$.

## Consequences

### rank(A) = rank(AT)

An immediate consequence of rank factorization is that the rank of ${\textstyle A}$ is equal to the rank of its transpose ${\textstyle A^{\textsf {T}}}$. Since the columns of ${\textstyle A}$ are the rows of ${\textstyle A^{\textsf {T}}}$, the column rank of ${\textstyle A}$ equals its row rank.

Proof: To see why this is true, let us first define rank to mean column rank. Since ${\textstyle A=CF}$, it follows that ${\textstyle A^{\textsf {T}}=F^{\textsf {T}}C^{\textsf {T}}}$. From the definition of matrix multiplication, this means that each column of ${\textstyle A^{\textsf {T}}}$ is a linear combination of the columns of ${\textstyle F^{\textsf {T}}}$. Therefore, the column space of ${\textstyle A^{\textsf {T}}}$ is contained within the column space of ${\textstyle F^{\textsf {T}}}$ and, hence, rank${\textstyle \left(A^{\textsf {T}}\right)}$ ≤ rank${\textstyle \left(F^{\textsf {T}}\right)}$.

Now, ${\textstyle F^{\textsf {T}}}$ is ${\textstyle n\times r}$, so there are ${\textstyle r}$ columns in ${\textstyle F^{\textsf {T}}}$ and, hence, rank${\textstyle \left(A^{\textsf {T}}\right)}$${\textstyle r}$ = rank${\textstyle \left(A\right)}$. This proves that rank${\textstyle \left(A^{\textsf {T}}\right)}$ ≤ rank${\textstyle \left(A\right)}$.

Now apply the result to ${\textstyle A^{\textsf {T}}}$ to obtain the reverse inequality: since ${\textstyle \left(A^{\textsf {T}}\right)^{\textsf {T}}}$ = ${\textstyle A}$, we can write rank${\textstyle \left(A\right)}$ = rank${\textstyle \left(\left(A^{\textsf {T}}\right)^{\textsf {T}}\right)}$ ≤ rank${\textstyle \left(A^{\textsf {T}}\right)}$. This proves rank${\textstyle \left(A\right)}$ ≤ rank${\textstyle \left(A^{\textsf {T}}\right)}$.

We have, therefore, proved rank${\textstyle \left(A^{\textsf {T}}\right)}$ ≤ rank${\textstyle \left(A\right)}$ and rank${\textstyle \left(A\right)}$ ≤ rank${\textstyle \left(A^{\textsf {T}}\right)}$, so rank${\textstyle \left(A\right)}$ = rank${\textstyle \left(A^{\textsf {T}}\right)}$. (Also see the first proof of column rank = row rank under rank).

## Notes

1. ^ Piziak, R.; Odell, P. L. (1 June 1999). "Full Rank Factorization of Matrices". Mathematics Magazine. 72 (3): 193. doi:10.2307/2690882.

## References

• Banerjee, Sudipto; Roy, Anindya (2014), Linear Algebra and Matrix Analysis for Statistics, Texts in Statistical Science (1st ed.), Chapman and Hall/CRC, ISBN 978-1420095388
• Lay, David C. (2005), Linear Algebra and its Applications (3rd ed.), Addison Wesley, ISBN 978-0-201-70970-4
• Golub, Gene H.; Van Loan, Charles F. (1996), Matrix Computations, Johns Hopkins Studies in Mathematical Sciences (3rd ed.), The Johns Hopkins University Press, ISBN 978-0-8018-5414-9
• Stewart, Gilbert W. (1998), Matrix Algorithms. I. Basic Decompositions, SIAM, ISBN 978-0-89871-414-2
• Piziak, R.; Odell, P. L. (1 June 1999). "Full Rank Factorization of Matrices". Mathematics Magazine. 72 (3): 193. doi:10.2307/2690882.