# Two-dimensional singular-value decomposition

Two-dimensional singular-value decomposition (2DSVD) computes the low-rank approximation of a set of matrices such as 2D images or weather maps in a manner almost identical to SVD (singular-value decomposition) which computes the low-rank approximation of a single matrix (or a set of 1D vectors).

## SVD

Let matrix ${\displaystyle X=(x_{1},\ldots ,x_{n})}$ contains the set of 1D vectors which have been centered. In PCA/SVD, we construct covariance matrix ${\displaystyle F}$ and Gram matrix ${\displaystyle G}$

${\displaystyle F=XX^{T}}$ , ${\displaystyle G=X^{T}X,}$

and compute their eigenvectors ${\displaystyle U=(u_{1},\ldots ,u_{n})}$ and ${\displaystyle V=(v_{1},\ldots ,v_{n})}$. Since ${\displaystyle VV^{T}=I,UU^{T}=I}$, we have

${\displaystyle X=UU^{T}XVV^{T}=U(U^{T}XV)V^{T}=U\Sigma V^{T}.}$

If we retain only ${\displaystyle K}$ principal eigenvectors in ${\displaystyle U,V}$, this gives low-rank approximation of ${\displaystyle X}$.

## 2DSVD

Here we deal with a set of 2D matrices ${\displaystyle (X_{1},\ldots ,X_{n})}$. Suppose they are centered ${\displaystyle \sum _{i}X_{i}=0}$. We construct row–row and column–column covariance matrices

${\displaystyle F=\sum _{i}X_{i}X_{i}^{T}}$ , ${\displaystyle G=\sum _{i}X_{i}^{T}X_{i}}$

in exactly the same manner as in SVD, and compute their eigenvectors ${\displaystyle U}$ and ${\displaystyle V}$. We approximate ${\displaystyle X_{i}}$ as

${\displaystyle X_{i}=UU^{T}X_{i}VV^{T}=U(U^{T}X_{i}V)V^{T}=UM_{i}V^{T}}$

in identical fashion as in SVD. This gives a near optimal low-rank approximation of ${\displaystyle (X_{1},\ldots ,X_{n})}$ with the objective function

${\displaystyle J=\sum _{i}|X_{i}-LM_{i}R^{T}|^{2}}$

Error bounds similar to Eckard–Young theorem also exist.

2DSVD is mostly used in image compression and representation.

## References

• Chris Ding and Jieping Ye. "Two-dimensional Singular Value Decomposition (2DSVD) for 2D Maps and Images". Proc. SIAM Int'l Conf. Data Mining (SDM'05), pp. 32–43, April 2005. http://ranger.uta.edu/~chqding/papers/2dsvdSDM05.pdf
• Jieping Ye. "Generalized Low Rank Approximations of Matrices". Machine Learning Journal. Vol. 61, pp. 167—191, 2005.