Higher-order singular value decomposition: Difference between revisions

From Wikipedia, the free encyclopedia
Content deleted Content added
Njerseyguy (talk | contribs)
m Fixed link
Orlyal (talk | contribs)
mNo edit summary
Line 77: Line 77:
| journal = Molecular Systems Biology
| journal = Molecular Systems Biology
| volume = 5
| volume = 5
| issue = 312
| pages = 312
| month = October
| month = October
| year = 2009
| year = 2009
| doi = 10.1038/msb.2009.70
| doi = 10.1038/msb.2009.70
}}</ref><ref>{{Cite journal
| author = C. Muralidhara, A. M. Gross, R. R. Gutell and O. Alter
| title = Tensor Decomposition Reveals Concurrent Evolutionary Convergences and Divergences and Correlations with Structural Motifs in Ribosomal RNA
| journal = Public Library of Science (PLoS) One
| volume = 6
| issue = 4
| pages = e18768
| month = April
| year = 2011
| doi = 10.1371/journal.pone.0018768
}}</ref>
}}</ref>

It is also used in [[tensor product model transformation]]-based controller design. In [[multilinear subspace learning]]<ref>Haiping Lu, K.N. Plataniotis and A.N. Venetsanopoulos, "[http://www.dsp.utoronto.ca/~haiping/Publication/SurveyMSL_PR2011.pdf A Survey of Multilinear Subspace Learning for Tensor Data]", Pattern Recognition, Vol. 44, No. 7, pp. 1540–1551, Jul. 2011.</ref>, it is modified to [[multilinear principal component analysis]]<ref>H. Lu, K. N. Plataniotis, and A. N. Venetsanopoulos, "[http://www.dsp.utoronto.ca/~haiping/Publication/MPCA_TNN08_rev2010.pdf MPCA: Multilinear principal component analysis of tensor objects]," IEEE Trans. Neural Netw., vol. 19, no. 1, pp. 18–39, Jan. 2008.</ref> for gait recognition.
It is also used in [[tensor product model transformation]]-based controller design. In [[multilinear subspace learning]]<ref>Haiping Lu, K.N. Plataniotis and A.N. Venetsanopoulos, "[http://www.dsp.utoronto.ca/~haiping/Publication/SurveyMSL_PR2011.pdf A Survey of Multilinear Subspace Learning for Tensor Data]", Pattern Recognition, Vol. 44, No. 7, pp. 1540–1551, Jul. 2011.</ref>, it is modified to [[multilinear principal component analysis]]<ref>H. Lu, K. N. Plataniotis, and A. N. Venetsanopoulos, "[http://www.dsp.utoronto.ca/~haiping/Publication/MPCA_TNN08_rev2010.pdf MPCA: Multilinear principal component analysis of tensor objects]," IEEE Trans. Neural Netw., vol. 19, no. 1, pp. 18–39, Jan. 2008.</ref> for gait recognition.



Revision as of 01:59, 19 May 2011

In multilinear algebra, there does not exist a general decomposition method for multi-way arrays or "data-tensors" with all the properties of a matrix singular value decomposition (SVD). A matrix SVD simultaneously computes the orthonormal row/column matrices and a rank-R decomposition. These two properties can be captured by two different decompositions for multi-way arrays: one represents a tensor as the sum of rank-1 tensors (Candecomp–Parafac), while the other computes the othonormal spaces associated with the different axes (or modes) of a tensor. The latter decomposition has been known as a Tucker3, N-mode SVD and N-mode principal component analysis (PCA). It is also used in multilinear subspace learning as multilinear principal component analysis. This terminology was coined by P. Kroonenberg in the 1980s, but it was renamed by L. De Lathauwer to multilinear SVD and HOSVD (higher-order SVD).

History

In 1966, L. Tucker proposed a decomposition method for 3-way arrays (inappropriately referred to as a 3-mode "tensors") as a multidimensional extension of factor analysis.[1] This decomposition was further developed in the 1980s by P. Kroonenberg, who coined the terms Tucker3, Tucker3ALS (an alternating least squares dimensionality reduction algorithm), 3-Mode SVD, and 3-Mode PCA.[2] In the intervening years, several authors developed the decomposition for N-way arrays. Most recently, this work was treated in an elegant fashion and introduced to the SIAM community by L. De Lathauwer et al. who referred to the decomposition as an N-way SVD, multilinear SVD and HOSVD.[3]

Definitions

Let the SVD of a real matrix be , then it can be written in an elementwise form as

and give, in a certain sense optimal, orthonormal basis for the column and row space, is diagonal with decreasing elements. N-mode SVD can be defined by the multidimensional generalization of this concept:

where the matrices and the core tensor should satisfy certain requirements (similar ones to the matrix SVD), namely

  • Each is an orthogonal matrix (called n-mode singular matrix).
  • Two subtensors of the core tensor are orthogonal i.e., if .
  • The subtensors in the core tensor are ordered according to their Frobenius norm, i.e. for n = 1, ..., N.

Notation:

Algorithm

Tensor SVD can be built on SVD as follows:[4]

Given a tensor, T of shape , the ith singular tensor is given by making a matrix,

, the matrix corresponding to .

taking its left singular vectors, .

The core tensor is then the tensor product of those s.

Applications

Main applications are extracting relevant information from multi-way arrays. Used in factor analysis, face recognition (TensorFaces), human motion analysis and synthesis, and many other signal processing endeavours, such as Genomic Signal Processing.[5][6][7]

It is also used in tensor product model transformation-based controller design. In multilinear subspace learning[8], it is modified to multilinear principal component analysis[9] for gait recognition.

References

  1. ^ Ledyard R. Tucker (1966). "Some mathematical notes on three-mode factor analysis". Psychometrika. 31 (3): 279–311. doi:10.1007/BF02289464. {{cite journal}}: Unknown parameter |month= ignored (help)
  2. ^ P. M. Kroonenberg (1983). "Three-mode principal component analysis: Theory and applications". DSWO Press, Leiden.
  3. ^ Lieven De Lathauwer, Bart De Moor and Joos Vandewalle (2000). "A multilinear Singular Value Decomposition" (PDF). SIAM. 21 (4). {{cite journal}}: Unknown parameter |month= ignored (help)
  4. ^ Vasilescu et al. "TensorTextures: Multilinear Image-Based Rendering", SIGGRAPH 2004
  5. ^ L. Omberg, G. H. Golub and O. Alter (2007). "A Tensor Higher-Order Singular Value Decomposition for Integrative Analysis of DNA Microarray Data From Different Studies". PNAS. 104 (47): 18371–18376. doi:10.1073/pnas.0709146104. {{cite journal}}: Unknown parameter |month= ignored (help)
  6. ^ L. Omberg, J. R. Meyerson, K. Kobayashi, L. S. Drury, J. F. X. Diffley and O. Alter (2009). "Global Effects of DNA Replication and DNA Replication Origin Activity on Eukaryotic Gene Expression". Molecular Systems Biology. 5: 312. doi:10.1038/msb.2009.70. {{cite journal}}: Unknown parameter |month= ignored (help)CS1 maint: multiple names: authors list (link)
  7. ^ C. Muralidhara, A. M. Gross, R. R. Gutell and O. Alter (2011). "Tensor Decomposition Reveals Concurrent Evolutionary Convergences and Divergences and Correlations with Structural Motifs in Ribosomal RNA". Public Library of Science (PLoS) One. 6 (4): e18768. doi:10.1371/journal.pone.0018768. {{cite journal}}: Unknown parameter |month= ignored (help)CS1 maint: multiple names: authors list (link) CS1 maint: unflagged free DOI (link)
  8. ^ Haiping Lu, K.N. Plataniotis and A.N. Venetsanopoulos, "A Survey of Multilinear Subspace Learning for Tensor Data", Pattern Recognition, Vol. 44, No. 7, pp. 1540–1551, Jul. 2011.
  9. ^ H. Lu, K. N. Plataniotis, and A. N. Venetsanopoulos, "MPCA: Multilinear principal component analysis of tensor objects," IEEE Trans. Neural Netw., vol. 19, no. 1, pp. 18–39, Jan. 2008.