Jump to content

FastICA: Difference between revisions

From Wikipedia, the free encyclopedia
Content deleted Content added
Yiuin (talk | contribs)
Yiuin (talk | contribs)
Line 47: Line 47:
=== Multiple Component Extraction ===
=== Multiple Component Extraction ===


The single unit iterative algorithm only estimates one of the independent components, to estimate more the algorithm must repeated, and the projection vectors decorated. Although Hyvärinen provides several ways of decorating results the simplest multiple unit algorithm follows. Note that \mathbf{1} indicates a column vector of 1's with dimension M. This notation is simpler than the expectation since it indicates the dimension that has been removed.
The single unit iterative algorithm only estimates one of the independent components, to estimate more the algorithm must repeated, and the projection vectors decorated. Although Hyvärinen provides several ways of decorating results the simplest multiple unit algorithm follows. <math>\mathbf{1}</math> indicates a column vector of 1's with dimension M.


'''Algorithm''' FastICA
'''Algorithm''' FastICA

Revision as of 23:41, 18 March 2013

FastICA is an efficient and popular algorithm for independent component analysis invented by Aapo Hyvärinen at Helsinki University of Technology. The algorithm is based on a fixed-point iteration scheme maximizing non-Gaussianity as a measure of statistical independence. It can be also derived as an approximative Newton iteration.

Algorithm

Preprocess the data

Before the FastICA algorithm can be applied, the input vector data should be centered and whitened.

Centering the data

The input data is centered by computing the mean of each component of and subtracting that mean. This has the effect of making each component have zero mean. Thus:

Whitening the data

Whitening the data involves linearly transforming the data so that the new components are uncorrelated and have variance one. If is the whitened data, then the covariance matrix of the whitened data is the identity matrix:

This can be done using eigenvalue decomposition of the covariance matrix of the data: , where is the matrix of eigenvectors and is the diagonal matrix of eigenvalues. Once eigenvalue decomposition is done, the whitened data is:

Single Component Extraction

The iterative algorithm finds the direction for the weight vector maximizing the non-Gaussianity of the projection for the data . The function is the derivative of a nonquadratic nonlinearity . Hyvärinen states that good values for (shown with their derivatives and second derivatives ) are:

The first equation is a good general-purpose equation, while the second is highly robust.

  1. Randomize the initial weight vector
  2. Let
  3. Let
  4. If not converged, go back to 2

Multiple Component Extraction

The single unit iterative algorithm only estimates one of the independent components, to estimate more the algorithm must repeated, and the projection vectors decorated. Although Hyvärinen provides several ways of decorating results the simplest multiple unit algorithm follows. indicates a column vector of 1's with dimension M.

Algorithm FastICA

Input: Number of desired components
Input: Matrix, where each column represents a N-dimensional sample, where
Output: Un-mixing matrix where each row projects X onto into independent component.
Output: Independent components matrix, with M columns representing a sample with C dimensions.
for p in 1 to C:
     Random vector of length C
    while  changes
        
        
        
Output: 
Output: 

See also

References