# Krylov subspace

In linear algebra, the order-r Krylov subspace generated by an n-by-n matrix A and a vector b of dimension n is the linear subspace spanned by the images of b under the first r powers of A (starting from $A^{0}=I$ ), that is,

${\mathcal {K}}_{r}(A,b)=\operatorname {span} \,\{b,Ab,A^{2}b,\ldots ,A^{r-1}b\}.\,$ ## Background

The concept is named after Russian applied mathematician and naval engineer Alexei Krylov, who published a paper about it in 1931.

## Properties

• ${\mathcal {K}}_{r}(A,b),A{\mathcal {K}}_{r}(A,b)\subset {\mathcal {K}}_{r+1}(A,b)$ .
• Vectors $\{b,Ab,A^{2}b,\ldots ,A^{r-1}b\}$ are linearly independent until $r , and ${\mathcal {K}}_{r}(A,b)\subset {\mathcal {K}}_{r_{0}}(A,b)$ . $r_{0}$ is the maximal dimension of a Krylov subspace.
• For such $r_{0}$ we have $r_{0}\leq 1+\mathrm {rank} A$ and $r_{0}\leq n$ , more exactly $r_{0}\leq \partial p(A)$ [clarification needed], where $p(A)$ is the minimal polynomial of $A$ .
• There exists a $b$ such that $r_{0}=\partial p(A)$ .
• ${\mathcal {K}}_{r}(A,b)$ is a cyclic submodule generated by $b$ of the torsion $k[x]$ -module $(k^{n})^{A}$ , where $k^{n}$ is the linear space on $k$ .
• $k^{n}$ can be decomposed as the direct sum of Krylov subspaces.

## Use

Krylov subspaces are used in algorithms for finding approximate solutions to high-dimensional linear algebra problems.

Modern iterative methods for finding one (or a few) eigenvalues of large sparse matrices or solving large systems of linear equations avoid matrix-matrix operations, but rather multiply vectors by the matrix and work with the resulting vectors. Starting with a vector, b, one computes $Ab$ , then one multiplies that vector by $A$ to find $A^{2}b$ and so on. All algorithms that work this way are referred to as Krylov subspace methods; they are among the most successful methods currently available in numerical linear algebra.

## Issues

Because the vectors usually soon become almost linearly dependent due to the properties of power iteration, methods relying on Krylov subspace frequently involve some orthogonalization scheme, such as Lanczos iteration for Hermitian matrices or Arnoldi iteration for more general matrices.

## Existing methods

The best known Krylov subspace methods are the Arnoldi, Lanczos, Conjugate gradient, IDR(s) (Induced dimension reduction), GMRES (generalized minimum residual), BiCGSTAB (biconjugate gradient stabilized), QMR (quasi minimal residual), TFQMR (transpose-free QMR), and MINRES (minimal residual) methods.