Biconjugate gradient method
||This article includes a list of references, related reading or external links, but its sources remain unclear because it lacks inline citations. (September 2013)|
- Choose initial guess , two other vectors and and a preconditioner
- for do
In the above formulation, the computed and satisfy
and thus are the respective residuals corresponding to and , as approximate solutions to the systems
Unpreconditioned version of the algorithm
- Choose initial guess ,
- for do
The biconjugate gradient method is numerically unstable (compare to the biconjugate gradient stabilized method), but very important from a theoretical point of view. Define the iteration steps by
where using the related projection
These related projections may be iterated themselves as
A relation to Quasi-Newton methods is given by and , where
The new directions
are then orthogonal to the residuals:
which themselves satisfy
The biconjugate gradient method now makes a special choice and uses the setting
With this particular choice, explicit evaluations of and A−1 are avoided, and the algorithm takes the form stated above.
- If is self-adjoint, and , then , , and the conjugate gradient method produces the same sequence at half the computational cost.
- The sequences produced by the algorithm are biorthogonal, i.e., for .
- if is a polynomial with , then . The algorithm thus produces projections onto the Krylov subspace.
- if is a polynomial with , then .
- Fletcher, R. (1976). "Conjugate gradient methods for indefinite systems". In Watson, G. Alistair. Numerical Analysis. Lecture Notes in Mathematics (Springer Berlin / Heidelberg) 506: 73–89. doi:10.1007/BFb0080109. ISBN 978-3-540-07610-0. ISSN 1617-9692.
- Press, WH; Teukolsky, SA; Vetterling, WT; Flannery, BP (2007). "Section 2.7.6". Numerical Recipes: The Art of Scientific Computing (3rd ed.). New York: Cambridge University Press. ISBN 978-0-521-88068-8