# Sherman–Morrison formula

(Redirected from Sherman-Morrison formula)

In mathematics, in particular linear algebra, the Sherman–Morrison formula, named after Jack Sherman and Winifred J. Morrison, computes the inverse of the sum of an invertible matrix $A$ and the outer product, $uv^{T}$ , of vectors $u$ and $v$ . The Sherman–Morrison formula is a special case of the Woodbury formula. Though named after Sherman and Morrison, it appeared already in earlier publications.

## Statement

Suppose $A\in \mathbb {R} ^{n\times n}$ is an invertible square matrix and $u$ , $v\in \mathbb {R} ^{n}$ are column vectors. Then $A+uv^{T}$ is invertible iff $1+v^{T}A^{-1}u\neq 0$ . In this case,

$(A+uv^{T})^{-1}=A^{-1}-{A^{-1}uv^{T}A^{-1} \over 1+v^{T}A^{-1}u}.$ Here, $uv^{T}$ is the outer product of two vectors $u$ and $v$ . The general form shown here is the one published by Bartlett.

## Proof

($\Leftarrow$ ) To prove that the backward direction ($1+v^{T}A^{-1}u\neq 0\Rightarrow A+uv^{T}$ is invertible with inverse given as above) is true, we verify the properties of the inverse. A matrix $Y$ (in this case the right-hand side of the Sherman–Morrison formula) is the inverse of a matrix $X$ (in this case $A+uv^{T}$ ) if and only if $XY=I$ .

We first verify that the right hand side ($Y$ ) satisfies $XY=I$ .

{\begin{aligned}XY&=(A+uv^{T})\left(A^{-1}-{A^{-1}uv^{T}A^{-1} \over 1+v^{T}A^{-1}u}\right)\\[6pt]&=AA^{-1}+uv^{T}A^{-1}-{AA^{-1}uv^{T}A^{-1}+uv^{T}A^{-1}uv^{T}A^{-1} \over 1+v^{T}A^{-1}u}\\[6pt]&=I+uv^{T}A^{-1}-{uv^{T}A^{-1}+uv^{T}A^{-1}uv^{T}A^{-1} \over 1+v^{T}A^{-1}u}\\[6pt]&=I+uv^{T}A^{-1}-{u(1+v^{T}A^{-1}u)v^{T}A^{-1} \over 1+v^{T}A^{-1}u}\\[6pt]&=I+uv^{T}A^{-1}-uv^{T}A^{-1}&&1+v^{T}A^{-1}u{\text{ is a scalar}}\\[6pt]&=I\end{aligned}} ($\Rightarrow$ ) Reciprocally, if $1+v^{T}A^{-1}u=0$ , then letting $x=A^{-1}u$ , one has $(A+uv^{T})x=0$ thus $(A+uv^{T})$ has a non-trivial kernel and is therefore not invertible.

## Application

If the inverse of $A$ is already known, the formula provides a numerically cheap way to compute the inverse of $A$ corrected by the matrix $uv^{T}$ (depending on the point of view, the correction may be seen as a perturbation or as a rank-1 update). The computation is relatively cheap because the inverse of $A+uv^{T}$ does not have to be computed from scratch (which in general is expensive), but can be computed by correcting (or perturbing) $A^{-1}$ .

Using unit columns (columns from the identity matrix) for $u$ or $v$ , individual columns or rows of $A$ may be manipulated and a correspondingly updated inverse computed relatively cheaply in this way. In the general case, where $A^{-1}$ is a $n$ -by-$n$ matrix and $u$ and $v$ are arbitrary vectors of dimension $n$ , the whole matrix is updated and the computation takes $3n^{2}$ scalar multiplications. If $u$ is a unit column, the computation takes only $2n^{2}$ scalar multiplications. The same goes if $v$ is a unit column. If both $u$ and $v$ are unit columns, the computation takes only $n^{2}$ scalar multiplications.

This formula also has application in theoretical physics. Namely, in quantum field theory, one uses this formula to calculate the propagator of a spin-1 field[circular reference]. The inverse propagator (as it appears in the Lagrangian) has the form $(A+uv^{T})$ . One uses the Sherman-Morrison formula to calculate the inverse (satisfying certain time-ordering boundary conditions) of the inverse propagator - or simply the (Feynman) propagator - which is needed to perform any perturbative calculation involving the spin-1 field.

## Alternative verification

Following is an alternate verification of the Sherman–Morrison formula using the easily verifiable identity

$(I+wv^{T})^{-1}=I-{\frac {wv^{T}}{1+v^{T}w}}$ .

Let

$u=Aw,\quad {\text{and}}\quad A+uv^{T}=A\left(I+wv^{T}\right),$ then

$(A+uv^{T})^{-1}=(I+wv^{T})^{-1}{A^{-1}}=\left(I-{\frac {wv^{T}}{1+v^{T}w}}\right)A^{-1}$ .

Substituting $w={{A}^{-1}}u$ gives

$(A+uv^{T})^{-1}=\left(I-{\frac {A^{-1}uv^{T}}{1+v^{T}A^{-1}u}}\right)A^{-1}={A^{-1}}-{\frac {A^{-1}uv^{T}A^{-1}}{1+v^{T}A^{-1}u}}$ ## Generalization (Woodbury matrix identity)

Given a square invertible $n\times n$ matrix $A$ , an $n\times k$ matrix $U$ , and a $k\times n$ matrix $V$ , let $B$ be an $n\times n$ matrix such that $B=A+UV$ . Then, assuming $\left(I_{k}+VA^{-1}U\right)$ is invertible, we have

$B^{-1}=A^{-1}-A^{-1}U\left(I_{k}+VA^{-1}U\right)^{-1}VA^{-1}.$ 