# Outer product

For "outer product" in geometric algebra, see Exterior algebra.

In linear algebra, the term outer product typically refers to the tensor product of two vectors. The result of applying the outer product to a pair of coordinate vectors is a matrix. The name contrasts with the inner product, which takes as input a pair of vectors and produces a scalar.

The outer product of vectors can be also regarded as a special case of the Kronecker product of matrices.

Some authors use the expression "outer product of tensors" as a synonym of "tensor product". The outer product is also a higher-order function in some computer programming languages such as R, APL, and Mathematica.

## Definition (matrix multiplication)

Main article: matrix multiplication

The outer product uv is equivalent to a matrix multiplication uvT, provided that u is represented as a m × 1 column vector and v as a n × 1 column vector (which makes vT a row vector).[1] For instance, if m = 4 and n = 3, then

$\mathbf{u} \otimes \mathbf{v} = \mathbf{u} \mathbf{v}^\mathrm{T} = \begin{bmatrix}u_1 \\ u_2 \\ u_3 \\ u_4\end{bmatrix} \begin{bmatrix}v_1 & v_2 & v_3\end{bmatrix} = \begin{bmatrix}u_1v_1 & u_1v_2 & u_1v_3 \\ u_2v_1 & u_2v_2 & u_2v_3 \\ u_3v_1 & u_3v_2 & u_3v_3 \\ u_4v_1 & u_4v_2 & u_4v_3\end{bmatrix}.$

Or in index notation:

$(\mathbf{u} \mathbf{v}^\mathrm{T})_{ij}=u_iv_j$

For complex vectors, it is customary to use the conjugate transpose of v (denoted vH):

$\mathbf{u} \otimes \mathbf{v} = \mathbf{u} \mathbf{v}^\mathrm{H}.$

### Contrast with inner product

If m = n, then one can take the matrix product the other way, yielding a scalar (or 1 × 1 matrix):

$\left\langle \mathbf{u}, \mathbf{v}\right\rangle = \mathbf{u}^\mathrm{T} \mathbf{v}$

which is the standard inner product for Euclidean vector spaces, better known as the dot product. The inner product is the trace of the outer product.

### Rank of an outer product

If u and v are both nonzero then the outer product matrix uvT always has matrix rank 1, as can be easily seen by multiplying it with a vector x:

$(\mathbf{u} \mathbf{v}^\mathrm{T}) \mathbf{x} = \mathbf{u} (\mathbf{v}^\mathrm{T} \mathbf{x})$

which is just a scalar vTx multiplied by a vector u.

("Matrix rank" should not be confused with "tensor order", or "tensor degree", which is sometimes referred to as "rank".)

## Definition (vectors and tensors)

### Vector multiplication

Given the vectors

$\mathbf{u} =(u_1, u_2, \dots, u_m)$
$\mathbf{v} = (v_1, v_2, \dots, v_n)$

their outer product uv is defined as the m × n matrix A obtained by multiplying each element of u by each element of v:[2][3]

$\mathbf{u} \otimes \mathbf{v} = \mathbf{A} = \begin{bmatrix}u_1v_1 & u_1v_2 & \dots & u_1v_n \\ u_2v_1 & u_2v_2 & \dots & u_2v_n \\ \vdots & \vdots & \ddots & \vdots\\ u_mv_1 & u_mv_2 & \dots & u_mv_n \end{bmatrix}.$

For complex vectors, outer product can be defined as above, or with the complex conjugate of v (denoted v or ). Namely, matrix A is obtained by multiplying each element of u by the complex conjugate of each element of v.

### Tensor multiplication

The outer product on tensors is typically referred to as the tensor product. Given a tensor a of order q with dimensions (i1, ..., iq), and a tensor b of order r with dimensions (j1, ..., jr), their outer product c is of order q + r with dimensions (k1, ..., kq+r) which are the i  dimensions followed by the j  dimensions. It is denoted in coordinate-free notation using ⊗ and components are defined index notation by:[4]

$\mathbf{c}=\mathbf{a}\otimes\mathbf{b}, \quad c_{ij}=a_ib_j$

similarly for higher order tensors:

$\mathbf{T}=\mathbf{a}\otimes\mathbf{b}\otimes\mathbf{c}, \quad T_{ijk}=a_ib_jc_k$

For example, if A is of order 3 with dimensions (3, 5, 7) and B is of order 2 with dimensions (10, 100), their outer product c is of order 5 with dimensions (3, 5, 7, 10, 100). If A has a component A[2, 2, 4] = 11 and B has a component B[8, 88] = 13, then the component of C formed by the outer product is C[2, 2, 4, 8, 88] = 143.

To understand the matrix definition of outer product in terms of the definition of tensor product:

1. The vector v can be interpreted as an order-1 tensor with dimension M, and the vector u as an order-1 tensor with dimension N. The result is an order-2 tensor with dimension (M, N).
2. The order of the result of an inner product between two tensors of order q and r is the greater of q + r − 2 and 0. Thus, the inner product of two matrices has the same order as the outer product (or tensor product) of two vectors.
3. It is possible to add arbitrarily many leading or trailing 1 dimensions to a tensor without fundamentally altering its structure. These 1 dimensions would alter the character of operations on these tensors, so any resulting equivalences should be expressed explicitly.
4. The inner product of two matrices V with dimensions (d, e) and U with dimensions (e, f) is $\sum_{j = 1}^e V_{ij} U_{jk}$, where i = 1, 2, ..., d and k = 1, 2, ..., f. For the case where e = 1, the summation is trivial (involving only a single term).
5. The outer product of two matrices V with dimensions (m, n) and U with dimensions (p, q) is $C_{st} = V_{ij} U_{hk}$, where s = 1, 2, ..., mp − 1, mp and t = 1, 2, ..., nq − 1, nq.

## Definition (abstract)

Let V and W be two vector spaces, and let W be the dual space of W. Given a vector xV and yW, then the tensor product yx corresponds to the map A : W → V given by

$w \mapsto y^*(w)x.$

Here y(w) denotes the value of the linear functional y (which is an element of the dual space of W) when evaluated at the element wW. This scalar in turn is multiplied by x to give as the final result an element of the space V.

If V and W are finite-dimensional, then the space of all linear transformations from W to V, denoted Hom(W, V), is generated by such outer products; in fact, the rank of a matrix is the minimal number of such outer products needed to express it as a sum (this is the tensor rank of a matrix). In this case Hom(W, V) is isomorphic to WV.

### Contrast with duality pairing

If W = V, then one can pair the covector wV with the vector vV via the map (w, v) ↦ w(v), which is the duality pairing between V and its dual.

## Applications

The outer product is useful in computing physical quantities (e.g., the tensor of inertia), and performing transform operations in digital signal processing and digital image processing. It is also useful in statistical analysis for computing the covariance and auto-covariance matrices for two random variables.

## References

1. ^ Linear Algebra (4th Edition), S. Lipcshutz, M. Lipson, Schaum’s Outlines, McGraw Hill (USA), 2009, ISBN 978-0-07-154352-1
2. ^ http://mathworld.wolfram.com/KroneckerProduct.html
3. ^ Encyclopaedia of Physics (2nd Edition), R.G. Lerner, G.L. Trigg, VHC publishers, 1991, (Verlagsgesellschaft) 3-527-26954-1, (VHC Inc.) 0-89573-752-3
4. ^ Mathematical methods for physics and engineering, K.F. Riley, M.P. Hobson, S.J. Bence, Cambridge University Press, 2010, ISBN 978-0-521-86153-3