On products of sums of series products
In algebra , the Binet–Cauchy identity , named after Jacques Philippe Marie Binet and Augustin-Louis Cauchy , states that[ 1]
(
∑
i
=
1
n
a
i
c
i
)
(
∑
j
=
1
n
b
j
d
j
)
=
(
∑
i
=
1
n
a
i
d
i
)
(
∑
j
=
1
n
b
j
c
j
)
+
∑
1
≤
i
<
j
≤
n
(
a
i
b
j
−
a
j
b
i
)
(
c
i
d
j
−
c
j
d
i
)
{\displaystyle \left(\sum _{i=1}^{n}a_{i}c_{i}\right)\left(\sum _{j=1}^{n}b_{j}d_{j}\right)=\left(\sum _{i=1}^{n}a_{i}d_{i}\right)\left(\sum _{j=1}^{n}b_{j}c_{j}\right)+\sum _{1\leq i<j\leq n}(a_{i}b_{j}-a_{j}b_{i})(c_{i}d_{j}-c_{j}d_{i})}
for every choice of real or complex numbers (or more generally, elements of a commutative ring ).
Setting ai = ci and bj = dj , it gives Lagrange's identity , which is a stronger version of the Cauchy–Schwarz inequality for the Euclidean space
R
n
{\textstyle \mathbb {R} ^{n}}
. The Binet-Cauchy identity is a special case of the Cauchy–Binet formula for matrix determinants.
The Binet–Cauchy identity and exterior algebra[ edit ]
When n = 3 , the first and second terms on the right hand side become the squared magnitudes of dot and cross products respectively; in n dimensions these become the magnitudes of the dot and wedge products . We may write it
(
a
⋅
c
)
(
b
⋅
d
)
=
(
a
⋅
d
)
(
b
⋅
c
)
+
(
a
∧
b
)
⋅
(
c
∧
d
)
{\displaystyle (a\cdot c)(b\cdot d)=(a\cdot d)(b\cdot c)+(a\wedge b)\cdot (c\wedge d)}
where a , b , c , and d are vectors. It may also be written as a formula giving the dot product of two wedge products, as
(
a
∧
b
)
⋅
(
c
∧
d
)
=
(
a
⋅
c
)
(
b
⋅
d
)
−
(
a
⋅
d
)
(
b
⋅
c
)
,
{\displaystyle (a\wedge b)\cdot (c\wedge d)=(a\cdot c)(b\cdot d)-(a\cdot d)(b\cdot c)\,,}
which can be written as
(
a
×
b
)
⋅
(
c
×
d
)
=
(
a
⋅
c
)
(
b
⋅
d
)
−
(
a
⋅
d
)
(
b
⋅
c
)
{\displaystyle (a\times b)\cdot (c\times d)=(a\cdot c)(b\cdot d)-(a\cdot d)(b\cdot c)}
in the n = 3 case.
In the special case a = c and b = d , the formula yields
|
a
∧
b
|
2
=
|
a
|
2
|
b
|
2
−
|
a
⋅
b
|
2
.
{\displaystyle |a\wedge b|^{2}=|a|^{2}|b|^{2}-|a\cdot b|^{2}.}
When both a and b are unit vectors, we obtain the usual relation
sin
2
ϕ
=
1
−
cos
2
ϕ
{\displaystyle \sin ^{2}\phi =1-\cos ^{2}\phi }
where φ is the angle between the vectors.
This is a special case of the Inner product on the exterior algebra of a vector space, which is defined on wedge-decomposable elements as the Gram determinant of their components.
A relationship between the Levi–Cevita symbols and the generalized Kronecker delta is
1
k
!
ε
λ
1
⋯
λ
k
μ
k
+
1
⋯
μ
n
ε
λ
1
⋯
λ
k
ν
k
+
1
⋯
ν
n
=
δ
ν
k
+
1
⋯
ν
n
μ
k
+
1
⋯
μ
n
.
{\displaystyle {\frac {1}{k!}}\varepsilon ^{\lambda _{1}\cdots \lambda _{k}\mu _{k+1}\cdots \mu _{n}}\varepsilon _{\lambda _{1}\cdots \lambda _{k}\nu _{k+1}\cdots \nu _{n}}=\delta _{\nu _{k+1}\cdots \nu _{n}}^{\mu _{k+1}\cdots \mu _{n}}\,.}
The
(
a
∧
b
)
⋅
(
c
∧
d
)
=
(
a
⋅
c
)
(
b
⋅
d
)
−
(
a
⋅
d
)
(
b
⋅
c
)
{\displaystyle (a\wedge b)\cdot (c\wedge d)=(a\cdot c)(b\cdot d)-(a\cdot d)(b\cdot c)}
form of the Binet–Cauchy identity can be written as
1
(
n
−
2
)
!
(
ε
μ
1
⋯
μ
n
−
2
α
β
a
α
b
β
)
(
ε
μ
1
⋯
μ
n
−
2
γ
δ
c
γ
d
δ
)
=
δ
γ
δ
α
β
a
α
b
β
c
γ
d
δ
.
{\displaystyle {\frac {1}{(n-2)!}}\left(\varepsilon ^{\mu _{1}\cdots \mu _{n-2}\alpha \beta }~a_{\alpha }~b_{\beta }\right)\left(\varepsilon _{\mu _{1}\cdots \mu _{n-2}\gamma \delta }~c^{\gamma }~d^{\delta }\right)=\delta _{\gamma \delta }^{\alpha \beta }~a_{\alpha }~b_{\beta }~c^{\gamma }~d^{\delta }\,.}
Expanding the last term,
∑
1
≤
i
<
j
≤
n
(
a
i
b
j
−
a
j
b
i
)
(
c
i
d
j
−
c
j
d
i
)
=
∑
1
≤
i
<
j
≤
n
(
a
i
c
i
b
j
d
j
+
a
j
c
j
b
i
d
i
)
+
∑
i
=
1
n
a
i
c
i
b
i
d
i
−
∑
1
≤
i
<
j
≤
n
(
a
i
d
i
b
j
c
j
+
a
j
d
j
b
i
c
i
)
−
∑
i
=
1
n
a
i
d
i
b
i
c
i
{\displaystyle {\begin{aligned}&\sum _{1\leq i<j\leq n}(a_{i}b_{j}-a_{j}b_{i})(c_{i}d_{j}-c_{j}d_{i})\\={}&{}\sum _{1\leq i<j\leq n}(a_{i}c_{i}b_{j}d_{j}+a_{j}c_{j}b_{i}d_{i})+\sum _{i=1}^{n}a_{i}c_{i}b_{i}d_{i}-\sum _{1\leq i<j\leq n}(a_{i}d_{i}b_{j}c_{j}+a_{j}d_{j}b_{i}c_{i})-\sum _{i=1}^{n}a_{i}d_{i}b_{i}c_{i}\end{aligned}}}
where the second and fourth terms are the same and artificially added to complete the sums as follows:
=
∑
i
=
1
n
∑
j
=
1
n
a
i
c
i
b
j
d
j
−
∑
i
=
1
n
∑
j
=
1
n
a
i
d
i
b
j
c
j
.
{\displaystyle =\sum _{i=1}^{n}\sum _{j=1}^{n}a_{i}c_{i}b_{j}d_{j}-\sum _{i=1}^{n}\sum _{j=1}^{n}a_{i}d_{i}b_{j}c_{j}.}
This completes the proof after factoring out the terms indexed by i .
A general form, also known as the Cauchy–Binet formula , states the following:
Suppose A is an m ×n matrix and B is an n ×m matrix. If S is a subset of {1, ..., n } with m elements, we write AS for the m ×m matrix whose columns are those columns of A that have indices from S . Similarly, we write BS for the m ×m matrix whose rows are those rows of B that have indices from S .
Then the determinant of the matrix product of A and B satisfies the identity
det
(
A
B
)
=
∑
S
⊂
{
1
,
…
,
n
}
|
S
|
=
m
det
(
A
S
)
det
(
B
S
)
,
{\displaystyle \det(AB)=\sum _{S\subset \{1,\ldots ,n\} \atop |S|=m}\det(A_{S})\det(B_{S}),}
where the sum extends over all possible subsets S of {1, ..., n } with m elements.
We get the original identity as special case by setting
A
=
(
a
1
…
a
n
b
1
…
b
n
)
,
B
=
(
c
1
d
1
⋮
⋮
c
n
d
n
)
.
{\displaystyle A={\begin{pmatrix}a_{1}&\dots &a_{n}\\b_{1}&\dots &b_{n}\end{pmatrix}},\quad B={\begin{pmatrix}c_{1}&d_{1}\\\vdots &\vdots \\c_{n}&d_{n}\end{pmatrix}}.}
Aitken, Alexander Craig (1944), Determinants and Matrices , Oliver and Boyd
Harville, David A. (2008), Matrix Algebra from a Statistician's Perspective , Springer