Jump to content

Tensor product: Difference between revisions

From Wikipedia, the free encyclopedia
Content deleted Content added
Magmalex (talk | contribs)
m →‎Prerequisite: the free vector space: changed: x in F to x in K
Line 11: Line 11:
Here the coefficients <math>a_1, \dots, a_n</math> are elements of the ground field ''K'' and the <math>s_1, \dots, s_n</math> are arbitrary elements of ''S''. The plus symbol and the dots are purely formal notation. Addition of such formal linear sums does ''not'' mean that elements of ''S'' are added, nor is any element of ''K'' actually multiplied with one of ''S''. Instead, for example
Here the coefficients <math>a_1, \dots, a_n</math> are elements of the ground field ''K'' and the <math>s_1, \dots, s_n</math> are arbitrary elements of ''S''. The plus symbol and the dots are purely formal notation. Addition of such formal linear sums does ''not'' mean that elements of ''S'' are added, nor is any element of ''K'' actually multiplied with one of ''S''. Instead, for example
:<math>(a_1 \cdot s_1 + \dots + a_n \cdot s_n) + (a'_1 \cdot s_1 + b' \cdot s') = (a_1 + a'_1) \cdot s_1 + a_2\cdot s_2 + \dots a_n \cdot s_n + b' \cdot s',</math>
:<math>(a_1 \cdot s_1 + \dots + a_n \cdot s_n) + (a'_1 \cdot s_1 + b' \cdot s') = (a_1 + a'_1) \cdot s_1 + a_2\cdot s_2 + \dots a_n \cdot s_n + b' \cdot s',</math>
if s' is different from all the elements appearing in the first summand. Moreover, the product ([[scalar multiplication]]) of the above formal linear sum with some element ''x'' in ''F'' is defined as
if s' is different from all the elements appearing in the first summand. Moreover, the product ([[scalar multiplication]]) of the above formal linear sum with some element ''x'' in ''K'' is defined as
:<math>(x a_1) \cdot s_1 + (x a_2) \cdot s_2 + \dots + (x a_n) \cdot s_n.</math>
:<math>(x a_1) \cdot s_1 + (x a_2) \cdot s_2 + \dots + (x a_n) \cdot s_n.</math>
This concludes the definition of the vector space ''F''(''S''). For example, if ''S'' has just 3 elements, then ''F''(''S'') is a 3-[[dimension (vector space)|dimensional]] vector space.
This concludes the definition of the vector space ''F''(''S''). For example, if ''S'' has just 3 elements, then ''F''(''S'') is a 3-[[dimension (vector space)|dimensional]] vector space.

Revision as of 15:15, 13 July 2013

In mathematics, the tensor product, denoted by ⊗, may be applied in different contexts to vectors, matrices, tensors, vector spaces, algebras, topological vector spaces, and modules, among many other structures or objects. In each case the significance of the symbol is the same: the most general bilinear operation. In some contexts, this product is also referred to as outer product. The term "tensor product" is also used in relation to monoidal categories. The variant of ⊗ is used in control theory that expresses that the elements in the tensor product are vectors, matrices or tensors which define the vertexes of a given polytopic model, as in TP model transformation.

Tensor product of vector spaces

The tensor product of two vector spaces V and W over a field K is another vector space over K. It is denoted V ⊗K W or V ⊗ W when the underlying field K is understood.

Prerequisite: the free vector space

The construction of V ⊗ W requires the notion of the free vector space F(S) on some set S. The elements of the vector space F(S) are expressions of the form

Here the coefficients are elements of the ground field K and the are arbitrary elements of S. The plus symbol and the dots are purely formal notation. Addition of such formal linear sums does not mean that elements of S are added, nor is any element of K actually multiplied with one of S. Instead, for example

if s' is different from all the elements appearing in the first summand. Moreover, the product (scalar multiplication) of the above formal linear sum with some element x in K is defined as

This concludes the definition of the vector space F(S). For example, if S has just 3 elements, then F(S) is a 3-dimensional vector space.

Definition

Given two vector spaces V and W, the Cartesian product V × W is the set consisting of pairs (v, w) with v in V and w in W. (This Cartesian product is also a vector space in its own right, but is regarded as a set, only, at this point.) The tensor product is defined as a certain quotient vector space of F(V × W), the free K-vector space on the Cartesian product.

The free vector space is called like this since different elements of the set V x W are not at all related in F(V × W). For example, given two different elements , in the free vector space F(V x W), the equation

does not hold. Likewise, for x in K, the equation

does not hold in the free vector space. In this sense, even though V × W is a vector space itself, that structure is not reflected in F(V × W). The idea of the tensor product is to enforce these two relations and similar ones for the second variable. To do so, consider the subspace R of F(V × W) generated by the following elements. To simplify notation, it is customary to drop the coefficient if it is one, i.e., stands for .

where v, v1 and v2 are arbitrary elements of V, while w, w1, and w2 are vectors from W, and c is from the underlying field K.

The tensor product is defined as the vector space

The tensor product of two vectors v and w is the equivalence class ((v,w) + R) of (v,w) in V ⊗ W. It is denoted v ⊗ w. The effect of dividing out R in the free vector space is the following equations hold in V ⊗ W:

Notation and examples

Given bases {vi} and {wi} for V and W respectively, the tensors {vi ⊗ wj} form a basis for V ⊗ W. The dimension of the tensor product therefore is the product of dimensions of the original spaces; for instance Rm ⊗ Rn will have dimension mn.

Elements of V ⊗ W are sometimes referred to as tensors, although this term refers to many other related concepts as well.[1] An element of V ⊗ W of the form v ⊗ w is called a pure or simple tensor. In general, an element of the tensor product space is not a pure tensor, but rather a finite linear combination of pure tensors. That is to say, if v1 and v2 are linearly independent, and w1 and w2 are also linearly independent, then v1 ⊗ w1 + v2 ⊗ w2 cannot be written as a pure tensor. The number of simple tensors required to express an element of a tensor product is called the tensor rank (not to be confused with tensor order, which is the number of spaces one has taken the product of, in this case 2; in notation, the number of indices), and for linear operators or matrices, thought of as (1,1) tensors (elements of the space V ⊗ V*), it agrees with matrix rank.

Tensor product of linear maps

The tensor product also operates on linear maps between vector spaces. Specifically, given two linear maps S : VX and T : WY between vector spaces, the tensor product of the two linear maps S and T is a linear map

defined by

In this way, the tensor product becomes a bifunctor from the category of vector spaces to itself, covariant in both arguments.[2]

By choosing bases of all vector spaces involved, the linear maps S and T can be represented by matrices. Then, the matrix describing the tensor product is the Kronecker product of the two matrices. For example, if V, X, W, and Y are all two-dimensional above and bases have been fixed for all of them, and S and T are given by the matrices and , respectively, then the tensor product of these two matrices is

The resultant rank is at most 4, and the resultant dimension 16. Here rank denotes the tensor rank (number of requisite indices), while the matrix rank counts the number of degrees of freedom in the resulting array.

A dyadic product is the special case of the tensor product between two vectors of the same dimension.

Universal property

The tensor product as defined above has a universal property. In general, a universal property means that some mathematical object is characterized by the maps with target (or, alternatively, domain) this object. In the context of linear algebra and vector spaces, the maps in question are required to be linear maps. The tensor product of vector spaces, as defined above, satisfies the following universal property: there is a bilinear (i.e., linear in each variable v and w) map such that given any other vector space Z together with a bilinear map , there is a unique linear map verifying . In this sense, is the most general bilinear map that can be built from .

This characterization can simplify proving statements about the tensor product. For example, the tensor product is symmetric: that is, there is a canonical isomorphism:

To construct, say, a map from left to right, it suffices, by the universal property, to give a bilinear map This is done by mapping (v, w) to . Constructing a map in the opposite direction is done similarly, as is checking that the two linear maps and are inverse to one another.

A similar reasoning can be used to show that the tensor product is associative, that is, there are natural isomorphisms

Therefore, it is customary to omit the parentheses and write .

Tensor powers and braiding

Let n be a non-negative integer. The nth tensor power of the vector space V is the n-fold tensor product of V with itself. That is

A permutation σ of the set {1, 2, ..., n} determines a mapping of the nth Cartesian power of V

defined by

Let

be the natural multilinear embedding of the Cartesian power of V into the tensor power of V. Then, by the universal property, there is a unique isomorphism

such that

The isomorphism τσ is called the braiding map associated to the permutation σ.

Product of tensors

For non-negative integers r and s a r,s-tensor on a vector space V is an element of

Here is the dual vector space (which consists of all linear maps f from V to the ground field K).

There is a product map, called the (tensor) product of tensors

It is defined by grouping all occuring "factors" V together: writing for an element of V and for elements of the dual space,

Picking a basis of V and the corresponding dual basis of , is endowed with a natural basis (this basis is described in the article on Kronecker products). In terms of these bases, the components of a (tensor) product of two (or more) tensors can be computed. For example, if F and G are two covariant tensors of rank m and n (respectively) (i.e. FTm0, and GTn0), then the components of their tensor product are given by

[3] Thus, the components of the tensor product of two tensors are the ordinary product of the components of each tensor. Another example: let U be a tensor of type (1,1) with components Uαβ, and let V be a tensor of type (1,0) with components Vγ. Then

and

Relation to dual space

A particular example is the tensor product of some vector space V with its dual vector space (which consists of all linear maps f from V to the ground field K). In this case, there is a natural "evaluation" map

which on elementary tensors is defined by

The resulting map

is called tensor contraction (for r, s > 0).

On the other hand, if V is finite-dimensional, there is a map in the other direction (called coevaluation)

where is a basis of V, and is its dual basis. The interplay of evaluation and coevaluation map can be used to characterize finite-dimensional vector spaces without referring to bases.[4]

Tensor product vs. Hom

Given three vector spaces U, V, W the tensor product is linked to the vector space of all linear maps, as follows:

Here denotes the K-vector space of all linear maps. This is an example of adjoint functors: the tensor product is "left adjoint" to Hom.

Adjoint representation

The tensor may be naturally viewed as a module for the Lie algebra End(V) by means of the diagonal action: for simplicity let us assume r = s = 1, then, for each ,

where u* in End(V*) is the transpose of u, that is, in terms of the obvious pairing on VV*,

.

There is a canonical isomorphism given by

Under this isomorphism, every u in End(V) may be first viewed as an endomorphism of and then viewed as an endomorphism of End(V). In fact it is the adjoint representation ad(u) of End(V) .

Tensor products of modules over a ring

The tensor product of two modules A and B over a commutative ring is defined in the exact same way as the tensor product of vector spaces over a field:

where now is the free R-module generated by the cartesian product and G is the R-module generated by the same relations as above.

More generally, the tensor product can be defined even if the ring is non-commutative (ab ≠ ba). In this case M has to be a right-R-module and N is a left-R-module, and instead of the last two relations above, the relation

is imposed. If R is non-commutative, this is no longer an R-module, but just an abelian group.

The universal property also carries over, slightly modified: the map defined by is a middle linear map (referred to as "canonical Middle Linear Map".[5]); that is,[6] it satisfies:

The first two properties make a homomorphism of the abelian group . For any middle linear map of , a unique group homomorphism of satisfies , and this property determines within group isomorphism. See the main article for details.

Computing the tensor product

For vector spaces, the tensor product is quickly computed since bases of V of W immediately determine a basis of , as was mentioned above. For modules over a general (commutative) ring, not every module is free. For example, Z/n is not a free abelian group (=Z-module). The tensor product with Z/n is given by

More generally, given a presentation of some R-module M, that is, a number of generators together with relations , with , the tensor product can be computed as the following cokernel:

Here and the map is determined by sending some in the j-th copy of to (in ). Colloquially, this may be rephrased by saying that a presentation of M gives rise to a presentation of . This is referred to by saying that the tensor product is a right exact functor. It is not in general left exact, that is, given an injective map of R-modules , the tensor product

is not usually injective. For example, tensoring the (injective) map given by multiplication with n, with yields the 0 map which is not injective. Higher Tor functors measure the defect of the tensor product being not left exact.

Tensor product of algebras

Let R be a commutative ring. The tensor product of R-modules applies, in particular, if A and B are R-algebras. In this case, the tensor product is an R-algebra itself by putting

For example,

A particular example is when A and B are fields containing a common subfield R. The tensor product of fields is closely related to Galois theory: if, say, , where f is some irreducible polynomial with coefficients in R, the tensor product can be calculated as

where now f is interpreted as the same polynomial, but with its coefficients regarded as elements of B. In the larger field B, the polynomial may become reducible, which brings in Galois theory. For example, if A = B is a Galois extension of R, then

is isomorphic (as an A-algebra) to the .

Other examples of tensor products

Tensor product of Hilbert spaces

Topological tensor product

Tensor product of graded vector spaces

Tensor product of quadratic forms

Tensor product of multilinear maps

Given multilinear maps and their tensor product is the multilinear function

Tensor product of graphs

Applications

Exterior and symmetric algebra

Two notable constructions in linear algebra can be constructed as quotients of the tensor product: the exterior algebra and the symmetric algebra. For example, given a vector space V, the exterior product

is defined as

The image of in the exterior product is usually denoted and satisfies, by construction, . Similar constructions are possible for (n factors), giving rise to , the n-th exterior power of V. The latter notion is the basis of differential n-forms.

The symmetric algebra is constructed in a similar manner:

That is, in the symmetric algebra two adjacent vectors (and therefore all of them) can be interchanged. The resulting objects are called symmetric tensors.

Tensor product of line bundles

Tensor product for computer programmers

Array programming languages

Array programming languages may have this pattern built in. For example, in APL the tensor product is expressed as (for example or ). In J the tensor product is the dyadic form of */ (for example a */ b or a */ b */ c).

Note that J's treatment also allows the representation of some tensor fields, as a and b may be functions instead of constants. This product of two functions is a derived function, and if a and b are differentiable, then a*/b is differentiable.

However, these kinds of notation are not universally present in array languages. Other array languages may require explicit treatment of indices (for example, MATLAB), and/or may not support higher-order functions such as the Jacobian derivative (for example, Fortran/APL).

See also

Notes

  1. ^ See Tensor or Tensor (intrinsic definition).
  2. ^ Hazewinkel, Michiel; Gubareni, Nadezhda Mikhaĭlovna; Gubareni, Nadiya; Kirichenko, Vladimir V. (2004). Algebras, rings and modules. Springer. p. 100. ISBN 978-1-4020-2690-4.
  3. ^ Analogous formulas also hold for contravariant tensors, as well as tensors of mixed variance. Although in many cases such as when there is an inner product defined, the distinction is irrelevant.
  4. ^ See Compact closed category.
  5. ^ Hungerford, Thomas W. (1974). Algebra. Springer. ISBN 0-387-90518-9.
  6. ^ Chen, Jungkai Alfred (2004), "Tensor product" (PDF), Advanced Algebra II (lecture notes), National Taiwan University {{citation}}: Unknown parameter |month= ignored (help)CS1 maint: location missing publisher (link)

References