Introduction to Mathematical Physics/Dual of a vectorial space

From Wikibooks, open books for an open world
< Introduction to Mathematical Physics
Jump to: navigation, search

Dual of a vectorial space[edit]



Let be a vectorial space on a commutative field . The vectorial space of the linear forms on is called the dual of and is noted .

When has a finite dimension, then has also a finite dimension and its dimension is gela to the dimension of . If has an inifinite dimension, has also an inifinite dimension but the two spaces are not isomorphic.



In this appendix, we introduce the fundamental notion of tensor\index{tensor} in physics. More information can be found in ([#References|references]) for instance. Let be a finite dimension vectorial space. Let be a basis of . A vector of can be referenced by its components is the basis :

In this chapter the repeated index convention (or {\bf Einstein summing convention}) will be used. It consists in considering that a product of two quantities with the same index correspond to a sum over this index. For instance:


To the vectorial space correspond a space called the dual of . A element of is a linear form on : it is a linear mapping that maps any vector of to a real. is defined by a set of number because the most general form of a linear form on is:

A basis of can be defined by the following linear form

where is one if and zero if not. Thus to each vector of of components can be associated a dual vector in of components :

The quantity

is an invariant. It is independent on the basis chosen. On another hand, the expression of the components of vector depend on the basis chosen. If defines a transformation that maps basis to another basis


we have the following relation between components of in and of in :


This comes from the identification of


Equations eqcov and eqcontra define two types of variables: covariant variables that are transformed like the vector basis. are such variables. Contravariant variables that are transformed like the components of a vector on this basis. Using a physicist vocabulary is called a covariant vector and a contravariant vector.

Covariant and contravariant components of a vector .}

Let and two vectors of two vectorial spaces and . The tensorial product space is the vectorial space such that there exist a unique isomorphism between the space of the bilinear forms of and the linear forms of . A bilinear form of is:

It can be considered as a linear form of using application from to that is linear and distributive with respect to . If is a basis of and a basis of , then

is a basis of . Thus tensor is an element of . A second order covariant tensor is thus an element of . In a change of basis, its components are transformed according the following relation:

Now we can define a tensor on any rank of any variance. For instance a tensor of third order two times covariant and one time contravariant is an element of and noted .

A second order tensor is called symmetric if . It is called antisymmetric is .

Pseudo tensors are transformed slightly differently from ordinary tensors. For instance a second order covariant pseudo tensor is transformed according to:

where is the determinant of transformation .


Let us introduce two particular tensors.

  • The Kronecker symbol is defined by:

It is the only second order tensor invariant in by rotations.

  • The signature of permutations tensor is defined by:

It is the only pseudo tensor of rank 3 invariant by rotations in . It verifies the equality:

Let us introduce two tensor operations: scalar product, vectorial product.

  • Scalar product is the contraction of vectors and  :

  • vectorial product of two vectors and is:

From those definitions, following formulas can be showed:

Here is useful formula:


Green's theorem[edit]

Green's theorem allows to transform a volume calculation integral into a surface calculation integral.


Let be a bounded domain of with a regular boundary. Let be the unitary vector normal to hypersurface (oriented towards the exterior of ). Let be a tensor, continuously derivable in , then:\index{Green's theorem}

Here are some important Green's formulas obtained by applying Green's theorem: