Linear Algebra/Inner product spaces

From Wikibooks, open books for an open world
< Linear Algebra
Jump to navigation Jump to search

Recall that in your study of vectors, we looked at an operation known as the dot product, and that if we have two vectors in Rn, we simply multiply the components together and sum them up. With the dot product, it becomes possible to introduce important new ideas like length and angle. The length of a vector,, is just . The angle between two vectors, and , is related to the dot product by

It turns out that only a few properties of the dot product are necessary to define similar ideas in vector spaces other than Rn, such as the spaces of matrices, or polynomials. The more general operation that will take the place of the dot product in these other spaces is called the "inner product".

The inner product[edit]

Say we have two vectors:

If we want to take their dot product, we would work as follows

Because in this case multiplication is commutative, we then have a·b = b · a.

But then, we observe that

much like the regular algebraic equality v(aA+bB)=avA+bvB. For regular dot products this is true since, for R3, for example, one can expand both sides out to obtain

Finally, we can notice that v·v is always positive or greater than zero - checking this for R3 gives this as

which can never be less than zero since a real number squared is positive. Note that v·v = 0 if and only if v = 0.

In generalizing this sort of behaviour, we want to keep these three behaviours. We can then move on to a definition of a generalization of the dot product, which we call the inner product. An inner product of two vectors in some vector space V, written < x, y > is a function that maps V×V to R, which obeys the property that

  • < x, y > = < y, x >
  • < v, αab > = α < v, a > + β < v, b >
  • < a, a > ≥ 0, < a, a > = 0 iff a = 0.

The vector space V and some inner product together are known as an inner product space.


The dot product in [edit]

Given two vectors and , the dot product generalized to complex numbers is:

where for an arbitrary complex number is the complex conjugate: .

The dot product is "conjugate commutative": . One immediate consequence of the definition of the dot product is that the dot product of a vector with itself is always a non-negative real number: .

if and only if

The Cauchy-Schwarz Inequality for [edit]

Cauchy-Schwarz Inequality

Given two vectors , it is the case that

In , the Cauchy-Schwarz inequality can be proven from the triangle inequality. Here, the Cauchy-Schwarz inequality will be proven algebraically.

To make the proof more intuitive, the algebraic proof for will be given first.

Proof for

follows from which is equivalent to

expanding both sides gives:

"Folding" the double sums along the diagonal, and cancelling out the diagonal terms which are equivalent on both sides, gives:

The above inequality is clearly true, therefore the Cauchy-Schwarz inequality holds for .

Proof for

Note that follows from which is equivalent to . Expanding both sides yields:

"Folding" the double sums along the diagonal, and cancelling out the diagonal terms which are equivalent on both sides, gives:

Given complex numbers and , it can be proven that (this is similar to for real numbers). The above inequality holds, and therefore the Cauchy-Schwarz inequality holds for complex numbers.