# Linear Algebra/Inner Product Spaces

Recall that in your study of vectors, we looked at an operation known as the dot product, and that if we have two vectors in Rn, we simply multiply the components together and sum them up. With the dot product, it becomes possible to introduce important new ideas like length and angle. The lenght of a vector,${\displaystyle \mathbf {a} }$, is just ${\displaystyle ||\mathbf {a} ||={\sqrt {\mathbf {a} \cdot \mathbf {a} }}}$. The angle between two vectors,${\displaystyle \mathbf {a} }$ and ${\displaystyle \mathbf {b} }$, is related to the dot product by

${\displaystyle \cos {\theta }={\frac {\mathbf {a} \cdot \mathbf {b} }{||\mathbf {a} ||||\mathbf {b} ||}}}$

It turns out that only a few properties of the dot product are necessary to define similar ideas in vector spaces other than Rn, such as the spaces of ${\displaystyle m\times n}$ matrices, or polynomials. The more general operation that will take the place of the dot product in these other spaces is called the "inner product".

## The inner product

Say we have two vectors:

${\displaystyle \mathbf {a} ={\begin{pmatrix}2\\1\\4\end{pmatrix}},\mathbf {b} ={\begin{pmatrix}6\\3\\0\end{pmatrix}}}$

If we want to take their dot product, we would work as follows

${\displaystyle \mathbf {a} \cdot \mathbf {b} =a_{1}b_{1}+a_{2}b_{2}+a_{3}b_{3}=(2)(6)+(1)(3)+(4)(0)=15}$

Because in this case multiplication is commutative, we then have a·b = b · a.

But then, we observe that

${\displaystyle \mathbf {v} \cdot (\alpha \mathbf {a} +\beta \mathbf {b} )=\alpha \mathbf {v} \cdot \mathbf {a} +\beta \mathbf {v} \cdot \mathbf {b} }$

much like the regular algebraic equality v(aA+bB)=avA+bvB. For regular dot products this is true since, for R3, for example, one can expand both sides out to obtain

${\displaystyle {\begin{matrix}(\alpha v_{1}a_{1}+\beta v_{1}b_{1})+(\alpha v_{2}a_{2}+\beta v_{2}b_{2})+(\alpha v_{2}a_{2}+\beta v_{2}b_{2})=\\(\alpha v_{1}a_{1}+\alpha v_{2}a_{2}+\alpha v_{3}a_{3})+(\beta v_{1}b_{1}+\beta v_{2}b_{2}+\beta v_{3}+b_{3})\end{matrix}}}$

Finally, we can notice that v·v is always positive or greater than zero - checking this for R3 gives this as

${\displaystyle \mathbf {v} \cdot \mathbf {v} =v_{1}^{2}+v_{2}^{2}+v_{3}^{2}}$

which can never be less than zero since a real number squared is positive. Note that v·v = 0 if and only if v = 0.

In generalizing this sort of behaviour, we want to keep these three behaviours. We can then move on to a definition of a generalization of the dot product, which we call the inner product. An inner product of two vectors in some vector space V, written < x, y > is a function that maps V×V to R, which obeys the property that

• < x, y > = < y, x >
• < v, αab > = α < v, a > + β < v, b >
• < a, a > ≥ 0, < a, a > = 0 iff a = 0.

The vector space V and some inner product together are known as an inner product space.