# Engineering Analysis/Linear Independence and Basis

## Linear Independence

A set of vectors ${\displaystyle V={v_{1},v_{2},\cdots ,v_{n}}}$ are said to be linearly dependent on one another if any vector v from the set can be constructed from a linear combination of the other vectors in the set. Given the following linear equation:

${\displaystyle a_{1}v_{1}+a_{2}v_{2}+\cdots +a_{n}v_{n}=0}$

The set of vectors V is linearly independent only if all the a coefficients are zero. If we combine the v vectors together into a single row vector:

${\displaystyle {\hat {V}}=[v_{1}v_{2}\cdots v_{n}]}$

And we combine all the a coefficients into a single column vector:

${\displaystyle {\hat {a}}=[a_{1}a_{2}\cdots a_{n}]^{T}}$

We have the following linear equation:

${\displaystyle {\hat {V}}{\hat {a}}=0}$

We can show that this equation can only be satisifed for ${\displaystyle {\hat {a}}=0}$, the matrix ${\displaystyle {\hat {V}}}$ must be invertable:

${\displaystyle {\hat {V}}^{-1}{\hat {V}}{\hat {a}}={\hat {V}}^{-1}0}$
${\displaystyle {\hat {a}}=0}$

Remember that for the matrix to be invertable, the determinate must be non-zero.

### Non-Square Matrix V

If the matrix ${\displaystyle {\hat {V}}}$ is not square, then the determinate can not be taken, and therefore the matrix is not invertable. To solve this problem, we can premultiply by the transpose matrix:

${\displaystyle {\hat {V}}^{T}{\hat {V}}{\hat {a}}=0}$

And then the square matrix ${\displaystyle {\hat {V}}^{T}{\hat {V}}}$ must be invertable:

${\displaystyle ({\hat {V}}^{T}{\hat {V}})^{-1}{\hat {V}}^{T}{\hat {V}}{\hat {a}}=0}$
${\displaystyle {\hat {a}}=0}$

### Rank

The rank of a matrix is the largest number of linearly independent rows or columns in the matrix.

To determine the Rank, typically the matrix is reduced to row-echelon form. From the reduced form, the number of non-zero rows, or the number of non-zero columns (whichever is smaller) is the rank of the matrix.

If we multiply two matrices A and B, and the result is C:

${\displaystyle AB=C}$

Then the rank of C is the minimum value between the ranks A and B:

${\displaystyle \operatorname {Rank} (C)=\operatorname {min} [\operatorname {Rank} (A),\operatorname {Rank} (B)]}$

## Span

A Span of a set of vectors V is the set of all vectors that can be created by a linear combination of the vectors.

## Basis

A basis is a set of linearly-independent vectors that span the entire vector space.

### Basis Expansion

If we have a vector ${\displaystyle y\in V}$, and V has basis vectors ${\displaystyle {v_{1}v_{2}\cdots v_{n}}}$, by definition, we can write y in terms of a linear combination of the basis vectors:

${\displaystyle a_{1}v_{1}+a_{2}v_{2}+\cdots +a_{n}v_{n}=y}$

or

${\displaystyle {\hat {V}}{\hat {a}}=y}$

If ${\displaystyle {\hat {V}}}$ is invertable, the answer is apparent, but if ${\displaystyle {\hat {V}}}$ is not invertable, then we can perform the following technique:

${\displaystyle {\hat {V}}^{T}{\hat {V}}{\hat {a}}={\hat {V}}^{T}y}$
${\displaystyle {\hat {a}}=({\hat {V}}^{T}{\hat {V}})^{-1}{\hat {V}}^{T}y}$

And we call the quantity ${\displaystyle ({\hat {V}}^{T}{\hat {V}})^{-1}{\hat {V}}^{T}}$ the left-pseudoinverse of ${\displaystyle {\hat {V}}}$.

### Change of Basis

Frequently, it is useful to change the basis vectors to a different set of vectors that span the set, but have different properties. If we have a space V, with basis vectors ${\displaystyle {\hat {V}}}$ and a vector in V called x, we can use the new basis vectors ${\displaystyle {\hat {W}}}$ to represent x:

${\displaystyle x=\sum _{i=0}^{n}a_{i}v_{i}=\sum _{j=1}^{n}b_{j}w_{j}}$

or,

${\displaystyle x={\hat {V}}{\hat {a}}={\hat {W}}{\hat {b}}}$

If V is invertable, then the solution to this problem is simple.

## Grahm-Schmidt Orthogonalization

If we have a set of basis vectors that are not orthogonal, we can use a process known as orthogonalization to produce a new set of basis vectors for the same space that are orthogonal:

Given: ${\displaystyle {\hat {V}}={x_{1}v_{2}\cdots v_{n}}}$
Find the new basis ${\displaystyle {\hat {W}}={w_{1}w_{2}\cdots w_{n}}}$
Such that ${\displaystyle \langle w_{i},w_{j}\rangle =0\quad \forall i,j}$

We can define the vectors as follows:

1. ${\displaystyle w_{1}=v_{1}}$
2. ${\displaystyle w_{m}=v_{m}-\sum _{i=1}^{m-1}{\frac {\langle v_{m},u_{i}\rangle }{\langle u_{i},u_{i}\rangle }}u_{i}}$

Notice that the vectors produced by this technique are orthogonal to each other, but they are not necessarily orthonormal. To make the w vectors orthonormal, you must divide each one by its norm:

${\displaystyle {\bar {w}}={\frac {w}{\|w\|}}}$

## Reciprocal Basis

A Reciprocal basis is a special type of basis that is related to the original basis. The reciprocal basis ${\displaystyle {\hat {W}}}$ can be defined as:

${\displaystyle {\hat {W}}=[{\hat {V}}^{T}]^{-1}}$