# Linear Algebra/Linear Transformations

A **linear transformation** is an important concept in mathematics because many real world phenomena can be approximated by linear models.

Unlike a linear function, a linear transformation works on vectors as well as numbers.

## Motivations and definitions[edit | edit source]

Say we have the vector in , and we rotate it through 90 degrees, to obtain the vector .

Another example instead of rotating a vector, we stretch it, so a vector becomes , for example. becomes

Or, if we look at the *projection* of one vector onto the *x* axis - extracting its *x* component - , e.g. from
we get

These examples are all an example of a *mapping* between two vectors, and are all linear transformations. If the rule transforming the matrix is called , we often write for the mapping of the vector by the rule . is often called the transformation.

Note we do not always write brackets like when we write functions. However we *should* write brackets, especially when we want to express the mapping of the sum or the product or the combination of many vectors.

## Definitions[edit | edit source]

### Linear Operators[edit | edit source]

Suppose one has a field K, and let x be an element of that field. Let O be a function taking values from K where O(x) is an element of a field J. Define O to be a linear form if and only if:

- O(x+y)=O(x)+O(y)
- O(λx)=λO(x)

### Linear Forms[edit | edit source]

Suppose one has a vector space V, and let x be an element of that vector space. Let F be a function taking values from V where F(x) is an element of a field K. Define F to be a linear form if and only if:

- F(x+y)=F(x)+F(y)
- F(λx)=λF(x)

### Linear Transformation[edit | edit source]

This time, instead of a field, let us consider functions from one vector space into another vector space. Let T be a function taking values from one vector space V where L(V) are elements of another vector space. Define L to be a linear transformation when it:

*preserves scalar multiplication*: T(λ**x**) = λT**x***preserves addition*: T(**x**+**y**) = T**x**+ T**y**

Note that not all transformations are linear. Many simple transformations that are in the real world are also non-linear. Their study is more difficult, and will not be done here. For example, the transformation *S* (whose input and output are both vectors in **R**^{2}) defined by

We can learn about nonlinear transformations by studying easier, linear ones.

We often *describe* a transformation T in the following way

This means that T, whatever transformation it may be, maps vectors in the vector space V to a vector in the vector space W.

The actual transformation *could* be written, for instance, as

## Examples and proofs[edit | edit source]

Here are some examples of some linear transformations. At the same time, let's look at how we can prove that a transformation we may find is linear or not.

### Projection[edit | edit source]

Let us take the projection of vectors in **R**^{2} to vectors on the *x*-axis. Let's call this transformation T.

We know that T maps vectors from **R**^{2} to **R**^{2}, so we can say

and we can then write the transformation itself as

Clearly this is linear. (*Can you see why, without looking below?*)

Let's go through a proof that the conditions in the definitions are established.

#### Scalar multiplication is preserved[edit | edit source]

We wish to show that for all vectors **v** and all scalars λ, T(λ**v**)=λT(**v**).

Let

- .

Then

Now

If we work out λT(**v**) and find it is the same vector, we have proved our result.

This is the same vector as above, so under the transformation T, *scalar multiplication is preserved*.

#### Addition is preserved[edit | edit source]

We wish to show for all vectors **x** and **y**, T(**x**+**y**)=T**x**+T**y**.

Let

- .

and

- .

Now

Now if we can show T**x**+T**y** is this vector above, we have proved this result.
Proceed, then,

So we have that the transformation T *preserves addition*.

#### Zero vector is preserved[edit | edit source]

Clearly we have

#### Conclusion[edit | edit source]

We have shown T preserves addition, scalar multiplication and the zero vector. So T must be linear.

## Disproof of linearity[edit | edit source]

When we want to *disprove* linearity - that is, to *prove* that a transformation is *not* linear, we need only find one counter-example.

If we can find just one case in which the transformation does not preserve addition, scalar multiplication, or the zero vector, we can conclude that the transformation is not linear.

For example, consider the transformation

We suspect it is not linear. To prove it is not linear, take the vector

then

but

so we can immediately say T is not linear because it doesn't preserve scalar multiplication.

### Problem set[edit | edit source]

Given the above, determine whether the following transformations are in fact linear or not. Write down each transformation in the form T:V -> W, and identify V and W. (Answers follow to even-numbered questions):

#### Answers[edit | edit source]

- 2. No. A check whether the zero vector is preserved readily confirms this fact. T :
**R**^{2}->**R**^{2} - 4. Yes. T :
**R**^{3}->**R**^{2}.

## Images and kernels[edit | edit source]

We have some fundamental concepts underlying linear transformations, such as the *kernel* and the *image* of a linear transformation, which are analogous to the *zeros* and *range* of a function.

### Kernel[edit | edit source]

The *kernel* of a linear transformation T: V -> W is the set of all vectors in V which are mapped to the zero vector in W, ie.,

Coincidentally because of the matr to the matrix equation A**x**=**0**.

The kernel of a transform T: V->W is always a subspace of V. The dimension of a transform or a matrix is called the *nullity*..

### Image[edit | edit source]

The *image* of a linear transformation T:V->W is the set of all vectors in W which were mapped from vectors in V. For example with the trivial mapping T:V->W such that T**x**=**0**, the image would be **0**. (*What would the kernel be?*).

More formally, we say that the image of a transformation T:V->W is the set

## Isomorphism[edit | edit source]

A linear transformation T:V -> W is an isomorphic transformation if it is:

- one-to-one and onto.
- kernel(T) = {0} and the range(T) = W.
- an inverse of T exists.
- dim(V) = dim(W).