Linear Algebra/Changing Map Representations

From Wikibooks, open books for an open world
Jump to navigation Jump to search
Linear Algebra
 ← Changing Representations of Vectors Changing Map Representations Projection → 

The first subsection shows how to convert the representation of a vector with respect to one basis to the representation of that same vector with respect to another basis. Here we will see how to convert the representation of a map with respect to one pair of bases to the representation of that map with respect to a different pair. That is, we want the relationship between the matrices in this arrow diagram.

To move from the lower-left of this diagram to the lower-right we can either go straight over, or else up to then over to and then down. Restated in terms of the matrices, we can calculate either by simply using and , or else by first changing bases with then multiplying by and then changing bases with . This equation summarizes.

(To compare this equation with the sentence before it, remember that the equation is read from right to left because function composition is read right to left and matrix multiplication represent the composition.)

Example 2.1

The matrix

represents, with respect to , the transformation that rotates vectors radians counterclockwise.

We can translate that representation with respect to to one with respect to

by using the arrow diagram and formula () above.


From this, we can use the formula:


Note that can be calculated as the matrix inverse of .

Although the new matrix is messier-appearing, the map that it represents is the same. For instance, to replicate the effect of in the picture, start with ,

apply ,

and check it against

to see that it is the same result as above.

Example 2.2

On the map

that is represented with respect to the standard basis in this way

can also be represented with respect to another basis

if      then

in a way that is simpler, in that the action of a diagonal matrix is easy to understand.

Naturally, we usually prefer basis changes that make the representation easier to understand. When the representation with respect to equal starting and ending bases is a diagonal matrix we say the map or matrix has been diagonalized. In Chaper Five we shall see which maps and matrices are diagonalizable, and where one is not, we shall see how to get a representation that is nearly diagonal.

We finish this subsection by considering the easier case where representations are with respect to possibly different starting and ending bases. Recall that the prior subsection shows that a matrix changes bases if and only if it is nonsingular. That gives us another version of the above arrow diagram and equation ().

Definition 2.3

Same-sized matrices and are matrix equivalent if there are nonsingular matrices and such that .

Corollary 2.4

Matrix equivalent matrices represent the same map, with respect to appropriate pairs of bases.

Problem 10 checks that matrix equivalence is an equivalence relation. Thus it partitions the set of matrices into matrix equivalence classes.

All matrices: matrix equivalent
to

We can get some insight into the classes by comparing matrix equivalence with row equivalence (recall that matrices are row equivalent when they can be reduced to each other by row operations). In , the matrices and are nonsingular and thus each can be written as a product of elementary reduction matrices (see Lemma 4.8 in the previous subsection). Left-multiplication by the reduction matrices making up has the effect of performing row operations. Right-multiplication by the reduction matrices making up performs column operations. Therefore, matrix equivalence is a generalization of row equivalence— two matrices are row equivalent if one can be converted to the other by a sequence of row reduction steps, while two matrices are matrix equivalent if one can be converted to the other by a sequence of row reduction steps followed by a sequence of column reduction steps.

Thus, if matrices are row equivalent then they are also matrix equivalent (since we can take to be the identity matrix and so perform no column operations). The converse, however, does not hold.

Example 2.5

These two

are matrix equivalent because the second can be reduced to the first by the column operation of taking times the first column and adding to the second. They are not row equivalent because they have different reduced echelon forms (in fact, both are already in reduced form).

We will close this section by finding a set of representatives for the matrix equivalence classes.[1]

Theorem 2.6

Any matrix of rank is matrix equivalent to the matrix that is all zeros except that the first diagonal entries are ones.

Sometimes this is described as a block partial-identity form.

Proof

As discussed above, Gauss-Jordan reduce the given matrix and combine all the reduction matrices used there to make . Then use the leading entries to do column reduction and finish by swapping columns to put the leading ones on the diagonal. Combine the reduction matrices used for those column operations into .

Example 2.7

We illustrate the proof by finding the and for this matrix.

First Gauss-Jordan row-reduce.

Then column-reduce, which involves right-multiplication.

Finish by swapping columns.

Finally, combine the left-multipliers together as and the right-multipliers together as to get the equation.

Corollary 2.8

Two same-sized matrices are matrix equivalent if and only if they have the same rank. That is, the matrix equivalence classes are characterized by rank.

Proof

Two same-sized matrices with the same rank are equivalent to the same block partial-identity matrix.

Example 2.9

The matrices have only three possible ranks: zero, one, or two. Thus there are three matrix-equivalence classes.


All matrices: Three equivalence
classes

Each class consists of all of the matrices with the same rank. There is only one rank zero matrix, so that class has only one member, but the other two classes each have infinitely many members.

In this subsection we have seen how to change the representation of a map with respect to a first pair of bases to one with respect to a second pair. That led to a definition describing when matrices are equivalent in this way. Finally we noted that, with the proper choice of (possibly different) starting and ending bases, any map can be represented in block partial-identity form.

One of the nice things about this representation is that, in some sense, we can completely understand the map when it is expressed in this way: if the bases are and then the map sends

where is the map's rank. Thus, we can understand any linear map as a kind of projection.

Of course, "understanding" a map expressed in this way requires that we understand the relationship between and . However, despite that difficulty, this is a good classification of linear maps. }}

Exercises[edit | edit source]

This exercise is recommended for all readers.
Problem 1

Decide if these matrices are matrix equivalent.

  1. ,
  2. ,
  3. ,
This exercise is recommended for all readers.
Problem 2

Find the canonical representative of the matrix-equivalence class of each matrix.

Problem 3

Suppose that, with respect to

the transformation is represented by this matrix.

Use change of basis matrices to represent with respect to each pair.

  1. ,
  2. ,
This exercise is recommended for all readers.
Problem 4

What sizes are and in the equation ?

This exercise is recommended for all readers.
Problem 5

Use Theorem 2.6 to show that a square matrix is nonsingular if and only if it is equivalent to an identity matrix.

This exercise is recommended for all readers.
Problem 6

Show that, where is a nonsingular square matrix, if and are nonsingular square matrices such that then .

This exercise is recommended for all readers.
Problem 7

Why does Theorem 2.6 not show that every matrix is diagonalizable (see Example 2.2)?

Problem 8

Must matrix equivalent matrices have matrix equivalent transposes?

Problem 9

What happens in Theorem 2.6 if ?

This exercise is recommended for all readers.
Problem 10

Show that matrix-equivalence is an equivalence relation.

This exercise is recommended for all readers.
Problem 11

Show that a zero matrix is alone in its matrix equivalence class. Are there other matrices like that?

Problem 12

What are the matrix equivalence classes of matrices of transformations on ? ?

Problem 13

How many matrix equivalence classes are there?

Problem 14

Are matrix equivalence classes closed under scalar multiplication? Addition?

Problem 15

Let represented by with respect to .

  1. Find in this specific case.
  2. Describe in the general case where .
Problem 16
  1. Let have bases and and suppose that has the basis . Where , find the formula that computes from .
  2. Repeat the prior question with one basis for and two bases for .
Problem 17
  1. If two matrices are matrix-equivalent and invertible, must their inverses be matrix-equivalent?
  2. If two matrices have matrix-equivalent inverses, must the two be matrix-equivalent?
  3. If two matrices are square and matrix-equivalent, must their squares be matrix-equivalent?
  4. If two matrices are square and have matrix-equivalent squares, must they be matrix-equivalent?
This exercise is recommended for all readers.
Problem 18

Square matrices are similar if they represent the same transformation, but each with respect to the same ending as starting basis. That is, is similar to .

  1. Give a definition of matrix similarity like that of Definition 2.3.
  2. Prove that similar matrices are matrix equivalent.
  3. Show that similarity is an equivalence relation.
  4. Show that if is similar to then is similar to , the cubes are similar, etc. Contrast with the prior exercise.
  5. Prove that there are matrix equivalent matrices that are not similar.

Solutions

Footnotes[edit | edit source]

  1. More information on class representatives is in the appendix.
Linear Algebra
 ← Changing Representations of Vectors Changing Map Representations Projection →