Linear Algebra/Definition and Examples of Similarity/Solutions

From Wikibooks, open books for an open world
< Linear Algebra‎ | Definition and Examples of Similarity
Jump to: navigation, search

Solutions[edit]

Problem 1

For


S=
\begin{pmatrix}
1  &3  \\
-2  &-6
\end{pmatrix}
\quad
T=
\begin{pmatrix}
0    &0  \\
-11/2 &-5
\end{pmatrix}
\quad
P=
\begin{pmatrix}
4  &2  \\
-3  &2
\end{pmatrix}

check that T=PSP^{-1}.

Answer

One way to proceed is left to right.


PSP^{-1}=
\begin{pmatrix}
4  &2  \\
-3  &2
\end{pmatrix}
\begin{pmatrix}
1  &3  \\
-2  &-6
\end{pmatrix}
\begin{pmatrix}
2/14  &-2/14  \\
3/14  &4/14
\end{pmatrix}
=
\begin{pmatrix}
0  &0  \\
-7  &-21
\end{pmatrix}
\begin{pmatrix}
2/14  &-2/14  \\
3/14  &4/14
\end{pmatrix}
=
\begin{pmatrix}
0    &0  \\
-11/2 &-5
\end{pmatrix}
This exercise is recommended for all readers.
Problem 2

Example 1.3 shows that the only matrix similar to a zero matrix is itself and that the only matrix similar to the identity is itself.

  1. Show that the 1 \! \times \! 1 matrix (2), also, is similar only to itself.
  2. Is a matrix of the form cI for some scalar c similar only to itself?
  3. Is a diagonal matrix similar only to itself?
Answer
  1. Because the matrix (2) is 1 \! \times \! 1, the matrices P and P^{-1} are also 1 \! \times \! 1 and so where P=(p) the inverse is P^{-1}=(1/p). Thus P(2)P^{-1}=(p)(2)(1/p)=(2).
  2. Yes: recall that scalar multiples can be brought out of a matrix  P(cI)P^{-1}=cPIP^{-1}=cI . By the way, the zero and identity matrices are the special cases c=0 and c=1.
  3. No, as this example shows.
    
\begin{pmatrix}
1  &-2  \\
-1  &1
\end{pmatrix}
\begin{pmatrix}
-1  &0   \\
0  &-3
\end{pmatrix}
\begin{pmatrix}
-1  &-2   \\
-1  &-1
\end{pmatrix}
=
\begin{pmatrix}
-5  &-4   \\
2   &1
\end{pmatrix}
Problem 3

Show that these matrices are not similar.


\begin{pmatrix}
1  &0  &4  \\
1  &1  &3  \\
2  &1  &7
\end{pmatrix}
\qquad
\begin{pmatrix}
1  &0  &1  \\
0  &1  &1  \\
3  &1  &2
\end{pmatrix}
Answer

Gauss' method shows that the first matrix represents maps of rank two while the second matrix represents maps of rank three.

Problem 4

Consider the transformation t:\mathcal{P}_2\to \mathcal{P}_2 described by x^2\mapsto x+1, x\mapsto x^2-1, and 1\mapsto 3.

  1. Find T={\rm Rep}_{B,B}(t) where B=\langle x^2,x,1 \rangle .
  2. Find S={\rm Rep}_{D,D}(t) where D=\langle 1,1+x,1+x+x^2 \rangle .
  3. Find the matrix P such that T=PSP^{-1}.
Answer
  1. Because t is described with the members of B, finding the matrix representation is easy:
    
{\rm Rep}_{B}(t(x^2))=\begin{pmatrix} 0 \\ 1 \\ 1 \end{pmatrix}_B
\quad
{\rm Rep}_{B}(t(x))=\begin{pmatrix} 1 \\ 0 \\ -1 \end{pmatrix}_B
\quad
{\rm Rep}_{B}(t(1))=\begin{pmatrix} 0 \\ 0 \\ 3 \end{pmatrix}_B
    gives this.
    
{\rm Rep}_{B,B}(t)
\begin{pmatrix}
0  &1  &0  \\
1  &0  &0  \\
1  &-1 &3
\end{pmatrix}
  2. We will find t(1), t(1+x), and t(1+x+x^2), to find how each is represented with respect to D. We are given that t(1)=3, and the other two are easy to see: t(1+x)=x^2+2 and t(1+x+x^2)=x^2+x+3. By eye, we get the representation of each vector
    
{\rm Rep}_{D}(t(1))=\begin{pmatrix} 3 \\ 0 \\ 0 \end{pmatrix}_D
\quad
{\rm Rep}_{D}(t(1+x))=\begin{pmatrix} 2  \\ -1 \\  1 \end{pmatrix}_D
\quad
{\rm Rep}_{D}(t(1+x+x^2))=\begin{pmatrix} 2 \\ 0 \\ 1 \end{pmatrix}_D
    and thus the representation of the map.
    
{\rm Rep}_{D,D}(t)
=
\begin{pmatrix}
3  &2  &2  \\
0  &-1 &0  \\
0  &1  &1
\end{pmatrix}
  3. The diagram, adapted for this T and S,
    Linalg matrix equivalent cd problem4.png
    shows that P={\rm Rep}_{D,B}(\mbox{id}).
    
P=
\begin{pmatrix}
0  &0  &1  \\
0  &1  &1  \\
1  &1  &1
\end{pmatrix}
This exercise is recommended for all readers.
Problem 5

Exhibit an nontrivial similarity relationship in this way: let  t:\mathbb{C}^2\to \mathbb{C}^2 act by


\begin{pmatrix} 1 \\ 2 \end{pmatrix}\mapsto\begin{pmatrix} 3 \\ 0 \end{pmatrix}
\qquad
\begin{pmatrix} -1 \\ 1 \end{pmatrix}\mapsto\begin{pmatrix} -1 \\ 2 \end{pmatrix}

and pick two bases, and represent  t with respect to then  T={\rm Rep}_{B,B}(t) and  S={\rm Rep}_{D,D}(t) . Then compute the  P and  P^{-1} to change bases from  B to  D and back again.

Answer

One possible choice of the bases is


B=\langle \begin{pmatrix} 1 \\ 2 \end{pmatrix},\begin{pmatrix} -1 \\ 1 \end{pmatrix} \rangle
\qquad
D=\mathcal{E}_2=\langle \begin{pmatrix} 1 \\ 0 \end{pmatrix},\begin{pmatrix} 0 \\ 1 \end{pmatrix} \rangle

(this B is suggested by the map description). To find the matrix T={\rm Rep}_{B,B}(t), solve the relations


c_1\begin{pmatrix} 1 \\ 2 \end{pmatrix}+c_2\begin{pmatrix} -1 \\ 1 \end{pmatrix}=\begin{pmatrix} 3 \\ 0 \end{pmatrix}
\qquad
\hat{c}_1\begin{pmatrix} 1 \\ 2 \end{pmatrix}+\hat{c}_2\begin{pmatrix} -1 \\ 1 \end{pmatrix}=\begin{pmatrix} -1 \\ 2 \end{pmatrix}

to get  c_1=1 ,  c_2=-2 ,  \hat{c}_1=1/3 and  \hat{c}_2=4/3 .


{\rm Rep}_{B,B}(t)=
\begin{pmatrix}
1  &1/3 \\
-2  &4/3
\end{pmatrix}

Finding  {\rm Rep}_{D,D}(t) involves a bit more computation. We first find  t(\vec{e}_1) . The relation


c_1\begin{pmatrix} 1 \\ 2 \end{pmatrix}+c_2\begin{pmatrix} -1 \\ 1 \end{pmatrix}=\begin{pmatrix} 1 \\ 0 \end{pmatrix}

gives  c_1=1/3 and  c_2=-2/3 , and so


{\rm Rep}_{B}(\vec{e}_1)=\begin{pmatrix} 1/3 \\ -2/3 \end{pmatrix}_B

making


{\rm Rep}_{B}(t(\vec{e}_1))=
\begin{pmatrix}
1  &1/3  \\
-2  &4/3
\end{pmatrix}_{B,B}
\begin{pmatrix} 1/3 \\ -2/3 \end{pmatrix}_B
=
\begin{pmatrix} 1/9 \\ -14/9 \end{pmatrix}_B

and hence t acts on the first basis vector \vec{e}_1 in this way.


t(\vec{e}_1)
=(1/9)\cdot\begin{pmatrix} 1 \\ 2 \end{pmatrix}-(14/9)\cdot\begin{pmatrix} -1 \\ 1 \end{pmatrix}
=\begin{pmatrix} 5/3 \\ -4/3 \end{pmatrix}

The computation for  t(\vec{e}_2) is similar. The relation


c_1\begin{pmatrix} 1 \\ 2 \end{pmatrix}+c_2\begin{pmatrix} -1 \\ 1 \end{pmatrix}=\begin{pmatrix} 0 \\ 1 \end{pmatrix}

gives  c_1=1/3 and  c_2=1/3 , so


{\rm Rep}_{B}(\vec{e}_1)=\begin{pmatrix} 1/3 \\ 1/3 \end{pmatrix}_B

making


{\rm Rep}_{B}(t(\vec{e}_1))=
\begin{pmatrix}
1  &1/3  \\
-2  &4/3
\end{pmatrix}_{B,B}
\begin{pmatrix} 1/3 \\ 1/3 \end{pmatrix}_B
=
\begin{pmatrix} 4/9 \\ -2/9 \end{pmatrix}_B

and hence t acts on the second basis vector \vec{e}_2 in this way.


t(\vec{e}_2)
=(4/9)\cdot\begin{pmatrix} 1 \\ 2 \end{pmatrix}-(2/9)\cdot\begin{pmatrix} -1 \\ 1 \end{pmatrix}
=\begin{pmatrix} 2/3 \\ 2/3 \end{pmatrix}

Therefore


{\rm Rep}_{D,D}(t)=
\begin{pmatrix}
5/3  &2/3  \\
-4/3  &2/3
\end{pmatrix}

and these are the change of basis matrices.


P={\rm Rep}_{B,D}(\mbox{id})
=\begin{pmatrix}
1  &-1  \\
2  &1
\end{pmatrix}
\qquad
P^{-1}=\bigl({\rm Rep}_{B,D}(\mbox{id})\bigr)^{-1}
=\begin{pmatrix}
1  &-1 \\
2  &1
\end{pmatrix}^{-1}
=
\begin{pmatrix}
1/3  &1/3  \\
-2/3 &1/3
\end{pmatrix}

The check of these computations is routine.


\begin{pmatrix}
1  &-1 \\
2  &1
\end{pmatrix}
\begin{pmatrix}
1  &1/3 \\
-2  &4/3
\end{pmatrix}
\begin{pmatrix}
1/3 &1/3 \\
-2/3 &1/3
\end{pmatrix}
=
\begin{pmatrix}
5/3 &2/3 \\
-4/3 &2/3
\end{pmatrix}
Problem 6

Explain Example 1.3 in terms of maps.

Answer

The only representation of a zero map is a zero matrix, no matter what the pair of bases {\rm Rep}_{B,D}(z)=Z, and so in particular for any single basis B we have {\rm Rep}_{B,B}(z)=Z. The case of the identity is related, but slightly different: the only representation of the identity map, with respect to any B,B, is the identity {\rm Rep}_{B,B}(\mbox{id})=I. (Remark: of course, we have seen examples where B\neq D and {\rm Rep}_{B,D}(\mbox{id})\neq I— in fact, we have seen that any nonsingular matrix is a representation of the identity map with respect to some B,D.)

This exercise is recommended for all readers.
Problem 7

Are there two matrices  A and  B that are similar while  A^2 and  B^2 are not similar? (Halmos 1958)

Answer

No. If  A=PBP^{-1} then  A^2=(PBP^{-1})(PBP^{-1})=PB^2P^{-1} .

This exercise is recommended for all readers.
Problem 8

Prove that if two matrices are similar and one is invertible then so is the other.

Answer

Matrix similarity is a special case of matrix equivalence (if matrices are similar then they are matrix equivalent) and matrix equivalence preserves nonsingularity. (This is an extension of the rule that similar matrices have equal determinants, which can be used as indicator if it's invertible.)

This exercise is recommended for all readers.
Problem 9

Show that similarity is an equivalence relation.

Answer

A matrix is similar to itself; take  P to be the identity matrix: IPI^{-1}=IPI=P.

If  T is similar to  S then  T=PSP^{-1} and so  P^{-1}TP=S . Rewrite this as  S=(P^{-1})T(P^{-1})^{-1} to conclude that S is similar to  T .

If  T is similar to  S and  S is similar to  U then  T=PSP^{-1} and  S=QUQ^{-1} . Then  T=PQUQ^{-1}P^{-1}=(PQ)U(PQ)^{-1} , showing that  T is similar to  U .

Problem 10

Consider a matrix representing, with respect to some B,B, reflection across the  x -axis in  \mathbb{R}^2 . Consider also a matrix representing, with respect to some D,D, reflection across the  y -axis. Must they be similar?

Answer

Let f_x and f_y be the reflection maps (sometimes called "flip"s). For any bases  B and  D , the matrices  {\rm Rep}_{B,B}(f_x)  and  {\rm Rep}_{D,D}(f_y) are similar. First note that


S={\rm Rep}_{\mathcal{E}_2,\mathcal{E}_2}(f_x)=
\begin{pmatrix}
1  &0  \\
0  &-1
\end{pmatrix}
\qquad
T={\rm Rep}_{\mathcal{E}_2,\mathcal{E}_2}(f_y)=
\begin{pmatrix}
-1  &0  \\
0  &1
\end{pmatrix}

are similar because the second matrix is the representation of f_x with respect to the basis  A=\langle \vec{e}_2,\vec{e}_1 \rangle  :


\begin{pmatrix}
1  &0  \\
0  &-1
\end{pmatrix}
=
P
\begin{pmatrix}
-1  &0  \\
0  &1
\end{pmatrix}
P^{-1}

where P={\rm Rep}_{A,\mathcal{E}_2}(\mbox{id}).

Linalg matrix equivalent cd problem4.png

Now the conclusion follows from the transitivity part of Problem 9.

To finish without relying on that exercise, write {\rm Rep}_{B,B}(f_x)=QTQ^{-1}=Q{\rm Rep}_{\mathcal{E}_2,\mathcal{E}_2}(f_x)Q^{-1} and {\rm Rep}_{D,D}(f_y)=RSR^{-1}=R{\rm Rep}_{\mathcal{E}_2,\mathcal{E}_2}(f_y)R^{-1}. Using the equation in the first paragraph, the first of these two becomes {\rm Rep}_{B,B}(f_x)=QP{\rm Rep}_{\mathcal{E}_2,\mathcal{E}_2}(f_y)P^{-1}Q^{-1} and rewriting the second of these two as R^{-1}\cdot{\rm Rep}_{D,D}(f_y)\cdot R={\rm Rep}_{\mathcal{E}_2,\mathcal{E}_2}(f_y) and substituting gives the desired relationship

{\rm Rep}_{B,B}(f_x) =QP{\rm
  Rep}_{\mathcal{E}_2,\mathcal{E}_2}(f_y)P^{-1}Q^{-1}
 =QPR^{-1}\cdot{\rm Rep}_{D,D}(f_y)\cdot RP^{-1}Q^{-1}
=(QPR^{-1})\cdot{\rm Rep}_{D,D}(f_y)\cdot (QPR^{-1})^{-1}

Thus the matrices  {\rm Rep}_{B,B}(f_x)  and  {\rm Rep}_{D,D}(f_y) are similar.

Problem 11

Prove that similarity preserves determinants and rank. Does the converse hold?

Answer

We must show that if two matrices are similar then they have the same determinant and the same rank. Both determinant and rank are properties of matrices that we have already shown to be preserved by matrix equivalence. They are therefore preserved by similarity (which is a special case of matrix equivalence: if two matrices are similar then they are matrix equivalent).

To prove the statement without quoting the results about matrix equivalence, note first that rank is a property of the map (it is the dimension of the rangespace) and since we've shown that the rank of a map is the rank of a representation, it must be the same for all representations. As for determinants,  \left|PSP^{-1}\right|=\left|P\right|\cdot\left|S\right|\cdot\left|P^{-1}\right|
=\left|P\right|\cdot\left|S\right|\cdot\left|P\right|^{-1}=\left|S\right| .

The converse of the statement does not hold; for instance, there are matrices with the same determinant that are not similar. To check this, consider a nonzero matrix with a determinant of zero. It is not similar to the zero matrix, the zero matrix is similar only to itself, but they have they same determinant. The argument for rank is much the same.

Problem 12

Is there a matrix equivalence class with only one matrix similarity class inside? One with infinitely many similarity classes?

Answer

The matrix equivalence class containing all  n \! \times \! n rank zero matrices contains only a single matrix, the zero matrix. Therefore it has as a subset only one similarity class.

In contrast, the matrix equivalence class of  1 \! \times \! 1 matrices of rank one consists of those 1 \! \times \! 1 matrices  (k) where  k\neq 0 . For any basis  B , the representation of multiplication by the scalar  k is  {\rm Rep}_{B,B}(t_k)=(k) , so each such matrix is alone in its similarity class. So this is a case where a matrix equivalence class splits into infinitely many similarity classes.

Problem 13

Can two different diagonal matrices be in the same similarity class?

Answer

Yes, these are similar


\begin{pmatrix}
1  &0  \\
0  &3
\end{pmatrix}
\qquad
\begin{pmatrix}
3  &0  \\
0  &1
\end{pmatrix}

since, where the first matrix is {\rm Rep}_{B,B}(t) for B=\langle \vec{\beta}_1,\vec{\beta}_2 \rangle , the second matrix is {\rm Rep}_{D,D}(t) for D=\langle \vec{\beta}_2,\vec{\beta}_1 \rangle .

This exercise is recommended for all readers.
Problem 14

Prove that if two matrices are similar then their  k -th powers are similar when  k>0 . What if  k\leq 0 ?

Answer

The  k -th powers are similar because, where each matrix represents the map t, the k-th powers represent  t^k , the composition of k-many t's. (For instance, if T=rep{t}{B,B} then T^2={\rm Rep}_{B,B}(t\circ t).)

Restated more computationally, if  T=PSP^{-1} then  T^2=(PSP^{-1})(PSP^{-1})=PS^2P^{-1} . Induction extends that to all powers.

For the k\leq 0 case, suppose that  S is invertible and that  T=PSP^{-1} . Note that  T is invertible:  T^{-1}=(PSP^{-1})^{-1}=PS^{-1}P^{-1} , and that same equation shows that  T^{-1} is similar to  S^{-1} . Other negative powers are now given by the first paragraph.

This exercise is recommended for all readers.
Problem 15

Let  p(x) be the polynomial  c_nx^n+\cdots+c_1x+c_0 . Show that if  T is similar to  S then  p(T)=c_nT^n+\cdots+c_1T+c_0I is similar to  p(S)=c_nS^n+\cdots+c_1S+c_0I .

Answer

In conceptual terms, both represent  p(t) for some transformation  t . In computational terms, we have this.

\begin{array}{rl}
p(T)
&=c_n(PSP^{-1})^n+\dots+c_1(PSP^{-1})+c_0I   \\
&=c_nPS^nP^{-1}+\dots+c_1PSP^{-1}+c_0I   \\
&=Pc_nS^nP^{-1}+\dots+Pc_1SP^{-1}+Pc_0P^{-1}   \\
&=P(c_nS^n+\dots+c_1S+c_0)P^{-1}
\end{array}
Problem 16

List all of the matrix equivalence classes of  1 \! \times \! 1 matrices. Also list the similarity classes, and describe which similarity classes are contained inside of each matrix equivalence class.

Answer

There are two equivalence classes, (i) the class of rank zero matrices, of which there is one: \mathcal{C}_1=\{(0)\}, and (2) the class of rank one matrices, of which there are infinitely many:  \mathcal{C}_2=\{(k)\,\big|\, k\neq0\} .

Each  1 \! \times \! 1 matrix is alone in its similarity class. That's because any transformation of a one-dimensional space is multiplication by a scalar t_k:V\to V given by \vec{v}\mapsto k\cdot\vec{v}. Thus, for any basis  B=\langle \vec{\beta} \rangle  , the matrix representing a transformation  t_k with respect to  B,B is  ({\rm Rep}_{B}(t_k(\vec{\beta})))=(k) .

So, contained in the matrix equivalence class \mathcal{C}_1 is (obviously) the single similarity class consisting of the matrix (0). And, contained in the matrix equivalence class \mathcal{C}_2 are the infinitely many, one-member-each, similarity classes consisting of (k) for k\neq0.

Problem 17

Does similarity preserve sums?

Answer

No. Here is an example that has two pairs, each of two similar matrices:


\begin{pmatrix}
1  &-1  \\
1  &2
\end{pmatrix}
\begin{pmatrix}
1  &0   \\
0  &3
\end{pmatrix}
\begin{pmatrix}
2/3  &1/3   \\
-1/3  &1/3
\end{pmatrix}
=
\begin{pmatrix}
5/3  &-2/3  \\
-4/3  &7/3
\end{pmatrix}

and


\begin{pmatrix}
1  &-2  \\
-1  &1
\end{pmatrix}
\begin{pmatrix}
-1  &0   \\
0  &-3
\end{pmatrix}
\begin{pmatrix}
-1  &-2   \\
-1  &-1
\end{pmatrix}
=
\begin{pmatrix}
-5  &-4   \\
2  &1
\end{pmatrix}

(this example is mostly arbitrary, but not entirely, because the the center matrices on the two left sides add to the zero matrix). Note that the sums of these similar matrices are not similar


\begin{pmatrix}
1  &0   \\
0  &3
\end{pmatrix}
+
\begin{pmatrix}
-1  &0   \\
0  &-3
\end{pmatrix}
=
\begin{pmatrix}
0  &0  \\
0  &0
\end{pmatrix}
\qquad
\begin{pmatrix}
5/3  &-2/3   \\
-4/3 &7/3
\end{pmatrix}
+
\begin{pmatrix}
-5  &-4   \\
2  &1
\end{pmatrix}
\neq
\begin{pmatrix}
0  &0  \\
0  &0
\end{pmatrix}

since the zero matrix is similar only to itself.

Problem 18

Show that if  T-\lambda I and  N are similar matrices then  T and  N+\lambda I are also similar.

Answer

If  N=P(T-\lambda I)P^{-1} then  N=PTP^{-1}-P(\lambda I)P^{-1} . The diagonal matrix  \lambda I commutes with anything, so  P(\lambda I)P^{-1}=PP^{-1}(\lambda I)=\lambda I . Thus  N=PTP^{-1}-\lambda I and consequently  N+\lambda I=PTP^{-1} . (So not only are they similar, in fact they are similar via the same  P .)

References[edit]

  • Halmos, Paul P. (1958), Finite Dimensional Vector Spaces (Second ed.), Van Nostrand .