# Linear Algebra/Definition and Examples of Similarity/Solutions

## Solutions

Problem 1

For

${\displaystyle S={\begin{pmatrix}1&3\\-2&-6\end{pmatrix}}\quad T={\begin{pmatrix}0&0\\-11/2&-5\end{pmatrix}}\quad P={\begin{pmatrix}4&2\\-3&2\end{pmatrix}}}$

check that ${\displaystyle T=PSP^{-1}}$.

One way to proceed is left to right.

${\displaystyle PSP^{-1}={\begin{pmatrix}4&2\\-3&2\end{pmatrix}}{\begin{pmatrix}1&3\\-2&-6\end{pmatrix}}{\begin{pmatrix}2/14&-2/14\\3/14&4/14\end{pmatrix}}={\begin{pmatrix}0&0\\-7&-21\end{pmatrix}}{\begin{pmatrix}2/14&-2/14\\3/14&4/14\end{pmatrix}}={\begin{pmatrix}0&0\\-11/2&-5\end{pmatrix}}}$
This exercise is recommended for all readers.
Problem 2

Example 1.3 shows that the only matrix similar to a zero matrix is itself and that the only matrix similar to the identity is itself.

1. Show that the ${\displaystyle 1\!\times \!1}$ matrix ${\displaystyle (2)}$, also, is similar only to itself.
2. Is a matrix of the form ${\displaystyle cI}$ for some scalar ${\displaystyle c}$ similar only to itself?
3. Is a diagonal matrix similar only to itself?
1. Because the matrix ${\displaystyle (2)}$ is ${\displaystyle 1\!\times \!1}$, the matrices ${\displaystyle P}$ and ${\displaystyle P^{-1}}$ are also ${\displaystyle 1\!\times \!1}$ and so where ${\displaystyle P=(p)}$ the inverse is ${\displaystyle P^{-1}=(1/p)}$. Thus ${\displaystyle P(2)P^{-1}=(p)(2)(1/p)=(2)}$.
2. Yes: recall that scalar multiples can be brought out of a matrix ${\displaystyle P(cI)P^{-1}=cPIP^{-1}=cI}$. By the way, the zero and identity matrices are the special cases ${\displaystyle c=0}$ and ${\displaystyle c=1}$.
3. No, as this example shows.
${\displaystyle {\begin{pmatrix}1&-2\\-1&1\end{pmatrix}}{\begin{pmatrix}-1&0\\0&-3\end{pmatrix}}{\begin{pmatrix}-1&-2\\-1&-1\end{pmatrix}}={\begin{pmatrix}-5&-4\\2&1\end{pmatrix}}}$
Problem 3

Show that these matrices are not similar.

${\displaystyle {\begin{pmatrix}1&0&4\\1&1&3\\2&1&7\end{pmatrix}}\qquad {\begin{pmatrix}1&0&1\\0&1&1\\3&1&2\end{pmatrix}}}$

Gauss' method shows that the first matrix represents maps of rank two while the second matrix represents maps of rank three.

Problem 4

Consider the transformation ${\displaystyle t:{\mathcal {P}}_{2}\to {\mathcal {P}}_{2}}$ described by ${\displaystyle x^{2}\mapsto x+1}$, ${\displaystyle x\mapsto x^{2}-1}$, and ${\displaystyle 1\mapsto 3}$.

1. Find ${\displaystyle T={\rm {Rep}}_{B,B}(t)}$ where ${\displaystyle B=\langle x^{2},x,1\rangle }$.
2. Find ${\displaystyle S={\rm {Rep}}_{D,D}(t)}$ where ${\displaystyle D=\langle 1,1+x,1+x+x^{2}\rangle }$.
3. Find the matrix ${\displaystyle P}$ such that ${\displaystyle T=PSP^{-1}}$.
1. Because ${\displaystyle t}$ is described with the members of ${\displaystyle B}$, finding the matrix representation is easy:
${\displaystyle {\rm {Rep}}_{B}(t(x^{2}))={\begin{pmatrix}0\\1\\1\end{pmatrix}}_{B}\quad {\rm {Rep}}_{B}(t(x))={\begin{pmatrix}1\\0\\-1\end{pmatrix}}_{B}\quad {\rm {Rep}}_{B}(t(1))={\begin{pmatrix}0\\0\\3\end{pmatrix}}_{B}}$
gives this.
${\displaystyle {\rm {Rep}}_{B,B}(t){\begin{pmatrix}0&1&0\\1&0&0\\1&-1&3\end{pmatrix}}}$
2. We will find ${\displaystyle t(1)}$, ${\displaystyle t(1+x)}$, and ${\displaystyle t(1+x+x^{2})}$, to find how each is represented with respect to ${\displaystyle D}$. We are given that ${\displaystyle t(1)=3}$, and the other two are easy to see: ${\displaystyle t(1+x)=x^{2}+2}$ and ${\displaystyle t(1+x+x^{2})=x^{2}+x+3}$. By eye, we get the representation of each vector
${\displaystyle {\rm {Rep}}_{D}(t(1))={\begin{pmatrix}3\\0\\0\end{pmatrix}}_{D}\quad {\rm {Rep}}_{D}(t(1+x))={\begin{pmatrix}2\\-1\\1\end{pmatrix}}_{D}\quad {\rm {Rep}}_{D}(t(1+x+x^{2}))={\begin{pmatrix}2\\0\\1\end{pmatrix}}_{D}}$
and thus the representation of the map.
${\displaystyle {\rm {Rep}}_{D,D}(t)={\begin{pmatrix}3&2&2\\0&-1&0\\0&1&1\end{pmatrix}}}$
3. The diagram, adapted for this ${\displaystyle T}$ and ${\displaystyle S}$,
shows that ${\displaystyle P={\rm {Rep}}_{D,B}({\mbox{id}})}$.
${\displaystyle P={\begin{pmatrix}0&0&1\\0&1&1\\1&1&1\end{pmatrix}}}$
This exercise is recommended for all readers.
Problem 5

Exhibit an nontrivial similarity relationship in this way: let ${\displaystyle t:\mathbb {C} ^{2}\to \mathbb {C} ^{2}}$ act by

${\displaystyle {\begin{pmatrix}1\\2\end{pmatrix}}\mapsto {\begin{pmatrix}3\\0\end{pmatrix}}\qquad {\begin{pmatrix}-1\\1\end{pmatrix}}\mapsto {\begin{pmatrix}-1\\2\end{pmatrix}}}$

and pick two bases, and represent ${\displaystyle t}$ with respect to then ${\displaystyle T={\rm {Rep}}_{B,B}(t)}$ and ${\displaystyle S={\rm {Rep}}_{D,D}(t)}$. Then compute the ${\displaystyle P}$ and ${\displaystyle P^{-1}}$ to change bases from ${\displaystyle B}$ to ${\displaystyle D}$ and back again.

One possible choice of the bases is

${\displaystyle B=\langle {\begin{pmatrix}1\\2\end{pmatrix}},{\begin{pmatrix}-1\\1\end{pmatrix}}\rangle \qquad D={\mathcal {E}}_{2}=\langle {\begin{pmatrix}1\\0\end{pmatrix}},{\begin{pmatrix}0\\1\end{pmatrix}}\rangle }$

(this ${\displaystyle B}$ is suggested by the map description). To find the matrix ${\displaystyle T={\rm {Rep}}_{B,B}(t)}$, solve the relations

${\displaystyle c_{1}{\begin{pmatrix}1\\2\end{pmatrix}}+c_{2}{\begin{pmatrix}-1\\1\end{pmatrix}}={\begin{pmatrix}3\\0\end{pmatrix}}\qquad {\hat {c}}_{1}{\begin{pmatrix}1\\2\end{pmatrix}}+{\hat {c}}_{2}{\begin{pmatrix}-1\\1\end{pmatrix}}={\begin{pmatrix}-1\\2\end{pmatrix}}}$

to get ${\displaystyle c_{1}=1}$, ${\displaystyle c_{2}=-2}$, ${\displaystyle {\hat {c}}_{1}=1/3}$ and ${\displaystyle {\hat {c}}_{2}=4/3}$.

${\displaystyle {\rm {Rep}}_{B,B}(t)={\begin{pmatrix}1&1/3\\-2&4/3\end{pmatrix}}}$

Finding ${\displaystyle {\rm {Rep}}_{D,D}(t)}$ involves a bit more computation. We first find ${\displaystyle t({\vec {e}}_{1})}$. The relation

${\displaystyle c_{1}{\begin{pmatrix}1\\2\end{pmatrix}}+c_{2}{\begin{pmatrix}-1\\1\end{pmatrix}}={\begin{pmatrix}1\\0\end{pmatrix}}}$

gives ${\displaystyle c_{1}=1/3}$ and ${\displaystyle c_{2}=-2/3}$, and so

${\displaystyle {\rm {Rep}}_{B}({\vec {e}}_{1})={\begin{pmatrix}1/3\\-2/3\end{pmatrix}}_{B}}$

making

${\displaystyle {\rm {Rep}}_{B}(t({\vec {e}}_{1}))={\begin{pmatrix}1&1/3\\-2&4/3\end{pmatrix}}_{B,B}{\begin{pmatrix}1/3\\-2/3\end{pmatrix}}_{B}={\begin{pmatrix}1/9\\-14/9\end{pmatrix}}_{B}}$

and hence ${\displaystyle t}$ acts on the first basis vector ${\displaystyle {\vec {e}}_{1}}$ in this way.

${\displaystyle t({\vec {e}}_{1})=(1/9)\cdot {\begin{pmatrix}1\\2\end{pmatrix}}-(14/9)\cdot {\begin{pmatrix}-1\\1\end{pmatrix}}={\begin{pmatrix}5/3\\-4/3\end{pmatrix}}}$

The computation for ${\displaystyle t({\vec {e}}_{2})}$ is similar. The relation

${\displaystyle c_{1}{\begin{pmatrix}1\\2\end{pmatrix}}+c_{2}{\begin{pmatrix}-1\\1\end{pmatrix}}={\begin{pmatrix}0\\1\end{pmatrix}}}$

gives ${\displaystyle c_{1}=1/3}$ and ${\displaystyle c_{2}=1/3}$, so

${\displaystyle {\rm {Rep}}_{B}({\vec {e}}_{1})={\begin{pmatrix}1/3\\1/3\end{pmatrix}}_{B}}$

making

${\displaystyle {\rm {Rep}}_{B}(t({\vec {e}}_{1}))={\begin{pmatrix}1&1/3\\-2&4/3\end{pmatrix}}_{B,B}{\begin{pmatrix}1/3\\1/3\end{pmatrix}}_{B}={\begin{pmatrix}4/9\\-2/9\end{pmatrix}}_{B}}$

and hence ${\displaystyle t}$ acts on the second basis vector ${\displaystyle {\vec {e}}_{2}}$ in this way.

${\displaystyle t({\vec {e}}_{2})=(4/9)\cdot {\begin{pmatrix}1\\2\end{pmatrix}}-(2/9)\cdot {\begin{pmatrix}-1\\1\end{pmatrix}}={\begin{pmatrix}2/3\\2/3\end{pmatrix}}}$

Therefore

${\displaystyle {\rm {Rep}}_{D,D}(t)={\begin{pmatrix}5/3&2/3\\-4/3&2/3\end{pmatrix}}}$

and these are the change of basis matrices.

${\displaystyle P={\rm {Rep}}_{B,D}({\mbox{id}})={\begin{pmatrix}1&-1\\2&1\end{pmatrix}}\qquad P^{-1}={\bigl (}{\rm {Rep}}_{B,D}({\mbox{id}}){\bigr )}^{-1}={\begin{pmatrix}1&-1\\2&1\end{pmatrix}}^{-1}={\begin{pmatrix}1/3&1/3\\-2/3&1/3\end{pmatrix}}}$

The check of these computations is routine.

${\displaystyle {\begin{pmatrix}1&-1\\2&1\end{pmatrix}}{\begin{pmatrix}1&1/3\\-2&4/3\end{pmatrix}}{\begin{pmatrix}1/3&1/3\\-2/3&1/3\end{pmatrix}}={\begin{pmatrix}5/3&2/3\\-4/3&2/3\end{pmatrix}}}$
Problem 6

Explain Example 1.3 in terms of maps.

The only representation of a zero map is a zero matrix, no matter what the pair of bases ${\displaystyle {\rm {Rep}}_{B,D}(z)=Z}$, and so in particular for any single basis ${\displaystyle B}$ we have ${\displaystyle {\rm {Rep}}_{B,B}(z)=Z}$. The case of the identity is related, but slightly different: the only representation of the identity map, with respect to any ${\displaystyle B,B}$, is the identity ${\displaystyle {\rm {Rep}}_{B,B}({\mbox{id}})=I}$. (Remark: of course, we have seen examples where ${\displaystyle B\neq D}$ and ${\displaystyle {\rm {Rep}}_{B,D}({\mbox{id}})\neq I}$— in fact, we have seen that any nonsingular matrix is a representation of the identity map with respect to some ${\displaystyle B,D}$.)

This exercise is recommended for all readers.
Problem 7

Are there two matrices ${\displaystyle A}$ and ${\displaystyle B}$ that are similar while ${\displaystyle A^{2}}$ and ${\displaystyle B^{2}}$ are not similar? (Halmos 1958)

No. If ${\displaystyle A=PBP^{-1}}$ then ${\displaystyle A^{2}=(PBP^{-1})(PBP^{-1})=PB^{2}P^{-1}}$.

This exercise is recommended for all readers.
Problem 8

Prove that if two matrices are similar and one is invertible then so is the other.

Matrix similarity is a special case of matrix equivalence (if matrices are similar then they are matrix equivalent) and matrix equivalence preserves nonsingularity. (This is an extension of the rule that similar matrices have equal determinants, which can be used as indicator if it's invertible.)

This exercise is recommended for all readers.
Problem 9

Show that similarity is an equivalence relation.

A matrix is similar to itself; take ${\displaystyle P}$ to be the identity matrix: ${\displaystyle IPI^{-1}=IPI=P}$.

If ${\displaystyle T}$ is similar to ${\displaystyle S}$ then ${\displaystyle T=PSP^{-1}}$ and so ${\displaystyle P^{-1}TP=S}$. Rewrite this as ${\displaystyle S=(P^{-1})T(P^{-1})^{-1}}$ to conclude that ${\displaystyle S}$ is similar to ${\displaystyle T}$.

If ${\displaystyle T}$ is similar to ${\displaystyle S}$ and ${\displaystyle S}$ is similar to ${\displaystyle U}$ then ${\displaystyle T=PSP^{-1}}$ and ${\displaystyle S=QUQ^{-1}}$. Then ${\displaystyle T=PQUQ^{-1}P^{-1}=(PQ)U(PQ)^{-1}}$, showing that ${\displaystyle T}$ is similar to ${\displaystyle U}$.

Problem 10

Consider a matrix representing, with respect to some ${\displaystyle B,B}$, reflection across the ${\displaystyle x}$-axis in ${\displaystyle \mathbb {R} ^{2}}$. Consider also a matrix representing, with respect to some ${\displaystyle D,D}$, reflection across the ${\displaystyle y}$-axis. Must they be similar?

Let ${\displaystyle f_{x}}$ and ${\displaystyle f_{y}}$ be the reflection maps (sometimes called "flip"s). For any bases ${\displaystyle B}$ and ${\displaystyle D}$, the matrices ${\displaystyle {\rm {Rep}}_{B,B}(f_{x})}$ and ${\displaystyle {\rm {Rep}}_{D,D}(f_{y})}$ are similar. First note that

${\displaystyle S={\rm {Rep}}_{{\mathcal {E}}_{2},{\mathcal {E}}_{2}}(f_{x})={\begin{pmatrix}1&0\\0&-1\end{pmatrix}}\qquad T={\rm {Rep}}_{{\mathcal {E}}_{2},{\mathcal {E}}_{2}}(f_{y})={\begin{pmatrix}-1&0\\0&1\end{pmatrix}}}$

are similar because the second matrix is the representation of ${\displaystyle f_{x}}$ with respect to the basis ${\displaystyle A=\langle {\vec {e}}_{2},{\vec {e}}_{1}\rangle }$:

${\displaystyle {\begin{pmatrix}1&0\\0&-1\end{pmatrix}}=P{\begin{pmatrix}-1&0\\0&1\end{pmatrix}}P^{-1}}$

where ${\displaystyle P={\rm {Rep}}_{A,{\mathcal {E}}_{2}}({\mbox{id}})}$.

Now the conclusion follows from the transitivity part of Problem 9.

To finish without relying on that exercise, write ${\displaystyle {\rm {Rep}}_{B,B}(f_{x})=QTQ^{-1}=Q{\rm {Rep}}_{{\mathcal {E}}_{2},{\mathcal {E}}_{2}}(f_{x})Q^{-1}}$ and ${\displaystyle {\rm {Rep}}_{D,D}(f_{y})=RSR^{-1}=R{\rm {Rep}}_{{\mathcal {E}}_{2},{\mathcal {E}}_{2}}(f_{y})R^{-1}}$. Using the equation in the first paragraph, the first of these two becomes ${\displaystyle {\rm {Rep}}_{B,B}(f_{x})=QP{\rm {Rep}}_{{\mathcal {E}}_{2},{\mathcal {E}}_{2}}(f_{y})P^{-1}Q^{-1}}$ and rewriting the second of these two as ${\displaystyle R^{-1}\cdot {\rm {Rep}}_{D,D}(f_{y})\cdot R={\rm {Rep}}_{{\mathcal {E}}_{2},{\mathcal {E}}_{2}}(f_{y})}$ and substituting gives the desired relationship

${\displaystyle {\rm {Rep}}_{B,B}(f_{x})=QP{\rm {Rep}}_{{\mathcal {E}}_{2},{\mathcal {E}}_{2}}(f_{y})P^{-1}Q^{-1}}$
${\displaystyle =QPR^{-1}\cdot {\rm {Rep}}_{D,D}(f_{y})\cdot RP^{-1}Q^{-1}=(QPR^{-1})\cdot {\rm {Rep}}_{D,D}(f_{y})\cdot (QPR^{-1})^{-1}}$

Thus the matrices ${\displaystyle {\rm {Rep}}_{B,B}(f_{x})}$ and ${\displaystyle {\rm {Rep}}_{D,D}(f_{y})}$ are similar.

Problem 11

Prove that similarity preserves determinants and rank. Does the converse hold?

We must show that if two matrices are similar then they have the same determinant and the same rank. Both determinant and rank are properties of matrices that we have already shown to be preserved by matrix equivalence. They are therefore preserved by similarity (which is a special case of matrix equivalence: if two matrices are similar then they are matrix equivalent).

To prove the statement without quoting the results about matrix equivalence, note first that rank is a property of the map (it is the dimension of the rangespace) and since we've shown that the rank of a map is the rank of a representation, it must be the same for all representations. As for determinants, ${\displaystyle \left|PSP^{-1}\right|=\left|P\right|\cdot \left|S\right|\cdot \left|P^{-1}\right|=\left|P\right|\cdot \left|S\right|\cdot \left|P\right|^{-1}=\left|S\right|}$.

The converse of the statement does not hold; for instance, there are matrices with the same determinant that are not similar. To check this, consider a nonzero matrix with a determinant of zero. It is not similar to the zero matrix, the zero matrix is similar only to itself, but they have they same determinant. The argument for rank is much the same.

Problem 12

Is there a matrix equivalence class with only one matrix similarity class inside? One with infinitely many similarity classes?

The matrix equivalence class containing all ${\displaystyle n\!\times \!n}$ rank zero matrices contains only a single matrix, the zero matrix. Therefore it has as a subset only one similarity class.

In contrast, the matrix equivalence class of ${\displaystyle 1\!\times \!1}$ matrices of rank one consists of those ${\displaystyle 1\!\times \!1}$ matrices ${\displaystyle (k)}$ where ${\displaystyle k\neq 0}$. For any basis ${\displaystyle B}$, the representation of multiplication by the scalar ${\displaystyle k}$ is ${\displaystyle {\rm {Rep}}_{B,B}(t_{k})=(k)}$, so each such matrix is alone in its similarity class. So this is a case where a matrix equivalence class splits into infinitely many similarity classes.

Problem 13

Can two different diagonal matrices be in the same similarity class?

Yes, these are similar

${\displaystyle {\begin{pmatrix}1&0\\0&3\end{pmatrix}}\qquad {\begin{pmatrix}3&0\\0&1\end{pmatrix}}}$

since, where the first matrix is ${\displaystyle {\rm {Rep}}_{B,B}(t)}$ for ${\displaystyle B=\langle {\vec {\beta }}_{1},{\vec {\beta }}_{2}\rangle }$, the second matrix is ${\displaystyle {\rm {Rep}}_{D,D}(t)}$ for ${\displaystyle D=\langle {\vec {\beta }}_{2},{\vec {\beta }}_{1}\rangle }$.

This exercise is recommended for all readers.
Problem 14

Prove that if two matrices are similar then their ${\displaystyle k}$-th powers are similar when ${\displaystyle k>0}$. What if ${\displaystyle k\leq 0}$?

The ${\displaystyle k}$-th powers are similar because, where each matrix represents the map ${\displaystyle t}$, the ${\displaystyle k}$-th powers represent ${\displaystyle t^{k}}$, the composition of ${\displaystyle k}$-many ${\displaystyle t}$'s. (For instance, if ${\displaystyle T=rep{t}{B,B}}$ then ${\displaystyle T^{2}={\rm {Rep}}_{B,B}(t\circ t)}$.)

Restated more computationally, if ${\displaystyle T=PSP^{-1}}$ then ${\displaystyle T^{2}=(PSP^{-1})(PSP^{-1})=PS^{2}P^{-1}}$. Induction extends that to all powers.

For the ${\displaystyle k\leq 0}$ case, suppose that ${\displaystyle S}$ is invertible and that ${\displaystyle T=PSP^{-1}}$. Note that ${\displaystyle T}$ is invertible: ${\displaystyle T^{-1}=(PSP^{-1})^{-1}=PS^{-1}P^{-1}}$, and that same equation shows that ${\displaystyle T^{-1}}$ is similar to ${\displaystyle S^{-1}}$. Other negative powers are now given by the first paragraph.

This exercise is recommended for all readers.
Problem 15

Let ${\displaystyle p(x)}$ be the polynomial ${\displaystyle c_{n}x^{n}+\cdots +c_{1}x+c_{0}}$. Show that if ${\displaystyle T}$ is similar to ${\displaystyle S}$ then ${\displaystyle p(T)=c_{n}T^{n}+\cdots +c_{1}T+c_{0}I}$ is similar to ${\displaystyle p(S)=c_{n}S^{n}+\cdots +c_{1}S+c_{0}I}$.

In conceptual terms, both represent ${\displaystyle p(t)}$ for some transformation ${\displaystyle t}$. In computational terms, we have this.

${\displaystyle {\begin{array}{rl}p(T)&=c_{n}(PSP^{-1})^{n}+\dots +c_{1}(PSP^{-1})+c_{0}I\\&=c_{n}PS^{n}P^{-1}+\dots +c_{1}PSP^{-1}+c_{0}I\\&=Pc_{n}S^{n}P^{-1}+\dots +Pc_{1}SP^{-1}+Pc_{0}P^{-1}\\&=P(c_{n}S^{n}+\dots +c_{1}S+c_{0})P^{-1}\end{array}}}$
Problem 16

List all of the matrix equivalence classes of ${\displaystyle 1\!\times \!1}$ matrices. Also list the similarity classes, and describe which similarity classes are contained inside of each matrix equivalence class.

There are two equivalence classes, (i) the class of rank zero matrices, of which there is one: ${\displaystyle {\mathcal {C}}_{1}=\{(0)\}}$, and (2) the class of rank one matrices, of which there are infinitely many: ${\displaystyle {\mathcal {C}}_{2}=\{(k)\,{\big |}\,k\neq 0\}}$.

Each ${\displaystyle 1\!\times \!1}$ matrix is alone in its similarity class. That's because any transformation of a one-dimensional space is multiplication by a scalar ${\displaystyle t_{k}:V\to V}$ given by ${\displaystyle {\vec {v}}\mapsto k\cdot {\vec {v}}}$. Thus, for any basis ${\displaystyle B=\langle {\vec {\beta }}\rangle }$, the matrix representing a transformation ${\displaystyle t_{k}}$ with respect to ${\displaystyle B,B}$ is ${\displaystyle ({\rm {Rep}}_{B}(t_{k}({\vec {\beta }})))=(k)}$.

So, contained in the matrix equivalence class ${\displaystyle {\mathcal {C}}_{1}}$ is (obviously) the single similarity class consisting of the matrix ${\displaystyle (0)}$. And, contained in the matrix equivalence class ${\displaystyle {\mathcal {C}}_{2}}$ are the infinitely many, one-member-each, similarity classes consisting of ${\displaystyle (k)}$ for ${\displaystyle k\neq 0}$.

Problem 17

Does similarity preserve sums?

No. Here is an example that has two pairs, each of two similar matrices:

${\displaystyle {\begin{pmatrix}1&-1\\1&2\end{pmatrix}}{\begin{pmatrix}1&0\\0&3\end{pmatrix}}{\begin{pmatrix}2/3&1/3\\-1/3&1/3\end{pmatrix}}={\begin{pmatrix}5/3&-2/3\\-4/3&7/3\end{pmatrix}}}$

and

${\displaystyle {\begin{pmatrix}1&-2\\-1&1\end{pmatrix}}{\begin{pmatrix}-1&0\\0&-3\end{pmatrix}}{\begin{pmatrix}-1&-2\\-1&-1\end{pmatrix}}={\begin{pmatrix}-5&-4\\2&1\end{pmatrix}}}$

(this example is mostly arbitrary, but not entirely, because the the center matrices on the two left sides add to the zero matrix). Note that the sums of these similar matrices are not similar

${\displaystyle {\begin{pmatrix}1&0\\0&3\end{pmatrix}}+{\begin{pmatrix}-1&0\\0&-3\end{pmatrix}}={\begin{pmatrix}0&0\\0&0\end{pmatrix}}\qquad {\begin{pmatrix}5/3&-2/3\\-4/3&7/3\end{pmatrix}}+{\begin{pmatrix}-5&-4\\2&1\end{pmatrix}}\neq {\begin{pmatrix}0&0\\0&0\end{pmatrix}}}$

since the zero matrix is similar only to itself.

Problem 18

Show that if ${\displaystyle T-\lambda I}$ and ${\displaystyle N}$ are similar matrices then ${\displaystyle T}$ and ${\displaystyle N+\lambda I}$ are also similar.

If ${\displaystyle N=P(T-\lambda I)P^{-1}}$ then ${\displaystyle N=PTP^{-1}-P(\lambda I)P^{-1}}$. The diagonal matrix ${\displaystyle \lambda I}$ commutes with anything, so ${\displaystyle P(\lambda I)P^{-1}=PP^{-1}(\lambda I)=\lambda I}$. Thus ${\displaystyle N=PTP^{-1}-\lambda I}$ and consequently ${\displaystyle N+\lambda I=PTP^{-1}}$. (So not only are they similar, in fact they are similar via the same ${\displaystyle P}$.)