Linear Algebra/Eigenvalues and Eigenvectors

Linear Algebra
 ← Diagonalizability Eigenvalues and Eigenvectors Nilpotence →

In this subsection we will focus on the property of Corollary 2.4.

Definition 3.1

A transformation ${\displaystyle t:V\to V}$ has a scalar eigenvalue ${\displaystyle \lambda }$ if there is a nonzero eigenvector ${\displaystyle {\vec {\zeta }}\in V}$ such that ${\displaystyle t({\vec {\zeta }})=\lambda \cdot {\vec {\zeta }}}$.

("Eigen" is German for "characteristic of" or "peculiar to"; some authors call these characteristic values and vectors. No authors call them "peculiar".)

Example 3.2

The projection map

${\displaystyle {\begin{pmatrix}x\\y\\z\end{pmatrix}}{\stackrel {\pi }{\longmapsto }}{\begin{pmatrix}x\\y\\0\end{pmatrix}}\qquad x,y,z\in \mathbb {C} }$

has an eigenvalue of ${\displaystyle 1}$ associated with any eigenvector of the form

${\displaystyle {\begin{pmatrix}x\\y\\0\end{pmatrix}}}$

where ${\displaystyle x}$ and ${\displaystyle y}$ are scalars at least one of which is non-${\displaystyle 0}$. On the other hand, ${\displaystyle 2}$ is not an eigenvalue of ${\displaystyle \pi }$ since no non-${\displaystyle {\vec {0}}}$ vector is doubled.

That example shows why the "non-${\displaystyle {\vec {0}}}$" appears in the definition. Disallowing ${\displaystyle {\vec {0}}}$ as an eigenvector eliminates trivial eigenvalues.

Example 3.3

The only transformation on the trivial space ${\displaystyle \{{\vec {0}}\,\}}$ is

${\displaystyle {\vec {0}}\mapsto {\vec {0}}}$.

This map has no eigenvalues because there are no non-${\displaystyle {\vec {0}}}$ vectors ${\displaystyle {\vec {v}}}$ mapped to a scalar multiple ${\displaystyle \lambda \cdot {\vec {v}}}$ of themselves.

Example 3.4

Consider the homomorphism ${\displaystyle t:{\mathcal {P}}_{1}\to {\mathcal {P}}_{1}}$ given by ${\displaystyle c_{0}+c_{1}x\mapsto (c_{0}+c_{1})+(c_{0}+c_{1})x}$. The range of ${\displaystyle t}$ is one-dimensional. Thus an application of ${\displaystyle t}$ to a vector in the range will simply rescale that vector: ${\displaystyle c+cx\mapsto (2c)+(2c)x}$. That is, ${\displaystyle t}$ has an eigenvalue of ${\displaystyle 2}$ associated with eigenvectors of the form ${\displaystyle c+cx}$ where ${\displaystyle c\neq 0}$.

This map also has an eigenvalue of ${\displaystyle 0}$ associated with eigenvectors of the form ${\displaystyle c-cx}$ where ${\displaystyle c\neq 0}$.

Definition 3.5

A square matrix ${\displaystyle T}$ has a scalar eigenvalue ${\displaystyle \lambda }$ associated with the non-${\displaystyle {\vec {0}}}$ eigenvector ${\displaystyle {\vec {\zeta }}}$ if ${\displaystyle T{\vec {\zeta }}=\lambda \cdot {\vec {\zeta }}}$.

Remark 3.6

Although this extension from maps to matrices is obvious, there is a point that must be made. Eigenvalues of a map are also the eigenvalues of matrices representing that map, and so similar matrices have the same eigenvalues. But the eigenvectors are different— similar matrices need not have the same eigenvectors.

For instance, consider again the transformation ${\displaystyle t:{\mathcal {P}}_{1}\to {\mathcal {P}}_{1}}$ given by ${\displaystyle c_{0}+c_{1}x\mapsto (c_{0}+c_{1})+(c_{0}+c_{1})x}$. It has an eigenvalue of ${\displaystyle 2}$ associated with eigenvectors of the form ${\displaystyle c+cx}$ where ${\displaystyle c\neq 0}$. If we represent ${\displaystyle t}$ with respect to ${\displaystyle B=\langle 1+1x,1-1x\rangle }$

${\displaystyle T={\rm {Rep}}_{B,B}(t)={\begin{pmatrix}2&0\\0&0\end{pmatrix}}}$

then ${\displaystyle 2}$ is an eigenvalue of ${\displaystyle T}$, associated with these eigenvectors.

${\displaystyle \{{\begin{pmatrix}c_{0}\\c_{1}\end{pmatrix}}\,{\big |}\,{\begin{pmatrix}2&0\\0&0\end{pmatrix}}{\begin{pmatrix}c_{0}\\c_{1}\end{pmatrix}}={\begin{pmatrix}2c_{0}\\2c_{1}\end{pmatrix}}\}=\{{\begin{pmatrix}c_{0}\\0\end{pmatrix}}\,{\big |}\,c_{0}\in \mathbb {C} ,\,c_{0}\neq 0\}}$

On the other hand, representing ${\displaystyle t}$ with respect to ${\displaystyle D=\langle 2+1x,1+0x\rangle }$ gives

${\displaystyle S={\rm {Rep}}_{D,D}(t)={\begin{pmatrix}3&1\\-3&-1\end{pmatrix}}}$

and the eigenvectors of ${\displaystyle S}$ associated with the eigenvalue ${\displaystyle 2}$ are these.

${\displaystyle \{{\begin{pmatrix}c_{0}\\c_{1}\end{pmatrix}}\,{\big |}\,{\begin{pmatrix}3&1\\-3&-1\end{pmatrix}}{\begin{pmatrix}c_{0}\\c_{1}\end{pmatrix}}={\begin{pmatrix}2c_{0}\\2c_{1}\end{pmatrix}}\}=\{{\begin{pmatrix}0\\c_{1}\end{pmatrix}}\,{\big |}\,c_{1}\in \mathbb {C} ,\,c_{1}\neq 0\}}$

Thus similar matrices can have different eigenvectors.

Here is an informal description of what's happening. The underlying transformation doubles the eigenvectors ${\displaystyle {\vec {v}}\mapsto 2\cdot {\vec {v}}}$. But when the matrix representing the transformation is ${\displaystyle T={\rm {Rep}}_{B,B}(t)}$ then it "assumes" that column vectors are representations with respect to ${\displaystyle B}$. In contrast, ${\displaystyle S={\rm {Rep}}_{D,D}(t)}$ "assumes" that column vectors are representations with respect to ${\displaystyle D}$. So the vectors that get doubled by each matrix look different.

The next example illustrates the basic tool for finding eigenvectors and eigenvalues.

Example 3.7

What are the eigenvalues and eigenvectors of this matrix?

${\displaystyle T={\begin{pmatrix}1&2&1\\2&0&-2\\-1&2&3\end{pmatrix}}}$

To find the scalars ${\displaystyle x}$ such that ${\displaystyle T{\vec {\zeta }}=x{\vec {\zeta }}}$ for non-${\displaystyle {\vec {0}}}$ eigenvectors ${\displaystyle {\vec {\zeta }}}$, bring everything to the left-hand side

${\displaystyle {\begin{pmatrix}1&2&1\\2&0&-2\\-1&2&3\end{pmatrix}}{\begin{pmatrix}z_{1}\\z_{2}\\z_{3}\end{pmatrix}}-x{\begin{pmatrix}z_{1}\\z_{2}\\z_{3}\end{pmatrix}}={\vec {0}}}$

and factor ${\displaystyle (T-xI){\vec {\zeta }}={\vec {0}}}$. (Note that it says ${\displaystyle T-xI}$; the expression ${\displaystyle T-x}$ doesn't make sense because ${\displaystyle T}$ is a matrix while ${\displaystyle x}$ is a scalar.) This homogeneous linear system

${\displaystyle {\begin{pmatrix}1-x&2&1\\2&0-x&-2\\-1&2&3-x\end{pmatrix}}{\begin{pmatrix}z_{1}\\z_{2}\\z_{3}\end{pmatrix}}={\begin{pmatrix}0\\0\\0\end{pmatrix}}}$

has a non-${\displaystyle {\vec {0}}}$ solution if and only if the matrix is singular. We can determine when that happens.

${\displaystyle {\begin{array}{rl}0&=\left|T-xI\right|\\&={\begin{vmatrix}1-x&2&1\\2&0-x&-2\\-1&2&3-x\end{vmatrix}}\\&=x^{3}-4x^{2}+4x\\&=x(x-2)^{2}\end{array}}}$

The eigenvalues are ${\displaystyle \lambda _{1}=0}$ and ${\displaystyle \lambda _{2}=2}$. To find the associated eigenvectors, plug in each eigenvalue. Plugging in ${\displaystyle \lambda _{1}=0}$ gives

${\displaystyle {\begin{pmatrix}1-0&2&1\\2&0-0&-2\\-1&2&3-0\end{pmatrix}}{\begin{pmatrix}z_{1}\\z_{2}\\z_{3}\end{pmatrix}}={\begin{pmatrix}0\\0\\0\end{pmatrix}}\qquad \Longrightarrow \qquad {\begin{pmatrix}z_{1}\\z_{2}\\z_{3}\end{pmatrix}}={\begin{pmatrix}a\\-a\\a\end{pmatrix}}}$

for a scalar parameter ${\displaystyle a\neq 0}$ (${\displaystyle a}$ is non-${\displaystyle 0}$ because eigenvectors must be non-${\displaystyle {\vec {0}}}$). In the same way, plugging in ${\displaystyle \lambda _{2}=2}$ gives

${\displaystyle {\begin{pmatrix}1-2&2&1\\2&0-2&-2\\-1&2&3-2\end{pmatrix}}{\begin{pmatrix}z_{1}\\z_{2}\\z_{3}\end{pmatrix}}={\begin{pmatrix}0\\0\\0\end{pmatrix}}\qquad \Longrightarrow \qquad {\begin{pmatrix}z_{1}\\z_{2}\\z_{3}\end{pmatrix}}={\begin{pmatrix}b\\0\\b\end{pmatrix}}}$

with ${\displaystyle b\neq 0}$.

Example 3.8

If

${\displaystyle S={\begin{pmatrix}\pi &1\\0&3\end{pmatrix}}}$

(here ${\displaystyle \pi }$ is not a projection map, it is the number ${\displaystyle 3.14\ldots }$) then

${\displaystyle \left|{\begin{pmatrix}\pi -x&1\\0&3-x\end{pmatrix}}\right|=(x-\pi )(x-3)}$

so ${\displaystyle S}$ has eigenvalues of ${\displaystyle \lambda _{1}=\pi }$ and ${\displaystyle \lambda _{2}=3}$. To find associated eigenvectors, first plug in ${\displaystyle \lambda _{1}}$ for ${\displaystyle x}$:

${\displaystyle {\begin{pmatrix}\pi -\pi &1\\0&3-\pi \end{pmatrix}}{\begin{pmatrix}z_{1}\\z_{2}\end{pmatrix}}={\begin{pmatrix}0\\0\end{pmatrix}}\qquad \Longrightarrow \qquad {\begin{pmatrix}z_{1}\\z_{2}\end{pmatrix}}={\begin{pmatrix}a\\0\end{pmatrix}}}$

for a scalar ${\displaystyle a\neq 0}$, and then plug in ${\displaystyle \lambda _{2}}$:

${\displaystyle {\begin{pmatrix}\pi -3&1\\0&3-3\end{pmatrix}}{\begin{pmatrix}z_{1}\\z_{2}\end{pmatrix}}={\begin{pmatrix}0\\0\end{pmatrix}}\qquad \Longrightarrow \qquad {\begin{pmatrix}z_{1}\\z_{2}\end{pmatrix}}={\begin{pmatrix}-b/(\pi -3)\\b\end{pmatrix}}}$

where ${\displaystyle b\neq 0}$.

Definition 3.9

The characteristic polynomial of a square matrix ${\displaystyle T}$ is the determinant of the matrix ${\displaystyle T-xI}$, where ${\displaystyle x}$ is a variable. The characteristic equation is ${\displaystyle \left|T-xI\right|=0}$. The characteristic polynomial of a transformation ${\displaystyle t}$ is the polynomial of any ${\displaystyle {\rm {Rep}}_{B,B}(t)}$.

Problem 11 checks that the characteristic polynomial of a transformation is well-defined, that is, any choice of basis yields the same polynomial.

Lemma 3.10

A linear transformation on a nontrivial vector space has at least one eigenvalue.

Proof

Any root of the characteristic polynomial is an eigenvalue. Over the complex numbers, any polynomial of degree one or greater has a root. (This is the reason that in this chapter we've gone to scalars that are complex.)

Notice the familiar form of the sets of eigenvectors in the above examples.

Definition 3.11

The eigenspace of a transformation ${\displaystyle t}$ associated with the eigenvalue ${\displaystyle \lambda }$ is ${\displaystyle V_{\lambda }=\{{\vec {\zeta }}\,{\big |}\,t({\vec {\zeta }}\,)=\lambda {\vec {\zeta }}\,\}\cup \{{\vec {0}}\,\}}$. The eigenspace of a matrix is defined analogously.

Lemma 3.12

An eigenspace is a subspace.

Proof

An eigenspace must be nonempty— for one thing it contains the zero vector— and so we need only check closure. Take vectors ${\displaystyle {\vec {\zeta }}_{1},\ldots ,{\vec {\zeta }}_{n}}$ from ${\displaystyle V_{\lambda }}$, to show that any linear combination is in ${\displaystyle V_{\lambda }}$

${\displaystyle {\begin{array}{rl}t(c_{1}{\vec {\zeta }}_{1}+c_{2}{\vec {\zeta }}_{2}+\cdots +c_{n}{\vec {\zeta }}_{n})&=c_{1}t({\vec {\zeta }}_{1})+\dots +c_{n}t({\vec {\zeta }}_{n})\\&=c_{1}\lambda {\vec {\zeta }}_{1}+\dots +c_{n}\lambda {\vec {\zeta }}_{n}\\&=\lambda (c_{1}{\vec {\zeta }}_{1}+\dots +c_{n}{\vec {\zeta }}_{n})\end{array}}}$

(the second equality holds even if any ${\displaystyle {\vec {\zeta }}_{i}}$ is ${\displaystyle {\vec {0}}}$ since ${\displaystyle t({\vec {0}})=\lambda \cdot {\vec {0}}={\vec {0}}}$).

Example 3.13

In Example 3.8 the eigenspace associated with the eigenvalue ${\displaystyle \pi }$ and the eigenspace associated with the eigenvalue ${\displaystyle 3}$ are these.

${\displaystyle V_{\pi }=\{{\begin{pmatrix}a\\0\end{pmatrix}}\,{\big |}\,a\in \mathbb {R} \}\qquad V_{3}=\{{\begin{pmatrix}-b/\pi -3\\b\end{pmatrix}}\,{\big |}\,b\in \mathbb {R} \}}$
Example 3.14

In Example 3.7, these are the eigenspaces associated with the eigenvalues ${\displaystyle 0}$ and ${\displaystyle 2}$.

${\displaystyle V_{0}=\{{\begin{pmatrix}a\\-a\\a\end{pmatrix}}\,{\big |}\,a\in \mathbb {R} \},\qquad V_{2}=\{{\begin{pmatrix}b\\0\\b\end{pmatrix}}\,{\big |}\,b\in \mathbb {R} \}.}$
Remark 3.15

The characteristic equation is ${\displaystyle 0=x(x-2)^{2}}$ so in some sense ${\displaystyle 2}$ is an eigenvalue "twice". However there are not "twice" as many eigenvectors, in that the dimension of the eigenspace is one, not two. The next example shows a case where a number, ${\displaystyle 1}$, is a double root of the characteristic equation and the dimension of the associated eigenspace is two.

Example 3.16

With respect to the standard bases, this matrix

${\displaystyle {\begin{pmatrix}1&0&0\\0&1&0\\0&0&0\end{pmatrix}}}$

represents projection.

${\displaystyle {\begin{pmatrix}x\\y\\z\end{pmatrix}}{\stackrel {\pi }{\longmapsto }}{\begin{pmatrix}x\\y\\0\end{pmatrix}}\qquad x,y,z\in \mathbb {C} }$

Its eigenspace associated with the eigenvalue ${\displaystyle 0}$ and its eigenspace associated with the eigenvalue ${\displaystyle 1}$ are easy to find.

${\displaystyle V_{0}=\{{\begin{pmatrix}0\\0\\c_{3}\end{pmatrix}}\,{\big |}\,c_{3}\in \mathbb {C} \}\qquad V_{1}=\{{\begin{pmatrix}c_{1}\\c_{2}\\0\end{pmatrix}}\,{\big |}\,c_{1},c_{2}\in \mathbb {C} \}}$

By the lemma, if two eigenvectors ${\displaystyle {\vec {v}}_{1}}$ and ${\displaystyle {\vec {v}}_{2}}$ are associated with the same eigenvalue then any linear combination of those two is also an eigenvector associated with that same eigenvalue. But, if two eigenvectors ${\displaystyle {\vec {v}}_{1}}$ and ${\displaystyle {\vec {v}}_{2}}$ are associated with different eigenvalues then the sum ${\displaystyle {\vec {v}}_{1}+{\vec {v}}_{2}}$ need not be related to the eigenvalue of either one. In fact, just the opposite. If the eigenvalues are different then the eigenvectors are not linearly related.

Theorem 3.17

For any set of distinct eigenvalues of a map or matrix, a set of associated eigenvectors, one per eigenvalue, is linearly independent.

Proof

We will use induction on the number of eigenvalues. If there is no eigenvalue or only one eigenvalue then the set of associated eigenvectors is empty or is a singleton set with a non-${\displaystyle {\vec {0}}}$ member, and in either case is linearly independent.

For induction, assume that the theorem is true for any set of ${\displaystyle k}$ distinct eigenvalues, suppose that ${\displaystyle \lambda _{1},\dots ,\lambda _{k+1}}$ are distinct eigenvalues, and let ${\displaystyle {\vec {v}}_{1},\dots ,{\vec {v}}_{k+1}}$ be associated eigenvectors. If ${\displaystyle c_{1}{\vec {v}}_{1}+\dots +c_{k}{\vec {v}}_{k}+c_{k+1}{\vec {v}}_{k+1}={\vec {0}}}$ then after multiplying both sides of the displayed equation by ${\displaystyle \lambda _{k+1}}$, applying the map or matrix to both sides of the displayed equation, and subtracting the first result from the second, we have this.

${\displaystyle c_{1}(\lambda _{k+1}-\lambda _{1}){\vec {v}}_{1}+\dots +c_{k}(\lambda _{k+1}-\lambda _{k}){\vec {v}}_{k}+c_{k+1}(\lambda _{k+1}-\lambda _{k+1}){\vec {v}}_{k+1}={\vec {0}}}$

The induction hypothesis now applies: ${\displaystyle c_{1}(\lambda _{k+1}-\lambda _{1})=0,\dots ,c_{k}(\lambda _{k+1}-\lambda _{k})=0}$. Thus, as all the eigenvalues are distinct, ${\displaystyle c_{1},\,\dots ,\,c_{k}}$ are all ${\displaystyle 0}$. Finally, now ${\displaystyle c_{k+1}}$ must be ${\displaystyle 0}$ because we are left with the equation ${\displaystyle {\vec {v}}_{k+1}\neq {\vec {0}}}$.

Example 3.18

The eigenvalues of

${\displaystyle {\begin{pmatrix}2&-2&2\\0&1&1\\-4&8&3\end{pmatrix}}}$

are distinct: ${\displaystyle \lambda _{1}=1}$, ${\displaystyle \lambda _{2}=2}$, and ${\displaystyle \lambda _{3}=3}$. A set of associated eigenvectors like

${\displaystyle \{{\begin{pmatrix}2\\1\\0\end{pmatrix}},{\begin{pmatrix}9\\4\\4\end{pmatrix}},{\begin{pmatrix}2\\1\\2\end{pmatrix}}\}}$

is linearly independent.

Corollary 3.19

An ${\displaystyle n\!\times \!n}$ matrix with ${\displaystyle n}$ distinct eigenvalues is diagonalizable.

Proof

Form a basis of eigenvectors. Apply Corollary 2.4.

Exercises

Problem 1

For each, find the characteristic polynomial and the eigenvalues.

1. ${\displaystyle {\begin{pmatrix}10&-9\\4&-2\end{pmatrix}}}$
2. ${\displaystyle {\begin{pmatrix}1&2\\4&3\end{pmatrix}}}$
3. ${\displaystyle {\begin{pmatrix}0&3\\7&0\end{pmatrix}}}$
4. ${\displaystyle {\begin{pmatrix}0&0\\0&0\end{pmatrix}}}$
5. ${\displaystyle {\begin{pmatrix}1&0\\0&1\end{pmatrix}}}$
This exercise is recommended for all readers.
Problem 2

For each matrix, find the characteristic equation, and the eigenvalues and associated eigenvectors.

1. ${\displaystyle {\begin{pmatrix}3&0\\8&-1\end{pmatrix}}}$
2. ${\displaystyle {\begin{pmatrix}3&2\\-1&0\end{pmatrix}}}$
Problem 3

Find the characteristic equation, and the eigenvalues and associated eigenvectors for this matrix. Hint. The eigenvalues are complex.

${\displaystyle {\begin{pmatrix}-2&-1\\5&2\end{pmatrix}}}$
Problem 4

Find the characteristic polynomial, the eigenvalues, and the associated eigenvectors of this matrix.

${\displaystyle {\begin{pmatrix}1&1&1\\0&0&1\\0&0&1\end{pmatrix}}}$
This exercise is recommended for all readers.
Problem 5

For each matrix, find the characteristic equation, and the eigenvalues and associated eigenvectors.

1. ${\displaystyle {\begin{pmatrix}3&-2&0\\-2&3&0\\0&0&5\end{pmatrix}}}$
2. ${\displaystyle {\begin{pmatrix}0&1&0\\0&0&1\\4&-17&8\end{pmatrix}}}$
This exercise is recommended for all readers.
Problem 6

Let ${\displaystyle t:{\mathcal {P}}_{2}\to {\mathcal {P}}_{2}}$ be

${\displaystyle a_{0}+a_{1}x+a_{2}x^{2}\mapsto (5a_{0}+6a_{1}+2a_{2})-(a_{1}+8a_{2})x+(a_{0}-2a_{2})x^{2}.}$

Find its eigenvalues and the associated eigenvectors.

Problem 7

Find the eigenvalues and eigenvectors of this map ${\displaystyle t:{\mathcal {M}}_{2}\to {\mathcal {M}}_{2}}$.

${\displaystyle {\begin{pmatrix}a&b\\c&d\end{pmatrix}}\mapsto {\begin{pmatrix}2c&a+c\\b-2c&d\end{pmatrix}}}$
This exercise is recommended for all readers.
Problem 8

Find the eigenvalues and associated eigenvectors of the differentiation operator ${\displaystyle d/dx:{\mathcal {P}}_{3}\to {\mathcal {P}}_{3}}$.

Problem 9
Prove that

the eigenvalues of a triangular matrix (upper or lower triangular) are the entries on the diagonal.

This exercise is recommended for all readers.
Problem 10

Find the formula for the characteristic polynomial of a ${\displaystyle 2\!\times \!2}$ matrix.

Problem 11

Prove that the characteristic polynomial of a transformation is well-defined.

This exercise is recommended for all readers.
Problem 12
1. Can any non-${\displaystyle {\vec {0}}}$ vector in any nontrivial vector space be a eigenvector? That is, given a ${\displaystyle {\vec {v}}\neq {\vec {0}}}$ from a nontrivial ${\displaystyle V}$, is there a transformation ${\displaystyle t:V\to V}$ and a scalar ${\displaystyle \lambda \in \mathbb {R} }$ such that ${\displaystyle t({\vec {v}})=\lambda {\vec {v}}}$?
2. Given a scalar ${\displaystyle \lambda }$, can any non-${\displaystyle {\vec {0}}}$ vector in any nontrivial vector space be an eigenvector associated with the eigenvalue ${\displaystyle \lambda }$?
This exercise is recommended for all readers.
Problem 13

Suppose that ${\displaystyle t:V\to V}$ and ${\displaystyle T={\rm {Rep}}_{B,B}(t)}$. Prove that the eigenvectors of ${\displaystyle T}$ associated with ${\displaystyle \lambda }$ are the non-${\displaystyle {\vec {0}}}$ vectors in the kernel of the map represented (with respect to the same bases) by ${\displaystyle T-\lambda I}$.

Problem 14

Prove that if ${\displaystyle a,\ldots ,\,d}$ are all integers and ${\displaystyle a+b=c+d}$ then

${\displaystyle {\begin{pmatrix}a&b\\c&d\end{pmatrix}}}$

has integral eigenvalues, namely ${\displaystyle a+b}$ and ${\displaystyle a-c}$.

This exercise is recommended for all readers.
Problem 15

Prove that if ${\displaystyle T}$ is nonsingular and has eigenvalues ${\displaystyle \lambda _{1},\dots ,\lambda _{n}}$ then ${\displaystyle T^{-1}}$ has eigenvalues ${\displaystyle 1/\lambda _{1},\dots ,1/\lambda _{n}}$. Is the converse true?

This exercise is recommended for all readers.
Problem 16

Suppose that ${\displaystyle T}$ is ${\displaystyle n\!\times \!n}$ and ${\displaystyle c,d}$ are scalars.

1. Prove that if ${\displaystyle T}$ has the eigenvalue ${\displaystyle \lambda }$ with an associated eigenvector ${\displaystyle {\vec {v}}}$ then ${\displaystyle {\vec {v}}}$ is an eigenvector of ${\displaystyle cT+dI}$ associated with eigenvalue ${\displaystyle c\lambda +d}$.
2. Prove that if ${\displaystyle T}$ is diagonalizable then so is ${\displaystyle cT+dI}$.
This exercise is recommended for all readers.
Problem 17

Show that ${\displaystyle \lambda }$ is an eigenvalue of ${\displaystyle T}$ if and only if the map represented by ${\displaystyle T-\lambda I}$ is not an isomorphism.

Problem 18
1. Show that if ${\displaystyle \lambda }$ is an eigenvalue of ${\displaystyle A}$ then ${\displaystyle \lambda ^{k}}$ is an eigenvalue of ${\displaystyle A^{k}}$.
2. What is wrong with this proof generalizing that? "If ${\displaystyle \lambda }$ is an eigenvalue of ${\displaystyle A}$ and ${\displaystyle \mu }$ is an eigenvalue for ${\displaystyle B}$, then ${\displaystyle \lambda \mu }$ is an eigenvalue for ${\displaystyle AB}$, for, if ${\displaystyle A{\vec {x}}=\lambda {\vec {x}}}$ and ${\displaystyle B{\vec {x}}=\mu {\vec {x}}}$ then ${\displaystyle AB{\vec {x}}=A\mu {\vec {x}}=\mu A{\vec {x}}\mu \lambda {\vec {x}}}$"?
(Strang 1980)
Problem 19

Do matrix-equivalent matrices have the same eigenvalues?

Problem 20

Show that a square matrix with real entries and an odd number of rows has at least one real eigenvalue.

Problem 21

Diagonalize.

${\displaystyle {\begin{pmatrix}-1&2&2\\2&2&2\\-3&-6&-6\end{pmatrix}}}$
Problem 22

Suppose that ${\displaystyle P}$ is a nonsingular ${\displaystyle n\!\times \!n}$ matrix. Show that the similarity transformation map ${\displaystyle t_{P}:{\mathcal {M}}_{n\!\times \!n}\to {\mathcal {M}}_{n\!\times \!n}}$ sending ${\displaystyle T\mapsto PTP^{-1}}$ is an isomorphism.

? Problem 23

Show that if ${\displaystyle A}$ is an ${\displaystyle n}$ square matrix and each row (column) sums to ${\displaystyle c}$ then ${\displaystyle c}$ is a characteristic root of ${\displaystyle A}$. (Morrison 1967)

Solutions

References

• Morrison, Clarence C. (proposer) (1967), "Quickie", Mathematics Magazine 40 (4): 232 .
• Strang, Gilbert (1980), Linear Algebra and its Applications (Second ed.), Harcourt Brace Jovanovich .
Linear Algebra
 ← Diagonalizability Eigenvalues and Eigenvectors Nilpotence →