# Numerical Methods Qualification Exam Problems and Solutions (University of Maryland)/August 2009

## Problem 1

 Let ${\displaystyle A\!\,}$ be a real symmetric matrix of order ${\displaystyle n\!\,}$ with ${\displaystyle n\!\,}$ distinct eigenvalues, and let ${\displaystyle v_{1}\in R^{n}\!\,}$ be such that ${\displaystyle \|v_{1}\|_{2}=1\!\,}$ and the inner product ${\displaystyle (v_{1},u)\neq 0\!\,}$ for every eigenvector ${\displaystyle u\!\,}$ of ${\displaystyle A\!\,}$.

## Problem 1a

 Let ${\displaystyle P_{n}\!\,}$ denote the space of polynomials of degree at most ${\displaystyle n-1\!\,}$. Show that ${\displaystyle \langle p,q\rangle \equiv (p(A)v_{1},q(A)v_{1})\!\,}$ defines an inner product on ${\displaystyle P_{n}\!\,}$, where the expression on the right above is the Euclidean inner product in ${\displaystyle R^{n}\!\,}$

## Solution 1a

### Symmetry

{\displaystyle {\begin{aligned}\langle p,q\rangle &=(p(A)v_{1},q(A)v_{1})\\&=(q(A)v_{1},p(A)v_{1})\\&=\langle q,p\rangle \end{aligned}}\!\,}

### Linearity of 1st Argument

Let ${\displaystyle \alpha \in \mathbb {R} \!\,}$

{\displaystyle {\begin{aligned}\langle \alpha p,q\rangle &=(\alpha p(A)v_{1},q(A)v_{1})\\&=\alpha (p(A)v_{1},q(A)v_{1})\\&=\alpha \langle p,q\rangle \end{aligned}}\!\,}

{\displaystyle {\begin{aligned}\langle p+r,q\rangle &=((p(A)+r(A))v_{1},q(A)v_{1})\\&=(p(A)v_{1},q(A)v_{1})+(r(A)v_{1},q(A)v_{1})\\&=\langle p,q\rangle +\langle r,q\rangle \end{aligned}}\!\,}

### Positive Definiteness

{\displaystyle {\begin{aligned}\langle p,p\rangle &=(\underbrace {p(A)v_{1}} _{\hat {v}},\underbrace {p(A)v_{1})} _{\hat {v}}{\mbox{ where }}{\hat {v}}\in R^{n}\\&=({\hat {v}},{\hat {v}})\\&=\sum _{i=1}^{n}({\hat {v}}_{i})^{2}\geq 0\end{aligned}}\!\,}

### "Zeroness"

We also need to show that ${\displaystyle \langle p,p\rangle =0\!\,}$ if and only if ${\displaystyle p=0\!\,}$.

#### Forward Direction (alt)

Suppose ${\displaystyle p\neq 0}$. It suffices to show ${\displaystyle \langle p,p\rangle \neq 0}$. However, this a trivial consequence of the fact that ${\displaystyle p(A)v_{1}\neq 0}$ (which is clear from the fact that ${\displaystyle p(A)\neq 0}$ for ${\displaystyle p\neq 0}$ with degree less than ${\displaystyle n}$ and that ${\displaystyle v_{i}}$ does not lie in the orthogonal compliment of any of the ${\displaystyle n}$ distinct eigen vectors of ${\displaystyle A}$).

##### Forward Direction

Claim: If ${\displaystyle \langle p,p\rangle =0\!\,}$, then ${\displaystyle p=0\!\,}$.

From hypothesis

${\displaystyle v_{1}=\sum _{i=1}^{n}\alpha _{i}u_{i}\!\,}$

where ${\displaystyle u_{i}\!\,}$ are the orthogonal eigenvectors of ${\displaystyle A\!\,}$ and all ${\displaystyle \alpha _{i}\!\,}$ are non-zero

{\displaystyle {\begin{aligned}\langle p,p,\rangle &=(p(A)v_{1},p(A)v_{1})\\&=(p(A)\sum _{i=1}^{n}\alpha _{i}u_{i},p(A)\sum _{i=1}^{n}\alpha _{i}u_{i})\\&=(\sum _{i=1}^{n}\beta _{i}u_{i},\sum _{i=1}^{n}\beta u_{i}){\mbox{ (since }}u_{i}{\mbox{ are eigenvectors )}}\\&=0{\mbox{ (by hypothesis) }}\end{aligned}}\!\,}

Notice that ${\displaystyle \beta _{i}\!\,}$ is a linear combination of ${\displaystyle \alpha _{i}\!\,}$, the coefficients of the polynomial ${\displaystyle A\!\,}$, and the scaling coefficient ${\displaystyle m_{i}\!\,}$ of the eigenvector e.g. ${\displaystyle Au_{i}=m_{i}u_{i}\!\,}$

Since ${\displaystyle u_{i}\neq 0\forall i\!\,}$ and ${\displaystyle \alpha _{i}\neq 0\!\,}$, this implies ${\displaystyle p=0\!\,}$.

##### Reverse Direction

If ${\displaystyle p=0\!\,}$, then ${\displaystyle \langle p,p\rangle =(\underbrace {p(A)} _{0}v_{1},\underbrace {p(A)} _{0}v_{1})=0\!\,}$

## Problem 1b

 Consider the recurrence ${\displaystyle \beta _{j+1}v_{j+1}=Av_{j}-\alpha _{j}v_{j}-\beta _{j}v_{j-1}\!\,}$ where the ${\displaystyle \alpha _{j}\!\,}$ and ${\displaystyle \beta _{j}\!\,}$ are scalars and ${\displaystyle v_{0}\equiv 0\!\,}$. Show that ${\displaystyle v_{j}=p_{j-1}(A)v_{1}\!\,}$, where ${\displaystyle p_{j-1}(t)\!\,}$ is a polynomial of degree ${\displaystyle j-1\!\,}$

## Solution 1b

By induction.

### Base Case

{\displaystyle {\begin{aligned}\beta _{2}v_{2}&=Av_{1}-\alpha _{1}v_{1}-\beta _{1}\underbrace {v_{0}} _{0}\\v_{2}&={\frac {1}{\beta _{2}}}[Av_{1}-\alpha _{1}v_{1}]\\v_{2}&=p_{1}(A)v_{1}\quad p_{1}(t)={\frac {1}{\beta _{2}}}[t-\alpha _{1}]\end{aligned}}\!\,}

{\displaystyle {\begin{aligned}\beta _{3}v_{3}&=Av_{2}-\alpha _{2}v_{2}-\beta _{2}v_{1}\\v_{3}&={\frac {1}{\beta _{3}}}[\underbrace {Av_{2}} _{\in p_{2}(A)v_{1}}-\underbrace {\alpha _{2}v_{2}} _{\in p_{1}(A)v_{1}}-\underbrace {\beta _{2}v_{1}} _{\in p_{0}(A)v_{1}}]\\&\in p_{2}(A)v_{1}\end{aligned}}\!\,}

### Induction Step

Claim: ${\displaystyle v_{j}=p_{j-1}(A)v_{1}\!\,}$

Hypothesis:

Suppose

${\displaystyle v_{j}=p_{j-1}(A)v_{1}\!\,}$

${\displaystyle v_{j-1}=p_{j-2}(A)v_{1}\!\,}$

where ${\displaystyle p_{j-1}}$ (respectively ${\displaystyle p_{j-2}}$) has degree ${\displaystyle j-1}$ (respectively ${\displaystyle j-2}$). Then for ${\displaystyle \beta _{j+1}\neq 0}$

${\displaystyle v_{j+1}={\frac {Ap_{j-1}(A)-\alpha _{j}p_{j-1}(A)-\beta _{j}p_{j-2}(A)}{\beta _{j+1}}}v_{1}}$

which is as desired.

## Problem 1c

 Suppose the scalars above are such that ${\displaystyle \alpha _{j}=(Av_{j},v_{j})\!\,}$ and ${\displaystyle \beta _{j+1}\!\,}$ is chosen so that ${\displaystyle \|v_{j+1}\|_{2}=1\!\,}$. Use this to show that that the polynomials in part (b) are othogonal with respect to the inner product from part (a.

## Solution 1c

Since ${\displaystyle v_{j}=p_{j-1}(A)v_{1}\!\,}$ and ${\displaystyle \langle p,q\rangle \equiv (p(A)v_{1},q(A)v_{1})\!\,}$, it is equivalent to show that ${\displaystyle (v_{i},v_{j})=0\!\,}$ for ${\displaystyle i\neq j\!\,}$.

Since

${\displaystyle v_{j+1}={\frac {1}{\beta _{j+1}}}Av_{j}-{\frac {\alpha _{j}}{\beta _{j}}}v_{j}-{\frac {\beta _{j+1}}{\beta _{j}}}v_{j-1}\!\,}$,

it is then sufficient to show that

${\displaystyle (v_{j+1},v_{j})=(v_{j+1},v_{j-1})=0\!\,}$

### Claim ${\displaystyle (v_{j+1},v_{j})=0\quad \forall j\!\,}$

By induction.

#### Base Case

{\displaystyle {\begin{aligned}v_{2}&={\frac {1}{\beta _{2}}}(Av_{1}-\alpha _{1}v_{1})\\(v_{2},v_{1})&={\frac {1}{\beta _{2}}}[(Av_{1},v_{1})-\underbrace {\alpha _{1}} _{(Av_{1},v_{1})}\underbrace {(v_{1},v_{1})} _{1}]\\&=0\end{aligned}}\!\,}

#### Induction Step

Assume: ${\displaystyle (v_{j},v_{j-1})=0\!\,}$

Claim: ${\displaystyle (v_{j+1},v_{j})=0\!\,}$

{\displaystyle {\begin{aligned}v_{j+1}&={\frac {1}{\beta _{j+1}}}[Av_{j}-\alpha _{j}v_{j}-\beta _{j}v_{j-1}]\\(v_{j+1},v_{j})&={\frac {1}{\beta _{j+1}}}[(Av_{j},v_{j})-\underbrace {\alpha _{j}} _{(Av_{j},v_{j})}\underbrace {(v_{j},v_{j})} _{1}-\beta _{j}\underbrace {(v_{j-1},v_{j})} _{0}]\\&=0\end{aligned}}\!\,}

### Claim ${\displaystyle (v_{j+1},v_{j-1})=0\!\,}$

By induction.

#### Base Case

{\displaystyle {\begin{aligned}v_{3}&={\frac {1}{\beta _{3}}}[Av_{2}-\alpha _{2}v_{2}-\beta _{2}v_{1}]\\(v_{3},v_{1})&={\frac {1}{\beta _{3}}}[(Av_{2},v_{1})-\alpha _{2}\underbrace {(v_{2},v_{1})} _{0}-\beta _{2}\underbrace {(v_{1},v_{1})} _{1}]\\&={\frac {1}{\beta _{3}}}[(Av_{2},v_{1})-\beta _{2}]\\&={\frac {1}{\beta _{3}}}[(v_{2},Av_{1})-\beta _{2}]{\mbox{ (because A symmetric) }}\\&={\frac {1}{\beta _{3}}}[\beta _{2}-\beta _{2}]=0{\mbox{ (see below) }}\end{aligned}}\!\,}

{\displaystyle {\begin{aligned}\beta _{2}v_{2}&=Av_{1}-\alpha _{1}v_{1}\\\beta _{2}\underbrace {(v_{2},v_{2})} _{1}&=(Av_{1},v_{2})-\alpha _{1}\underbrace {(v_{1},v_{2})} _{0}\\\beta _{2}&=(Av_{1},v_{2})\end{aligned}}\!\,}

#### Induction Step

Assume: ${\displaystyle (v_{j},v_{j-2})=0\!\,}$

Claim: ${\displaystyle (v_{j+1},v_{j-1})=0\!\,}$

{\displaystyle {\begin{aligned}(v_{j+1},v_{j-1})&={\frac {1}{\beta _{j+1}}}[(Av_{j},v_{j-1}-\alpha _{j}\underbrace {(v_{j},v_{j-1})} _{0}-\beta _{j}\underbrace {(v_{j-1},v_{j-1})} _{1}\\&={\frac {1}{\beta _{j+1}}}[(Av_{j},v_{j-1})-\beta _{j})]\\&={\frac {1}{\beta _{j+1}}}[(v_{j},Av_{j-1})-\beta _{j}]{\mbox{ (because A symmetric)}}\\&={\frac {1}{\beta _{j+1}}}[\beta _{j}-\beta _{j}]=0{\mbox{ (see below) }}\end{aligned}}\!\,}

{\displaystyle {\begin{aligned}\beta _{j}v_{j}&=Av_{j-1}-\alpha _{j-1}v_{j-1}-\beta _{j-1}v_{j-2}\\\beta _{j}\underbrace {(v_{j},v_{j})} _{1}&=(Av_{j-1},v_{j})-\alpha _{j-1}\underbrace {(v_{j-1},v_{j})} _{0}-\beta _{j-1}\underbrace {(v_{j-2},v_{j})} _{0}\\\beta _{j}&=(Av_{j-1},v_{j})\end{aligned}}\!\,}

## Problem 2

 Consider the n-panel trapezoid rule ${\displaystyle I_{n}(f)\!\,}$ for calculating the integral ${\displaystyle \int _{0}^{1}f(x)dx\!\,}$ of a continuous function ${\displaystyle f\in C[0,1]\!\,}$, ${\displaystyle I_{n}(f)=\sum _{k=0}^{n-1}\left({\frac {t_{k+1}-t_{k}}{2}}\right)(f(t_{k})+f(t_{k+1}))\!\,}$ where ${\displaystyle 0=t_{0}

## Problem 2a

 Find a piecewise linear function ${\displaystyle G\!\,}$ such that ${\displaystyle I_{n}(f)-\int _{0}^{1}f(t)dt=\int _{0}^{1}G(t)f'(t)dt\!\,}$ for any continuous function ${\displaystyle f\!\,}$ such that ${\displaystyle f'\!\,}$ is integrable over [0,1]. Hint: Find ${\displaystyle G\!\,}$ by applying the fundamental theorem of calculus to ${\displaystyle \int _{t_{k}}^{t_{k+1}}(f(t)G(t))'dt\!\,}$.

## Solution 2a

### Rewrite given equation on specific interval

For a specific interval ${\displaystyle [t_{k},t_{k+1}]\!\,}$, we have from hypothesis

${\displaystyle ({\frac {t_{k+1}-t_{k}}{2}})(f(t_{k})+f(t_{k+1}))-\int _{t_{k}}^{t_{k+1}}f(t)=\int _{t_{k}}^{t_{k+1}}G(t)f'(t)dt\!\,}$.

Distributing and rearranging terms gives

${\displaystyle (1)\qquad ({\frac {t_{k+1}-t_{k}}{2}})f(t_{k})+({\frac {t_{k+1}-t_{k}}{2}})f(t_{k+1}))-\int _{t_{k}}^{t_{k+1}}f(t)=\int _{t_{k}}^{t_{k+1}}G(t)f'(t)\!\,}$

### Apply Hint

Starting with the hint and applying product rule, we get

${\displaystyle \int _{t_{k}}^{t_{k+1}}(f(t)G(t))'dt=\int _{t_{k}}^{t_{k+1}}f'(t)G(t)dt+\int _{t_{k}}^{t_{k+1}}f(t)G'(t)dt\!\,}$.

Also, we know from the Fundamental Theorem of Calculus

${\displaystyle \int _{t_{k}}^{t_{k+1}}(f(t)G(t))'dt=f(t_{k+1})G(t_{k+1})-f(t_{k})G(t_{k})\!\,}$.

Setting the above two equations equal to each other and solving for ${\displaystyle \int _{t_{k}}^{t_{k+1}}f'(t)G(t)\!\,}$ yields

${\displaystyle (2)\qquad -G(t_{k})f(t_{k})+G(t_{k+1})f(t_{k+1})-\int _{t_{k}}^{t_{k+1}}G'(t)f(t)dt=\int _{t_{k}}^{t_{k+1}}G(t)f'(t)\!\,}$

### Choose G'(t)

Let ${\displaystyle G'(t)=1\!\,}$. Therefore since ${\displaystyle G(t)\!\,}$ is linear

${\displaystyle (3)\qquad G(t)=t+b\!\,}$

By comparing equations (1) and (2) we see that

${\displaystyle G(t_{k+1})={\frac {t_{k+1}-t_{k}}{2}}\!\,}$ and

${\displaystyle G(t_{k})=-{\frac {t_{k+1}-t_{k}}{2}}\!\,}$.

Plugging in either ${\displaystyle G(t_{k})\!\,}$ or ${\displaystyle G(t_{k+1})\!\,}$ into equation (3), we get that

${\displaystyle b=-{\frac {t_{k+1}+t_{k}}{2}}\!\,}$

Hence

${\displaystyle G(t)=t-{\frac {t_{k+1}+t_{k}}{2}}\!\,}$

## Problem 2b

 Apply the previous result to ${\displaystyle f(x)=x^{\alpha }\!\,}$, ${\displaystyle 0<\alpha <1\!\,}$, to obtain a rate of convergence.

## Problem 3

 Let ${\displaystyle C[a,b]\!\,}$ denote the set of all real-valued continuous functions defined on the closed interval ${\displaystyle [a,b]\!\,}$ be positive everywhere in ${\displaystyle [a,b]\!\,}$. Let ${\displaystyle \{Q_{n}\}_{n=0}^{\infty }\!\,}$ be a system of polynomials with ${\displaystyle deg\,Q_{n}=n\!\,}$ for each ${\displaystyle n\!\,}$, orthogonal with respect to the inner product ${\displaystyle \langle g,h\rangle =\int _{a}^{b}\rho (x)g(x)h(x)dx,\quad \forall g,h\in C[a,b]\!\,}$ For a fixed integer ${\displaystyle n\geq 2\!\,}$, let ${\displaystyle x_{1},\ldots ,x_{n}\!\,}$ be the ${\displaystyle n\!\,}$ distinct roots of ${\displaystyle Q_{n}\!\,}$ in ${\displaystyle (a,b)\!\,}$. Let ${\displaystyle r_{k}(x)={\frac {(x-x_{1})\cdots (x-x_{k-1})(x-x_{k+1})\cdots (x-x_{n})}{(x_{k}-x_{1})\cdots (x_{k}-x_{k-1})(x_{k}-x_{k+1})\cdots (x_{k}-x_{n})}},\quad k=1,2,\ldots ,n\!\,}$ be polynomials of degree ${\displaystyle n-1\!\,}$. Show that ${\displaystyle \int _{a}^{b}\rho (x)r_{j}(x)r_{k}(x)dx=0,\forall j\neq k\!\,}$ and that ${\displaystyle \sum _{k=1}^{n}\int _{a}^{b}\rho (x)(r_{k}(x))^{2}dx=\int _{a}^{b}\rho (x)dx\!\,}$ Hint: Use orthogonality to simplify ${\displaystyle \int _{a}^{b}\rho (x)(\sum _{k=1}^{n}r_{k}(x))^{2}dx\!\,}$

## Solution 3a

{\displaystyle {\begin{aligned}&\quad \int _{a}^{b}\rho (x)r_{j}(x)r_{k}(x)dx\\&=\int _{a}^{b}\rho (x)\prod _{i\neq j}{\frac {x-x_{i}}{x_{j}-x_{i}}}\prod _{i\neq k}{\frac {x-x_{i}}{x_{k}-x_{i}}}dx\\&=\int _{a}^{b}\overbrace {\frac {\prod _{i\neq k,i\neq j}^{n}x-x_{i}}{\prod _{i\neq j}^{n}x_{j}-x_{i}}} ^{Q_{n-2}}\overbrace {\frac {\prod _{i=1}^{n}x-x_{i}}{\prod _{i\neq k}^{n}x_{k}-x_{i}}} _{n}^{Q}dx\\&=0\end{aligned}}\!\,}

## Solution 3b

{\displaystyle {\begin{aligned}&\quad \sum _{k=1}^{n}\int _{a}^{b}\rho (x)(r_{k}(x))^{2}dx\\&=\int _{a}^{b}\rho (x)\sum _{k=1}^{n}(r_{k}(x))^{2}dx\\&=\int _{a}^{b}\rho (x)(\sum _{k=1}^{n}r_{k}(x))^{2}dx{\mbox{ (from part a) }}\\&=\int _{a}^{b}\rho (x)(1)^{2}dx{\mbox{ (from claim) }}\\&=\int _{a}^{b}\rho (x)dx\end{aligned}}\!\,}

### Claim

${\displaystyle \sum _{k=1}^{n}r_{k}(x)=1\!\,}$

### Proof

Since ${\displaystyle r_{k}\!\,}$ is a polynomial of degree ${\displaystyle n-1\!\,}$ for all ${\displaystyle k\!\,}$, ${\displaystyle \sum _{k=1}^{n}r_{k}(x)\!\,}$ is a polynomial of degree ${\displaystyle n-1\!\,}$.

Notice that ${\displaystyle \sum _{k=1}^{n}r_{k}(x_{i})=1\!\,}$ for ${\displaystyle i=1,2,\ldots ,n\!\,}$ where ${\displaystyle x_{i}\!\,}$ are the ${\displaystyle n\!\,}$ distinct roots of ${\displaystyle Q_{n}\!\,}$. Since ${\displaystyle \sum _{k=1}^{n}r_{k}(x)\!\,}$ is a polynomial of degree ${\displaystyle n-1\!\,}$ and takes on the value 1, ${\displaystyle n\!\,}$ distinct times

${\displaystyle \sum _{k=1}^{n}r_{k}(x)=1\!\,}$