Numerical Methods Qualification Exam Problems and Solutions (University of Maryland)/January 2005

From Wikibooks, open books for an open world
Jump to navigation Jump to search

Problem 1[edit | edit source]

Given , let denote the best uniform approximation to among polynomials of degree , i.e.



is minimized by the choice . Let . Prove that there are at least two points such that



and

Solution 1[edit | edit source]

Since both and are continuous, so is the function



and therefore attains both a maximum and a minimum on .


Let and


If , then there exists a polynomial (a vertical shift of the supposed best approximation which is a better approximation to .



Therefore,



Equality holds if and only if .

Problem 2a[edit | edit source]

Find the node and the weight so that the integration rule



is exact for linear functions. (No knowledge of orthogonal polynomials is required.)

Solution 2a[edit | edit source]

Let . Then



which implies after computation



Let . Then



which implies after computation


Problem 2b[edit | edit source]

Show that no one-point rule for approximating can be exact for quadratics.

Solution 2b[edit | edit source]

Suppose there is a one-point rule for approximating that is exact for quadratics.


Let .


Then,



but


,


a contradiction.

Problem 2c[edit | edit source]

In fact



Find .

Solution 2c[edit | edit source]

Let . Then


Using the values computed in part (b), we have



which implies


Problem 2d[edit | edit source]

Let and be two polynomials of degree . Suppose and agree at , , and . Show


Solution 2d[edit | edit source]

There exist such that if is a polynomial of degree



Hence,


Problem 3[edit | edit source]

Let be nonsingular and . Consider the following iteration for the solution


Problem 3a[edit | edit source]

Show that if all eigenvalues of have positive real part then there will be some real such that the iterates converge for any starting vector . Discuss how to choose optimally in case is symmetric and determine the rate of convergence.

Solution 3a[edit | edit source]

Convergence of any starting vector[edit | edit source]

Using the iteration , we have that


Then, computing norms we obtain

.


In particular, considering , we have that

.


Now, in order to study the convergence of the method, we need to analyze . We use the following characterization:


Let's denote by an eigenvalue of the matrix . Then implies



i.e.



The above equation is quadratic with respect to and opens upward. Hence using the quadratic formula we can find the roots of the equation to be



Then, if for all the eigenvalues of the matrix , we can find such that , i.e., the method converges.

Choosing alpha if A symmetric[edit | edit source]

On the other hand, if the matrix is symmetric, we have that . Then using the Schur decomposition, we can find a unitary matrix and a diagonal matrix such that , and in consequence,



This last expression is minimized if , i.e, if . With this optimal value of , we obtain that



which implies that


.


Finally, we've obtained that


.

Problem 3b[edit | edit source]

Show that if some eigenvalues of have negative real part and some have positive real part, then there is no real for which the iterations converge.

Solution 3b[edit | edit source]

If for , then convergence occurs if



Hence .


Similarly, if for , then convergence occurs if



Hence .


If some eigenvalues are positive and some negative, there does not exist a real for which the iterations converge since cannot be both positive and negative simultaneously.

Problem 3c[edit | edit source]

Let for a matrix norm associated to a vector norm. Show that the error can be expressed in terms of the difference between consecutive iterates, namely


(The proof of this is short but a little tricky)

Solution 3c[edit | edit source]

Problem 3d[edit | edit source]

Let be the tridiagonal matrix



Find a value of that guarantees convergence

Solution 3d[edit | edit source]

By Gerschgorin's theorem, the magnitude of all the eigenvalues are between 1 and 5 i.e.



Therefore,



We want



Therefore,