UMD Probability Qualifying Exams/Jan2011Probability

From Wikibooks, open books for an open world
Jump to navigation Jump to search

Problem 1[edit]

A person plays an infinite sequence of games. He wins the th game with probability , independently of the other games.

(i) Prove that for any , the probability is one that the player will accumulate dollars if he gets a dollar each time he wins two games in a row.

(ii) Does the claim in part (i) hold true if the player gets a dollar only if he wins three games in a row? Prove or disprove it.


(i): Define the person's game as the infinite sequence where each equals either 1 (corresponding to a win) or 0 (corresponding to a loss).

Define the random variable by

that is, counts how many times the player received two consecutive wins in his first games. Thus, the player will win dollars in the first games. Clearly, is measurable. Moreover, we can compute the expectation:

Now observe what happens as we send :

Hence the expected winnings of the infinite game is also infinite. This implies that the player will surpass $ in winnings almost surely.

(ii): Define everything as before except this time

Then which gives Thus we cannot assert that the probability of surpassing any given winnings will equal 1.

Problem 2[edit]

There are 10 coins in a bag. Five of them are normal coins, one coin has two heads and four coins have two tails. You pull one coin out, look at one of its sides and see that it is a tail. What is the probability that it is a normal coin?


This is just a direct application of Bayes' theorem. Let denote the event that you pulled a normal coin. Let denote the even that you have a tail.

By Bayes,

The probability of seeing a tail on a normal coin, is 5/20 since there are five tails on normal coins out of all 20 faces. The probability of seeing a tail is 13 out of 20 (5 normal + 2*4 double).

Problem 3[edit]

Let be a Markov chain with state space , with transition probabilities for , for .

(i) Find a strictly monotonically decreasing non-negative function such that is a supermartingale.

(ii) Prove that for each initial distribution


(i) Let be the Markov transition matrix. I claim that for any initial probability distribution, , then .

Proof of claim: It is sufficient to consider the case where the initial distribution is singular, i.e. . Clearly we can see that . Then if and for we have .

Now let . We want to compute for .

where the last inequality comes from our claim above. This shows that is a supermartingale.

Problem 4[edit]

Let be i.i.d. random variables with .

(i) Prove that the series converges with probability one.

(ii) Prove that the distribution of is singular, i.e., concentrated on a set of Lebesgue measure zero.


(i) Notice that

So the series is bounded. Moreover, it must be Cauchy. Indeed for any we can select sufficiently large so that for every , Hence, the series converges almost surely.

(ii) To show that is supported on a set of Lebesgue measure zero, first recall some facts about the Cantor set.

The Cantor set is the set of all with ternary expansion (in base 3). This corresponds to the usual Cantor set which can be thought of the perfect symmetric set with contraction 1/3.

Instead, consider the set consisting of all with expansion in base . There exists an obvious bijection between the elements of and . Since the Lebesgue measure of is . Hence has support on a set of Lebesgue measure zero.

Problem 5[edit]

Let be a sequence of independent random variables with uniformly distributed on . Find and such that converges in distribution to a nondegenerate limit and identify the limit.


This is a direct appliction of Central Limit Theorem, Lindeberg Condition.

We know that each random variable has mean and variance .

Then and . Then converges in distribution to the standard normal provided the Lindeberg condition holds.

Hence we want to check

Since grows faster than then for sufficiently large , the domain of each integral is empty. Hence the above equation goes to 0 as . Thus the Lindeberg condition is satisfied and CLT holds.

Problem 6[edit]

(i) Let be random variables defined on a probability space . Assuming that for all , prove that implies , i.e. under the above assumptions, almost sure convergence implies convergence in mean square.

(ii) Let be a random process with the property that and are finite and do not depend on (such a process is called wide-sense stationary). Prove that the correlation function is continuous if the trajectories of are continuous.


(i) Let . By assumption . Now we compute the norm:

Let us evaluate the first integral on the right-hand side. We can write

by Fatou's lemma

(since ).

Now the second term:

by the triangle inequality.

since all have finite second moments.

Thus we have just shown that under the above assumptions, almost sure convergence implies convergence in mean square.