UMD Probability Qualifying Exams/Aug2009Probability

From Wikibooks, open books for an open world
Jump to navigation Jump to search

Problem 1[edit | edit source]

Let be i.i.d. random variables with moment generating function which is finite for all . Let .

(a) Prove that

where

and

(b) Prove that

.

(c) Assume . Use the result of (b) to establish that almost surely.


Solution[edit | edit source]

(a)

Thus far, we have not imposed any conditions on . So the above inequality will hold for all , hence for the supremum as well, which gives us the desired result.


(b) where the last equality follows from the fact that the are independent and identically distributed.

(c)

Problem 2[edit | edit source]

Let be a probability space; let be a random variable with finite second moment and let be sub -fields. Prove that


Solution[edit | edit source]

Problem 3[edit | edit source]

Let be independent homogeneous Poisson processes with rates , respectively. Let be the time of the first jump for the process and let be the random index of the component process that made the first jump. Find the joint distribution of . In particular, establish that are independent and that is exponentially distributed.



Solution[edit | edit source]

Show is exponentially distributed[edit | edit source]

Let be the first time that a Poisson process jumps.

is a Poisson Process with parameter [edit | edit source]

Proof: There are three conditions to check:

(i) almost surely

(ii) For is independent of ? This is true since both are Poisson Processes and are both independent of each other.

(iii) For is distributed Poisson with parameter ? This is true since the sum of independent Poisson processes are also poison. (see second bullet)

Joint distribution of (J,Z)[edit | edit source]

Problem 4[edit | edit source]

Let be a martingale sequence and for each let be an -measurable random variable. Define

Assuming that is integrable for each , show that is a martingale.


Solution[edit | edit source]

Problem 5[edit | edit source]

Let be an i.i.d. sequence with and . Prove that for any , the series converges almost surely.

Solution[edit | edit source]

Define . Then and . We check the three components of Kolmogorov's three-series theorem to conclude that converges almost surely.


[edit | edit source]

[edit | edit source]

[edit | edit source]

Problem 6[edit | edit source]

Consider the following process taking values in . Assume is an i.i.d. sequence of positive integer valued random variables and let be independent of the . Then

Solution[edit | edit source]