# UMD Probability Qualifying Exams/Aug2009Probability

## Problem 1

 Let ${\displaystyle X_{1},...,X_{n}}$ be i.i.d. random variables with moment generating function ${\displaystyle M(t)=E[\exp(tX_{1})]}$ which is finite for all ${\displaystyle t}$. Let ${\displaystyle {\tilde {X}}_{n}=(X_{1}+\cdots +X_{n})/n}$. (a) Prove that ${\displaystyle P[X_{1}>a]\leq exp[-h(a)]}$ where ${\displaystyle h(a)=\sup _{t\geq 0}[at-\psi (t)]}$ and ${\displaystyle \psi (t)=\log M(t)}$ (b) Prove that ${\displaystyle P[{\tilde {X}}_{n}\geq a]\leq \exp[-nh(a)]}$. (c) Assume ${\displaystyle E[X_{1}]=0}$. Use the result of (b) to establish that ${\displaystyle {\tilde {X}}_{n}\to 0}$ almost surely.

### Solution

(a) {\displaystyle {\begin{aligned}P[X_{1}>a]=&\int _{X_{1}>a}1\,dF=\int _{X_{1}>a}{\frac {\exp(ta)}{\exp(ta)}}\,dF\\=&e^{-at}\int _{X_{1}>a}e^{at}\,dF\leq e^{-at}\int _{X_{1}>a}e^{X_{1}t}\,dF\\\leq &e^{-at}\int _{\Omega }e^{X_{1}t}\,dF=e^{-at}E[\exp(tX_{1})]\end{aligned}}}

Thus far, we have not imposed any conditions on ${\displaystyle t}$. So the above inequality will hold for all ${\displaystyle t}$, hence for the supremum as well, which gives us the desired result.

(b) {\displaystyle {\begin{aligned}P[{\tilde {X}}_{n}>a]=&\int _{{\tilde {X}}_{n}>a}1\,dF=\int _{{\tilde {X}}_{n}>a}{\frac {\exp(nta)}{\exp(nta)}}\,dF\\=&e^{-nat}\int _{\sum _{i=1}^{n}X_{i}>an}e^{nat}\,dF\leq e^{-nat}\int _{\sum _{i=1}^{n}X_{i}>an}e^{\sum _{i=1}^{n}X_{i}t}\,dF\\\leq &e^{-nat}\int _{\Omega }e^{\sum _{i=1}^{n}X_{i}t}\,dF=e^{-nat}(\int _{\Omega }e^{X_{i}t}\,dF)^{n}\end{aligned}}} where the last equality follows from the fact that the ${\displaystyle X_{i}}$ are independent and identically distributed.

(c)

## Problem 2

 Let ${\displaystyle (\Omega ,{\mathcal {F}},P)}$ be a probability space; let ${\displaystyle X}$ be a random variable with finite second moment and let ${\displaystyle {\mathcal {G}}_{1}\subset {\mathcal {G_{2}}}}$ be sub ${\displaystyle \sigma }$-fields. Prove that ${\displaystyle E[(X-E(X|{\mathcal {G}}_{2}))^{2}]\leq E[(X-E(X|{\mathcal {G}}_{1}))^{2}].}$

## Problem 3

 Let ${\displaystyle N_{1}(t),N_{2}(t)}$ be independent homogeneous Poisson processes with rates ${\displaystyle \lambda _{1},\lambda _{2}}$, respectively. Let ${\displaystyle Z}$ be the time of the first jump for the process ${\displaystyle N_{1}(t)+N_{2}(t)}$ and let ${\displaystyle J}$ be the random index of the component process that made the first jump. Find the joint distribution of ${\displaystyle (J,Z)}$. In particular, establish that ${\displaystyle J,Z}$ are independent and that ${\displaystyle Z}$ is exponentially distributed.

### Solution

#### Show ${\displaystyle Z}$ is exponentially distributed

Let ${\displaystyle \tau }$ be the first time that a Poisson process ${\displaystyle N(t)}$ jumps.

{\displaystyle {\begin{aligned}p_{\tau }(x)=\lim _{\epsilon \to 0}{\frac {F_{\tau }(x)-F_{\tau }(x-\epsilon )}{\epsilon }}&=\lim _{\epsilon \to 0}{\frac {P(N(x)>0\cap N(x-\epsilon )=0)}{\epsilon }}\\=&\lim _{\epsilon \to 0}1/\epsilon \,P(N(x-\epsilon )=0)\cdot P(N(x)-N(x-\epsilon )>0)\\=&\lim _{\epsilon \to 0}1/\epsilon \,e^{-\lambda (x-\epsilon )}\cdot {\frac {\lambda \epsilon }{1}}e^{-\lambda \epsilon }=\lambda e^{-\lambda x}\end{aligned}}}

#### ${\displaystyle N_{1}(t)+N_{2}(t)}$ is a Poisson Process with parameter ${\displaystyle \lambda _{1}+\lambda _{2}}$

Proof: There are three conditions to check:

(i) ${\displaystyle N_{1}(0)+N_{2}(0)=0}$ almost surely

(ii) For ${\displaystyle t>s}$ is ${\displaystyle (N_{1}(t)+N_{2}(t)-N_{1}(s)-N_{2}(s)}$ independent of ${\displaystyle N_{1}(s)+N_{2}(s)}$? This is true since both ${\displaystyle N_{1},N_{2}}$ are Poisson Processes and are both independent of each other.

(iii) For ${\displaystyle t>s}$ is ${\displaystyle (N_{1}(t)+N_{2}(t)-N_{1}(s)-N_{2}(s)}$ distributed Poisson with parameter ${\displaystyle t-s}$? This is true since the sum of independent Poisson processes are also poison. (see second bullet)

#### Joint distribution of (J,Z)

${\displaystyle P(J=1,Z=x)=\lambda _{1}e^{-\lambda _{1}x}}$

${\displaystyle P(J=2,Z=x)=\lambda _{2}e^{-\lambda _{2}x}}$

## Problem 4

 Let ${\displaystyle (X_{n},{\mathcal {F}}_{n})}$ be a martingale sequence and for each ${\displaystyle n}$ let ${\displaystyle \epsilon _{n}}$ be an ${\displaystyle {\mathcal {F}}_{n-1}}$-measurable random variable. Define ${\displaystyle Y_{n}=\sum _{i=1}^{n}\epsilon _{i}(X_{i}-X_{i-1}),\quad Y_{0}=0}$ Assuming that ${\displaystyle Y_{n}}$ is integrable for each ${\displaystyle n}$, show that ${\displaystyle Y_{n}}$ is a martingale.

## Problem 5

 Let ${\displaystyle X_{1},...,X_{n}}$ be an i.i.d. sequence with ${\displaystyle E[X_{i}]=0}$ and ${\displaystyle V[X_{i}]=\sigma ^{2}<\infty }$. Prove that for any ${\displaystyle \gamma >1/2}$, the series ${\displaystyle \sum _{k=1}^{\infty }X_{k}/k^{\gamma }}$ converges almost surely.

### Solution

Define ${\displaystyle Z_{k}:=X_{k}/k^{\gamma }}$. Then ${\displaystyle E[Z_{k}]=0}$ and ${\displaystyle V[Z_{k}]={\frac {\sigma ^{2}}{k^{2\gamma }}}}$. We check the three components of Kolmogorov's three-series theorem to conclude that ${\displaystyle \sum _{k=1}^{\infty }Z_{k}}$ converges almost surely.

## Problem 6

 Consider the following process ${\displaystyle \{X_{n}\}}$ taking values in ${\displaystyle \{0,1,...\}}$. Assume ${\displaystyle U_{n},n=1,2,...}$ is an i.i.d. sequence of positive integer valued random variables and let ${\displaystyle X_{0}}$ be independent of the ${\displaystyle U_{n}}$. Then ${\displaystyle X_{n}=\left\{{\begin{array}{l l}X_{n-1}-1&{\text{if }}X_{n-1}\neq 0\\U_{k}-1&{\text{if }}X_{n-1}=0{\text{ for the kth time}}\end{array}}\right.}$