# UMD Probability Qualifying Exams/Jan2010Probability

## Problem 1

 Let $\{X_{nk}\},k=1,...,r_{n},n=1,2,...$ be a triangular array of Bernoulli random variables with $p_{nk}=P[X_{nk}=1]$ . Suppose that $\sum _{k=1}^{r_{n}}p_{nk}\to \lambda \,{\text{ and }}\,\max _{k\leq r_{n}}p_{nk}\to 0.$ Find the limiting distribution of $\sum _{k=1}^{r_{n}}X_{nk}$ .

### Solution

We will show it converges to a Poisson distribution with parameter $\lambda$ . The characteristic function for the Poisson distribution is $e^{\lambda (e^{it}-1)}$ . We show the characteristic function, $E[\exp(it\sum _{k=1}^{r_{n}}X_{nk})]$ converges to $e^{\lambda (e^{it}-1)}$ , which implies the result.

$\log E[\exp(it\sum _{k=1}^{r_{n}}X_{nk})]=\sum _{k=1}^{r_{n}}\log((1-p_{nk})+p_{nk}e^{it})=\sum _{k=1}^{r_{n}}\log(1-p_{nk}(1-e^{it}))=\sum _{k=1}^{r_{n}}(-p_{nk}(1-e^{it})+O(p_{nk}^{2}))$ . By our assumptions, this converges to $\lambda (e^{it}-1)$ .

## Problem 2

 Let $X_{1},X_{2},...$ be a sequence of i.i.d. random variables with uniform distribution on $[0,1]$ . Prove that $\lim _{n\to \infty }(X_{1}X_{2}\cdots X_{n})^{1/n}$ exists with probability one and compute its value.

### Solution

Let $Y_{n}=(X_{1}X_{2}\cdots X_{n})^{1/n}$ .

$\log(Y_{n})={\frac {1}{n}}\sum _{j=1}^{n}\log(X_{j})$ .

The random variables $\log(X_{j})$ are i.i.d. with finite mean,

$E[\log(X_{j})]=\int _{0}^{1}\log(t)dt=-1$ .

Therefore, the strong law of large numbers implies ${\frac {1}{n}}\sum _{j=1}^{n}\log(X_{j})$ converges with probability one to $-1$ .

So almost surely, $\log(Y_{n})$ converges to $-1$ and $Y_{n}$ converges to $e^{-1}$ .

## Problem 3

 Let $\{X_{n}|n=0,1,2,...\}$ be a square integrable martingale with respect to a nested sequence of $\sigma$ -fields $\{{\mathcal {F}}_{n}\}$ . Assume $E[X_{n}]=0$ . Prove that $P[\max _{1\leq k\leq n}|X_{k}|>\epsilon ]\leq E[X_{n}^{2}]/\epsilon ^{2}$ .

### Solution

Since $X_{n}$ is a martingale, $|X_{n}|$ is a non-negative submartingale and $E[|X_{n}|^{2}]<\infty$ since $X_{n}$ is square integrable. Thus $|X_{n}|$ meets the conditions for Doob's Martingale Inequality and the result follows.

## Problem 4

 The random variable $X$ is defined on a probability space $(\Omega ,{\mathcal {F}},P)$ . Let ${\mathcal {G}}_{1}\subset {\mathcal {G}}_{2}\subset {\mathcal {F}}$ and assume $X$ has finite variance. Prove that $E[(X-E[X|{\mathcal {G}}_{2}])^{2}]\leq E[(X-E[X|{\mathcal {G}}_{1}])^{2}].$ In words, the dispersion of $X$ about its conditional mean becomes smaller as the $\sigma$ -field grows.

### Solution

$E[(X-E[X|{\mathcal {G}}_{1}])^{2}]=E[((X-E[X|{\mathcal {G}}_{2}])+(E[X|{\mathcal {G}}_{2}]-E[X|{\mathcal {G}}_{1}]))^{2}]$ $=E[(X-E[X|{\mathcal {G}}_{2}])^{2}]+E[(E[X|{\mathcal {G}}_{2}]-E[X|{\mathcal {G}}_{1}])^{2}]+2E[(X-E[X|{\mathcal {G}}_{2}])(E[X|{\mathcal {G}}_{2}]-E[X|{\mathcal {G}}_{1}])]$ We will show that the third term vanishes. Then since the second term is nonnegative, the result follows.

$E[(X-E[X|{\mathcal {G}}_{2}])(E[X|{\mathcal {G}}_{2}]-E[X|{\mathcal {G}}_{1}])]=E[E[(X-E[X|{\mathcal {G}}_{2}])(E[X|{\mathcal {G}}_{2}]-E[X|{\mathcal {G}}_{1}])|{\mathcal {G}}_{2}]]$ by the law of total probability.

$E[(X-E[X|{\mathcal {G}}_{2}])(E[X|{\mathcal {G}}_{2}]-E[X|{\mathcal {G}}_{1}])|{\mathcal {G}}_{2}]=(E[X|{\mathcal {G}}_{2}]-E[X|{\mathcal {G}}_{1}])E[(X-E[X|{\mathcal {G}}_{2}])|{\mathcal {G}}_{2}]$ , since $(E[X|{\mathcal {G}}_{2}]-E[X|{\mathcal {G}}_{1}])$ is ${\mathcal {G}}_{2}$ -measurable.

Finally, $E[(X-E[X|{\mathcal {G}}_{2}])|{\mathcal {G}}_{2}]=E[X|{\mathcal {G}}_{2}]-E[E[X|{\mathcal {G}}_{2}]|{\mathcal {G}}_{2}]=E[X|{\mathcal {G}}_{2}]-E[X|{\mathcal {G}}_{2}]=0$ ## Problem 5

 Consider a sequence of random variables $X_{1},X_{2},\ldots$ such that $X_{n}=1{\text{ or }}0$ . Assume $P[X_{1}=1]\geq \alpha$ and $P[X_{n}=1|X_{1},\ldots ,X_{n-1}]\geq \alpha >0{\text{ for n=2,3,}}\ldots$ Prove that (a.) $P[X_{n}=1{\text{ for some n}}]=1.$ (b). $P[X_{n}=1{\text{ infinitely often}}]=1.$ ### Solution

We show $P[X_{n}=1{\text{ finitely often}}]=0.$ . If $X_{n}=1$ for only finitely many $n$ , then there is a largest index $T$ for which $X_{T}=1$ . We show in contrast that for all $T$ , $P[X_{n}=0{\text{ for all }}n\geq T]=0$ .

First notice, $P[X_{1}=0]\leq (1-\alpha )$ and $P[X_{T}=0]=E[P[X_{T}=0|X_{1},X_{2},\ldots ,X_{T-1}]]\leq (1-\alpha ){\text{ for T}}>1$ .

Then let $A_{n}^{(T)}$ be the event $[X_{T+n-1}=\ldots =X_{T}=0]$ , then $P[X_{n}=0{\text{ for all }}n\geq T]=P[A_{n}^{(T)}{\text{ occurs for all n}}]$ .

Notice $P[A_{n}^{(T)}]=P[X_{T+n-1}=0|A_{n-1}^{(T)}]P[A_{n-1}^{(T)}]\leq (1-\alpha )P[A_{n-1}^{(T)}]{\text{ for n =2,3,}}\ldots$ and $P[A_{1}^{(T)}]=P[X_{T}=0]\leq (1-\alpha )$ . Therefore $P[A_{n}^{(T)}]\leq (1-\alpha )^{n}$ and $\lim _{n\rightarrow \infty }P[A_{n}^{(T)}]=0$ . So $P[X_{n}=0{\text{ for all n}}\geq T]=0$ and we reach the desired conclusion.

## Problem 6

 Let $\{N(t):t\geq 0\}$ be a nonhomogeneous Poisson process. That is, $N(0)=0$ a.s., $N(t)$ has independent increments, and $N(t)-N(s)$ has a Poisson distribution with parameter $\int _{s}^{t}\lambda (u)du$ where $0\leq s\leq t$ and the rate function $\lambda (u)$ is a continuous positive function. (a.) Find a continuous strictly increasing function $h(t)$ such that the time-transformed process ${\tilde {N}}(t)=N(h(t))$ is a homogeneous Poisson process with rate parameter 1. (b.) Let $T$ be the time until the first event in the nonhomogeneous process $N(t)$ . Compute $P[T>t]$ and $P[T>t|N(s)=n]{\text{ where }}s>t$ 