Probability with Measure
7.6 Exercises on Chapter 7
On the Borel-Cantelli lemmas
-
7.1 Let \(k\in \N \). Prove that in a sequence of independent coin tosses, infinitely many runs of \(k\) consecutive heads will occur.
-
7.2 Let \((\Omega , \mc {F}, \P )\) be a probability space. Let \((A_n)\) be a sequence of events.
-
(a) Show that \(\{ A_{n}\text { e.v.}\} \subseteq \{A_{n}\text { i.o.}\}\).
-
(b) Show that \(\{A_n\text { i.o.}\}^{c} = \{A_n^c\text { e.v.}\}\) and deduce that \(\P [A_n\text { i.o.}] = 1 - \P [A_n^c\text { e.v.}].\)
-
(c) Show that
\[\P [A_n\text { e.v.}]\;\leq \; \li \P [A_n]\;\leq \; \ls \P [A_{n}] \;\leq \; \P \l [A_{n}\text { i.o.}\r ].\]
-
On convergence of random variables and laws of large numbers
-
7.3 Let \((X_n)\) be a sequence of i.i.d. random variables such that \(\P [X_n=1]=\P [X_n=0]=\frac 12\). Show that \(X_n\stackrel {d}{\to }X_1\) as \(n\to \infty \), but that this convergence does not hold in probability.
-
-
(a) Let \((X_n)\) be a sequence of random variables such that \(\P [X_n=n]=\frac 1{n^2}\) and \(\P [X_n=0]=1-\frac 1{n^2}\). Show that \(X_n\stackrel {a.s}{\to } 0\) and \(X_n\stackrel {L^1}{\to } 0\).
-
(b) Let \((X_n)\) be a sequence of independent random variables such that \(\P [X_n=n]=\frac 1{n}\) and \(\P [X_n=0]=1-\frac 1{n}\). Show that \(X_n\) does not converge to zero almost surely or in \(L^1\).
-
(c) Let \((X_n)\) be a sequence of random variables such that \(\P [X_n=n^2]=\frac 1{n^2}\) and \(\P [X_n=0]=1-\frac 1{n^2}\). Show that \(X_n\stackrel {a.s.}{\to } 0\), and that \(X_n\) does not converge to zero in \(L^1\).
-
(d) Let \((X_n)\) be a sequence of independent random variables such that \(\P [X_n=\sqrt {n}]=\frac 1{n}\) and \(\P [X_n=0]=1-\frac 1{n}\). Show that \(X_n\stackrel {L^1}{\to } 0\), and that \(X_n\) does not converge to almost surely to zero.
-
(e) Deduce that \(X_n\stackrel {\P }{\to } 0\) in all of the above cases.
-
-
7.5 Show that the following sequence \((X_n)\) and candidate limit \(X\) of random variables converges in probability but not almost surely.
Take \(\Omega = [0,1],{\cal F} = {\cal B}([0,1])\) and \(\P \) to be Lebesgue measure. Take \(X = 0\) and define \(X_{n} = {\1}_{A_{n}}\) where \(A_{1} = [0, 1/2], A_{2} = [1/2, 1], A_{3} = [0, 1/4], A_{4} = [1/4, 1/2], A_{5} = [1/2, 3/4], A_{6} = [3/4, 1], A_{7} = [0, 1/8], A_{8} = [1/8, 1/4]\) etc.
-
7.6 Let \((X_n)\) be a sequence of random variables, and let \(X\) and \(Y\) be random variables.
-
(a) Show that if \(X_n\stackrel {d}{\to }X\) and \(X_n\stackrel {d}{\to }Y\) then \(X\) and \(Y\) have the same distribution function i.e. \(F_X=F_Y\).
Hint: If two right-continuous functions are equal almost everywhere, they are equal everywhere.
-
(b) Show that if \(X_n\stackrel {\P }{\to }X\) and \(X_n\stackrel {\P }{\to }Y\) then \(X=Y\) almost surely.
-
-
7.7 Examine the proof of the weak law of large numbers (Theorem 7.3.1) carefully. Show that the conclusion continues to hold if the requirement that the random variables \((X_{n})\) are i.i.d. is replaced by the weaker condition that they are identically distributed and uncorrelated, that is \(\E [X_{m}X_{n}]= \E [X_{m}]\E [X_{n}]\) whenever \(m \neq n\).
-
-
(a) Let \(X\) be a random variable and suppose that \(X\geq 0\). Show that for any \(a\in (0,1]\) we have \(\E [\min (1,X)] \leq a + \P [X\geq a].\)
-
(b) Let \((X_n)\) be a sequence of random variables. Suppose that \(X_n\geq 0\) for all \(n\in \N \). Show that as \(n\to \infty \),
\[X_n\stackrel {\P }{\to } 0 \quad \text {if and only if}\quad \E [\min (1,X_n)]\to 0.\]
Hint: Recall Markov’s inequality (Lemma 4.2.3).
-
On characteristic functions and central limit theorem \((\Delta )\)
-
-
(a) Let \(X\) be a random variable with the \(N(0,1)\) distribution. Let \(\phi \) be the characteristic function of \(X\). Show that \(\phi '(u) = -u\phi _{Y}(u)\) and hence show that \(\phi (x)=e^{-x^2/2}\).
Hint: Use the result of Exercise 4.17 to show that the real and imaginary parts of \(y \rightarrow \phi (u)\) are differentiable.
-
(b) Extend part (a) to cover \(X\sim N(\mu ,\sigma ^2)\).
-
-
7.10 The first central limit theorem to be established was due to de Moivre and Laplace. In this case each \(X_{n}\) takes only two values, \(\P [X_n=1]=p\) and \(\P [X_n=-1]=1-p\), where \(p\in [0,1]\). Write down the theorem in this special case, in terms of \(S_{n} = X_{1} + X_{2} + \cdots + X_{n}\), and explain how it can be used to justify binomial approximations to the normal distribution.
-
7.11 The following result, which you do not need to prove, comes from real analysis.
Dini’s Theorem. Let \(a<b\). For each \(n\in \N \) let \(f_n:[a,b]\to \R \) be a continuous function and suppose that \(f_n\leq f_{n+1}\) for all \(n\). If the pointwise limit \(f_n\to f\) exists, and if \(f:[a,b]\to \R \) is continuous, then \(f_n\to f\) uniformly.
Let \(f_n(x)=(1+\frac {x}{n})^n\). Use the AM-GM inequality from Exercise 6.4 to show that \(f_n(x)\leq f_{n+1}(x)\) for \(x\geq 0\) and all \(n\in \N \). Hence, use Dini’s theorem to prove Lemma 7.5.2.
Challenge questions
-
-
(a) Let \((X_n)\) be a sequence of random variables and let \(c\in \R \) be deterministic. Suppose that \(X_n\stackrel {d}{\to } c\). Show that \(X_n\stackrel {\P }{\to } c\).
-
(b) Let \((X_n)\) be a sequence of independent random variables and suppose that \(X_n\stackrel {\P }{\to } X\). Show that there exists deterministic \(c\in \R \) such that \(\P [X=c]=1\).
-
-
7.13 A sequence \((X_n)\) of random variables is said to converge completely to the random variable \(X\) if
\[ \sum _{n} \P [|X_n-X|\geq \eps ]<\infty \quad \text { for all }\eps >0. \]
-
(a) Show that for sequences of independent random variables, complete convergence is equivalent to almost sure convergence.
-
(b) Find a sequence of (dependent) random variables \((X_n)\) that converges almost surely but not completely.
-