last updated: May 9, 2024

Probability with Measure

5.5 Exercises on Chapter 5

On probability as measure
  • 5.1 Write down probabilistic versions of the following results, using the notation of probability theory that was introduced in Section 5.1. You should use probability in place of measure, random variables in place of measurable functions, expectation in place of integration, etc.

    • (a) The monotone and dominated convergence theorems (Theorems 4.3.1 and 4.6.2).

    • (b) Markov’s and Chebyshev’s inequalities (Lemma 4.2.3 and Exercise 4.3).

    • (c) Theorem 4.4.1.

    • (d) \((\star )\) Fatou’s lemma (Lemma 4.6.3).

  • 5.2 Using the version of Chebyshev’s inequality that you found in Exercise 5.1, show that if \(X\) is a random variable satisfying \(\var (X)<\infty \) then

    \[\P [|X-\E [X]|\geq c]\leq \frac {\var (X)}{c^2}.\]

    Within probability, this is the most common form in which to apply Chebyshev’s inequality.

  • 5.3 Let \(a,b\in \R \) with \(a<b\) and let \(U\) be a continuous uniform random variable on \([a,b]\), which means that the p.d.f. of \(U\) is the function \(f(u)=\1_{(a,b)}(x)\frac {1}{b-a}\). Let \(A\in \mc {B}([a,b])\). Find \(\P [U\in A]\) in terms of the Lebesgue measure of \(A\).

  • 5.4 Let \(X:\Omega \to \R \) be a random variable with cumulative distribution function \(F\).

    • (a) Deduce that \(\P [X > x] = 1 - F(x)\) and \(\P [x < X \leq y] = F(y) - F(x)\) for all \(x < y\).

    • (b) Prove the last part of Lemma 5.2.1: show that \(F(x)\to 0\) as \(x\to -\infty \) and \(F(x)\to 1\) as \(x\to \infty \).

      Hint: Use the same method as for the first two parts of the lemma.

  • 5.5 Let \(X\) be a random variable. Show that there are at most countably many \(x\in \R \) such that \(\P [X=x]>0\).

    Hint: What happens to \(F_X(x)\) at \(x\) such that \(\P [X=x]>0\)?

  • 5.6 Let \((\Omega ,\mc {F},m)\) be a measure space, where \(m(\Omega )<\infty \). From Problem 1.5, recall that \(\P [A]=\frac {m(A)}{m(S)}\) defines a probability measure on \((S,\Sigma )\). Show that \(\E [X]=\frac {1}{m(S)}\int _S X\,dm\) for all \(X\in \mc {L}^1(\Omega ,\mc {F},m)\).

    Hint: Try simple functions first.

On independence
  • 5.7

    • (a) Let \((A_n)\) be a sequence of independent events. Show that

      \begin{equation} \label {eq:inf_indep} \P \l [\bigcap _{n\in \N } A_n\r ]=\prod _{n=1}^\infty \P [A_n]. \end{equation}

    • (b) Recall that we define independence of a sequence of events \((A_{n})\) in terms of finite subsequences (e.g. as in Section 5.4). An ‘obvious’ alternative definition might to be use (5.5) instead. Why is this not a sensible idea?

  • 5.8

    • (a) Let \(A\) and \(B\) be independent events. Show that their complements \(A^c\) and \(B^c\) are also independent.

    • (b) Let \(X\) and \(Y\) be independent random variables and \(f,g:\R \rightarrow \R \) be Borel measurable. Deduce that \(f(X)\) and \(g(Y)\) are also independent.

  • 5.9

    • (a) Let \(U\) be a random variable such that \(\P [U=-1]=\P [U=1]=\frac 12\) and let \(V\) be a random variable such that \(\P [V=0]=\P [V=1]=\frac 12\), independent of \(U\). Let \(X=UV\) and \(Y=U(1-V)\). Show that \(\E [XY]=\E [X]\E [Y]\) but that \(X\) and \(Y\) are not independent.

    • (b) Let \(X,Y,Z\) be random variables, where \(X\) and \(Y\) are independent of each other with \(\P [X=1]=\P [X=-1]=\P [Y=1]=-\P [Y=-1]=\frac 12\), and \(Z=XY\). Show that any pair within \(\{X,Y,Z\}\) are independent of each other, but that \(\{X,Y,Z\}\) is not a set of independent random variables.

On properties of random variables
  • 5.10 Let \(M\in [0,\infty )\). Suppose that \((X_n)\) is a sequence of random variables such that for each \(n\) we have \(|X_n|\leq M\), and suppose that \(X_n\toas X\). Show that \(\E [X_n]\to \E [X]\).

  • 5.11

    • (a) Let \(X\) be a random variable that takes values in \(\N \cup \{0\}\). Explain why \(X = \sum _{i=1}^{\infty }{\1}_{\{X \geq i\}}\) and hence show that

      \[\E [X] = \sum _{i=1}^{\infty }\P [X \geq i].\]

    • (b) Let \(Y\) be a random variable taking values in \([0,\infty )\). Use part (a) to deduce that \(\sum _{k=1}^{\infty }\P [Y \geq k] \;\leq \; \E [Y] \;\leq \; 1 + \sum _{k=1}^{\infty }\P [Y \geq k].\)

  • 5.12

    • (a) Suppose that \(X\) and \(Y\) are random variables and both \(X^{2}\) and \(Y^{2}\) are in \(L^1\). Prove the Cauchy-Schwarz inequality:

      \[ |\E [XY]| \leq \l (\E [X^{2}]^{\frac {1}{2}}\r )\l (\E [Y^{2}]^{\frac {1}{2}}\r ).\]

      Hint: Consider \(g(t) = \E [(X + tY)^{2}]\) as a quadratic function of \(t \in \R \). Note that a quadratic function \(ax^2+bx+c\) with at most one real root must satisfy \(b^2-4ac\leq 0\).

    • (b) Deduce that if \(X^{2}\in L^1\) then also \(X\in L^1\), and in fact \(|\E [X]|^{2} \leq \E [X^{2}]\).

    • (c) Let \(X\) be any random variable with a finite mean \(\E [X]=\mu \). Show that \(\E [X^{2}] < \infty \) if and only if \(\var (X) < \infty \).

  • 5.13 A random variable is said to have an \(a\)th exponential moment if \(\E [e^{a|X|}] < \infty \), where \(a>0\).

    • (a) Let \(X\) be a non-negative random variable and \(a > 0\). Show that \(\E [e^{-aX} ] \leq 1\).

    • (b) Let \(X\) be a random variable with an exponential moment. Show that \(\E [|X|^{n}] < \infty \) for all \(\nN \).

  • 5.14 Let \(X\) be a real-valued random variable with law \(p_{X}\) defined on a probability space \((\Omega , {\cal F}, \P )\). Show that for all bounded measurable functions \(f:\R \rightarrow \R \),

    \[ \int _{\Omega }f(X(\omega ))\,d\P (\omega ) = \int _{\R }f(x)\,dp_{X}(x).\]

    What can you say about these integrals when \(f\) is non-negative but not necessarily bounded?

    Hint: Begin with \(f\) an indicator function, then extend to simple, bounded non-negative and general bounded measurable functions.

Challenge questions
  • 5.15

    • (a) Let \(\eps >0\). Let \((E_n)\) be a sequence of independent events such that \(\P [E_n]\geq \eps \) for all \(n\in \N \). Show that \(\P [\cup _{n\in \N } E_n]=1\).

    • (b) Let \((\Omega ,\mc {F},\P )\) be a probability space and let \(\eps >0\). Suppose that \((E_n)_{n\in \N }\) is a sequence of independent events, with \(\P [E_n]\in (\eps ,1-\eps )\) for all \(n\in \N \). Show that \(\P [\omega ]=0\) for all \(\omega \in \Omega \) and, hence, deduce that \(\Omega \) is uncountable.