Stochastic Processes and Financial Mathematics
(part one)
7.5 Exercises on Chapter 7
On the martingale transform
-
7.1 Let \(S_n=\sum _{i=1}^n X_i\) be the symmetric random walk, from Section 7.4. In each of the following cases, establish the given formula for \((C\circ S)_n\).
-
(a) If \(C_n=0\), show that \((C\circ S)_n=0\).
-
(b) If \(C_n=1\), show that \((C\circ S)_n=S_n\).
-
(c) If \(C_n=S_n\), show that \((C\circ S)_n=\frac {S_n^2}{2}-\frac {n}{2}\)
-
-
7.2 Let \(M_n\) be a stochastic process. Let \(\alpha ,\beta \in \R \) and let \(C_n\) and \(D_n\) be adapted stochastic processes. Let \(X_n=\alpha C_n+\beta D_n\). Show that
\[(X \circ M)_n=\alpha (C\circ M)_n+\beta (D\circ M)_n\]
for all \(n\).
On long-term behaviour of stochastic processes
-
7.3 Let \((X_n)\) be a sequence of independent random variables, with distribution
\[\P [X_n=1]=\P [X_n=-1]=\frac {1}{2n^2}\]
and \(\P [X_n=0]=1-\frac {1}{n^2}\). Define
\[S_n=\sum \limits _{i=1}^n X_i\]
where we take \(S_0=0\).
-
(a) Show that \(S_n\) is a martingale, and deduce that there exists a real-valued random variable \(S_\infty \) such that \(S_n\stackrel {a.s.}{\to }S_\infty \) as \(n\to \infty \).
-
(b) Show that, almost surely, there exists some \(N\in \N \) such that \(X_n=0\) for all \(n\geq N\).
-
-
7.4 Write simulations of the symmetric & asymmetric random walks (in a language of your choice). Add functionality to draw the random walk as a graph, with time on the horizontal axis and the value of the walk on the vertical axis.
Look at several samples from your simulations, with e.g. \(1000\) steps of time, and check that they support the claims made in Section 7.4, about the long-term behaviour of random walks.
Modify your simulation to simulate the random walks in Exercise 7.3 and Question 2 of Assignment 3. Check that your graphs support the result that, in both cases, \(S_n\stackrel {a.s.}{\to }S_\infty \). From your graphs, do you notice a difference in behaviour between these two cases?
-
7.5 Let \(M_n=S_n-L_n\) be the martingale defined in exercise 4.5. Show that \((M_n)\) is not uniformly bounded in \(L^1\).
-
7.6 Recall the Galton-Watson process \((Z_n)\) from Section 7.4, and recall that it is parametrized by its offspring distribution \(G\).
-
(a) Give an example of an offspring distribution \(G\) for which \(\P [Z_n\text { dies out}]=1\).
-
(b) Give an example of an offspring distribution \(G\) for which \(\P [Z_n\text { dies out}]=0\).
-
-
7.7 Consider the following modification of the Pólya urn process. At time \(n=0\), the urn contains one red ball and one black ball. Then, at each time \(n=1,2,\ldots \), we draw a ball from the urn. We place this ball back into the urn, and add one ball of the opposite colour to the urn; so if we drew a red ball, we would add a black ball, and vice versa.
Therefore, at time \(n\) (which means: after the \(n^{th}\) draw is complete) the urn contains \(n+2\) balls. Let \(B_n\) denote the number of red balls in the urn at time \(n\), and let \(M_n=\frac {B_n}{n+2}\) denote the fraction of red balls in the urn at time \(n\).
-
(a) Calculate \(\E [M_{n+1}\|\mc {F}_n]\) and hence show that \(M_n\) is not a martingale, with respect to the filtration \(\mc {F}_n=\sigma (B_1,\ldots ,B_n)\).
-
(b) Write a simulation of the urn (in a language of your choice) and use your simulation to make a conjecture about the value of the almost sure limit of \(M_n\) as \(n\to \infty \). Does this limit depend on the initial state of the urn?
-
-
7.8 Consider an urn that, at time \(n=0\), contains \(K\geq 1\) balls, each of which is either black or red. At each time \(n=1,2,\ldots \), we do the following, in order:
-
1. Draw a ball \(X_1\) from the urn, and record its colour. Place \(X_1\) back into the urn.
-
2. Draw a ball \(X_2\) from the urn, and discard it.
-
3. Place a new ball, with the same colour as \(X_1\), into the urn.
Thus, for all time, the urn contains exactly \(K\) balls. We write \(M_n\) for the fraction of red balls in the urn, after the \(n^{th}\) iteration of the above steps is complete.
-
(a) Show that \(M_n\) is a martingale.
-
(b) Show that there exists a random variable \(M_\infty \) such that \(M_n\stackrel {a.s.}{\to }M_\infty \) as \(n\to \infty \), and deduce that \(\P [M_\infty =0\text { or }M_\infty =1]=1\).
This process is known as the discrete time ‘Moran model’, and is a model for the evolution of a population that contains a fixed number \(K\) of individual organisms – represented as \(K\) balls. At each time \(n\), an individual \(X_2\) (is chosen and) dies and an individual \(X_1\) (is chosen and) reproduces.
Although this model is a highly simplified version of reality, with careful enough application it turns out to be very useful. For example, it is the basis for current methods of reconstructing genealogical trees from data obtained by genome sequencing.
-
-
7.9 Consider the Pólya urn process from Section 7.4. Suppose that we begin our urn, at time \(n=0\), with two red balls and one black ball. Let \(M_n\) denote the resulting fraction of red balls in the urn at time \(n\).
-
(a) Show that \(M_n\) does not converge almost surely to \(0\).
-
(b) Write a simulation of the Pólya urn process (in a language of your choice) and compare the effect of different initial conditions on \(M_\infty \).
-
-
7.10 Let \((S_n)\) denote the simple asymmetric random walk, with \(q>p\). In Exercise 4.2 we showed that
\[M_n = (q/p)^{S_n}\]
is a martingale.
-
(a) Show that there exists a real valued random variable \(M_\infty \) such that \(M_n\stackrel {a.s.}{\to } M_\infty \).
-
(b) Deduce that \(\P [M_\infty =0]=1\) and that \((M_n)\) is not uniformly bounded in \(L^2\).
-
(c) Use (a) and (b) to show that \(S_n\stackrel {a.s.}{\to } -\infty \). Explain briefly why this means that \(S_n\stackrel {a.s.}{\to }\infty \) for asymmetric random walks with \(p>q\).
-
-
7.11 Let \(S_n\) denote the symmetric random walk, from Section 7.4. Recall that \(S_0=0\).
-
(a) Show that \(S_n\) is even when \(n\) is even, and odd when \(n\) is odd.
-
(b) Define \(p_{n}=\P [S_{n}=0]\). Show that \(p_{2n}=\binom {2n}{n}2^{-2n}\) and \(p_{2n+1}=0\).
(Hint: Count the number of ways to return to zero after precisely \(2n\) steps.)
-
(c) Show that \(p_{2(n+1)}=\big (1-\frac {1}{2(n+1)}\big )p_{2n}\) for all \(n\), and hence show that \(p_{2n}\to 0\) as \(n\to \infty \).
(Hint: Use the inequality \(1-x\leq e^{-x}\), which holds for all \(x\geq 0\).)
-
-
7.12 Let \(S_n\) denote the symmetric random walk, from Section 7.4. Let \(f:\N \to \N \) be a deterministic function.
-
(a) Show that if \(\frac {S_n}{f(n)}\) is a martingale (for \(n\geq 1\)), then \(f\) is constant.
-
(b) Show that if \(\frac {S_n}{f(n)}\) is a supermartingale (for \(n\geq 1\)), then \(f\) is constant.
-
Challenge questions
-
7.13 In this question we establish the formula (7.10), which we used in the proof of Lemma 7.4.8.
Let \((Z_n)\) be the Galton-Watson process, with the offspring distribution \(G\). Suppose that \(\E [G]=\mu \) and \(\var (G)=\sigma ^2<\infty \). Set \(M_n=\frac {Z_n}{\mu ^n}\).
-
(a) Show that
\[\E [(M_{n+1}-M_n)^2\|\mc {F}_n]=\frac {Z_n\sigma ^2}{\mu ^{2(n+1)}},\]
-
(b) Deduce from part (a) and exercise 3.6 that \(\E [M_{n+1}^2]=\E [M_n^2]+\frac {\sigma ^2}{\mu ^{n+2}}.\)
-
-
7.14 In this question we prove the inequality (7.4), which we used in the proof of the martingale convergence theorem.
Let \(M_n\) be a sequence of random variables such that \(M_n\stackrel {a.s.}{\to }M_\infty \). Define
\[X_n=\inf _{k\geq n} |M_k|.\]
-
(a) Explain why \((X_n)\) is an increasing sequence and, hence, why there is a random variable \(X_\infty \) such that \(X_n\stackrel {a.s.}{\to }X_\infty \).
-
(b) Show that, for all \(\epsilon >0\) and all \(n\in \N \) there exists some \(n'\geq n\) such that
\[|M_{n'}|-\epsilon \leq X_n\leq |M_n|.\]
-
(c) Deduce that \(X_\infty =|M_\infty |\).
-
(d) Check that the monotone convergence theorem applies to \((X_n)\).
-
(e) Deduce that \(\E [|M_\infty |]\leq \sup _{n\in \N }\E [|M_n|]\).
-
-
7.15 In this question we give a rigorous proof of Lemma 7.4.7.
Let \(Z_n\) and \(X^{n+1}_i\) be as in (7.9) (i.e. \(Z_n\) is a Galton-Watson process) and suppose that \(\E [X^{n+1}_i]=\mu <1\).
Let \(\alpha \in [0,1]\). For each \(i,n\) we define an independent random variable \(C^{n+1}_i\), with the same distribution as \(C\) where \(\P \l [C=1\r ]=\alpha \) and \(\P \l [C=0\r ]=1-\alpha \). We define
\(\seteqnumber{0}{7.}{10}\)\begin{equation} \label {eq:gw_coupling} \wt {X}^{n+1}_i= \begin{cases} 0 & \text { if }X^{n+1}_i=0\text { and }C^{n+1}_i=0\\ 1 & \text { if }X^{n+1}_i=0\text { and }C^{n+1}_i=1\\ X^{n+1}_i & \text { if }X^{n+1}_i\geq 1. \end {cases} \end{equation}
Define \(\wt {Z}_n\) by setting \(\wt {Z}_0=1\), and then using (7.9) with \(\wt {Z}_n\) in place of \(Z_n\) and \(\wt {X}^{n+1}_i\) in place of \(X^{n+1}_i\). Define \(f:[0,1]\to \R \) by \(f(\alpha )=\E [X^{n+1}_i]\).
-
(a) Convince yourself that \(\wt {Z}_n\) is a Galton-Watson process, with offspring distribution given by (7.11).
-
(b) Explain briefly why \(0\leq Z_n\leq \wt {Z}_n\) for all \(n\).
-
(c) Show that \(f(0)<1\) and \(f(1)\geq 1\). Deduce that there exists a value \(\alpha \in [0,1]\) such that \(f(\alpha )=1\).
-
(d) Show that \(\P [Z_n\text { dies out}]=1\).
-