Stochastic Processes and Financial Mathematics
(part one)
9.5 In higher dimensions \(\offsyl \)
We will look briefly at the behaviour of simple symmetric random walks in higher dimensions. This section is off-syllabus and is included for interest. In particular, we will consider versions of Lemma 9.3.1 in dimension \(d=2\) and \(d\geq 3\).
We must first explain what the simple symmetric random walk is, in higher dimensions. It will suffice to describe \(d=2\) to make the principle clear. Then, for each \(n\in \N \), \(S_n\) is a random element of \(\Z ^2\), which we write as \(S_n=(S_{n}^{(1)},S_n^{(2)})\). The movements are given by \(S_n=\sum _{i=1}^n X_i\) where \(\P [X_i=(0,1)]=\P [X_i=(0,-1)]=\P [X_i=(1,0)]=\P [X_i=(0,-1)]=\frac 14\). These four possibilities correspond to movements by precisely one unit of space in the four possible directions: north, south, east and west.
Let \(R=\inf \{n\geq 1\-S_n=\v {0}\}\) denote the time taken to return to the origin \(\v {0}=(0,0)\). In two dimensions we have:
Proof: We may use much the same argument as in Lemma 9.3.1. The only difference is that we must replace the estimate 9.9 with a two dimensional version, as follows. We have
\(\seteqnumber{0}{9.}{11}\)\begin{align} \P [S_{2n}=(0,0)] &= \sum _{i=0}^n\binom {2n}{2i}\binom {2i}{i}\binom {2n-2i}{n-i} 4^{-2n} \notag \\ &= \binom {2n}{n}\sum _{k=0}^n\binom {n}{k}^2 4^{-2n} \notag \\ &= \l [\binom {2n}{n}2^{-2n}\r ]^2 \notag \\ &\sim \frac {1}{\pi n}. \label {eq:rw_return_time_d_2_eq_1} \end{align} In the above, the first line comes from counting the number of possible ways in which the two dimensional walk can start at \(0\) and returns to \(0\) after exactly \(2n\) steps (we did the one dimensional version in Exercise 7.11). To do so the walk must have made the same number of steps north as it did south, and the same number of steps east as it did west. We write \(2i\) for the number of east-west steps and \(2j=2n-2i\) for the number of north-south steps. The first binomial factor counts which steps are east-west and which are north-south. The second and third, respectively, count which of the east-west steps are to the east, and which of the north-south steps are to the north – and that fixes everything else about the path.
The second line of the above calculation follows easily from the first, and you should write the factorials out in full to check this for yourself. The third line follows from the second by the combinatorical identity \(\binom {2n}{n}=\sum _{k=0}^n \binom {n}{k}\). The final line follows from the same calculation as in the right hand side of (9.9).
With (9.12) in place of (9.9), the rest of the argument for this lemma follows in the same way as that of Lemma 9.3.1. We’ll omit writing out the details again. ∎
We won’t prove it here, but the simple symmetric random walk will take much longer to return to the origin in \(d=2\) than \(d=1\). This is because in order to reach \((0,0)\) both the east-west component \(S^{(1)}_n\) and the north-south component \(S^{(2)}_n\) must return simultaneously to zero, and they unlikely to do this at the same time. Perhaps surprisingly, in two dimensions the random walk still manages to return. The picture changes if we move to three dimensions, with the obvious notation: now \(S_n\in \Z ^3\) and there are 6 different directions to move in, each with probability \(\frac 16\). In three dimensions there is now so much space to get lost in that the random walk might not return to the origin.
Sketch of Proof: We will only give a sketch of the argument. We need to make some substantial modifications to the argument used in Lemmas 9.3.1 and 9.5.1, because we are now trying to prove the opposite result, but there are still parts that overlap. In similar style to (9.12) we have
\(\seteqnumber{0}{9.}{12}\)\begin{align*} \P [S_{2n}=(0,0,0)] &= \sum _{i+j+k=n}\frac {(2n)!}{(i!)^2(j!)^2(k!)^2}6^{-2n} \\ &= \binom {2n}{n} 2^{-2n} \sum _{i+j+k=n} \l (\frac {n!}{i!j!k!}3^{-n}\r )^2 \\ &\leq \binom {2n}{n} 2^{-2n} \l (\max _{i+j+k=n} \frac {n!}{i!j!k!}3^{-n}\r )\l (\sum _{i+j+k=n} \frac {n!}{i!j!k!}3^{-n}\r ). \end{align*} The first line again concerns counting possible ways to return to the origin in exactly \(2n\) steps (we’ll omit the details of this) and the next two lines follow by elementary calculations. We have that \(\sum _{i+j+k=n} \frac {n!}{i!j!k!}3^{-n}=1\), because this sums up all the probabilities of outcomes \((i,j,k)\) that sum to \(n\) according to the trinomial distribution (which you might have to look up, if you haven’t seen it before). A bit of calculus with Lagrange multipliers shows that the term \(\frac {n!}{i!j!k!}3^{-n}\), subject to the condition \(i+j+k=n\), is maximised when \(i,j,k\) are all approximately equal to \(\frac {n}{3}\). Approximating \(\frac {n}{3}\) by the nearest integer, we can find that \(\max _{i+j+k=n} \frac {n!}{i!j!k!}3^{-n}\leq \frac {C}{n}\), where \(C\in (0,\infty )\) is a constant. Combining this with the calculation from (9.9) we obtain that \(\P [S_{2n}=\v {0}]\leq \frac {C'}{n^{3/2}}\), where \(C'\in (0,\infty )\) is also constant.
Setting \(G=\sum _{n=1}^\infty \1\{S_n=\v {0}\}\) we now have
\[\E [G]=\sum _{n=1}^\infty \P [S_{2n}=\v {0}]\leq \sum _{n=1}^\infty \frac {C'}{n^{3/2}}<\infty .\]
Hence the expected number of visits to the origin is finite. By the strong Markov property, if \(\P [R<\infty ]=1\) then we would almost surely have infinitely many returns to the origin. Hence \(\P [R<\infty ]<1.\) ∎
Lemma 9.5.2 also holds in all dimensions \(d\geq 3\), To see this, note that the first three coordinates of a \(d\)-dimensional random walk give us a three dimensional random walk. If the \(d\)-dimensional walk visits the origin, so do its first three coordinates, but Lemma 9.5.2 gives that this has probability less than one.
It is natural to ask what the random walk does do, in three dimensions and higher, as \(n\to \infty \). The answer is easy to guess: it disappears off to infinity and does not come back i.e. \(|S_n|\stackrel {a.s.}{\to }\infty \). This is left for you to prove, as Exercise 9.9.