Stochastic Processes and Financial Mathematics
(part one)
3.3 Martingales
In this section we introduce martingales, which are the mathematical representation of a ‘fair game’. As usual, let \((\Om ,\F ,\P )\) be a probability space. We refer to a sequence of random variables \((S_n)_{n=0}^\infty \) as a stochastic process. In this section of the course we only deal with discrete time stochastic processes.
We have previously discussed the idea of gradually learning more and more information about the outcome of some experiment, through seeing the information visible from gradually larger \(\sigma \)-fields. We formalize this concept as follows.
We should think of the filtration \(\mc {F}_n\) as telling us which information we have access too at time \(n=1,2,\ldots \). Thus, an adapted process is a process whose (random) value we know at all times \(n\in \N \).
We are now ready to give the definition of a martingale.
-
Definition 3.3.3 A process \(M=(M_n)_{n=0}^\infty \) is a martingale if
-
1. if \((M_n)\) is adapted,
-
2. \(M_n\in L^1\) for all \(n\),
-
3. \(\E [M_{n+1}\|\F _{n}]=M_{n}\) almost surely, for all \(n\).
We say that \(M\) is a submartingale if, instead of 3, we have \(\E [M_n\|\F _{n-1}]\geq M_{n-1}\) almost surely.
We say that \(M\) is a supermartingale if, instead of 3, we have \(\E [M_n\|\F _{n-1}]\leq M_{n-1}\) almost surely. -
-
Remark 3.3.4 The second condition in Definition 3.3.3 is needed for the third to make sense.
A martingale is the mathematical idealization of a fair game. It is best to understand what we mean by this through an example.
Let \((X_n)\) be a sequence of i.i.d. random variables such that
\[\P [X_i=1]=\P [X_i=-1]=\frac {1}{2}.\]
Define \(\mc {F}_n=\sigma (X_1,\ldots ,X_n)\) and \(\mc {F}_0=\{\emptyset ,\Omega \}\). Then \((\mc {F}_n)\) is a filtration. Define
\[S_n=\sum \limits _{i=1}^n X_i\]
(and \(S_0=0\)). We can think of \(S_n\) as a game in the following way. At each time \(n=1,2,\ldots \) we toss a coin. We win if the \(n^{th}\) round if the coin is heads, and lose if it is tails. Each time we win we score \(1\), each time we lose we score \(-1\). Thus, \(S_n\) is our score after \(n\) rounds. The process \(S_n\) is often called a simple random walk.
We claim that \(S_n\) is a martingale. To see this, we check the three properties in the definition. (1) Since \(X_1,X_2,\ldots ,X_n\in m\sigma (X_1,\ldots ,X_n)\) we have that \(S_n\in \mc {F}_n\) for all \(n\in \N \). (2) Since \(|S_n|\leq n\) for all \(n\in \N \), \(\E [|S_n|]\leq n\) for all \(n\), so \(S_n\in L^1\) for all \(n\). (3) We have
\(\seteqnumber{0}{3.}{3}\)\begin{align*} \E \l [S_{n+1}\|\mc {F}_{n}\r ]&=\E [X_{n+1}\|\mc {F}_n]+\E [S_n\|\mc {F}_n]\\ &=\E [X_{n+1}]+S_n\\ &=S_n. \end{align*} Here, in the first line we used the linearity of conditional expectation. To deduce the second line we used the relationship between independence and conditional expectation (for the first term) and the measurability rule (for the second term). To deduce the final line we used that \(\E [X_{n+1}]=(1)\frac 12+(-1)\frac 12=0\).
At time \(n\) we have seen the result of rounds \(1,2,\ldots ,n\), so the information we currently have access to is given by \(\mc {F}_n\). This means that at time \(n\) we know \(S_1,\ldots ,S_n\). But we don’t know \(S_{n+1}\), because \(S_{n+1}\) is not \(\mc {F}_n\)-measurable. However, using our current information we can make our best guess at what \(S_{n+1}\) will be, which naturally is \(\E [S_{n+1}|\mc {F}_n]\). Since the game is fair, in the future, on average we do not expect to win more than we lose, that is \(\E [S_{n+1}|\mc {F}_n]=S_n\).
In this course we will see many examples of martingales, and we will gradually build up an intuition for how to recognize a martingale. There is, however, one easy sufficient (but not necessary) condition under which we can recognize that a stochastic process is not a martingale.
Proof: We have \(\E [M_{n+1}|\mc {F}_n]=M_n\). Taking expectations and using the ‘taking \(\E \)’ property from Proposition 3.2.1, we have \(\E [M_{n+1}]=\E [M_n]\). The result follows by a trivial induction. ∎
Suppose, now, that \((X_n)\) is an i.i.d. sequence of random variables such that \(\P [X_i=2]=\P [X_i=-1]=\frac {1}{2}\). Note that \(\E [X_n]>0\). Define \(S_n\) and \(\mc {F}_n\) as before. Now, \(\E [S_n]=\sum _1^n\E [X_n]\), which is not constant, so \(S_n\) is not a martingale. However, as before, \(S_n\) is \(\mc {F}_n\)-measurable, and \(|S_n|\leq 2n\) so \(S_n\in L^1\), essentially as before. We have
\(\seteqnumber{0}{3.}{3}\)\begin{align*} \E [S_{n+1}|\mc {F}_n]&=\E [X_{n+1}|\mc {F}_n]+\E [S_n|\mc {F}_n]\\ &=\E [X_{n+1}]+S_n\\ &\geq S_n. \end{align*} Hence \(S_n\) is a submartingale.
In general, if \((M_n)\) is a submartingale, then by definition \(\E [M_{n+1}\|\mc {F}_n]\geq M_n\), so taking expectations gives us \(\E [M_{n+1}]\geq \E [M_n]\). For supermartingales we get \(\E [M_{n+1}]\leq \E [M_n]\). In words: submartingales, on average, increase, whereas supermartingales, on average, decrease. The use of super- and sub- is counter intuitive in this respect.
-
Remark 3.3.7 Sometimes we will want to make it clear which filtration is being used in the definition of a martingale. To do so we might say that \((M_n)\) is an \(\mc {F}_n\)-martingale’, or that \((M_n)\) is a martingale with respect to \(\mc {F}_n\)’. We use the same notation for super/sub-martingales.
Our definition of a filtration and a martingale both make sense if we look at only a finite set of times \(n=1,\ldots , N\). We sometimes also use the terms filtration and martingale in this situation.
We end this section with two important general examples of martingales. You should check the conditions yourself, as exercise 3.3.