last updated: September 19, 2024

Stochastic Processes and Financial Mathematics
(part one)

\(\newcommand{\footnotename}{footnote}\) \(\def \LWRfootnote {1}\) \(\newcommand {\footnote }[2][\LWRfootnote ]{{}^{\mathrm {#1}}}\) \(\newcommand {\footnotemark }[1][\LWRfootnote ]{{}^{\mathrm {#1}}}\) \(\let \LWRorighspace \hspace \) \(\renewcommand {\hspace }{\ifstar \LWRorighspace \LWRorighspace }\) \(\newcommand {\mathnormal }[1]{{#1}}\) \(\newcommand \ensuremath [1]{#1}\) \(\newcommand {\LWRframebox }[2][]{\fbox {#2}} \newcommand {\framebox }[1][]{\LWRframebox } \) \(\newcommand {\setlength }[2]{}\) \(\newcommand {\addtolength }[2]{}\) \(\newcommand {\setcounter }[2]{}\) \(\newcommand {\addtocounter }[2]{}\) \(\newcommand {\arabic }[1]{}\) \(\newcommand {\number }[1]{}\) \(\newcommand {\noalign }[1]{\text {#1}\notag \\}\) \(\newcommand {\cline }[1]{}\) \(\newcommand {\directlua }[1]{\text {(directlua)}}\) \(\newcommand {\luatexdirectlua }[1]{\text {(directlua)}}\) \(\newcommand {\protect }{}\) \(\def \LWRabsorbnumber #1 {}\) \(\def \LWRabsorbquotenumber "#1 {}\) \(\newcommand {\LWRabsorboption }[1][]{}\) \(\newcommand {\LWRabsorbtwooptions }[1][]{\LWRabsorboption }\) \(\def \mathchar {\ifnextchar "\LWRabsorbquotenumber \LWRabsorbnumber }\) \(\def \mathcode #1={\mathchar }\) \(\let \delcode \mathcode \) \(\let \delimiter \mathchar \) \(\def \oe {\unicode {x0153}}\) \(\def \OE {\unicode {x0152}}\) \(\def \ae {\unicode {x00E6}}\) \(\def \AE {\unicode {x00C6}}\) \(\def \aa {\unicode {x00E5}}\) \(\def \AA {\unicode {x00C5}}\) \(\def \o {\unicode {x00F8}}\) \(\def \O {\unicode {x00D8}}\) \(\def \l {\unicode {x0142}}\) \(\def \L {\unicode {x0141}}\) \(\def \ss {\unicode {x00DF}}\) \(\def \SS {\unicode {x1E9E}}\) \(\def \dag {\unicode {x2020}}\) \(\def \ddag {\unicode {x2021}}\) \(\def \P {\unicode {x00B6}}\) \(\def \copyright {\unicode {x00A9}}\) \(\def \pounds {\unicode {x00A3}}\) \(\let \LWRref \ref \) \(\renewcommand {\ref }{\ifstar \LWRref \LWRref }\) \( \newcommand {\multicolumn }[3]{#3}\) \(\require {textcomp}\) \(\newcommand {\intertext }[1]{\text {#1}\notag \\}\) \(\let \Hat \hat \) \(\let \Check \check \) \(\let \Tilde \tilde \) \(\let \Acute \acute \) \(\let \Grave \grave \) \(\let \Dot \dot \) \(\let \Ddot \ddot \) \(\let \Breve \breve \) \(\let \Bar \bar \) \(\let \Vec \vec \) \(\DeclareMathOperator {\var }{var}\) \(\DeclareMathOperator {\cov }{cov}\) \(\def \ra {\Rightarrow }\) \(\def \to {\rightarrow }\) \(\def \iff {\Leftrightarrow }\) \(\def \sw {\subseteq }\) \(\def \wt {\widetilde }\) \(\def \mc {\mathcal }\) \(\def \mb {\mathbb }\) \(\def \sc {\setminus }\) \(\def \v {\textbf }\) \(\def \p {\partial }\) \(\def \E {\mb {E}}\) \(\def \P {\mb {P}}\) \(\def \R {\mb {R}}\) \(\def \C {\mb {C}}\) \(\def \N {\mb {N}}\) \(\def \Q {\mb {Q}}\) \(\def \Z {\mb {Z}}\) \(\def \B {\mb {B}}\) \(\def \~{\sim }\) \(\def \-{\,;\,}\) \(\def \|{\,|\,}\) \(\def \qed {$\blacksquare $}\) \(\def \1{\unicode {x1D7D9}}\) \(\def \cadlag {c\`{a}dl\`{a}g}\) \(\def \p {\partial }\) \(\def \l {\left }\) \(\def \r {\right }\) \(\def \F {\mc {F}}\) \(\def \G {\mc {G}}\) \(\def \H {\mc {H}}\) \(\def \Om {\Omega }\) \(\def \om {\omega }\)

3.2 Properties of conditional expectation

In all but the easiest cases, calculating conditional expectations explicitly from Theorem 3.1.1 is not feasible. Instead, we are able to work with them via a set of useful properties, provided by the following proposition.

  • Proposition 3.2.1 Let \(\mc {G},\mc {H}\) be sub-\(\sigma \)-fields of \(\mc {F}\) and \(X,Y,Z\in L^1\). Then, almost surely,

    (Linearity) \(\E [a_1 X_1+ a_2 X_2\|\G ]= a_1\E [X_1\|\G ]+a_2\E [X_2\|\G ]\).

    (Absolute values) \(|\E [X\|\G ]|\leq \E [|X|\|\G ]\).

    (Montonicity) If \(X\le Y\), then \(\E [X|\G ]\le \E [Y|\G ]\).

    (Constants) If \(a\in \R \) (deterministic) then \(\E [a\|\G ]=a\).

    (Measurability) If \(X\) is \(\G \)-measurable, then \(\E [X\|\G ]= X\).

    (Independence) If \(X\) is independent of \(\mc {G}\) then \(\E [X\|\G ]=\E [X]\).

    (Taking out what is known) If \(Z\) is \(\G \) measurable, then \(\E [ZX\|\G ]= Z\E [X\|\G ]\).

    (Tower) If \(\H \subset \G \) then \(\E [\E [X\|\G ]\|\H ]= \E [X\|\H ]\).

    (Taking \(\E \)) It holds that \(\E [\E [X\|\G ]]=\E [X]\).

    (No information) It holds that \(\E [X\|\{\emptyset ,\Omega \}]=\E [X]\),

Proof of these properties is beyond the scope of our course. Note that the first five properties above are common properties of both \(\E [\cdot ]\) and \(\E [\cdot \|\mc {G}]\).

We’ll use these properties extensively, for the whole of the remainder of the course. They are not on the formula sheet – you should remember them and become familiar with applying them.

  • Remark 3.2.2 \(\offsyl \) Although we have not proved the properties in Proposition 3.2.1, they are intuitive properties for conditional expectation to have.

    For example, in the taking out what is known property, we can think of \(Z\) as already being simple enough to be \(\mc {G}\) measurable, so we’d expect that taking conditional expectation with respect to \(\mc {G}\) doesn’t need to affect it.

    In the independence property, we can think of \(\mc {G}\) as giving us no information about the value \(X\) is taking, so our best guess at the value of \(X\) has to be simply \(\E [X]\).

    In the tower property for \(\E [\E [X|\mc {G}]|\mc {H}]\), we start with \(X\), simplify it to be \(\mc {G}\) measurable and simplify it to be \(\mc {H}\) measurable. But since \(\mc {H}\sw \mc {G}\), we might as well have just simplified \(X\) enough to be \(\mc {H}\) measurable in a single step, which would be \(\E [X|\mc {H}]\).

    Etc. It is a useful exercise for you to try and think of ‘intuitive’ arguments for the other properties too, so as you can easily remember them.

Conditional expectation as an estimator

The conditional expectation \(Y=\E [X\|\G ]\) is the ‘best least-squares estimator’ of \(X\), based on the information available in \(\G \). We can state this rigorously and use our toolkit from Proposition 3.2.1 prove it. It demonstrates another way in which \(Y\) is ‘the best’ \(\mc {G}\)-measurable approximation to \(X\), and provides our first example of using the properties of \(\E [X\|\mc {G}]\).

  • Lemma 3.2.3 Let \(\mc {G}\) be a sub-\(\sigma \)-field of \(\mc {F}\). Let \(X\) be an \(\mc {F}\)-measurable random variable and let \(Y=\E [X|\mc {G}]\). Suppose that \(Y'\) is a \(\G \)-measurable, random variable. Then

    \[\E [(X-Y)^2]\leq \E [(X-Y')^2].\]

Proof: We note that

\begin{align} \E [(X-Y')^2] &= \E [(X-Y+Y-Y')^2]\notag \\ &= \E [(X-Y)^2] + 2\E [(X-Y)(Y-Y')] + \E [(Y-Y')^2].\label {eq:cond_exp_least_sqs_req} \end{align} In the middle term above, we can write

\begin{align*} \E [(X-Y)(Y-Y')] &= \E [\E [(X-Y)(Y-Y')|\mc {G}]]\\ &=\E [(Y-Y')\E [X-Y|\mc {G}]]\\ &=\E [(Y-Y')(\E [X|\mc {G}]-Y)]\\ &=\E [(Y-Y')(0)]\\ &=0. \end{align*} Here, in the first step we used the ‘taking \(\E \)’ property, in the second step we used Proposition 2.2.6 to tell us that \(Y-Y'\) is \(\mc {G}\)-measurable, followed by the ‘taking out what is known’ rule. In the final step we used the linearity and measurability properties. Since \(\E [X|\mc {G}]=Y\) almost surely, we obtain that \(\E [(X-Y)(Y-Y')]=0\). Hence, since \(\E [(Y-Y')^2]\geq 0\), from (3.3) we obtain \(\E [(X-Y')^2] \ge \E [(X-Y)^2]\).   ∎