\(\newcommand{\footnotename}{footnote}\)
\(\def \LWRfootnote {1}\)
\(\newcommand {\footnote }[2][\LWRfootnote ]{{}^{\mathrm {#1}}}\)
\(\newcommand {\footnotemark }[1][\LWRfootnote ]{{}^{\mathrm {#1}}}\)
\(\let \LWRorighspace \hspace \)
\(\renewcommand {\hspace }{\ifstar \LWRorighspace \LWRorighspace }\)
\(\newcommand {\mathnormal }[1]{{#1}}\)
\(\newcommand \ensuremath [1]{#1}\)
\(\newcommand {\LWRframebox }[2][]{\fbox {#2}} \newcommand {\framebox }[1][]{\LWRframebox } \)
\(\newcommand {\setlength }[2]{}\)
\(\newcommand {\addtolength }[2]{}\)
\(\newcommand {\setcounter }[2]{}\)
\(\newcommand {\addtocounter }[2]{}\)
\(\newcommand {\arabic }[1]{}\)
\(\newcommand {\number }[1]{}\)
\(\newcommand {\noalign }[1]{\text {#1}\notag \\}\)
\(\newcommand {\cline }[1]{}\)
\(\newcommand {\directlua }[1]{\text {(directlua)}}\)
\(\newcommand {\luatexdirectlua }[1]{\text {(directlua)}}\)
\(\newcommand {\protect }{}\)
\(\def \LWRabsorbnumber #1 {}\)
\(\def \LWRabsorbquotenumber "#1 {}\)
\(\newcommand {\LWRabsorboption }[1][]{}\)
\(\newcommand {\LWRabsorbtwooptions }[1][]{\LWRabsorboption }\)
\(\def \mathchar {\ifnextchar "\LWRabsorbquotenumber \LWRabsorbnumber }\)
\(\def \mathcode #1={\mathchar }\)
\(\let \delcode \mathcode \)
\(\let \delimiter \mathchar \)
\(\def \oe {\unicode {x0153}}\)
\(\def \OE {\unicode {x0152}}\)
\(\def \ae {\unicode {x00E6}}\)
\(\def \AE {\unicode {x00C6}}\)
\(\def \aa {\unicode {x00E5}}\)
\(\def \AA {\unicode {x00C5}}\)
\(\def \o {\unicode {x00F8}}\)
\(\def \O {\unicode {x00D8}}\)
\(\def \l {\unicode {x0142}}\)
\(\def \L {\unicode {x0141}}\)
\(\def \ss {\unicode {x00DF}}\)
\(\def \SS {\unicode {x1E9E}}\)
\(\def \dag {\unicode {x2020}}\)
\(\def \ddag {\unicode {x2021}}\)
\(\def \P {\unicode {x00B6}}\)
\(\def \copyright {\unicode {x00A9}}\)
\(\def \pounds {\unicode {x00A3}}\)
\(\let \LWRref \ref \)
\(\renewcommand {\ref }{\ifstar \LWRref \LWRref }\)
\( \newcommand {\multicolumn }[3]{#3}\)
\(\require {textcomp}\)
\(\newcommand {\intertext }[1]{\text {#1}\notag \\}\)
\(\let \Hat \hat \)
\(\let \Check \check \)
\(\let \Tilde \tilde \)
\(\let \Acute \acute \)
\(\let \Grave \grave \)
\(\let \Dot \dot \)
\(\let \Ddot \ddot \)
\(\let \Breve \breve \)
\(\let \Bar \bar \)
\(\let \Vec \vec \)
\(\require {colortbl}\)
\(\let \LWRorigcolumncolor \columncolor \)
\(\renewcommand {\columncolor }[2][named]{\LWRorigcolumncolor [#1]{#2}\LWRabsorbtwooptions }\)
\(\let \LWRorigrowcolor \rowcolor \)
\(\renewcommand {\rowcolor }[2][named]{\LWRorigrowcolor [#1]{#2}\LWRabsorbtwooptions }\)
\(\let \LWRorigcellcolor \cellcolor \)
\(\renewcommand {\cellcolor }[2][named]{\LWRorigcellcolor [#1]{#2}\LWRabsorbtwooptions }\)
\(\require {mathtools}\)
\(\newenvironment {crampedsubarray}[1]{}{}\)
\(\newcommand {\smashoperator }[2][]{#2\limits }\)
\(\newcommand {\SwapAboveDisplaySkip }{}\)
\(\newcommand {\LaTeXunderbrace }[1]{\underbrace {#1}}\)
\(\newcommand {\LaTeXoverbrace }[1]{\overbrace {#1}}\)
\(\newcommand {\LWRmultlined }[1][]{\begin {multline*}}\)
\(\newenvironment {multlined}[1][]{\LWRmultlined }{\end {multline*}}\)
\(\let \LWRorigshoveleft \shoveleft \)
\(\renewcommand {\shoveleft }[1][]{\LWRorigshoveleft }\)
\(\let \LWRorigshoveright \shoveright \)
\(\renewcommand {\shoveright }[1][]{\LWRorigshoveright }\)
\(\newcommand {\shortintertext }[1]{\text {#1}\notag \\}\)
\(\newcommand {\vcentcolon }{\mathrel {\unicode {x2236}}}\)
\(\renewcommand {\intertext }[2][]{\text {#2}\notag \\}\)
\(\newenvironment {fleqn}[1][]{}{}\)
\(\newenvironment {ceqn}{}{}\)
\(\newenvironment {darray}[2][c]{\begin {array}[#1]{#2}}{\end {array}}\)
\(\newcommand {\dmulticolumn }[3]{#3}\)
\(\newcommand {\LWRnrnostar }[1][0.5ex]{\\[#1]}\)
\(\newcommand {\nr }{\ifstar \LWRnrnostar \LWRnrnostar }\)
\(\newcommand {\mrel }[1]{\begin {aligned}#1\end {aligned}}\)
\(\newcommand {\underrel }[2]{\underset {#2}{#1}}\)
\(\newcommand {\medmath }[1]{#1}\)
\(\newcommand {\medop }[1]{#1}\)
\(\newcommand {\medint }[1]{#1}\)
\(\newcommand {\medintcorr }[1]{#1}\)
\(\newcommand {\mfrac }[2]{\frac {#1}{#2}}\)
\(\newcommand {\mbinom }[2]{\binom {#1}{#2}}\)
\(\newenvironment {mmatrix}{\begin {matrix}}{\end {matrix}}\)
\(\newcommand {\displaybreak }[1][]{}\)
\( \def \offsyl {(\oslash )} \def \msconly {(\Delta )} \)
\( \DeclareMathOperator {\var }{var} \DeclareMathOperator {\cov }{cov} \DeclareMathOperator {\Bin }{Bin} \DeclareMathOperator {\Geo }{Geometric} \DeclareMathOperator {\Beta
}{Beta} \DeclareMathOperator {\Unif }{Uniform} \DeclareMathOperator {\Gam }{Gamma} \DeclareMathOperator {\Normal }{N} \DeclareMathOperator {\Exp }{Exp} \DeclareMathOperator
{\Cauchy }{Cauchy} \DeclareMathOperator {\Bern }{Bernoulli} \DeclareMathOperator {\Poisson }{Poisson} \DeclareMathOperator {\Weibull }{Weibull} \DeclareMathOperator {\IGam
}{IGamma} \DeclareMathOperator {\NGam }{NGamma} \DeclareMathOperator {\ChiSquared }{ChiSquared} \DeclareMathOperator {\Pareto }{Pareto} \DeclareMathOperator {\NBin }{NegBin}
\DeclareMathOperator {\Studentt }{Student-t} \DeclareMathOperator *{\argmax }{arg\,max} \DeclareMathOperator *{\argmin }{arg\,min} \)
\( \def \to {\rightarrow } \def \iff {\Leftrightarrow } \def \ra {\Rightarrow } \def \sw {\subseteq } \def \mc {\mathcal } \def \mb {\mathbb } \def \sc {\setminus } \def \wt
{\widetilde } \def \v {\textbf } \def \E {\mb {E}} \def \P {\mb {P}} \def \R {\mb {R}} \def \C {\mb {C}} \def \N {\mb {N}} \def \Q {\mb {Q}} \def \Z {\mb {Z}} \def \B {\mb {B}}
\def \~{\sim } \def \-{\,;\,} \def \qed {$\blacksquare $} \CustomizeMathJax {\def \1{\unicode {x1D7D9}}} \def \cadlag {c\`{a}dl\`{a}g} \def \p {\partial } \def \l
{\left } \def \r {\right } \def \Om {\Omega } \def \om {\omega } \def \eps {\epsilon } \def \de {\delta } \def \ov {\overline } \def \sr {\stackrel } \def \Lp {\mc {L}^p} \def
\Lq {\mc {L}^p} \def \Lone {\mc {L}^1} \def \Ltwo {\mc {L}^2} \def \toae {\sr {\rm a.e.}{\to }} \def \toas {\sr {\rm a.s.}{\to }} \def \top {\sr {\mb {\P }}{\to }} \def \tod {\sr
{\rm d}{\to }} \def \toLp {\sr {\Lp }{\to }} \def \toLq {\sr {\Lq }{\to }} \def \eqae {\sr {\rm a.e.}{=}} \def \eqas {\sr {\rm a.s.}{=}} \def \eqd {\sr {\rm d}{=}} \def \approxd
{\sr {\rm d}{\approx }} \def \Sa {(S1)\xspace } \def \Sb {(S2)\xspace } \def \Sc {(S3)\xspace } \)
-
3.1 \(\color {blue}\star \) This exercise continues Exercise 2.1. It provides template code for drawing several sketches of distributions, which you will find helpful
in many later exercises.
Inside the files 2_dist_sketching.ipynb and 2_dist_sketching.Rmd, below the parts corresponding to Exercise 2.1, you will find the code for sketching
\[f_{X}(x_1)=\int _{\R _d}f_{\Exp (\lambda )}(x_1)f_{\Gam (2,60)}(\lambda ) \,d\lambda ,\]
which is the p.d.f. of the sampling distribution (for a single item of data) in Example 3.2.3.
-
(a) Modify this code to sketch the p.d.f. of the sampling distribution of the continuous Bayesian model \((X,\Theta )\) with
model family \(M_\theta =\Gamma (2,\theta )\) and prior \(\Theta \sim \Exp (1)\).
-
(b) Do the same as in (a), for the continuous Bayesian model \((X,\Theta )\) with model family \(M_\theta =\Normal (\theta ,1)\)
and prior \(\Theta \sim \Normal (0,1)\).
-
3.2 \(\color {blue}\star \,\star \) Let \(M_\theta \sim \Exp (\theta )\), where \(\theta \) takes values in the parameter
space \(\Pi =(0,\infty )\). Let \((X,\Theta )\) be the Bayesian model with this model family and prior \(\Theta \sim \Gam (2,3)\).
-
(a) Given the single data point \(x=2\), show that the posterior \(\Theta |_{\{X=2\}}\) has the \(\Gam (3,5)\) distribution.
-
(b)
-
(i) Show that the sampling distribution of the model has p.d.f.
\(\seteqnumber{0}{3.}{7}\)
\begin{equation}
\label {eq:lomax_1} f_X(x)=\begin{cases} \frac {18}{(x+3)^3} & \text { for }x>0 \\ 0 & \text { otherwise}. \end {cases}
\end{equation}
Hint: Use that \(\int _0^\infty f_{\Gam (3,x+3)}(\theta )\,d\theta =1\) to help with the integral.
-
(ii) Find the predictive distribution in similar form to (3.8).
-
(c) Now consider the model \(M_\theta \sim (X_1,\ldots ,X_n)\), where \(n\in \N \) and the \(X_i\) are independent \(\Exp
(\theta )\) random variables. Use the same prior \(\Theta \sim \Gam (2,3)\) and the data \(x=(x_1,\ldots ,x_n)\), where \(x_i\in (0,\infty )\) for all \(i=1,\ldots ,n\). Show that the posterior
distribution has the \(\Gam (n+2,\,3+z)\) distribution, where \(z=\sum _1^n x_i\).
-
3.3 \(\color {blue}\star \) With a computer package of your choice, sketch the prior and posterior probability density
functions from Exercise 3.2(a)/(b) on the same graph.
On a separate graph, use the explicit formula you found in Exercise 3.2(a)/(b) to sketch the sampling and
predictive distributions. Modify your code from Exercise 3.1 to sketch the same functions, but
without using your explicit formulae. Check that the results agree.
-
3.4
-
(a) \(\color {blue}\star \) Look at the left hand column of the reference sheet ‘Conditional Probability and Related Formulae’ in
Appendix A. For each item listed there, identify which Section, Lemma, equation, or other part of Chapter 1 it comes from.
-
(b) Do the same for the left hand column of the reference sheet ‘Bayesian Models and Related Formulae’ (excluding the last item), with
Chapters 2 and 3.
-
3.5 Let \(M_\theta \sim \Unif ([0,\theta ])\) be the continuous uniform distribution on \([0,\theta ]\). Let \((X,\Theta
)\) be a Bayesian model with model family \((M_\theta )_{\theta \in \Pi }\), with parameter space \(\Pi =(0,\infty )\). Take the prior to be \(\Theta \sim \Pareto (3,1)\).
-
(a) \(\color {blue}\star \,\star \) Suppose that we have the datapoint \(x=\frac 12\). Show that the posterior \(\Theta
|_{\{X=\frac 12\}}\) has distribution \(\Pareto (4,1)\).
-
(b) \(\color {blue}\star \star \star \) Suppose instead that we had the data point \(x=5\). Find the posterior distribution
\(\Theta |_{\{X=5\}}\) and calculate the p.d.f. of the resulting predictive distribution.
-
3.6 \(\color {blue}\star \star \star \) Let \((X,\Theta )\) be a continuous Bayesian model with
parameter space \(\Pi \). Suppose that \(A\sw \Pi \) with \(\P [\Theta \in A]>0\). Show that \(X|_{\{\Theta \in A\}}\) is a continuous random variable with p.d.f.
\[f_{X|_{\{\Theta \in A\}}}(x)=\int _{A}f_{M_\theta }(x)f_{\Theta |_{\{\Theta \in A\}}}(\theta )\,d\theta .\]
-
3.7 \(\color {blue}\star \star \star \) Let \((X,\Theta )\) be a continuous Bayesian model, with range
\(R_X\sw \R \) and parameter space \(\Pi \sw \R \), and with model family \((M_\theta )\). Let \(f_{M_\theta }\) denote the p.d.f. of \(M_\theta \) and let \(f_\Theta \) denote the p.d.f. of
\(\Theta \).
Consider a second continuous Bayesian model \((X', \Theta )\) with the same prior, the same range and parameter space, but with model family \((M'_\theta )\) given by
\(\seteqnumber{0}{3.}{8}\)
\begin{equation}
\label {eq:model_convolution} f_{M'_\theta }(x)=\int _\R f_{M_\theta }(x-y)\kappa (y)\,dy.
\end{equation}
We require that \(\kappa :\R \to [0,\infty )\) and \(\int _\R \kappa (y)\,dy=1\).
-
(a) Check that \(\int _\R f_{M'_\theta }(x)\,dx=1\).
-
(b) Show that the posterior density of \((X',\Theta )\) satisfies
\(\seteqnumber{0}{3.}{9}\)
\begin{equation}
\label {eq:model_convolution_posterior} f_{\Theta |_{\{X'=x\}}}(\theta ) \propto \int _\R f_{\Theta |_{\{X=x-y\}}}(\theta )\kappa (y)\,dy.
\end{equation}
-
(c) The operation in (3.9) is
known as the convolution of \(f_{M_\theta }\) with \(\kappa \), and the function \(\kappa \) is known as the kernel of the convolution.
Consider the case \(\kappa (y)=\frac {1}{\sqrt {2\pi }}e^{-y^2/2}\). Investigate the connection between convolutions and sums of random variables. Use what you discover to write down (in words) a
heuristic interpretation of the connection between Model 1 and Model 2, and also of equation (3.10).