Paper 3, Section II, J

Principles of Statistics
Part II, 2020

Let Θ=Rp\Theta=\mathbb{R}^{p}, let μ>0\mu>0 be a probability density function on Θ\Theta and suppose we are given a further auxiliary conditional probability density function q(t)>0,tΘq(\cdot \mid t)>0, t \in \Theta, on Θ\Theta from which we can generate random draws. Consider a sequence of random variables {ϑm:mN}\left\{\vartheta_{m}: m \in \mathbb{N}\right\} generated as follows:

  • For mNm \in \mathbb{N} and given ϑm\vartheta_{m}, generate a new draw smq(ϑm)s_{m} \sim q\left(\cdot \mid \vartheta_{m}\right).

  • Define

ϑm+1={sm, with probability ρ(ϑm,sm)ϑm, with probability 1ρ(ϑm,sm)\vartheta_{m+1}= \begin{cases}s_{m}, & \text { with probability } \rho\left(\vartheta_{m}, s_{m}\right) \\ \vartheta_{m}, & \text { with probability } 1-\rho\left(\vartheta_{m}, s_{m}\right)\end{cases}

where ρ(t,s)=min{μ(s)μ(t)q(ts)q(st),1}\rho(t, s)=\min \left\{\frac{\mu(s)}{\mu(t)} \frac{q(t \mid s)}{q(s \mid t)}, 1\right\}.

(i) Show that the Markov chain (ϑm)\left(\vartheta_{m}\right) has invariant measure μ\mu, that is, show that for all (measurable) subsets BΘB \subset \Theta and all mNm \in \mathbb{N} we have

ΘPr(ϑm+1Bϑm=t)μ(t)dt=Bμ(θ)dθ\int_{\Theta} \operatorname{Pr}\left(\vartheta_{m+1} \in B \mid \vartheta_{m}=t\right) \mu(t) d t=\int_{B} \mu(\theta) d \theta

(ii) Now suppose that μ\mu is the posterior probability density function arising in a statistical model {f(,θ):θΘ}\{f(\cdot, \theta): \theta \in \Theta\} with observations xx and a N(0,Ip)N\left(0, I_{p}\right) prior distribution on θ\theta. Derive a family {q(t):tΘ}\{q(\cdot \mid t): t \in \Theta\} such that in the above algorithm the acceptance probability ρ(t,s)\rho(t, s) is a function of the likelihood ratio f(x,s)/f(x,t)f(x, s) / f(x, t), and for which the probability density function q(t)q(\cdot \mid t) has covariance matrix 2δIp2 \delta I_{p} for all tΘt \in \Theta.