Random variables X1,…,Xn are independent and identically distributed from the normal distribution with unknown mean M and unknown precision (inverse variance) H. Show that the likelihood function, for data X1=x1,…,Xn=xn, is
Ln(μ,h)∝hn/2exp(−21h{n(xˉ−μ)2+S})
where xˉ:=n−1∑ixi and S:=∑i(xi−xˉ)2.
A bivariate prior distribution for (M,H) is specified, in terms of hyperparameters (α0,β0,m0,λ0), as follows. The marginal distribution of H is Γ(α0,β0), with density
π(h)∝hα0−1e−β0h(h>0),
and the conditional distribution of M, given H=h, is normal with mean m0 and precision λ0h.
Show that the conditional prior distribution of H, given M=μ, is
H∣M=μ∼(α0+21,β0+21λ0(μ−m0)2)
Show that the posterior joint distribution of (M,H), given X1=x1,…,Xn=xn, has the same form as the prior, with updated hyperparameters (αn,βn,mn,λn) which you should express in terms of the prior hyperparameters and the data.
[You may use the identity
p(t−a)2+q(t−b)2=(t−δ)2+pq(a−b)2
where p+q=1 and δ=pa+qb.]
Explain how you could implement Gibbs sampling to generate a random sample from the posterior joint distribution.