Paper 1, Section II, 28 K28 \mathrm{~K}

Principles of Statistics
Part II, 2012

Prove that, if TT is complete sufficient for Θ\Theta, and SS is a function of TT, then SS is the minimum variance unbiased estimator of E(SΘ)\mathbb{E}(S \mid \Theta).

When the parameter Θ\Theta takes a value θ>0\theta>0, observables (X1,,Xn)\left(X_{1}, \ldots, X_{n}\right) arise independently from the exponential distribution E(θ)\mathcal{E}(\theta), having probability density function

p(xθ)=θeθx(x>0).p(x \mid \theta)=\theta e^{-\theta x} \quad(x>0) .

Show that the family of distributions

ΘGamma(α,β)(α>0,β>0),\Theta \sim \operatorname{Gamma}(\alpha, \beta) \quad(\alpha>0, \beta>0),

with probability density function

π(θ)=βαΓ(α)θα1eβθ(θ>0),\pi(\theta)=\frac{\beta^{\alpha}}{\Gamma(\alpha)} \theta^{\alpha-1} e^{-\beta \theta} \quad(\theta>0),

is a conjugate family for Bayesian inference about Θ\Theta (where Γ(α)\Gamma(\alpha) is the Gamma function).

Show that the expectation of Λ:=logΘ\Lambda:=\log \Theta, under prior distribution (1), is ψ(α)logβ\psi(\alpha)-\log \beta, where ψ(α):=(d/dα)logΓ(α)\psi(\alpha):=(\mathrm{d} / \mathrm{d} \alpha) \log \Gamma(\alpha). What is the prior variance of Λ\Lambda ? Deduce the posterior expectation and variance of Λ\Lambda, given (X1,,Xn)\left(X_{1}, \ldots, X_{n}\right).

Let Λ~\tilde{\Lambda} denote the limiting form of the posterior expectation of Λ\Lambda as α,β0\alpha, \beta \downarrow 0. Show that Λ~\tilde{\Lambda} is the minimum variance unbiased estimator of Λ\Lambda. What is its variance?