Prove that, if T is complete sufficient for Θ, and S is a function of T, then S is the minimum variance unbiased estimator of E(S∣Θ).
When the parameter Θ takes a value θ>0, observables (X1,…,Xn) arise independently from the exponential distribution E(θ), having probability density function
p(x∣θ)=θe−θx(x>0).
Show that the family of distributions
Θ∼Gamma(α,β)(α>0,β>0),
with probability density function
π(θ)=Γ(α)βαθα−1e−βθ(θ>0),
is a conjugate family for Bayesian inference about Θ (where Γ(α) is the Gamma function).
Show that the expectation of Λ:=logΘ, under prior distribution (1), is ψ(α)−logβ, where ψ(α):=(d/dα)logΓ(α). What is the prior variance of Λ ? Deduce the posterior expectation and variance of Λ, given (X1,…,Xn).
Let Λ~ denote the limiting form of the posterior expectation of Λ as α,β↓0. Show that Λ~ is the minimum variance unbiased estimator of Λ. What is its variance?