Paper 2, Section II, J

Principles of Statistics
Part II, 2014

In a general decision problem, define the concepts of a Bayes rule and of admissibility. Show that a unique Bayes rule is admissible.

Consider i.i.d. observations X1,,XnX_{1}, \ldots, X_{n} from a Poisson(θ),θΘ=(0,)\operatorname{Poisson}(\theta), \theta \in \Theta=(0, \infty), model. Can the maximum likelihood estimator θ^MLE\hat{\theta}_{M L E} of θ\theta be a Bayes rule for estimating θ\theta in quadratic risk for any prior distribution on θ\theta that has a continuous probability density on (0,)?(0, \infty) ? Justify your answer.

Now model the XiX_{i} as i.i.d. copies of XθPoisson(θ)X \mid \theta \sim \operatorname{Poisson}(\theta), where θ\theta is drawn from a prior that is a Gamma distribution with parameters α>0\alpha>0 and λ>0\lambda>0 (given below). Show that the posterior distribution of θX1,,Xn\theta \mid X_{1}, \ldots, X_{n} is a Gamma distribution and find its parameters. Find the Bayes rule θ^BAYES\hat{\theta}_{B A Y E S} for estimating θ\theta in quadratic risk for this prior. [The Gamma probability density function with parameters α>0,λ>0\alpha>0, \lambda>0 is given by

f(θ)=λαθα1eλθΓ(α),θ>0f(\theta)=\frac{\lambda^{\alpha} \theta^{\alpha-1} e^{-\lambda \theta}}{\Gamma(\alpha)}, \quad \theta>0

where Γ(α)\Gamma(\alpha) is the usual Gamma function.]

Finally assume that the XiX_{i} have actually been generated from a fixed Poisson (θ0)\left(\theta_{0}\right) distribution, where θ0>0\theta_{0}>0. Show that n(θ^BAYESθ^MLE)\sqrt{n}\left(\hat{\theta}_{B A Y E S}-\hat{\theta}_{M L E}\right) converges to zero in probability and deduce the asymptotic distribution of n(θ^BAYESθ0)\sqrt{n}\left(\hat{\theta}_{B A Y E S}-\theta_{0}\right) under the joint law Pθ0NP_{\theta_{0}}^{\mathbb{N}} of the random variables X1,X2,X_{1}, X_{2}, \ldots. [You may use standard results from lectures without proof provided they are clearly stated.]