Paper 1, Section I, 7H\mathbf{7 H} \quad

Statistics
Part IB, 2014

Consider an estimator θ^\hat{\theta} of an unknown parameter θ\theta, and assume that Eθ(θ^2)<\mathbb{E}_{\theta}\left(\hat{\theta}^{2}\right)<\infty for all θ\theta. Define the bias and mean squared error of θ^\hat{\theta}.

Show that the mean squared error of θ^\hat{\theta} is the sum of its variance and the square of its bias.

Suppose that X1,,XnX_{1}, \ldots, X_{n} are independent identically distributed random variables with mean θ\theta and variance θ2\theta^{2}, and consider estimators of θ\theta of the form kXˉk \bar{X} where Xˉ=1ni=1nXi\bar{X}=\frac{1}{n} \sum_{i=1}^{n} X_{i}.

(i) Find the value of kk that gives an unbiased estimator, and show that the mean squared error of this unbiased estimator is θ2/n\theta^{2} / n.

(ii) Find the range of values of kk for which the mean squared error of kXˉk \bar{X} is smaller thanθ2/n\operatorname{than} \theta^{2} / n.