Paper 1, Section I,
Part IB, 2014
Consider an estimator of an unknown parameter , and assume that for all . Define the bias and mean squared error of .
Show that the mean squared error of is the sum of its variance and the square of its bias.
Suppose that are independent identically distributed random variables with mean and variance , and consider estimators of of the form where .
(i) Find the value of that gives an unbiased estimator, and show that the mean squared error of this unbiased estimator is .
(ii) Find the range of values of for which the mean squared error of is smaller .