When the real parameter Θ takes value θ, variables X1,X2,… arise independently from a distribution Pθ having density function pθ(x) with respect to an underlying measure μ. Define the score variable Un(θ) and the information function In(θ) for estimation of Θ based on Xn:=(X1,…,Xn), and relate In(θ) to i(θ):=I1(θ).
State and prove the Cramér-Rao inequality for the variance of an unbiased estimator of Θ. Under what conditions does this inequality become an equality? What is the form of the estimator in this case? [You may assume Eθ{Un(θ)}=0,varθ{Un(θ)}=In(θ), and any further required regularity conditions, without comment.]
Let Θn be the maximum likelihood estimator of Θ based on Xn. What is the asymptotic distribution of n21(Θn−Θ) when Θ=θ ?
Suppose that, for each n,Θn is unbiased for Θ, and the variance of n21(Θn−Θ) is exactly equal to its asymptotic variance. By considering the estimator αΘk+(1−α)Θn, or otherwise, show that, for k<n,covθ(Θk,Θn)=varθ(Θn).