Paper 1, Section II, I

Principles of Statistics
Part II, 2009

(i) Let X1,,XnX_{1}, \ldots, X_{n} be independent and identically distributed random variables, having the exponential distribution E(λ)\mathcal{E}(\lambda) with density p(xλ)=λexp(λx)p(x \mid \lambda)=\lambda \exp (-\lambda x) for x,λ>0x, \lambda>0. Show that Tn=i=1nXiT_{n}=\sum_{i=1}^{n} X_{i} is minimal sufficient and complete for λ\lambda.

[You may assume uniqueness of Laplace transforms.]

(ii) For given x>0x>0, it is desired to estimate the quantity ϕ=Prob(X1>xλ)\phi=\operatorname{Prob}\left(X_{1}>x \mid \lambda\right). Compute the Fisher information for ϕ\phi.

(iii) State the Lehmann-Scheffé theorem. Show that the estimator ϕ~n\tilde{\phi}_{n} of ϕ\phi defined by

ϕ~n={0, if Tn<x,(1xTn)n1, if Tnx\tilde{\phi}_{n}= \begin{cases}0, & \text { if } T_{n}<x, \\ \left(1-\frac{x}{T_{n}}\right)^{n-1}, & \text { if } T_{n} \geqslant x\end{cases}

is the minimum variance unbiased estimator of ϕ\phi based on (X1,,Xn)\left(X_{1}, \ldots, X_{n}\right). Without doing any computations, state whether or not the variance of ϕ~n\tilde{\phi}_{n} achieves the Cramér-Rao lower bound, justifying your answer briefly.

Let knk \leqslant n. Show that E(ϕ~kTn,λ)=ϕ~n\mathbb{E}\left(\tilde{\phi}_{k} \mid T_{n}, \lambda\right)=\tilde{\phi}_{n}.