(i) Let X1,…,Xn be independent, identically-distributed N(μ,μ2) random variables, μ>0.
Find a minimal sufficient statistic for μ.
Let T1=n−1∑i=1nXi and T2=n−1∑i=1nXi2. Write down the distribution of Xi/μ, and hence show that Z=T1/T2 is ancillary. Explain briefly why the Conditionality Principle would lead to inference about μ being drawn from the conditional distribution of T2 given Z.
What is the maximum likelihood estimator of μ ?
(ii) Describe briefly the Bayesian approach to predictive inference,
Let Z1,…,Zn be independent, identically-distributed N(μ,σ2) random variables, with μ,σ2 both unknown. Derive the maximum likelihood estimators μ,σ2 of μ,σ2 based on Z1,…,Zn, and state, without proof, their joint distribution.
Suppose that it is required to construct a prediction interval
I1−α≡I1−α(Z1,…,Zn) for a future, independent, random variable Z0 with the same N(μ,σ2) distribution, such that
P(Z0∈I1−α)=1−α
with the probability over the joint distribution of Z0,Z1,…,Zn. Let