(i) Assume that the n-dimensional vector Y may be written as Y=Xβ+ϵ, where X is a given n×p matrix of rankp,β is an unknown vector, and
ϵ∼Nn(0,σ2I)
Let Q(β)=(Y−Xβ)T(Y−Xβ). Find β^, the least-squares estimator of β, and state without proof the joint distribution of β^ and Q(β^).
(ii) Now suppose that we have observations (Yij,1⩽i⩽I,1⩽j⩽J) and consider the model
Ω:Yij=μ+αi+βj+ϵij,
where (αi),(βj) are fixed parameters with Σαi=0,Σβj=0, and (ϵij) may be assumed independent normal variables, with ϵij∼N(0,σ2), where σ2 is unknown.
(a) Find (α^i),(β^j), the least-squares estimators of (αi),(βj).
(b) Find the least-squares estimators of (αi) under the hypothesis H0:βj=0 for all j.
(c) Quoting any general theorems required, explain carefully how to test H0, assuming Ω is true.
(d) What would be the effect of fitting the model Ω1:Yij=μ+αi+βj+γij+ϵij, where now (αi),(βj),(γij) are all fixed unknown parameters, and (ϵij) has the distribution given above?