A1.13

Computational Statistics and Statistical Modelling
Part II, 2004

(i) Assume that the nn-dimensional vector YY may be written as Y=Xβ+ϵY=X \beta+\epsilon, where XX is a given n×pn \times p matrix of rankp,β\operatorname{rank} p, \beta is an unknown vector, and

ϵNn(0,σ2I)\epsilon \sim N_{n}\left(0, \sigma^{2} I\right)

Let Q(β)=(YXβ)T(YXβ)Q(\beta)=(Y-X \beta)^{T}(Y-X \beta). Find β^\hat{\beta}, the least-squares estimator of β\beta, and state without proof the joint distribution of β^\hat{\beta} and Q(β^)Q(\hat{\beta}).

(ii) Now suppose that we have observations (Yij,1iI,1jJ)\left(Y_{i j}, 1 \leqslant i \leqslant I, 1 \leqslant j \leqslant J\right) and consider the model

Ω:Yij=μ+αi+βj+ϵij,\Omega: Y_{i j}=\mu+\alpha_{i}+\beta_{j}+\epsilon_{i j},

where (αi),(βj)\left(\alpha_{i}\right),\left(\beta_{j}\right) are fixed parameters with Σαi=0,Σβj=0\Sigma \alpha_{i}=0, \Sigma \beta_{j}=0, and (ϵij)\left(\epsilon_{i j}\right) may be assumed independent normal variables, with ϵijN(0,σ2)\epsilon_{i j} \sim N\left(0, \sigma^{2}\right), where σ2\sigma^{2} is unknown.

(a) Find (α^i),(β^j)\left(\hat{\alpha}_{i}\right),\left(\hat{\beta}_{j}\right), the least-squares estimators of (αi),(βj)\left(\alpha_{i}\right),\left(\beta_{j}\right).

(b) Find the least-squares estimators of (αi)\left(\alpha_{i}\right) under the hypothesis H0:βj=0H_{0}: \beta_{j}=0 for all jj.

(c) Quoting any general theorems required, explain carefully how to test H0H_{0}, assuming Ω\Omega is true.

(d) What would be the effect of fitting the model Ω1:Yij=μ+αi+βj+γij+ϵij\Omega_{1}: Y_{i j}=\mu+\alpha_{i}+\beta_{j}+\gamma_{i j}+\epsilon_{i j}, where now (αi),(βj),(γij)\left(\alpha_{i}\right),\left(\beta_{j}\right),\left(\gamma_{i j}\right) are all fixed unknown parameters, and (ϵij)\left(\epsilon_{i j}\right) has the distribution given above?