Paper 3, Section II, 18H18 \mathrm{H}

Statistics
Part IB, 2021

Consider the normal linear model Y=Xβ+εY=X \beta+\varepsilon where XX is a known n×pn \times p design matrix with n2>p1,βRpn-2>p \geqslant 1, \beta \in \mathbb{R}^{p} is an unknown vector of parameters, and εNn(0,σ2I)\varepsilon \sim N_{n}\left(0, \sigma^{2} I\right) is a vector of normal errors with each component having variance σ2>0\sigma^{2}>0. Suppose XX has full column rank.

(i) Write down the maximum likelihood estimators, β^\hat{\beta} and σ^2\hat{\sigma}^{2}, for β\beta and σ2\sigma^{2} respectively. [You need not derive these.]

(ii) Show that β^\hat{\beta} is independent of σ^2\hat{\sigma}^{2}.

(iii) Find the distributions of β^\hat{\beta} and nσ^2/σ2n \hat{\sigma}^{2} / \sigma^{2}.

(iv) Consider the following test statistic for testing the null hypothesis H0:β=0H_{0}: \beta=0 against the alternative β0\beta \neq 0 :

T:=β^2nσ^2.T:=\frac{\|\hat{\beta}\|^{2}}{n \hat{\sigma}^{2}} .

Let λ1λ2λp>0\lambda_{1} \geqslant \lambda_{2} \geqslant \cdots \geqslant \lambda_{p}>0 be the eigenvalues of XTXX^{T} X. Show that under H0,TH_{0}, T has the same distribution as

j=1pλj1WjZ\frac{\sum_{j=1}^{p} \lambda_{j}^{-1} W_{j}}{Z}

where Zχnp2Z \sim \chi_{n-p}^{2} and W1,,WpW_{1}, \ldots, W_{p} are independent χ12\chi_{1}^{2} random variables, independent of ZZ.

[Hint: You may use the fact that X=UDVTX=U D V^{T} where URn×pU \in \mathbb{R}^{n \times p} has orthonormal columns, VRp×pV \in \mathbb{R}^{p \times p} is an orthogonal matrix and DRp×pD \in \mathbb{R}^{p \times p} is a diagonal matrix with Dii=λi.]\left.D_{i i}=\sqrt{\lambda_{i}} .\right]

(v) Find ET\mathbb{E} T when β0\beta \neq 0. [Hint: If Rχν2R \sim \chi_{\nu}^{2} with ν>2\nu>2, then E(1/R)=1/(ν2)\mathbb{E}(1 / R)=1 /(\nu-2).]