Consider the general linear model Y=Xβ0+ε where X is a known n×p design matrix with p⩾2,β0∈Rp is an unknown vector of parameters, and ε∈Rn is a vector of stochastic errors with E(εi)=0,var(εi)=σ2>0 and cov(εi,εj)=0 for all i,j=1,…,n with i=j. Suppose X has full column rank.
(a) Write down the least squares estimate β^ of β0 and show that it minimises the least squares objective S(β)=∥Y−Xβ∥2 over β∈Rp.
(b) Write down the variance-covariance matrix cov(β^).
(c) Let β~∈Rp minimise S(β) over β∈Rp subject to βp=0. Let Z be the n×(p−1) submatrix of X that excludes the final column. Write downcov(β~).
(d) Let P and P0 be n×n orthogonal projections onto the column spaces of X and Z respectively. Show that for all u∈Rn,uTPu⩾uTP0u.
(e) Show that for all x∈Rp,
var(xTβ~)⩽var(xTβ^).
[Hint: Argue that x=XTu for some u∈Rn.]