Consider a linear model
Y=Xβ+ε
where X is a known n×p matrix, β is a p×1(p<n) vector of unknown parameters and ε is an n×1 vector of independent N(0,σ2) random variables with σ2 unknown. Assume that X has full rank p. Find the least squares estimator β^ of β and derive its distribution. Define the residual sum of squares RSS and write down an unbiased estimator σ^2 of σ2.
Suppose that Vi=a+bui+δi and Zi=c+dwi+ηi, for i=1,…,m, where ui and wi are known with ∑i=1mui=∑i=1mwi=0, and δ1,…,δm,η1,…,ηm are independent N(0,σ2) random variables. Assume that at least two of the ui are distinct and at least two of the wi are distinct. Show that Y=(V1,…,Vm,Z1,…,Zm)T (where T denotes transpose) may be written as in ( † ) and identify X and β. Find β^ in terms of the Vi,Zi, ui and wi. Find the distribution of b^−d^ and derive a 95% confidence interval for b−d.
[Hint: You may assume that σ2RSS has a χn−p2 distribution, and that β^ and the residual sum of squares are independent. Properties of χ2 distributions may be used without proof.]