Consider a linear model Y=Xβ+ε where Y is an n×1 vector of observations, X is a known n×p matrix, β is a p×1(p<n) vector of unknown parameters and ε is an n×1 vector of independent normally distributed random variables each with mean zero and unknown variance σ2. Write down the log-likelihood and show that the maximum likelihood estimators β^ and σ^2 of β and σ2 respectively satisfy
XTXβ^=XTY,σ^41(Y−Xβ^)T(Y−Xβ^)=σ^2n
(T denotes the transpose ). Assuming that XTX is invertible, find the solutions β^ and σ^2 of these equations and write down their distributions.
Prove that β^ and σ^2 are independent.
Consider the model Yij=μi+γxij+εij,i=1,2,3 and j=1,2,3. Suppose that, for all i,xi1=−1,xi2=0 and xi3=1, and that εij,i,j=1,2,3, are independent N(0,σ2) random variables where σ2 is unknown. Show how this model may be written as a linear model and write down Y,X,β and ε. Find the maximum likelihood estimators of μi (i=1,2,3),γ and σ2 in terms of the Yij. Derive a 100(1−α)% confidence interval for σ2 and for μ2−μ1.
[You may assume that, if W=(W1T,W2T)T is multivariate normal with cov(W1,W2)=0, then W1 and W2 are independent.]