Paper 4, Section II, H

Statistics
Part IB, 2015

Consider a linear model Y=Xβ+ε\mathbf{Y}=X \boldsymbol{\beta}+\varepsilon where Y\mathbf{Y} is an n×1n \times 1 vector of observations, XX is a known n×pn \times p matrix, β\boldsymbol{\beta} is a p×1(p<n)p \times 1(p<n) vector of unknown parameters and ε\varepsilon is an n×1n \times 1 vector of independent normally distributed random variables each with mean zero and unknown variance σ2\sigma^{2}. Write down the log-likelihood and show that the maximum likelihood estimators β^\hat{\boldsymbol{\beta}} and σ^2\hat{\sigma}^{2} of β\boldsymbol{\beta} and σ2\sigma^{2} respectively satisfy

XTXβ^=XTY,1σ^4(YXβ^)T(YXβ^)=nσ^2X^{T} X \hat{\boldsymbol{\beta}}=X^{T} \mathbf{Y}, \quad \frac{1}{\hat{\sigma}^{4}}(\mathbf{Y}-X \hat{\boldsymbol{\beta}})^{T}(\mathbf{Y}-X \hat{\boldsymbol{\beta}})=\frac{n}{\hat{\sigma}^{2}}

(T(T denotes the transpose )). Assuming that XTXX^{T} X is invertible, find the solutions β^\hat{\boldsymbol{\beta}} and σ^2\hat{\sigma}^{2} of these equations and write down their distributions.

Prove that β^\hat{\boldsymbol{\beta}} and σ^2\hat{\sigma}^{2} are independent.

Consider the model Yij=μi+γxij+εij,i=1,2,3Y_{i j}=\mu_{i}+\gamma x_{i j}+\varepsilon_{i j}, i=1,2,3 and j=1,2,3j=1,2,3. Suppose that, for all i,xi1=1,xi2=0i, x_{i 1}=-1, x_{i 2}=0 and xi3=1x_{i 3}=1, and that εij,i,j=1,2,3\varepsilon_{i j}, i, j=1,2,3, are independent N(0,σ2)N\left(0, \sigma^{2}\right) random variables where σ2\sigma^{2} is unknown. Show how this model may be written as a linear model and write down Y,X,β\mathbf{Y}, X, \boldsymbol{\beta} and ε\varepsilon. Find the maximum likelihood estimators of μi\mu_{i} (i=1,2,3),γ(i=1,2,3), \gamma and σ2\sigma^{2} in terms of the YijY_{i j}. Derive a 100(1α)%100(1-\alpha) \% confidence interval for σ2\sigma^{2} and for μ2μ1\mu_{2}-\mu_{1}.

[You may assume that, if W=(W1T,W2T)T\mathbf{W}=\left(\mathbf{W}_{1}^{T}, \mathbf{W}_{2}^{T}\right)^{T} is multivariate normal with cov(W1,W2)=0\operatorname{cov}\left(\mathbf{W}_{1}, \mathbf{W}_{2}\right)=0, then W1\mathbf{W}_{1} and W2\mathbf{W}_{2} are independent.]