Consider a linear model Y=Xβ+ϵ, where Y and ϵ are (n×1) with ϵ∼Nn(0,σ2I), β is (p×1), and X is (n×p) of full rankp<n. Let γ and δ be sub-vectors of β. What is meant by orthogonality between γ and δ ?
Now suppose
Yi=β0+β1xi+β2xi2+β3P3(xi)+ϵi(i=1,…,n),
where ϵ1,…,ϵn are independent N(0,σ2) random variables, x1,…,xn are real-valued known explanatory variables, and P3(x) is a cubic polynomial chosen so that β3 is orthogonal to (β0,β1,β2)T and β1 is orthogonal to (β0,β2)T.
Let β=(β0,β2,β1,β3)T. Describe the matrix Xsuch that Y=Xβ+ϵ. Show that XTXis block diagonal. Assuming further that this matrix is non-singular, show that the least-squares estimators of β1 and β3 are, respectively,
β1=∑i=1nxi2∑i=1nxiYi and β3=∑i=1nP3(xi)2∑i=1nP3(xi)Yi