Paper 2, Section II, F

Probability
Part IA, 2019

Recall that a random variable XX in R2\mathbb{R}^{2} is bivariate normal or Gaussian if uTXu^{T} X is normal for all uR2u \in \mathbb{R}^{2}. Let X=(X1X2)X=\left(\begin{array}{c}X_{1} \\ X_{2}\end{array}\right) be bivariate normal.

(a) (i) Show that if AA is a 2×22 \times 2 real matrix then AXA X is bivariate normal.

(ii) Let μ=E(X)\mu=\mathbb{E}(X) and V=Var(X)=E[(Xμ)(Xμ)T]V=\operatorname{Var}(X)=\mathbb{E}\left[(X-\mu)(X-\mu)^{T}\right]. Find the moment generating function MX(λ)=E(eλTX)M_{X}(\lambda)=\mathbb{E}\left(e^{\lambda^{T}} X\right) of XX and deduce that the distribution of a bivariate normal random variable XX is uniquely determined by μ\mu and VV.

(iii) Let μi=E(Xi)\mu_{i}=\mathbb{E}\left(X_{i}\right) and σi2=Var(Xi)\sigma_{i}^{2}=\operatorname{Var}\left(X_{i}\right) for i=1,2i=1,2. Let ρ=Cov(X1,X2)σ1σ2\rho=\frac{\operatorname{Cov}\left(X_{1}, X_{2}\right)}{\sigma_{1} \sigma_{2}} be the correlation of X1X_{1} and X2X_{2}. Write down VV in terms of some or all of μ1,μ2,σ1,σ2\mu_{1}, \mu_{2}, \sigma_{1}, \sigma_{2} and ρ\rho. If Cov(X1,X2)=0\operatorname{Cov}\left(X_{1}, X_{2}\right)=0, why must X1X_{1} and X2X_{2} be independent?

For each aRa \in \mathbb{R}, find Cov(X1,X2aX1)\operatorname{Cov}\left(X_{1}, X_{2}-a X_{1}\right). Hence show that X2=aX1+YX_{2}=a X_{1}+Y for some normal random variable YY in R\mathbb{R} that is independent of X1X_{1} and some aRa \in \mathbb{R} that should be specified.

(b) A certain species of East Anglian goblin has left arm of mean length 100 cm100 \mathrm{~cm} with standard deviation 1 cm1 \mathrm{~cm}, and right arm of mean length 102 cm102 \mathrm{~cm} with standard deviation 2 cm2 \mathrm{~cm}. The correlation of left- and right-arm-length of a goblin is 12\frac{1}{2}. You may assume that the distribution of left- and right-arm-lengths can be modelled by a bivariate normal distribution. What is the probability that a randomly selected goblin has longer right arm than left arm?

[You may give your answer in terms of the distribution function Φ\Phi of a N(0,1)N(0,1) random variable ZZ. That is, Φ(t)=P(Zt)\Phi(t)=\mathbb{P}(Z \leqslant t).J