2.II.19H

Statistics
Part IB, 2008

Suppose that the joint distribution of random variables X,YX, Y taking values in Z+={0,1,2,}\mathbb{Z}^{+}=\{0,1,2, \ldots\} is given by the joint probability generating function

φ(s,t)E[sXtY]=1αβ1αsβt\varphi(s, t) \equiv E\left[s^{X} t^{Y}\right]=\frac{1-\alpha-\beta}{1-\alpha s-\beta t}

where the unknown parameters α\alpha and β\beta are positive, and satisfy the inequality α+β<1\alpha+\beta<1. Find E(X)E(X). Prove that the probability mass function of (X,Y)(X, Y) is

f(x,yα,β)=(1αβ)(x+yx)αxβy(x,yZ+)f(x, y \mid \alpha, \beta)=(1-\alpha-\beta)\left(\begin{array}{c} x+y \\ x \end{array}\right) \alpha^{x} \beta^{y} \quad\left(x, y \in \mathbb{Z}^{+}\right)

and prove that the maximum-likelihood estimators of α\alpha and β\beta based on a sample of size nn drawn from the distribution are

α^=Xˉ1+Xˉ+Yˉ,β^=Yˉ1+Xˉ+Yˉ,\hat{\alpha}=\frac{\bar{X}}{1+\bar{X}+\bar{Y}}, \quad \hat{\beta}=\frac{\bar{Y}}{1+\bar{X}+\bar{Y}},

where Xˉ\bar{X} (respectively, Yˉ\bar{Y} ) is the sample mean of X1,,XnX_{1}, \ldots, X_{n} (respectively, Y1,,YnY_{1}, \ldots, Y_{n} ).

By considering α^+β^\hat{\alpha}+\hat{\beta} or otherwise, prove that the maximum-likelihood estimator is biased. Stating clearly any results to which you appeal, prove that as n,α^αn \rightarrow \infty, \hat{\alpha} \rightarrow \alpha, making clear the sense in which this convergence happens.