A1.12 B1.15

Principles of Statistics
Part II, 2002

(i) Explain in detail the minimax and Bayes principles of decision theory.

Show that if d(X)d(X) is a Bayes decision rule for a prior density π(θ)\pi(\theta) and has constant risk function, then d(X)d(X) is minimax.

(ii) Let X1,,XpX_{1}, \ldots, X_{p} be independent random variables, with XiN(μi,1),i=1,,pX_{i} \sim N\left(\mu_{i}, 1\right), i=1, \ldots, p.

Consider estimating μ=(μ1,,μp)T\mu=\left(\mu_{1}, \ldots, \mu_{p}\right)^{T} by d=(d1,,dp)Td=\left(d_{1}, \ldots, d_{p}\right)^{T}, with loss function

L(μ,d)=i=1p(μidi)2L(\mu, d)=\sum_{i=1}^{p}\left(\mu_{i}-d_{i}\right)^{2}

What is the risk function of X=(X1,,Xp)T?X=\left(X_{1}, \ldots, X_{p}\right)^{T} ?

Consider the class of estimators of μ\mu of the form

da(X)=(1aXTX)Xd^{a}(X)=\left(1-\frac{a}{X^{T} X}\right) X

indexed by a0a \geqslant 0. Find the risk function of da(X)d^{a}(X) in terms of E(1/XTX)E\left(1 / X^{T} X\right), which you should not attempt to evaluate, and deduce that XX is inadmissible. What is optimal value of aa ?

[You may assume Stein's Lemma, that for suitably behaved real-valued functions hh,

E{(Xiμi)h(X)}=E{h(X)Xi}.]\left.E\left\{\left(X_{i}-\mu_{i}\right) h(X)\right\}=E\left\{\frac{\partial h(X)}{\partial X_{i}}\right\} . \quad\right]