(i) Explain in detail the minimax and Bayes principles of decision theory.
Show that if d(X) is a Bayes decision rule for a prior density π(θ) and has constant risk function, then d(X) is minimax.
(ii) Let X1,…,Xp be independent random variables, with Xi∼N(μi,1),i=1,…,p.
Consider estimating μ=(μ1,…,μp)T by d=(d1,…,dp)T, with loss function
L(μ,d)=i=1∑p(μi−di)2
What is the risk function of X=(X1,…,Xp)T?
Consider the class of estimators of μ of the form
da(X)=(1−XTXa)X
indexed by a⩾0. Find the risk function of da(X) in terms of E(1/XTX), which you should not attempt to evaluate, and deduce that X is inadmissible. What is optimal value of a ?
[You may assume Stein's Lemma, that for suitably behaved real-valued functions h,
E{(Xi−μi)h(X)}=E{∂Xi∂h(X)}.]