Paper 4, Section II, I

Principles of Statistics
Part II, 2009

Consider the double dichotomy, where the loss is 0 for a correct decision and 1 for an incorrect decision. Describe the form of a Bayes decision rule. Assuming the equivalence of normal and extensive form analyses, deduce the Neyman-Pearson lemma.

For a problem with random variable XX and real parameter θ\theta, define monotone likelihood ratio (MLR) and monotone test.

Suppose the problem has MLR in a real statistic T=t(X)T=t(X). Let ϕ\phi be a monotone test, with power function γ()\gamma(\cdot), and let ϕ\phi^{\prime} be any other test, with power function γ()\gamma^{\prime}(\cdot). Show that if θ1>θ0\theta_{1}>\theta_{0} and γ(θ0)>γ(θ0)\gamma\left(\theta_{0}\right)>\gamma^{\prime}\left(\theta_{0}\right), then γ(θ1)>γ(θ1)\gamma\left(\theta_{1}\right)>\gamma^{\prime}\left(\theta_{1}\right). Deduce that there exists θ[,]\theta^{*} \in[-\infty, \infty] such that γ(θ)γ(θ)\gamma(\theta) \leqslant \gamma^{\prime}(\theta) for θ<θ\theta<\theta^{*}, and γ(θ)γ(θ)\gamma(\theta) \geqslant \gamma^{\prime}(\theta) for θ>θ\theta>\theta^{*}.

For an arbitrary prior distribution Π\Pi with density π()\pi(\cdot), and an arbitrary value θ\theta^{*}, show that the posterior odds

Π(θ>θX=x)Π(θθX=x)\frac{\Pi\left(\theta>\theta^{*} \mid X=x\right)}{\Pi\left(\theta \leqslant \theta^{*} \mid X=x\right)}

is a non-decreasing function of t(x)t(x).