Suppose that X1,…,Xn are independent identically distributed random variables with
P(Xi=x)=(kx)θx(1−θ)k−x,x=0,…,k
where k is known and θ(0<θ<1) is an unknown parameter. Find the maximum likelihood estimator θ^ of θ.
Statistician 1 has prior density for θ given by π1(θ)=αθα−1,0<θ<1, where α>1. Find the posterior distribution for θ after observing data X1=x1,…,Xn=xn. Write down the posterior mean θ^1(B), and show that
θ^1(B)=cθ^+(1−c)θ~1
where θ~1 depends only on the prior distribution and c is a constant in (0,1) that is to be specified.
Statistician 2 has prior density for θ given by π2(θ)=α(1−θ)α−1,0<θ<1. Briefly describe the prior beliefs that the two statisticians hold about θ. Find the posterior mean θ^2(B) and show that θ^2(B)<θ^1(B).
Suppose that α increases (but n,k and the xi remain unchanged). How do the prior beliefs of the two statisticians change? How does c vary? Explain briefly what happens to θ^1(B) and θ^2(B).
[Hint: The Beta (α,β)(α>0,β>0) distribution has density
f(x)=Γ(α)Γ(β)Γ(α+β)xα−1(1−x)β−1,0<x<1
with expectation α+βα and variance (α+β+1)(α+β)2αβ. Here, Γ(α)=∫0∞xα−1e−xdx,α>0, is the Gamma function.]