Paper 2, Section II, I
Suppose that the random vector has a distribution over depending on a real parameter , with everywhere positive density function . Define the maximum likelihood estimator , the score variable , the observed information and the expected (Fisher) information for the problem of estimating from .
For the case where the are independent and identically distributed, show that, as . [You may assume sufficient conditions to allow interchange of integration over the sample space and differentiation with respect to the parameter.] State the asymptotic distribution of .
The random vector is generated according to the rule
where and the are independent and identically distributed from the standard normal distribution . Write down the likelihood function for based on data , find and and show that the pair forms a minimal sufficient statistic.
A Bayesian uses the improper prior density . Show that, in the posterior, (where is a statistic that you should identify) has the same distribution as .