1.II.19C
Part IB, 2006
Explain what is meant by a stopping time of a Markov chain . State the strong Markov property.
Show that, for any state , the probability, starting from , that makes infinitely many visits to can take only the values 0 or 1 .
Show moreover that, if
then makes infinitely many visits to with probability