Observed frequencies of transitions in a Markov Chain
I came across the following remark in P. Billingsley, "Statistical Inference for Markov Processes", and I wonder whether someone can point me to further information:
"It is possible to show that if (the transition probability) p_ij > 0, then P (n_ij = 0) goes to 0 exponentially as n --> infinity."
Remark: We assume that we have observed n transitions of the Markov chain, and n_ij denotes the number of times we have observed the transition i --> j.
I tried to find information about this online but I failed. Maybe someone has seen this or a related statement online or in a book? Any help would be very much appreciated.