Markov Chains - Recurrence
Hi, I was reading about Markov chains in wikipedia and I've got a doubt on this topic: Markov chain - Wikipedia, the free encyclopedia
More specifically, I wanted examples of transient and null recurrent states.
If I have a chain with 3 states that communicate with each other. I can start at state 1, and do something like 1->2->3->2->3->2->3->2->3... etc. Does this mean P(T1 = infinity) > 0? I don't think so, since calculating this over infinite time steps would yield a 0 probability, am I correct?
But the same chain with a 4th state that can only be accessed by state 1 and only accesses itself would make state 1 a transient state, right?
What about null recurrent states?
I can imagine a very simple case with 2 states, where state 1 goes to state 2 with probability 1 and state 2 remains at state 2 with probability P and goes to state 1 with probability 1-P, 0<P<1. It obviously has P(T1 = infinity) = 0, but E[T1] = 1<n<infinity : Sum(n*((1-P)*(P)^(n-2))) which seems to converge for any P < 1...
So, could anyone give me an example of a chain with null recurrent states? Thanks