1. ## Markov chain

Is it possible to have a markov chain with an infinite number of transient states and an infinite number of positive recurrent states?

2. Hello,

Let $\mathbb{Z}$ (integers) be the space of sets.
Perhaps you can define the probability transitions : $p_{2n,2n\pm 2}=\frac 12$ and $p_{2n+1,2n}=1$ ?

3. Originally Posted by Moo
Hello,

Let $\mathbb{Z}$ (integers) be the space of sets.
Perhaps you can define the probability transitions : $p_{2n,2n\pm 2}=\frac 12$ and $p_{2n+1,2n}=1$ ?

Thanks for the help.
I can see where the infinite transient states come from but how do you know it gives an infiinte number of positive recurrent states? I don't really understand the definition I have of positive recurrent.....

a recurrent state i is said to be positive recurrent if the expected time of 1st return (starting from i) is finite

4. Originally Posted by NYC
I don't really understand the definition I have of positive recurrent.....
(1) If i is Transient this must hold for some j
P(i-->j)>0 (its possible to get to j)
P(j-->i)=0 (its impossible to get back)

(2) i is positive recurrent if for all j
P(i-->j)>0 (its possible to get to j)
P(j-->i)>0 (its ALWAYS possible to get back)

(3) If its always possible to get back, then E(time of first return to i)< oo.

Don't get caught up with expectation, just think of it in terms of (2).

5. Originally Posted by Anonymous1
(1) If i is Transient this must hold for some j
P(i-->j)>0 (its possible to get to j)
P(j-->i)=0 (its impossible to get back)

(2) i is positive recurrent if for all j
P(i-->j)>0 (its possible to get to j)
P(j-->i)>0 (its ALWAYS possible to get back)

(3) If its always possible to get back, then E(time of first return to i)< oo.

Don't get caught up with expectation, just think of it in terms of (2).
Thanks! I now understand positive recurrence. How about null recurrence? The definition I have is....

state i is said to be null recurrent if the expected time of 1st return (starting from i) is infinite

6. Originally Posted by NYC
Thanks! I now understand positive recurrence. How about null recurrence? The definition I have is....

state i is said to be null recurrent if the expected time of 1st return (starting from i) is infinite
Getting back in a finite time a.s. does not guarantee that the expectation is finite. If you suppose that the first return time T is distributed something like
$\mathbb{P}(T=n)=c \frac{1}{n^2}$ where c is some constant, then when you take the expectation you get

$\mathbb{E}[T]=c\sum\frac{1}{n}$

which is not finite. However here you will get back in finite time a.s.

The intuition is that you will get there eventually, just in an arbitrarily large time.

7. I'm having a similar sort of problem in understanding null-recurrence. What I want is a finite, irreducible Markov chain whereby ALL states are null-recurrent.

But isn't this just a simple (symmetric) random walk with reflecting barriers?

If anyone can help I would be really grateful!

8. Originally Posted by raheel88
I'm having a similar sort of problem in understanding null-recurrence. What I want is a finite, irreducible Markov chain whereby ALL states are null-recurrent.

But isn't this just a simple (symmetric) random walk with reflecting barriers?

If anyone can help I would be really grateful!

Hi
For a finite Markov chain no state can be null-recurrent.

9. thanks nyc,

makes sense!