# Markov chain

• Apr 7th 2010, 06:08 AM
NYC
Markov chain
Is it possible to have a markov chain with an infinite number of transient states and an infinite number of positive recurrent states?
• Apr 7th 2010, 01:05 PM
Moo
Hello,

Let $\displaystyle \mathbb{Z}$ (integers) be the space of sets.
Perhaps you can define the probability transitions : $\displaystyle p_{2n,2n\pm 2}=\frac 12$ and $\displaystyle p_{2n+1,2n}=1$ ?
• Apr 8th 2010, 01:51 AM
NYC
Quote:

Originally Posted by Moo
Hello,

Let $\displaystyle \mathbb{Z}$ (integers) be the space of sets.
Perhaps you can define the probability transitions : $\displaystyle p_{2n,2n\pm 2}=\frac 12$ and $\displaystyle p_{2n+1,2n}=1$ ?

Thanks for the help.
I can see where the infinite transient states come from but how do you know it gives an infiinte number of positive recurrent states? I don't really understand the definition I have of positive recurrent.....

a recurrent state i is said to be positive recurrent if the expected time of 1st return (starting from i) is finite
• Apr 8th 2010, 05:10 AM
Anonymous1
Quote:

Originally Posted by NYC
I don't really understand the definition I have of positive recurrent.....

(1) If i is Transient this must hold for some j
P(i-->j)>0 (its possible to get to j)
P(j-->i)=0 (its impossible to get back)

(2) i is positive recurrent if for all j
P(i-->j)>0 (its possible to get to j)
P(j-->i)>0 (its ALWAYS possible to get back)

(3) If its always possible to get back, then E(time of first return to i)< oo.

Don't get caught up with expectation, just think of it in terms of (2).
• Apr 12th 2010, 08:40 AM
NYC
Quote:

Originally Posted by Anonymous1
(1) If i is Transient this must hold for some j
P(i-->j)>0 (its possible to get to j)
P(j-->i)=0 (its impossible to get back)

(2) i is positive recurrent if for all j
P(i-->j)>0 (its possible to get to j)
P(j-->i)>0 (its ALWAYS possible to get back)

(3) If its always possible to get back, then E(time of first return to i)< oo.

Don't get caught up with expectation, just think of it in terms of (2).

Thanks! I now understand positive recurrence. How about null recurrence? The definition I have is....

state i is said to be null recurrent if the expected time of 1st return (starting from i) is infinite
• Apr 12th 2010, 11:06 AM
Focus
Quote:

Originally Posted by NYC
Thanks! I now understand positive recurrence. How about null recurrence? The definition I have is....

state i is said to be null recurrent if the expected time of 1st return (starting from i) is infinite

Getting back in a finite time a.s. does not guarantee that the expectation is finite. If you suppose that the first return time T is distributed something like
$\displaystyle \mathbb{P}(T=n)=c \frac{1}{n^2}$ where c is some constant, then when you take the expectation you get

$\displaystyle \mathbb{E}[T]=c\sum\frac{1}{n}$

which is not finite. However here you will get back in finite time a.s.

The intuition is that you will get there eventually, just in an arbitrarily large time.
• Apr 13th 2010, 01:49 PM
raheel88
I'm having a similar sort of problem in understanding null-recurrence. What I want is a finite, irreducible Markov chain whereby ALL states are null-recurrent.

But isn't this just a simple (symmetric) random walk with reflecting barriers?

If anyone can help I would be really grateful!
• Apr 14th 2010, 06:01 AM
NYC
Quote:

Originally Posted by raheel88
I'm having a similar sort of problem in understanding null-recurrence. What I want is a finite, irreducible Markov chain whereby ALL states are null-recurrent.

But isn't this just a simple (symmetric) random walk with reflecting barriers?

If anyone can help I would be really grateful!

Hi
For a finite Markov chain no state can be null-recurrent.
• Apr 14th 2010, 06:19 AM
raheel88
thanks nyc,

makes sense!