Thread: Markov Chain Positive Recurrent

1. Markov Chain Positive Recurrent

Suppose it is said that the chain is recurrent since p(x,x-1)=1 and p(0,x)=Px

In other words, whenever the chain is at a state other than 0 it proceeds deterministically, one step at a time toward 0, and whenever the chain reaches state 0, it chooses a new state.

Under what conditions on the Px is the chain positive recurrent?

I've found the limiting distribution of x is 1/XPx.

Is it correct?

Thanks in advance!!

2. Originally Posted by exia
Suppose it is said that the chain is recurrent since p(x,x-1)=1 and p(0,x)=Px

In other words, whenever the chain is at a state other than 0 it proceeds deterministically, one step at a time toward 0, and whenever the chain reaches state 0, it chooses a new state.

Under what conditions on the Px is the chain positive recurrent?

I've found the limiting distribution of x is 1/XPx.

Is it correct?

Thanks in advance!!
I don't understand what you mean by "1/XPx". As for me, I've found the chain is positive recurrent iff $\sum_x x p_x<\infty$, and that in this case the limiting distribution is $\frac{1-(p_1+p_2+\cdots + p_{x-1})}{\sum_x x p_x} = \frac{P(X\geq x)}{E[X]}$ if $X$ has distribution given by $P(X=x)=p_x$. I got this from the definition of the invariant distribution $\pi$: it says $\pi(x)=\pi(0)p_x+\pi(x-1)$ for $x\geq 1$.

3. Originally Posted by Laurent
I don't understand what you mean by "1/XPx". As for me, I've found the chain is positive recurrent iff $\sum_x x p_x<\infty$, and that in this case the limiting distribution is $\frac{1-(p_1+p_2+\cdots + p_{x-1})}{\sum_x x p_x} = \frac{P(X\geq x)}{E[X]}$ if $X$ has distribution given by $P(X=x)=p_x$. I got this from the definition of the invariant distribution $\pi$: it says $\pi(x)=\pi(0)p_x+\pi(x-1)$ for $x\geq 1$.
Thank you so much for the reply!

I kinda get it now, but how could you get the numerator of the limiting distribution? In order words, how could you solve the difference equation of pi(x) ?

4. Originally Posted by exia
Thank you so much for the reply!

I kinda get it now, but how could you get the numerator of the limiting distribution? In order words, how could you solve the difference equation of pi(x) ?
First there was a typo, the equation is $\pi(x)=p_x \pi(0)+\pi(x+1)$.

Then, I chose $\pi(0)=1$ for convenience, and the equation gives $\pi(1)=\pi(0)-p_1=1-p_1$, $\pi(2)=\pi(1)-p_2=1-p_1-p_2$, etc.
In terms of $X$ (with $P(X=x)=p_x$), $\pi(x)=P(X\geq x)$.
Then $\sum_x \pi(x)=\sum_x P(X\geq x)=E[X]$.

markov chain positively recurrent

Click on a term to search for related topics.