Results 1 to 4 of 4

Math Help - Markov Chain Positive Recurrent

  1. #1
    Newbie
    Joined
    Mar 2009
    Posts
    4

    Markov Chain Positive Recurrent

    Suppose it is said that the chain is recurrent since p(x,x-1)=1 and p(0,x)=Px

    In other words, whenever the chain is at a state other than 0 it proceeds deterministically, one step at a time toward 0, and whenever the chain reaches state 0, it chooses a new state.

    Under what conditions on the Px is the chain positive recurrent?

    I've found the limiting distribution of x is 1/XPx.

    Is it correct?

    Thanks in advance!!
    Follow Math Help Forum on Facebook and Google+

  2. #2
    MHF Contributor

    Joined
    Aug 2008
    From
    Paris, France
    Posts
    1,174
    Quote Originally Posted by exia View Post
    Suppose it is said that the chain is recurrent since p(x,x-1)=1 and p(0,x)=Px

    In other words, whenever the chain is at a state other than 0 it proceeds deterministically, one step at a time toward 0, and whenever the chain reaches state 0, it chooses a new state.

    Under what conditions on the Px is the chain positive recurrent?

    I've found the limiting distribution of x is 1/XPx.

    Is it correct?

    Thanks in advance!!
    I don't understand what you mean by "1/XPx". As for me, I've found the chain is positive recurrent iff \sum_x x p_x<\infty, and that in this case the limiting distribution is \frac{1-(p_1+p_2+\cdots + p_{x-1})}{\sum_x x p_x} = \frac{P(X\geq x)}{E[X]} if X has distribution given by P(X=x)=p_x. I got this from the definition of the invariant distribution \pi: it says \pi(x)=\pi(0)p_x+\pi(x-1) for x\geq 1.
    Follow Math Help Forum on Facebook and Google+

  3. #3
    Newbie
    Joined
    Mar 2009
    Posts
    4
    Quote Originally Posted by Laurent View Post
    I don't understand what you mean by "1/XPx". As for me, I've found the chain is positive recurrent iff \sum_x x p_x<\infty, and that in this case the limiting distribution is \frac{1-(p_1+p_2+\cdots + p_{x-1})}{\sum_x x p_x} = \frac{P(X\geq x)}{E[X]} if X has distribution given by P(X=x)=p_x. I got this from the definition of the invariant distribution \pi: it says \pi(x)=\pi(0)p_x+\pi(x-1) for x\geq 1.
    Thank you so much for the reply!

    I kinda get it now, but how could you get the numerator of the limiting distribution? In order words, how could you solve the difference equation of pi(x) ?
    Follow Math Help Forum on Facebook and Google+

  4. #4
    MHF Contributor

    Joined
    Aug 2008
    From
    Paris, France
    Posts
    1,174
    Quote Originally Posted by exia View Post
    Thank you so much for the reply!

    I kinda get it now, but how could you get the numerator of the limiting distribution? In order words, how could you solve the difference equation of pi(x) ?
    First there was a typo, the equation is \pi(x)=p_x \pi(0)+\pi(x+1).

    Then, I chose \pi(0)=1 for convenience, and the equation gives \pi(1)=\pi(0)-p_1=1-p_1, \pi(2)=\pi(1)-p_2=1-p_1-p_2, etc.
    In terms of X (with P(X=x)=p_x), \pi(x)=P(X\geq x).
    Then \sum_x \pi(x)=\sum_x P(X\geq x)=E[X].
    Follow Math Help Forum on Facebook and Google+

Similar Math Help Forum Discussions

  1. Markov Chain of random variables from a primitive markov chain
    Posted in the Advanced Statistics Forum
    Replies: 0
    Last Post: October 19th 2011, 09:12 AM
  2. Null-recurrent Markov Chain
    Posted in the Advanced Statistics Forum
    Replies: 2
    Last Post: April 14th 2010, 05:05 AM
  3. how can u show this markov chain is recurrent
    Posted in the Advanced Statistics Forum
    Replies: 0
    Last Post: November 26th 2008, 01:56 AM
  4. Replies: 2
    Last Post: October 28th 2008, 07:32 PM
  5. positively recurrent and null-recurrent
    Posted in the Advanced Statistics Forum
    Replies: 0
    Last Post: March 11th 2008, 08:12 PM

Search Tags


/mathhelpforum @mathhelpforum