Results 1 to 9 of 9

Math Help - Markov chain

  1. #1
    NYC
    NYC is offline
    Newbie
    Joined
    Apr 2010
    Posts
    19

    Post Markov chain

    Is it possible to have a markov chain with an infinite number of transient states and an infinite number of positive recurrent states?
    Follow Math Help Forum on Facebook and Google+

  2. #2
    Moo
    Moo is offline
    A Cute Angle Moo's Avatar
    Joined
    Mar 2008
    From
    P(I'm here)=1/3, P(I'm there)=t+1/3
    Posts
    5,618
    Thanks
    6
    Hello,

    Let \mathbb{Z} (integers) be the space of sets.
    Perhaps you can define the probability transitions : p_{2n,2n\pm 2}=\frac 12 and p_{2n+1,2n}=1 ?
    Follow Math Help Forum on Facebook and Google+

  3. #3
    NYC
    NYC is offline
    Newbie
    Joined
    Apr 2010
    Posts
    19
    Quote Originally Posted by Moo View Post
    Hello,

    Let \mathbb{Z} (integers) be the space of sets.
    Perhaps you can define the probability transitions : p_{2n,2n\pm 2}=\frac 12 and p_{2n+1,2n}=1 ?

    Thanks for the help.
    I can see where the infinite transient states come from but how do you know it gives an infiinte number of positive recurrent states? I don't really understand the definition I have of positive recurrent.....

    a recurrent state i is said to be positive recurrent if the expected time of 1st return (starting from i) is finite
    Follow Math Help Forum on Facebook and Google+

  4. #4
    Super Member Anonymous1's Avatar
    Joined
    Nov 2009
    From
    Big Red, NY
    Posts
    517
    Thanks
    1
    Quote Originally Posted by NYC View Post
    I don't really understand the definition I have of positive recurrent.....
    (1) If i is Transient this must hold for some j
    P(i-->j)>0 (its possible to get to j)
    P(j-->i)=0 (its impossible to get back)

    (2) i is positive recurrent if for all j
    P(i-->j)>0 (its possible to get to j)
    P(j-->i)>0 (its ALWAYS possible to get back)

    (3) If its always possible to get back, then E(time of first return to i)< oo.

    Don't get caught up with expectation, just think of it in terms of (2).
    Follow Math Help Forum on Facebook and Google+

  5. #5
    NYC
    NYC is offline
    Newbie
    Joined
    Apr 2010
    Posts
    19
    Quote Originally Posted by Anonymous1 View Post
    (1) If i is Transient this must hold for some j
    P(i-->j)>0 (its possible to get to j)
    P(j-->i)=0 (its impossible to get back)

    (2) i is positive recurrent if for all j
    P(i-->j)>0 (its possible to get to j)
    P(j-->i)>0 (its ALWAYS possible to get back)

    (3) If its always possible to get back, then E(time of first return to i)< oo.

    Don't get caught up with expectation, just think of it in terms of (2).
    Thanks! I now understand positive recurrence. How about null recurrence? The definition I have is....

    state i is said to be null recurrent if the expected time of 1st return (starting from i) is infinite
    Follow Math Help Forum on Facebook and Google+

  6. #6
    Member Focus's Avatar
    Joined
    Aug 2009
    Posts
    228
    Quote Originally Posted by NYC View Post
    Thanks! I now understand positive recurrence. How about null recurrence? The definition I have is....

    state i is said to be null recurrent if the expected time of 1st return (starting from i) is infinite
    Getting back in a finite time a.s. does not guarantee that the expectation is finite. If you suppose that the first return time T is distributed something like
    \mathbb{P}(T=n)=c \frac{1}{n^2} where c is some constant, then when you take the expectation you get

    \mathbb{E}[T]=c\sum\frac{1}{n}

    which is not finite. However here you will get back in finite time a.s.

    The intuition is that you will get there eventually, just in an arbitrarily large time.
    Follow Math Help Forum on Facebook and Google+

  7. #7
    Junior Member raheel88's Avatar
    Joined
    Apr 2010
    Posts
    31
    I'm having a similar sort of problem in understanding null-recurrence. What I want is a finite, irreducible Markov chain whereby ALL states are null-recurrent.

    But isn't this just a simple (symmetric) random walk with reflecting barriers?

    If anyone can help I would be really grateful!
    Follow Math Help Forum on Facebook and Google+

  8. #8
    NYC
    NYC is offline
    Newbie
    Joined
    Apr 2010
    Posts
    19
    Quote Originally Posted by raheel88 View Post
    I'm having a similar sort of problem in understanding null-recurrence. What I want is a finite, irreducible Markov chain whereby ALL states are null-recurrent.

    But isn't this just a simple (symmetric) random walk with reflecting barriers?

    If anyone can help I would be really grateful!

    Hi
    For a finite Markov chain no state can be null-recurrent.
    Follow Math Help Forum on Facebook and Google+

  9. #9
    Junior Member raheel88's Avatar
    Joined
    Apr 2010
    Posts
    31
    thanks nyc,

    makes sense!
    Follow Math Help Forum on Facebook and Google+

Similar Math Help Forum Discussions

  1. Markov Chain of random variables from a primitive markov chain
    Posted in the Advanced Statistics Forum
    Replies: 0
    Last Post: October 19th 2011, 09:12 AM
  2. Markov Chain Help
    Posted in the Advanced Statistics Forum
    Replies: 0
    Last Post: June 28th 2010, 08:37 AM
  3. Markov Chain
    Posted in the Discrete Math Forum
    Replies: 0
    Last Post: December 12th 2009, 05:52 PM
  4. Markov Chain HELP!!!!
    Posted in the Advanced Statistics Forum
    Replies: 1
    Last Post: April 9th 2009, 10:28 PM
  5. Replies: 2
    Last Post: October 28th 2008, 07:32 PM

Search Tags


/mathhelpforum @mathhelpforum