Results 1 to 6 of 6

Math Help - Markov Chain Steady State Probablity

  1. #1
    Newbie
    Joined
    Oct 2009
    Posts
    18

    Markov Chain Steady State Probablity

    Hi, all

    Just a quick question about the Markov Chain, if we have (x,y,z)*A=(x,y,z), where A is the transmission matrix, and find a solution to (x,y,z), which is the probablities for the steady state. Does it mean...if n appoaches to infinity.the steady state probability is the final equilibrium? and not some factor or fraction of the steady state probability???

    Thanks
    Follow Math Help Forum on Facebook and Google+

  2. #2
    Moo
    Moo is offline
    A Cute Angle Moo's Avatar
    Joined
    Mar 2008
    From
    P(I'm here)=1/3, P(I'm there)=t+1/3
    Posts
    5,618
    Thanks
    6
    Hello,
    Quote Originally Posted by wxyj View Post
    Hi, all

    Just a quick question about the Markov Chain, if we have (x,y,z)*A=(x,y,z), where A is the transmission matrix, and find a solution to (x,y,z), which is the probablities for the steady state. Does it mean...if n appoaches to infinity.the steady state probability is the final equilibrium?
    For the steady state probability to be the final equilibrium (as you call it), you need the chain to be irreducible, aperiodic and recurrent positive.

    It's written here : Markov chain - Wikipedia, the free encyclopedia (the 2nd & 3rd formulas)


    and not some factor or fraction of the steady state probability???
    You're working with stationary probabilities. So the sum of the components has to be 1. If you were working with stationary measures, then there is a unique solution, up to a factor.


    NB : for some vocabulary problems, stationary = steady
    Last edited by Moo; January 20th 2010 at 09:19 AM. Reason: unwanted double r
    Follow Math Help Forum on Facebook and Google+

  3. #3
    MHF Contributor Drexel28's Avatar
    Joined
    Nov 2009
    From
    Berkeley, California
    Posts
    4,563
    Thanks
    21
    Quote Originally Posted by Moo View Post
    Hello,

    For the steady state probability to be the final equilibrium (as you call it), you need the chain to be irreducible, aperiodic and recurrent positive.

    It's written here : Markov chain - Wikipedia, the free encyclopedia (the 2nd & 3rd formulas)



    Your're working with stationary probabilities. So the sum of the components has to be 1. If you were working with stationary measures, then there is a unique solution, up to a factor.


    NB : for some vocabulary problems, stationary = steady
    I'm sorry. What does "NB" mean? I see it everywhere!
    Follow Math Help Forum on Facebook and Google+

  4. #4
    Moo
    Moo is offline
    A Cute Angle Moo's Avatar
    Joined
    Mar 2008
    From
    P(I'm here)=1/3, P(I'm there)=t+1/3
    Posts
    5,618
    Thanks
    6
    Quote Originally Posted by Drexel28 View Post
    I'm sorry. What does "NB" mean? I see it everywhere!
    Nota bene. No wonder you guys don't know it, because you usually don't have the Latin touch !

    Nota bene - Wikipedia, the free encyclopedia


    PS (post scriptum) : when I saw you replied this thread, I was surprised lol! Thought you had done these probabilities you like so much
    Follow Math Help Forum on Facebook and Google+

  5. #5
    MHF Contributor Drexel28's Avatar
    Joined
    Nov 2009
    From
    Berkeley, California
    Posts
    4,563
    Thanks
    21
    Quote Originally Posted by Moo View Post
    Nota bene. No wonder you guys don't know it, because you usually don't have the Latin touch !

    Nota bene - Wikipedia, the free encyclopedia


    PS (post scriptum) : when I saw you replied this thread, I was surprised lol! Thought you had done these probabilities you like so much
    Haha! Thanks. Actually, my ignorant unfounded dislike of probability was dispelled by you and Laurent with your measure-theoretic probability theory! So, while I still am as ignorant as ever about the subject I really appreciate it now!
    Follow Math Help Forum on Facebook and Google+

  6. #6
    Moo
    Moo is offline
    A Cute Angle Moo's Avatar
    Joined
    Mar 2008
    From
    P(I'm here)=1/3, P(I'm there)=t+1/3
    Posts
    5,618
    Thanks
    6
    Quote Originally Posted by Drexel28 View Post
    Haha! Thanks. Actually, my ignorant unfounded dislike of probability was dispelled by you and Laurent with your measure-theoretic probability theory! So, while I still am as ignorant as ever about the subject I really appreciate it now!
    'Serious' probabilities use & overuse measure theory
    Probability is not reduced to combinatorics (very fortunately !)

    Glad to see you say such a thing
    Follow Math Help Forum on Facebook and Google+

Similar Math Help Forum Discussions

  1. Period of each state in Markov Chain
    Posted in the Advanced Statistics Forum
    Replies: 1
    Last Post: August 21st 2011, 11:29 PM
  2. Markov chain -state space and transition matrix
    Posted in the Advanced Statistics Forum
    Replies: 1
    Last Post: March 29th 2010, 06:01 AM
  3. steady state
    Posted in the Differential Equations Forum
    Replies: 1
    Last Post: March 20th 2010, 12:38 AM
  4. Replies: 0
    Last Post: March 10th 2009, 11:29 AM
  5. Markov Chains - Steady State
    Posted in the Advanced Statistics Forum
    Replies: 1
    Last Post: January 7th 2008, 12:22 AM

Search Tags


/mathhelpforum @mathhelpforum