Results 1 to 5 of 5

Math Help - Markov Chain probabilities

  1. #1
    Member
    Joined
    Feb 2008
    Posts
    184

    Markov Chain probabilities

    A Markov chain on state space {1,2,3,4,5} has transition matrix

     <br />
\begin{pmatrix} <br />
0 & 0 & 1 & 0 & 0\\ <br />
0 & 0 & 4/{5} & 1/{5} & 0\\ <br />
0 & 1/{6} & 2/{3} & 0 & 1/{6}\\<br />
0 & 0 & 0 & 1 & 0\\<br />
0 & 0 & 0 & 0 &1\\<br />
\end{pmatrix} <br />

    The process starts in state 1.
    a) Which states are absorbing?
    b) Calculate the probability that the process is absorbed at state 4 (ends up at state 4).
    c) Calculate the expectation of the time of absorption.

    I am missed my lectures due illnes so I don't have much understanding of this topic. I would be grateful for some help with this revision question.

    Thanks
    Follow Math Help Forum on Facebook and Google+

  2. #2
    Moo
    Moo is offline
    A Cute Angle Moo's Avatar
    Joined
    Mar 2008
    From
    P(I'm here)=1/3, P(I'm there)=t+1/3
    Posts
    5,618
    Thanks
    6
    Hello,

    A state is said to be absorbing if once you're in this state, you don't leave it anymore. x is absorbing if P(X_{n+1}=x|X_n=x)=1
    When you're given the transition matrix, the absorbing states are the ones where there's a 1 in the diagonal.

    For question 2), find all the possible paths that go from 1 to 4 (in any given number of steps) and add their probabilities. Maybe doing a sketch will help.

    For question 3), which formula are you given ? You must have some lecture notes !
    Follow Math Help Forum on Facebook and Google+

  3. #3
    Member
    Joined
    Feb 2008
    Posts
    184
    so state 4 and 5 are absorbing...
    Follow Math Help Forum on Facebook and Google+

  4. #4
    Moo
    Moo is offline
    A Cute Angle Moo's Avatar
    Joined
    Mar 2008
    From
    P(I'm here)=1/3, P(I'm there)=t+1/3
    Posts
    5,618
    Thanks
    6
    Yes they are
    Follow Math Help Forum on Facebook and Google+

  5. #5
    Member
    Joined
    Feb 2008
    Posts
    184
    for part (c). I think, I have to use w_{i}=E(T|X_{0}=i)

    i know w_{4}=w_{5}=0

    for part (b) u_{i}=P(X_{A}=1|X_{0}=i)

    Can you help me to answer part b,c fully?


    thanks
    Follow Math Help Forum on Facebook and Google+

Similar Math Help Forum Discussions

  1. Derive Markov chain probabilities
    Posted in the Advanced Statistics Forum
    Replies: 3
    Last Post: September 14th 2010, 12:11 AM
  2. Markov chain probabilities
    Posted in the Advanced Statistics Forum
    Replies: 0
    Last Post: May 5th 2010, 01:16 PM
  3. Markov chain Probabilities
    Posted in the Advanced Statistics Forum
    Replies: 0
    Last Post: April 20th 2010, 11:57 AM
  4. Markov Chain probabilities
    Posted in the Advanced Statistics Forum
    Replies: 0
    Last Post: February 3rd 2010, 11:11 AM
  5. Markov Chain-Probabilities
    Posted in the Advanced Statistics Forum
    Replies: 0
    Last Post: January 20th 2010, 01:17 PM

Search Tags


/mathhelpforum @mathhelpforum