Results 1 to 13 of 13

Math Help - Markov Chain

  1. #1
    Senior Member
    Joined
    Oct 2009
    Posts
    295
    Thanks
    9

    Markov Chain

    A Markov chain X_{n},n\geq 0 with states 0, 1, 2 has the transition probability matrix



    If P(X_{0}=0)=P(X_{0}=1)=\frac{1}{4}, find E[X_{3}]

    How do I go about solving this? It seems like something that would be very basic, yet I can't find any similar examples in the my textbook.

    Thanks
    Follow Math Help Forum on Facebook and Google+

  2. #2
    Flow Master
    mr fantastic's Avatar
    Joined
    Dec 2007
    From
    Zeitgeist
    Posts
    16,948
    Thanks
    5

    Re: Markov Chain

    Quote Originally Posted by downthesun01 View Post
    A Markov chain X_{n},n\geq 0 with states 0, 1, 2 has the transition probability matrix



    If P(X_{0}=0)=P(X_{0}=1)=\frac{1}{4}, find E[X_{3}]

    How do I go about solving this? It seems like something that would be very basic, yet I can't find any similar examples in the my textbook.

    Thanks
    You know S_n = T^n S_0. So calculate S_3 and use its entries to calculate E(X_3).
    Follow Math Help Forum on Facebook and Google+

  3. #3
    Senior Member
    Joined
    Oct 2009
    Posts
    295
    Thanks
    9

    Re: Markov Chain

    Ok, but what do S and T represent?
    Follow Math Help Forum on Facebook and Google+

  4. #4
    Flow Master
    mr fantastic's Avatar
    Joined
    Dec 2007
    From
    Zeitgeist
    Posts
    16,948
    Thanks
    5

    Re: Markov Chain

    Quote Originally Posted by downthesun01 View Post
    Ok, but what do S and T represent?
    If you're studying Markov chains, then you ought to know ....! T is the transition matrix. S is ......
    Follow Math Help Forum on Facebook and Google+

  5. #5
    Senior Member
    Joined
    Oct 2009
    Posts
    295
    Thanks
    9

    Re: Markov Chain

    Hahaha.. apparently we aren't using the same notation. I was under the impression that P represents the probability transition matrix. It's okay. I'll just find out somewhere else. Thanks anyway.
    Follow Math Help Forum on Facebook and Google+

  6. #6
    Flow Master
    mr fantastic's Avatar
    Joined
    Dec 2007
    From
    Zeitgeist
    Posts
    16,948
    Thanks
    5

    Re: Markov Chain

    Quote Originally Posted by downthesun01 View Post
    Hahaha.. apparently we aren't using the same notation. I was under the impression that P represents the probability transition matrix. It's okay. I'll just find out somewhere else. Thanks anyway.
    I figured as much. But I thought the context would suggest the definitions. S is the state matrix, the subscripts tells you what step in the chain.
    Follow Math Help Forum on Facebook and Google+

  7. #7
    Senior Member
    Joined
    Oct 2009
    Posts
    295
    Thanks
    9

    Re: Markov Chain

    Thank you for your help and taking the time to respond to my questions. However, I think that it'll take more than a simple nudging towards the answer to help me with this problem.

    We've never had any discussion of State matrices, so I'm not quite sure of what they are and they do not seem to be mentioned in the textbook. I'm going to go ahead and check YouTube for any lectures on Markov Chains. Maybe they'll have some good examples that will help me out.

    If anyone knows of any sites with well written, easy to follow examples link would be much appreciated. Thanks
    Follow Math Help Forum on Facebook and Google+

  8. #8
    Senior Member
    Joined
    Oct 2009
    Posts
    295
    Thanks
    9

    Re: Markov Chain

    Ok. I think I found out how to do this.

    First calculate P^3



    Then
    Pr(X_3=0)=(\frac{1}{4})(\frac{39}{108})+(\frac{1}{  4})(\frac{48}{108})+(\frac{1}{2})(\frac{45}{108})=  \frac{177}{432}

    Pr(X_3=1)=(\frac{1}{4})(\frac{22}{108})+(\frac{1}{  4})(\frac{16}{108})+(\frac{1}{2})(\frac{24}{108})=  \frac{86}{432}

    Pr(X_3=2)=(\frac{1}{4})(\frac{47}{108})+(\frac{1}{  4})(\frac{44}{108})+(\frac{1}{2})(\frac{39}{108})=  \frac{169}{432}

    Then E(X_3)=(0)(\frac{177}{432})+(1)(\frac{86}{432})+(2  )(\frac{169}{432})=\frac{424}{432}

    Can someone confirm whether this is correct or not? Thanks
    Follow Math Help Forum on Facebook and Google+

  9. #9
    Grand Panjandrum
    Joined
    Nov 2005
    From
    someplace
    Posts
    14,972
    Thanks
    4

    Re: Markov Chain

    Quote Originally Posted by mr fantastic View Post
    You know S_n = T^n S_0. So calculate S_3 and use its entries to calculate E(X_3).
    From the given transition matrix it is clear that the poster is using the other convention (which from my observation of posts on MHF is the more common notation in undergraduate education today), where in your notation S_n is a row vector and:

    S_n=S_{n-1}{\text{A}}=S_{0}{\text{A}}^n

    where \text{A} is the transition matrix in their format where the rows sum to 1.
    Follow Math Help Forum on Facebook and Google+

  10. #10
    Moo
    Moo is offline
    A Cute Angle Moo's Avatar
    Joined
    Mar 2008
    From
    P(I'm here)=1/3, P(I'm there)=t+1/3
    Posts
    5,618
    Thanks
    6

    Re: Markov Chain

    Hello,

    Quote Originally Posted by downthesun01 View Post
    Ok. I think I found out how to do this.

    First calculate P^3



    Then
    Pr(X_3=0)=(\frac{1}{4})(\frac{39}{108})+(\frac{1}{  4})(\frac{48}{108})+(\frac{1}{2})(\frac{45}{108})=  \frac{177}{432}

    Pr(X_3=1)=(\frac{1}{4})(\frac{22}{108})+(\frac{1}{  4})(\frac{16}{108})+(\frac{1}{2})(\frac{24}{108})=  \frac{86}{432}

    Pr(X_3=2)=(\frac{1}{4})(\frac{47}{108})+(\frac{1}{  4})(\frac{44}{108})+(\frac{1}{2})(\frac{39}{108})=  \frac{169}{432}

    Then E(X_3)=(0)(\frac{177}{432})+(1)(\frac{86}{432})+(2  )(\frac{169}{432})=\frac{424}{432}

    Can someone confirm whether this is correct or not? Thanks
    I haven't checked your calculations, but the reasoning is correct

    This is basically what Mr F suggested, but I tend to agree that his notations are weird
    Follow Math Help Forum on Facebook and Google+

  11. #11
    Grand Panjandrum
    Joined
    Nov 2005
    From
    someplace
    Posts
    14,972
    Thanks
    4

    Re: Markov Chain

    Quote Originally Posted by downthesun01 View Post
    A Markov chain X_{n},n\geq 0 with states 0, 1, 2 has the transition probability matrix



    If P(X_{0}=0)=P(X_{0}=1)=\frac{1}{4}, find E[X_{3}]

    How do I go about solving this? It seems like something that would be very basic, yet I can't find any similar examples in the my textbook.

    Thanks
    The initial distribution vector for the states is S_0=[0.25,0.25,0.5]

    So:

    S_3=S_2{\text{A}}=(S_1{\text{A}}){\text{A}}=((S_0{  \text{A}}){\text{A}}){\text{A}}=S_0{\text{A}}^3

    which is gives the probability of each of the states from which you compute the expectation.

    CB
    Follow Math Help Forum on Facebook and Google+

  12. #12
    Grand Panjandrum
    Joined
    Nov 2005
    From
    someplace
    Posts
    14,972
    Thanks
    4

    Re: Markov Chain

    Quote Originally Posted by downthesun01 View Post
    Ok. I think I found out how to do this.

    First calculate P^3



    Then
    Pr(X_3=0)=(\frac{1}{4})(\frac{39}{108})+(\frac{1}{  4})(\frac{48}{108})+(\frac{1}{2})(\frac{45}{108})=  \frac{177}{432}

    Pr(X_3=1)=(\frac{1}{4})(\frac{22}{108})+(\frac{1}{  4})(\frac{16}{108})+(\frac{1}{2})(\frac{24}{108})=  \frac{86}{432}

    Pr(X_3=2)=(\frac{1}{4})(\frac{47}{108})+(\frac{1}{  4})(\frac{44}{108})+(\frac{1}{2})(\frac{39}{108})=  \frac{169}{432}

    Then E(X_3)=(0)(\frac{177}{432})+(1)(\frac{86}{432})+(2  )(\frac{169}{432})=\frac{424}{432}

    Can someone confirm whether this is correct or not? Thanks
    Well I have checked your arithmetic and it is correct.

    CB
    Follow Math Help Forum on Facebook and Google+

  13. #13
    Senior Member
    Joined
    Oct 2009
    Posts
    295
    Thanks
    9

    Re: Markov Chain

    Thank you for all of your responses. All much appreciated.
    Follow Math Help Forum on Facebook and Google+

Similar Math Help Forum Discussions

  1. Markov Chain of random variables from a primitive markov chain
    Posted in the Advanced Statistics Forum
    Replies: 0
    Last Post: October 19th 2011, 09:12 AM
  2. Markov Chain Help
    Posted in the Advanced Statistics Forum
    Replies: 0
    Last Post: June 28th 2010, 08:37 AM
  3. Markov Chain
    Posted in the Discrete Math Forum
    Replies: 0
    Last Post: December 12th 2009, 05:52 PM
  4. Markov Chain HELP!!!!
    Posted in the Advanced Statistics Forum
    Replies: 1
    Last Post: April 9th 2009, 10:28 PM
  5. Replies: 2
    Last Post: October 28th 2008, 07:32 PM

Search Tags


/mathhelpforum @mathhelpforum