Results 1 to 4 of 4

Math Help - Can someone explain this Markov Chain to me

  1. #1
    Senior Member
    Joined
    Oct 2009
    Posts
    295
    Thanks
    9

    Can someone explain this Markov Chain to me

    It's an example from my textbook

    Suppose that whether or not it rains today depends on previous whether conditions through the last two days. Specifically, supposed that if it has rained for the past two days, then it will rain tomorrow with probability 0.7; if it rained today but not yesterday, then it will rain tomorrow with probability 0.5; if it rained yesterday but not today then it will rain tomorrow with probability 0.4; if it has not rained in the past two days, then it will rain tomorrow with probability 0.2.

    If we let the state at time n depend only on whether or not it is raining at time n, then the preceding model is not a Markov chain (explain why?). However, we can transform this model into a Markov chain by saying that the state at any time is determined by the weather conditions during both that day and the previous day. In other words, we can say that the process is in

    state 0: if it rained both today and yesterday
    state 1: if it rained today but not yesterday
    state 2: if it rained yesterday but not today
    state 3: if it did not rain either yesterday or today

    The preceding would then represent a four-state Markov Chain having a transition probability matrix





    I'm having a hard time understanding the different states and how the probabilities are calculated.

    Conversely, I had a very easy time understanding this one which is one of my homework problems.


    Three white and three black balls are distributed in two urns in such a way that each contains three balls. We say that the system is in state i,i=0,1,2,3, if the first urn contains i white balls. At each step, we draw one ball from each urn and place the ball drawn from the first urn into the second, and conversely with the ball from the second urn. Let X_n denote the state of the system after the nth step. Explain why {X_n,n=0,1,2,3,} is a Markov chain and calculate its transition probability matrix.

    Any help is appreciated.
    Last edited by downthesun01; July 29th 2011 at 03:01 AM.
    Follow Math Help Forum on Facebook and Google+

  2. #2
    MHF Contributor
    Joined
    Aug 2007
    From
    USA
    Posts
    3,111
    Thanks
    2

    Re: Can someone explain this Markov Chain to me

    Have you considered multiplying hte matrix by itself to produce the results of two cycles?
    Follow Math Help Forum on Facebook and Google+

  3. #3
    MHF Contributor

    Joined
    Apr 2005
    Posts
    15,787
    Thanks
    1526

    Re: Can someone explain this Markov Chain to me

    Quote Originally Posted by downthesun01 View Post
    It's an example from my textbook

    [I]Suppose that whether or not it rains today depends on previous whether conditions through the last two days. Specifically, supposed that if it has rained for the past two days, then it will rain tomorrow with probability 0.7; if it rained today but not yesterday, then it will rain tomorrow with probability 0.5; if it rained yesterday but not today then it will rain tomorrow with probability 0.4; if it has not rained in the past two days, then it will rain tomorrow with probability 0.2.

    If we let the state at time n depend only on whether or not it is raining at time n, then the preceding model is not a Markov chain (explain why?).
    In a Markov chain, the next step depends only on the state at this step (or, equivalently, the state at this step depends only on the state at the previous step). This is not a Markov chain because the weather today (at this step) depends on the weather the previous two days.

    However, we can transform this model into a Markov chain by saying that the state at any time is determined by the weather conditions during both that day and the previous day. In other words, we can say that the process is in
    But we can make it a Markov chain by thinking of a "state" as being two days, today and yesterday. While the weather today has only two values, whether or not it rains today, the weather "yesterday and today" has 2^2= 4 possible values, those given as
    state 0: if it rained both today and yesterday
    state 1: if it rained today but not yesterday
    state 2: if it rained yesterday but not today
    state 3: if it did not rain either yesterday or today

    The preceding would then represent a four-state Markov Chain having a transition probability matrix
    Suppose we are in "state 0"- it rained both yesterday and today. The probability it will rain tomorrow is, because "suppose that if it has rained for the past two days, then it will rain tomorrow with probability 0.7", 0.7. The probability that it rained today is, of course, 1- it did rain today. Therefore, the probability that it will rain today and tomorrow, which will be state 0 "it rained yesterday and today" in the next step, tomorrow, is (1)(0.7)= 0.7.

    Do you see now what those eight "0"s mean? Those are the four cases where "yesterday's weather", as seen from the next step (tomorrow) is NOT the same as "today's weather". Tomorrow's weather may be either rain or not rain but today's weather is already determined- its probability that is did rain today is either 1 or 0 so we are always multiplying the probabilities for "tomorrow" by 1 or 0.




    I'm having a hard time understanding the different states and how the probabilities are calculated.

    Conversely, I had a very easy time understanding this one which is one of my homework problems.


    Three white and three black balls are distributed in two urns in such a way that each contains three balls. We say that the system is in state i,i=0,1,2,3, if the first urn contains i white balls. At each step, we draw one ball from each urn and place the ball drawn from the first urn into the second, and conversely with the ball from the second urn. Let X_n denote the state of the system after the nth step. Explain why {X_n,n=0,1,2,3,} is a Markov chain and calculate its transition probability matrix.

    Any help is appreciated.
    Last edited by HallsofIvy; July 29th 2011 at 07:31 AM.
    Follow Math Help Forum on Facebook and Google+

  4. #4
    Senior Member
    Joined
    Oct 2009
    Posts
    295
    Thanks
    9

    Re: Can someone explain this Markov Chain to me

    Thank you. I think I have a bit of a better understanding, but still do not fully comprehend.

    Maybe I missed it in your answer, but it seems like like rows 1 through for represent states 0 through 3. If so, what do columns 1 through 4 represent? Why is row 1 {0.7,0,0.3,0} instead of {0,0.7,0.3,0} or some other combination?

    For example, on my homework problem, the transition probability matrix ended up looking like this.




    With this matrix, it's quite easy to see that, for example, p(0,0) is going from 0 white balls to 0 white balls which as a probability of 0, and p(2,0) is going from having to white balls in urn one to having 0 which is also 0% probability.

    Is there a way of viewing this weather example in a similar, easy to follow fashion? I apologize for inserting images like that, but I can't seem to get latex to create tables for a matrix anymore.
    Follow Math Help Forum on Facebook and Google+

Similar Math Help Forum Discussions

  1. Markov Chain of random variables from a primitive markov chain
    Posted in the Advanced Statistics Forum
    Replies: 0
    Last Post: October 19th 2011, 08:12 AM
  2. Markov chain
    Posted in the Advanced Statistics Forum
    Replies: 3
    Last Post: May 24th 2010, 09:45 AM
  3. Markov chain
    Posted in the Advanced Statistics Forum
    Replies: 0
    Last Post: March 24th 2010, 09:58 AM
  4. Replies: 2
    Last Post: October 28th 2008, 06:32 PM
  5. Markov chain
    Posted in the Advanced Statistics Forum
    Replies: 3
    Last Post: March 14th 2007, 09:54 AM

Search Tags


/mathhelpforum @mathhelpforum