Have you considered multiplying hte matrix by itself to produce the results of two cycles?
It's an example from my textbook
Suppose that whether or not it rains today depends on previous whether conditions through the last two days. Specifically, supposed that if it has rained for the past two days, then it will rain tomorrow with probability 0.7; if it rained today but not yesterday, then it will rain tomorrow with probability 0.5; if it rained yesterday but not today then it will rain tomorrow with probability 0.4; if it has not rained in the past two days, then it will rain tomorrow with probability 0.2.
If we let the state at time n depend only on whether or not it is raining at time n, then the preceding model is not a Markov chain (explain why?). However, we can transform this model into a Markov chain by saying that the state at any time is determined by the weather conditions during both that day and the previous day. In other words, we can say that the process is in
state 0: if it rained both today and yesterday
state 1: if it rained today but not yesterday
state 2: if it rained yesterday but not today
state 3: if it did not rain either yesterday or today
The preceding would then represent a four-state Markov Chain having a transition probability matrix
I'm having a hard time understanding the different states and how the probabilities are calculated.
Conversely, I had a very easy time understanding this one which is one of my homework problems.
Three white and three black balls are distributed in two urns in such a way that each contains three balls. We say that the system is in state i,i=0,1,2,3, if the first urn contains i white balls. At each step, we draw one ball from each urn and place the ball drawn from the first urn into the second, and conversely with the ball from the second urn. Let X_n denote the state of the system after the nth step. Explain why {X_n,n=0,1,2,3,…} is a Markov chain and calculate its transition probability matrix.
Any help is appreciated.
In a Markov chain, the next step depends only on the state at this step (or, equivalently, the state at this step depends only on the state at the previous step). This is not a Markov chain because the weather today (at this step) depends on the weather the previous two days.
But we can make it a Markov chain by thinking of a "state" as being two days, today and yesterday. While the weather today has only two values, whether or not it rains today, the weather "yesterday and today" has possible values, those given asHowever, we can transform this model into a Markov chain by saying that the state at any time is determined by the weather conditions during both that day and the previous day. In other words, we can say that the process is in
Suppose we are in "state 0"- it rained both yesterday and today. The probability it will rain tomorrow is, because "suppose that if it has rained for the past two days, then it will rain tomorrow with probability 0.7", 0.7. The probability that it rained today is, of course, 1- it did rain today. Therefore, the probability that it will rain today and tomorrow, which will be state 0 "it rained yesterday and today" in the next step, tomorrow, is (1)(0.7)= 0.7.state 0: if it rained both today and yesterday
state 1: if it rained today but not yesterday
state 2: if it rained yesterday but not today
state 3: if it did not rain either yesterday or today
The preceding would then represent a four-state Markov Chain having a transition probability matrix
Do you see now what those eight "0"s mean? Those are the four cases where "yesterday's weather", as seen from the next step (tomorrow) is NOT the same as "today's weather". Tomorrow's weather may be either rain or not rain but today's weather is already determined- its probability that is did rain today is either 1 or 0 so we are always multiplying the probabilities for "tomorrow" by 1 or 0.
I'm having a hard time understanding the different states and how the probabilities are calculated.
Conversely, I had a very easy time understanding this one which is one of my homework problems.
Three white and three black balls are distributed in two urns in such a way that each contains three balls. We say that the system is in state i,i=0,1,2,3, if the first urn contains i white balls. At each step, we draw one ball from each urn and place the ball drawn from the first urn into the second, and conversely with the ball from the second urn. Let X_n denote the state of the system after the nth step. Explain why {X_n,n=0,1,2,3,…} is a Markov chain and calculate its transition probability matrix.
Any help is appreciated.
Thank you. I think I have a bit of a better understanding, but still do not fully comprehend.
Maybe I missed it in your answer, but it seems like like rows 1 through for represent states 0 through 3. If so, what do columns 1 through 4 represent? Why is row 1 {0.7,0,0.3,0} instead of {0,0.7,0.3,0} or some other combination?
For example, on my homework problem, the transition probability matrix ended up looking like this.
With this matrix, it's quite easy to see that, for example, p(0,0) is going from 0 white balls to 0 white balls which as a probability of 0, and p(2,0) is going from having to white balls in urn one to having 0 which is also 0% probability.
Is there a way of viewing this weather example in a similar, easy to follow fashion? I apologize for inserting images like that, but I can't seem to get latex to create tables for a matrix anymore.