I need some help with Markov chain where I have to simulate (predict) the weather given the following information:
a) The states are Rain, Nice, Snow
b) The transition matrix P is -->
Rain Nice Snow
Rain 0.5 0.25 0.25
Nice 0.5 0 0.5
Snow 0.25 0.25 0.5
c) At P6 steady state is reached which is (0.4 0.2 0.4)
d) Let the initial probability vector be (1/3 1/3 1/3) or if it is easier lets say the its a rainy day today so the initial probability vector becomes (1 0 0)
e) Now given this information, how do I predict the weather for the next 100 days? Can someone explain how to go about doing this step by step.
Thanks in advance
uhhhh..... what i want to compute is the output states. For example,
SSRNRNSSSSSSNRSNSSRNSRN..... (for 100 days)
which means snow-snow-rain-nice-rain-nice-snow-snow.... etc.
so how is an output like this computed from the information given in the first post...
so lets say if you have the first 10 output states available
and you need to compute the next 10 states, how is that done??
Sorry if I am being a bit vague...
The third power of P tell us what is going on three days hence.
So if it rains today, then in three days there is a probability of .203 it will be nice.
If it snows today it will snow in three days is .406.
But you see it all becomes the same (steady state) on the sixth day and thereafter.
Does that help?
Oh... so that means once the steady state vector is reached then prediction is not possible... is that correct?
Also, can you help me understand how the matrix is being constructed in the example given in link Markov Chains
Ex. 2: Another Weather Forecast
thanks once again...