Okay so maybe it's not impossible, but neither I nor any of my other classmates were able to figure it out.
if on any given day it snows, there is a .3 probability that it will snow the following day. if on any given day it doesn't snow, there is a .8 probability that it will not snow the following day. what is the probability that it will snow on any given day?
It was a question on our final, and I simply guessed (it was multiple choice). Anyone have any ideas?
This is an example of a two states Markov chain. Let X(n) be the state of the process, i.e the weather, at day n. Then we have a state space S={0,1}
0: Not snowing
1: Snowing
Transition probabilities between the states from day to day is
That is we have a transition probability matrix P
What we're interested in is to find the long run behaviour of the process. (There are certain conditions to be satisfied in order for the process to have a long run behaviour, i.e to have a limiting stationary distribution. We don't need to worry too much about that, cause in this case they're all fullfilled.)
We are looking for a limiting distribution <> stationary distribution satisfying
That is, if we act upon the distribution according to the transition probabilities we will still have the same distribution.
Now solving the above equation for with the condition that yields
That is, the probability that it snows on a given day is
For this example there is maybe an easier approach... but for the general case with more than 2 states this one is quite useful...