A Markov chain with states , has the transition probability matrix
If , find .
Is this just as simple as taking ? If not, how should I go about this?
Follow Math Help Forum on Facebook and Google+
To get the probabilities of state 3, it's rather
And in order to get , multiply the result on the right by (see the code to have a simpler way to write matrices)
Do you mean ? (Notice transpose)
Yep sorry, I forgot the transpose, but that's what you have to compute =)
View Tag Cloud