A Markov chain with states , has the transition probability matrix

If , find .

Is this just as simple as taking ? If not, how should I go about this?

Printable View

- February 13th 2011, 02:30 PMjoestevensExpected Value Markov Chain
A Markov chain with states , has the transition probability matrix

If , find .

Is this just as simple as taking ? If not, how should I go about this? - February 13th 2011, 11:57 PMMoo
Hello,

To get the probabilities of state 3, it's rather

And in order to get , multiply the result on the right by (see the code to have a simpler way to write matrices) - February 14th 2011, 06:09 AMjoestevens
Do you mean ? (Notice transpose)

- February 14th 2011, 10:07 AMMoo
Yep sorry, I forgot the transpose, but that's what you have to compute =)