
Markov Chain HELP!!!!
Hello,
Not sure where to put this question and let me know if it needs to be put somewhere else... I'm learning about Markov Chain Matrices and I am a little confused on how to take a transition matrix and finding the fixed probability vector??? Here is the problem.. let me know if you can help me out..
This is a transition matrix
0.375 0.625 0
0.375 0.375 0.25
0.375 0.5 0.125
Find the fixed probability vector. I'm not really sure how to start this I'm really confused. Any ideas?
The result should be a 1x3 Martix
