A Markov chain with states 0, 1, 2 has the transition probability matrix
If , find
How do I go about solving this? It seems like something that would be very basic, yet I can't find any similar examples in the my textbook.
Thanks
Thank you for your help and taking the time to respond to my questions. However, I think that it'll take more than a simple nudging towards the answer to help me with this problem.
We've never had any discussion of State matrices, so I'm not quite sure of what they are and they do not seem to be mentioned in the textbook. I'm going to go ahead and check YouTube for any lectures on Markov Chains. Maybe they'll have some good examples that will help me out.
If anyone knows of any sites with well written, easy to follow examples link would be much appreciated. Thanks
From the given transition matrix it is clear that the poster is using the other convention (which from my observation of posts on MHF is the more common notation in undergraduate education today), where in your notation is a row vector and:
where is the transition matrix in their format where the rows sum to 1.