Please help on hitting probabilities, I can't remember how to do this basic stuff.

Markov chain have states 1-4 has one-step transition matrix

For each state , find

(process ever reaches K = {2,5}

Thank you

Printable View

- Apr 17th 2008, 07:16 PMkleenexHitting probabilities
Please help on hitting probabilities, I can't remember how to do this basic stuff.

Markov chain have states 1-4 has one-step transition matrix

For each state , find

(process ever reaches K = {2,5}

Thank you - Apr 17th 2008, 07:25 PMcolby2152
That is the probability of going from state two to state five. It is in row two and column five. Matrices are always listed (r, c) for row and column. The probability is one in four.

- Apr 17th 2008, 07:39 PMkleenex
Thank you, just one more question.

So no matter how big the square matrix is, I just need to look at the row two and column five for that question?