Hi,

If i am given a transition probability matrix (4x4) for a Markov chain and I am given the state I begin in, then how can I find the probability that the process will never visit a certain state.

Thanks for the help!

Printable View

- Jan 27th 2011, 07:21 PMicobesMarkov Chain First Step Analysis Question
Hi,

If i am given a transition probability matrix (4x4) for a Markov chain and I am given the state I begin in, then how can I find the probability that the process will never visit a certain state.

Thanks for the help! - Jan 31st 2011, 01:13 PMMoo
Hello,

By Kolmogorov's 0-1 law, this probability is always 0 or 1. (Or so I guess ^^)

So you just have to see if the corresponding element of is 0 or different from 0 for any n.

(to do so, diagonalise the matrix and you'll have an easy computation for : , where D is the diagonal matrix of eigenvalues, etc, etc...)