Hi,
If i am given a transition probability matrix (4x4) for a Markov chain and I am given the state I begin in, then how can I find the probability that the process will never visit a certain state.
Thanks for the help!
Hi,
If i am given a transition probability matrix (4x4) for a Markov chain and I am given the state I begin in, then how can I find the probability that the process will never visit a certain state.
Thanks for the help!
Hello,
By Kolmogorov's 0-1 law, this probability is always 0 or 1. (Or so I guess ^^)
So you just have to see if the corresponding element of is 0 or different from 0 for any n.
(to do so, diagonalise the matrix and you'll have an easy computation for : , where D is the diagonal matrix of eigenvalues, etc, etc...)