Is it possible that a Markov chain can consist of all absorbing states ?
Follow Math Help Forum on Facebook and Google+
Originally Posted by mathlearn Is it possible that a Markov chain can consist of all absorbing states ? state $\displaystyle j$ is an absorbing state if $\displaystyle P(j,j)=1$ and $\displaystyle P(j,k)=0;\text{where}\;j\neq k$ think of a matrix that satisfies the conditions above
Last edited by harish21; Sep 20th 2012 at 06:21 PM.
so a markov chain can consists of all absorbing states, right ?
Originally Posted by mathlearn so a markov chain can consists of all absorbing states, right ? of course
View Tag Cloud