Is it possible that a Markov chain can consist of all absorbing states ?
Follow Math Help Forum on Facebook and Google+
Originally Posted by mathlearn Is it possible that a Markov chain can consist of all absorbing states ? state is an absorbing state if and think of a matrix that satisfies the conditions above
Last edited by harish21; Sep 20th 2012 at 07:21 PM.
so a markov chain can consists of all absorbing states, right ?
Originally Posted by mathlearn so a markov chain can consists of all absorbing states, right ? of course
View Tag Cloud