# Thread: Is it possible that a Markov chain can consist of all absorbing states ?

1. ## Is it possible that a Markov chain can consist of all absorbing states ?

Is it possible that a Markov chain can consist of all absorbing states ?

2. ## Re: Is it possible that a Markov chain can consist of all absorbing states ?

Originally Posted by mathlearn
Is it possible that a Markov chain can consist of all absorbing states ?
state $j$ is an absorbing state if $P(j,j)=1$ and $P(j,k)=0;\text{where}\;j\neq k$

think of a matrix that satisfies the conditions above

3. ## Re: Is it possible that a Markov chain can consist of all absorbing states ?

so a markov chain can consists of all absorbing states, right ?

4. ## Re: Is it possible that a Markov chain can consist of all absorbing states ?

Originally Posted by mathlearn
so a markov chain can consists of all absorbing states, right ?
of course