Hello,
Have you read that : Markov chain - Wikipedia, the free encyclopedia ???
Hello,
Have you read that : Markov chain - Wikipedia, the free encyclopedia ???
a state is periodic if the probability of
returning to the state is zero except at regular intervals
eg if you go from 0 to 2 in a markov chain it should be the same number of steps to go from 2 to 0
Hmmm draw a sketch with the arrows.
What do you call open and closed ? I know transient, absorbing and recurrent... (Markov chain - Wikipedia, the free encyclopedia)
And all are transient except 2, which is absorbing, because once you're in 2, you can't leave it since it returns to itself with probability 1.
Following from the criterion for the period of a chain, I'd advise you to consider i=2. And you should conclude with ease, it's not very hard