# Math Help - Markov Chain question

1. ## Markov Chain question

how do you calculate the expected time period, the process stays in state K for example, before moving to another state?

can anyone point me in the right direction

thank you

2. Originally Posted by rufusspeaks
how do you calculate the expected time period, the process stays in state K for example, before moving to another state?

can anyone point me in the right direction

thank you
Let $p_k$ be the probability that given the current state is $K$ that the next state will be $K$.

Then given that we are in state [matth]K[/tex]:

Prob that we stay in state $K$ for $0$ epocs is $(1-p_k)$

Prob that we stay in state $K$ for $1$ epocs is $p_k(1-p_k)$

Prob that we stay in state $K$ for $n$ epocs is $p_k^n(1-p_k)$

Expected number of epocs we remain in $K$ given that we are in $K$ is:

$E(n)=\sum_{r=0}^{\infty} r p_k^r (1-p_k)$

RonL

3. Originally Posted by CaptainBlack
Let $p_k$ be the probability that given the current state is $K$ that the next state will be $K$.

Then given that we are in state [matth]K[/tex]:

Prob that we stay in state $K$ for $0$ epocs is $(1-p_k)$

Prob that we stay in state $K$ for $1$ epocs is $p_k(1-p_k)$

Prob that we stay in state $K$ for $n$ epocs is $p_k^n(1-p_k)$

Expected number of epocs we remain in $K$ given that we are in $K$ is:

$E(n)=\sum_{r=0}^{\infty} r p_k^r (1-p_k)$

RonL
Would it be correct to use the limiting property of transition probabilities to give all the $r$s needed to evaluate the summation?

thank you.