# Markov Chain question

• Mar 30th 2008, 11:38 AM
rufusspeaks
Markov Chain question
how do you calculate the expected time period, the process stays in state K for example, before moving to another state?

can anyone point me in the right direction

thank you
• Mar 31st 2008, 06:13 AM
CaptainBlack
Quote:

Originally Posted by rufusspeaks
how do you calculate the expected time period, the process stays in state K for example, before moving to another state?

can anyone point me in the right direction

thank you

Let $p_k$ be the probability that given the current state is $K$ that the next state will be $K$.

Then given that we are in state [matth]K[/tex]:

Prob that we stay in state $K$ for $0$ epocs is $(1-p_k)$

Prob that we stay in state $K$ for $1$ epocs is $p_k(1-p_k)$

Prob that we stay in state $K$ for $n$ epocs is $p_k^n(1-p_k)$

Expected number of epocs we remain in $K$ given that we are in $K$ is:

$E(n)=\sum_{r=0}^{\infty} r p_k^r (1-p_k)$

RonL
• Apr 2nd 2008, 11:48 AM
rufusspeaks
Quote:

Originally Posted by CaptainBlack
Let $p_k$ be the probability that given the current state is $K$ that the next state will be $K$.

Then given that we are in state [matth]K[/tex]:

Prob that we stay in state $K$ for $0$ epocs is $(1-p_k)$

Prob that we stay in state $K$ for $1$ epocs is $p_k(1-p_k)$

Prob that we stay in state $K$ for $n$ epocs is $p_k^n(1-p_k)$

Expected number of epocs we remain in $K$ given that we are in $K$ is:

$E(n)=\sum_{r=0}^{\infty} r p_k^r (1-p_k)$

RonL

Would it be correct to use the limiting property of transition probabilities to give all the $r$s needed to evaluate the summation?

thank you.