# Markov Chains

• February 8th 2010, 06:51 PM
BERRY
Markov Chains
Hello, I have two problems I'm stuck on. Any hints would be appreciated.

$\{X_n : n\geq0\} \mbox{ is a Markov Chain and suppose } P_{ii} > 0$

$\eta_i \mbox{ is the exit time from state i}$

$\eta_i = min\{n \geq 1 : X_n \neq i\}$

How would I derive the distribution of $\eta_i$?

Secondly, suppose that whether not a team wins the Super Bowl depends on the number of Super Bowl wins for that team in the last three seasons. How would I transform this into a Markov Chain? I'm having trouble identifying the state space. Thank you.
• February 9th 2010, 04:58 PM
Focus
Quote:

Originally Posted by BERRY
Hello, I have two problems I'm stuck on. Any hints would be appreciated.

$\{X_n : n\geq0\} \mbox{ is a Markov Chain and suppose } P_{ii} > 0$

$\eta_i \mbox{ is the exit time from state i}$

$\eta_i = min\{n \geq 1 : X_n \neq i\}$

How would I derive the distribution of $\eta_i$?

First, $\mathbb{P}(\eta_i =1)=1-P_{X_0,i}$.

Now for k greater than 1, $\{\eta_i=k\}$ tells you that you have stayed at i for k-2 times and then moved off to an other state. So first you have to move into state i, with probability $P_{X_0,i}$, now you have to stay there until k-1, and at k move along.

If that is not clear, do let me know.