# Markov chain

• Sep 5th 2009, 04:34 PM
BrooketheChook
Markov chain
Consider a Markov Chain $\displaystyle X_0, X_1,...,$ with state space {1,2,3}, initial distribution $\displaystyle \pi^{0} = (1/3,1/3,1/3)$ and one-step transition matrix P where P is the attached image.

Show

(1) $\displaystyle \mathbb{P} (X_2 | X_0 = 1)$
(2) $\displaystyle \mathbb{P}(X_2 = 3)$
(3) $\displaystyle \mathbb{E}(X_3)$
(4) $\displaystyle \mathbb{P}(X_0=1 | X_3 = 3)$
• Sep 5th 2009, 10:56 PM
CaptainBlack
Quote:

Originally Posted by BrooketheChook
Consider a Markov Chain $\displaystyle X_0, X_1,...,$ with state space {1,2,3}, initial distribution $\displaystyle \pi^{0} = (1/3,1/3,1/3)$ and one-step transition matrix P where P is the attached image.

Show

(1) $\displaystyle \mathbb{P} (X_2 | X_0 = 1)$
(2) $\displaystyle \mathbb{P}(X_2 = 3)$
(3) $\displaystyle \mathbb{E}(X_3)$
(4) $\displaystyle \mathbb{P}(X_0=1 | X_3 = 3)$

For the first consider:

$\displaystyle [1,0,0]P^2$

For the second consider:

$\displaystyle \pi_0 P^2$

CB