# Math Help - Markov chain

1. ## Markov chain

Consider a Markov Chain $X_0, X_1,...,$ with state space {1,2,3}, initial distribution $\pi^{0} = (1/3,1/3,1/3)$ and one-step transition matrix P where P is the attached image.

Show

(1) $\mathbb{P} (X_2 | X_0 = 1)$
(2) $\mathbb{P}(X_2 = 3)$
(3) $\mathbb{E}(X_3)$
(4) $\mathbb{P}(X_0=1 | X_3 = 3)$

2. Originally Posted by BrooketheChook
Consider a Markov Chain $X_0, X_1,...,$ with state space {1,2,3}, initial distribution $\pi^{0} = (1/3,1/3,1/3)$ and one-step transition matrix P where P is the attached image.

Show

(1) $\mathbb{P} (X_2 | X_0 = 1)$
(2) $\mathbb{P}(X_2 = 3)$
(3) $\mathbb{E}(X_3)$
(4) $\mathbb{P}(X_0=1 | X_3 = 3)$
For the first consider:

$[1,0,0]P^2$

For the second consider:

$\pi_0 P^2$

CB