# Expected Value Markov Chain

• Feb 13th 2011, 03:30 PM
joestevens
Expected Value Markov Chain
A Markov chain $\{X_n , n\geq 0\}$ with states $0, 1, 2$, has the transition probability matrix
$\left[ \begin{array}{ccc}
\frac{1}{2} & \frac{1}{3} & \frac{1}{6} \\
0 & \frac{1}{3} & \frac{2}{3} \\
\frac{1}{2} & 0 & \frac{1}{2} \end{array} \right]
$

If $P(X_0 = 0)=P(X_0 = 1)=\frac{1}{4}$, find $E(X_3)$.

Is this just as simple as taking $\left[ \begin{array}{ccc}
\frac{1}{2} & \frac{1}{3} & \frac{1}{6} \\
0 & \frac{1}{3} & \frac{2}{3} \\
\frac{1}{2} & 0 & \frac{1}{2} \end{array} \right]^3 \left[ \begin{array}{c}
\frac{1}{4} \\
\frac{1}{4} \\
\frac{1}{2} \end{array} \right]
$
• Feb 14th 2011, 12:57 AM
Moo
Hello,

To get the probabilities of state 3, it's rather

$\left[ \begin{array}{c}
\frac{1}{4} &
\frac{1}{4} &
\frac{1}{2} \end{array} \right]
\left[ \begin{array}{ccc}
\frac{1}{2} & \frac{1}{3} & \frac{1}{6} \\
0 & \frac{1}{3} & \frac{2}{3} \\
\frac{1}{2} & 0 & \frac{1}{2} \end{array} \right]^3$

And in order to get $\displaystyle E[X_3]=\sum_{j=0}^2 jP(X_3=j)$, multiply the result on the right by $\begin{bmatrix} 0\\1\\2\end{bmatrix}$ (see the code to have a simpler way to write matrices)
• Feb 14th 2011, 07:09 AM
joestevens
Do you mean $\left[ \begin{array}{c}
\frac{1}{4} &
\frac{1}{4} &
\frac{1}{2} \end{array} \right]^T
\left[ \begin{array}{ccc}
\frac{1}{2} & \frac{1}{3} & \frac{1}{6} \\
0 & \frac{1}{3} & \frac{2}{3} \\
\frac{1}{2} & 0 & \frac{1}{2} \end{array} \right]^3\begin{bmatrix} 0\\1\\2\end{bmatrix}$
? (Notice transpose)
• Feb 14th 2011, 11:07 AM
Moo
Yep sorry, I forgot the transpose, but that's what you have to compute =)