Results 1 to 4 of 4

Math Help - Expected Value Markov Chain

  1. #1
    Junior Member
    Joined
    Sep 2010
    Posts
    42

    Expected Value Markov Chain

    A Markov chain \{X_n , n\geq 0\} with states 0, 1, 2, has the transition probability matrix
    \left[ \begin{array}{ccc}<br />
\frac{1}{2} & \frac{1}{3} & \frac{1}{6} \\<br />
0 & \frac{1}{3} & \frac{2}{3} \\<br />
\frac{1}{2} & 0 & \frac{1}{2} \end{array} \right] <br />
    If P(X_0 = 0)=P(X_0 = 1)=\frac{1}{4}, find E(X_3).


    Is this just as simple as taking \left[ \begin{array}{ccc}<br />
\frac{1}{2} & \frac{1}{3} & \frac{1}{6} \\<br />
0 & \frac{1}{3} & \frac{2}{3} \\<br />
\frac{1}{2} & 0 & \frac{1}{2} \end{array} \right]^3 \left[ \begin{array}{c}<br />
\frac{1}{4} \\<br />
\frac{1}{4} \\<br />
\frac{1}{2} \end{array} \right]<br />
? If not, how should I go about this?
    Follow Math Help Forum on Facebook and Google+

  2. #2
    Moo
    Moo is offline
    A Cute Angle Moo's Avatar
    Joined
    Mar 2008
    From
    P(I'm here)=1/3, P(I'm there)=t+1/3
    Posts
    5,618
    Thanks
    6
    Hello,

    To get the probabilities of state 3, it's rather

     \left[ \begin{array}{c}<br />
\frac{1}{4} &<br />
\frac{1}{4} &<br />
\frac{1}{2} \end{array} \right]<br />
\left[ \begin{array}{ccc}<br />
\frac{1}{2} & \frac{1}{3} & \frac{1}{6} \\<br />
0 & \frac{1}{3} & \frac{2}{3} \\<br />
\frac{1}{2} & 0 & \frac{1}{2} \end{array} \right]^3

    And in order to get \displaystyle E[X_3]=\sum_{j=0}^2 jP(X_3=j), multiply the result on the right by \begin{bmatrix} 0\\1\\2\end{bmatrix} (see the code to have a simpler way to write matrices)
    Follow Math Help Forum on Facebook and Google+

  3. #3
    Junior Member
    Joined
    Sep 2010
    Posts
    42

    Smile

    Do you mean  \left[ \begin{array}{c}<br />
\frac{1}{4} &<br />
\frac{1}{4} &<br />
\frac{1}{2} \end{array} \right]^T<br />
\left[ \begin{array}{ccc}<br />
\frac{1}{2} & \frac{1}{3} & \frac{1}{6} \\<br />
0 & \frac{1}{3} & \frac{2}{3} \\<br />
\frac{1}{2} & 0 & \frac{1}{2} \end{array} \right]^3\begin{bmatrix} 0\\1\\2\end{bmatrix}? (Notice transpose)
    Follow Math Help Forum on Facebook and Google+

  4. #4
    Moo
    Moo is offline
    A Cute Angle Moo's Avatar
    Joined
    Mar 2008
    From
    P(I'm here)=1/3, P(I'm there)=t+1/3
    Posts
    5,618
    Thanks
    6
    Yep sorry, I forgot the transpose, but that's what you have to compute =)
    Follow Math Help Forum on Facebook and Google+

Similar Math Help Forum Discussions

  1. Markov Chain of random variables from a primitive markov chain
    Posted in the Advanced Statistics Forum
    Replies: 0
    Last Post: October 19th 2011, 08:12 AM
  2. Expected value of markov chain URGENT!
    Posted in the Advanced Statistics Forum
    Replies: 1
    Last Post: October 20th 2009, 11:19 PM
  3. Expected Value and markov conditional distribution
    Posted in the Advanced Statistics Forum
    Replies: 4
    Last Post: April 29th 2009, 05:50 PM
  4. markov chain
    Posted in the Advanced Statistics Forum
    Replies: 0
    Last Post: March 6th 2009, 01:48 PM
  5. Replies: 2
    Last Post: October 28th 2008, 06:32 PM

Search Tags


/mathhelpforum @mathhelpforum