Results 1 to 4 of 4

Thread: Expected Value Markov Chain

  1. #1
    Junior Member
    Joined
    Sep 2010
    Posts
    42

    Expected Value Markov Chain

    A Markov chain $\displaystyle \{X_n , n\geq 0\}$ with states $\displaystyle 0, 1, 2$, has the transition probability matrix
    $\displaystyle \left[ \begin{array}{ccc}
    \frac{1}{2} & \frac{1}{3} & \frac{1}{6} \\
    0 & \frac{1}{3} & \frac{2}{3} \\
    \frac{1}{2} & 0 & \frac{1}{2} \end{array} \right]
    $
    If $\displaystyle P(X_0 = 0)=P(X_0 = 1)=\frac{1}{4}$, find $\displaystyle E(X_3)$.


    Is this just as simple as taking $\displaystyle \left[ \begin{array}{ccc}
    \frac{1}{2} & \frac{1}{3} & \frac{1}{6} \\
    0 & \frac{1}{3} & \frac{2}{3} \\
    \frac{1}{2} & 0 & \frac{1}{2} \end{array} \right]^3 \left[ \begin{array}{c}
    \frac{1}{4} \\
    \frac{1}{4} \\
    \frac{1}{2} \end{array} \right]
    $? If not, how should I go about this?
    Follow Math Help Forum on Facebook and Google+

  2. #2
    Moo
    Moo is offline
    A Cute Angle Moo's Avatar
    Joined
    Mar 2008
    From
    P(I'm here)=1/3, P(I'm there)=t+1/3
    Posts
    5,618
    Thanks
    6
    Hello,

    To get the probabilities of state 3, it's rather

    $\displaystyle \left[ \begin{array}{c}
    \frac{1}{4} &
    \frac{1}{4} &
    \frac{1}{2} \end{array} \right]
    \left[ \begin{array}{ccc}
    \frac{1}{2} & \frac{1}{3} & \frac{1}{6} \\
    0 & \frac{1}{3} & \frac{2}{3} \\
    \frac{1}{2} & 0 & \frac{1}{2} \end{array} \right]^3$

    And in order to get $\displaystyle \displaystyle E[X_3]=\sum_{j=0}^2 jP(X_3=j)$, multiply the result on the right by $\displaystyle \begin{bmatrix} 0\\1\\2\end{bmatrix}$ (see the code to have a simpler way to write matrices)
    Follow Math Help Forum on Facebook and Google+

  3. #3
    Junior Member
    Joined
    Sep 2010
    Posts
    42

    Smile

    Do you mean $\displaystyle \left[ \begin{array}{c}
    \frac{1}{4} &
    \frac{1}{4} &
    \frac{1}{2} \end{array} \right]^T
    \left[ \begin{array}{ccc}
    \frac{1}{2} & \frac{1}{3} & \frac{1}{6} \\
    0 & \frac{1}{3} & \frac{2}{3} \\
    \frac{1}{2} & 0 & \frac{1}{2} \end{array} \right]^3\begin{bmatrix} 0\\1\\2\end{bmatrix}$? (Notice transpose)
    Follow Math Help Forum on Facebook and Google+

  4. #4
    Moo
    Moo is offline
    A Cute Angle Moo's Avatar
    Joined
    Mar 2008
    From
    P(I'm here)=1/3, P(I'm there)=t+1/3
    Posts
    5,618
    Thanks
    6
    Yep sorry, I forgot the transpose, but that's what you have to compute =)
    Follow Math Help Forum on Facebook and Google+

Similar Math Help Forum Discussions

  1. Expected value of markov chain URGENT!
    Posted in the Advanced Statistics Forum
    Replies: 2
    Last Post: Oct 27th 2017, 06:52 PM
  2. Markov Chain of random variables from a primitive markov chain
    Posted in the Advanced Statistics Forum
    Replies: 0
    Last Post: Oct 19th 2011, 08:12 AM
  3. Expected Value and markov conditional distribution
    Posted in the Advanced Statistics Forum
    Replies: 4
    Last Post: Apr 29th 2009, 05:50 PM
  4. markov chain
    Posted in the Advanced Statistics Forum
    Replies: 0
    Last Post: Mar 6th 2009, 01:48 PM
  5. Replies: 2
    Last Post: Oct 28th 2008, 06:32 PM

Search tags for this page

Click on a term to search for related topics.

Search Tags


/mathhelpforum @mathhelpforum