Results 1 to 4 of 4
Like Tree1Thanks
  • 1 Post By Soroban

Math Help - Markov Chain

  1. #1
    Junior Member
    Joined
    Jan 2011
    From
    Sydney
    Posts
    36

    Markov Chain

    Hi Guys, just need a bit of help with this question. Suppose we have P = [0 1/2 1/2; 1 0 0; 1 0 0] and let Q_0=[x y z] where x,y,z >= 0 and x+y+z = 1.


    How would I find Q_(2n) and Q_(2n-1), that is the state distribution after an even or odd number of steps? And then how would we use this to explain whether or not the limit distribution exists.

    Thanks in advance
    Follow Math Help Forum on Facebook and Google+

  2. #2
    Super Member

    Joined
    May 2006
    From
    Lexington, MA (USA)
    Posts
    11,659
    Thanks
    600

    Re: Markov Chain

    Hello, liedora!

    Did you try anything?


    P \:=\:\begin{pmatrix}0&\frac{1}{2}&\frac{1}{2} \\ 1&0&0 \\ 1&0&0\end{pmatrix}\,\text{ and let }Q_0\,=\,[x\:y\:z]
    . . \text{where }x,y,z \ge 0\text{ and }x+y+z \,=\, 1.

    \text{Find }Q_{2n}\text{ and }Q_{2n-1}
    . . \text{the state distribution after an even or odd number of steps.}

    We have: . P \:=\:\begin{pmatrix}0&\frac{1}{2}&\frac{1}{2} \\ 1&0&0 \\ 1&0&0\end{pmatrix}

    P^2 \:=\:\begin{pmatrix}0&\frac{1}{2}& \frac{1}{2} \\ 1&0&0 \\ 1&0&0\end{pmatrix}\begin{pmatrix}0&\frac{1}{2}& \frac{1}{2} \\ 1&0&0 \\ 1&0&0\end{pmatrix} \:=\:\begin{pmatrix}1&0&0\\ 0&0&0 \\ 0&0&0 \end{pmatrix}

    P^3\:=\:\begin{pmatrix}1&0&0\\0&0&0\\0&0&0 \end{pmatrix}\begin{pmatrix}0&\frac{1}{2}&\frac{1}  {2} \\ 1&0&0 \\ 1&0&0\end{pmatrix} \:=\:\begin{pmatrix}0&\frac{1}{2} & \frac{1}{2} \\ 0&0&0 \\ 0&0&0 \end{pmatrix}

    P^4 \:=\:\begin{pmatrix}0&\frac{1}{2}& \frac{1}{2} \\ 0&0&0 \\ 0&0&0 \end{pmatrix} \begin{pmatrix}0 & \frac{1}{2} & \frac{1}{2} \\ 1&0&0 \\ 1&0&0\end{pmatrix} \:=\:\begin{pmatrix}1&0&0 \\ 0&0&0 \\ 0&0&0 \end{pmatrix}


    \text{Hence: }\:P^{2n} \:=\:\begin{pmatrix}1&0&0\\0&0&0\\0&0&0 \end{pmatrix} \qquad P^{2n+1} \:=\: \begin{pmatrix}0&\frac{1}{2}& \frac{1}{2} \\ 0&0&0 \\ 0&0&0 \end{pmatrix}


    \text{Therefore: }\:\begin{Bmatrix}Q_{2n} &=& Q_0\!\cdot\!P^{2n} &=& [x\;\;0\;\;0] \\ \\[-3mm] Q_{2n+1} &=& Q_0\!\cdot\!P^{2n+1} &=& \left[0\;\frac{1}{2}x\;\frac{1}{2}x\right] \end{Bmatrix}
    Thanks from liedora
    Follow Math Help Forum on Facebook and Google+

  3. #3
    Junior Member
    Joined
    Jan 2011
    From
    Sydney
    Posts
    36

    Re: Markov Chain

    Hi Soroban I think that your matrix multiplication is incorrect, for example by hand and in MATLAB I got P^(2n) = P^2 = [1 0 0;0 1/2 1/2;0 1/2 1/2] and P^(2n+1) = P^3 = [0 1/2 1/2 ; 1 0 0 ; 1 0 0]. But I now understand what it means to find the state distribution after an odd or even amount of steps. So thanks for that

    Also what does it mean by the limit distribution (using the above results we have established) I can't seem to decrypt the notes but I have a feeling the limit distribution does not exist.. Not sure how to show this however.

    Thanks for your fast reply Soroban.
    Follow Math Help Forum on Facebook and Google+

  4. #4
    Moo
    Moo is offline
    A Cute Angle Moo's Avatar
    Joined
    Mar 2008
    From
    P(I'm here)=1/3, P(I'm there)=t+1/3
    Posts
    5,618
    Thanks
    6

    Re: Markov Chain

    Quote Originally Posted by liedora View Post
    Hi Soroban I think that your matrix multiplication is incorrect, for example by hand and in MATLAB I got P^(2n) = P^2 = [1 0 0;0 1/2 1/2;0 1/2 1/2] and P^(2n+1) = P^3 = [0 1/2 1/2 ; 1 0 0 ; 1 0 0]. But I now understand what it means to find the state distribution after an odd or even amount of steps. So thanks for that

    Also what does it mean by the limit distribution (using the above results we have established) I can't seem to decrypt the notes but I have a feeling the limit distribution does not exist.. Not sure how to show this however.

    Thanks for your fast reply Soroban.
    Yes, your calculations are correct.
    For the limit distribution, just think of it as a limit in normal times. After a certain index N, you should have something approaching the limiting distribution. But here, it's oscillating, you'll never get a unique limit.
    If you want to do it formally, just look for how one defines the limiting distribution of a Markov chain.
    Follow Math Help Forum on Facebook and Google+

Similar Math Help Forum Discussions

  1. Markov Chain of random variables from a primitive markov chain
    Posted in the Advanced Statistics Forum
    Replies: 0
    Last Post: October 19th 2011, 08:12 AM
  2. Markov Chain
    Posted in the Statistics Forum
    Replies: 12
    Last Post: August 8th 2011, 10:38 PM
  3. Markov chain
    Posted in the Advanced Statistics Forum
    Replies: 1
    Last Post: November 20th 2010, 01:40 AM
  4. Markov Chain
    Posted in the Advanced Statistics Forum
    Replies: 1
    Last Post: March 21st 2009, 01:14 PM
  5. Replies: 2
    Last Post: October 28th 2008, 06:32 PM

Search Tags


/mathhelpforum @mathhelpforum