Results 1 to 3 of 3

Math Help - Markov chains/limiting matrix

  1. #1
    Newbie
    Joined
    Jan 2013
    From
    South Africa
    Posts
    14

    Markov chains/limiting matrix

    Hello. Can someone please help in finding the limiting matrix of the following transitional matrix of an absorbing Markov chain:

    P=
    1 0 0
    .1 .6 .3
    .2 .2 .6

    I have tried finding the limiting matrix myself but my answer does not seem to be right. Can you please check if it's alright?

    The matrix P is in standard form, which is the form

    P=
    Identity matrix 0
    R Q

    And I know that the limiting matrix must be in this form:

    Limiting matrix of P =
    Identity matrix 0
    FR 0


    F is the fundamental matrix for P. It is equal to (I - Q)^-1.

    1 0
    0 1
    minus
    .6 .3
    .2 .6
    is
    .4 -.3
    -.2 .4

    .4 -.3
    -.2 .4
    to the power of -1 (or in inverse) is
    4 3
    2 4

    Now, FR is equal to
    4 3
    2 4
    multiplied by
    .1
    .2
    The product is
    1
    1

    This makes the limiting matrix seem like
    1 0 0
    1 0 0
    1 0 0

    This matrix just seems strange O_o.
    Follow Math Help Forum on Facebook and Google+

  2. #2
    Super Member

    Joined
    May 2006
    From
    Lexington, MA (USA)
    Posts
    11,866
    Thanks
    745

    Re: Markov chains/limiting matrix

    Hello, Yoodle15!

    I arrived at the same result.


    Find the limiting matrix of the following transitional matrix of an absorbing Markov chain:

    . . A \;=\;\begin{bmatrix}1&0&0 \\ 0.1&0.6&0.3 \\ 0.2&0.2&0.6 \end{bmatrix}

    I was taught this method . . .

    \text{We want a row matrix: }\,X \:=\:(p,q,r)\,\text{ so that: }\:\begin{Bmatrix}X\cdot A \:=\:X & [1] \\ p+q+r \:=\:1 & [2] \end{Bmatrix}

    From [1], we have: . (p,q,r)\begin{bmatrix}1&0&0 \\ 0.1&0.6&0.3\\0.2&0.2&0.6\end{bmatrix} \;=\;(p,q,r)

    And we have: . \begin{Bmatrix}p+0.1q + 0.2r &=& p & \\ \quad 0.6q+0.2r &=& q \\ \quad 0.3q + 0.6r &=& r \end{Bmatrix}

    These simplify to: . \begin{Bmatrix}\text{-}9p + q + 2r &=& 0 \\ \quad\text{-}4q + 2r &=& 0 \\ \quad 3q - 4r &=& 0 \end{Bmatrix}
    . . And we have [2]: . . . p+q+r \:=\:1

    Solve the system of equations: . \begin{Bmatrix}p &=& 1 \\ q&=&0 \\ r&=&0 \end{Bmatrix}

    Therefore: . X \:=\:(1,0,0)


    ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~


    This makes sense if we consider the transition probabilities.

    We have three states: a_1,\,a_2,\,a_3.

    Matrix A gives us these probabilities:

    . . \begin{Bmatrix}P(a_1\!\to\!a_1) \:=\:1.0 & P(a_1\!\to\!a_2) \:=\:0.0 & P(a_1\!\to\!a_3) \:=\:0.0 \\ P(a_2\!\to\!a_1) \:=\:0.1 & P(a_2\!\to\!a_2) \:=\:0.6 & P(a_2\!\to\!a_3) \:=\:0.3 \\ P(a_3\!\to\!a_1) \:=\:0.2 & P(a_3\!\to\!a_2) \:=\:0.2 & P(a_3\!\to\!a_3) \:=\:0.6 \end{Bmatrix}

    \text{If the process reaches }a_1\text{, it }stays\text{ in }a_1\text{ . . . forever.}

    \text{As time passes, the probability of }a_2\text{ and }a_3\text{ avoiding }a_1\text{ gets smaller and smaller.}

    \text{All states will eventually reach }a_1\text{ and be "absorbed".}
    Follow Math Help Forum on Facebook and Google+

  3. #3
    Newbie
    Joined
    Jan 2013
    From
    South Africa
    Posts
    14

    Re: Markov chains/limiting matrix

    Thank you for your help, Soroban! I am relieved that the answer is alright! I cross-checked the answer many times and couldn't find another answer but still felt, for some reason, suspicious of it.
    Follow Math Help Forum on Facebook and Google+

Similar Math Help Forum Discussions

  1. Eigenvectors and transition matrix of markov chains.
    Posted in the Advanced Statistics Forum
    Replies: 3
    Last Post: August 11th 2011, 02:10 PM
  2. Markov Chains - Working out the transition matrix
    Posted in the Advanced Statistics Forum
    Replies: 1
    Last Post: March 14th 2011, 05:38 PM
  3. Probability Transition Matrix and Markov Chains
    Posted in the Advanced Statistics Forum
    Replies: 0
    Last Post: January 23rd 2011, 08:45 AM
  4. Limiting distribution and periodicty of markov chains
    Posted in the Advanced Statistics Forum
    Replies: 0
    Last Post: January 3rd 2010, 02:55 AM
  5. Markov Chains
    Posted in the Advanced Statistics Forum
    Replies: 0
    Last Post: April 26th 2009, 09:17 PM

Search Tags


/mathhelpforum @mathhelpforum