# Markov chains/limiting matrix

• Jan 31st 2013, 06:16 AM
Yoodle15
Markov chains/limiting matrix
Hello. Can someone please help in finding the limiting matrix of the following transitional matrix of an absorbing Markov chain:

P=
 1 0 0 0.1 0.6 0.3 0.2 0.2 0.6

I have tried finding the limiting matrix myself but my answer does not seem to be right. Can you please check if it's alright?

The matrix P is in standard form, which is the form

P=
 Identity matrix 0 R Q

And I know that the limiting matrix must be in this form:

Limiting matrix of P =
 Identity matrix 0 FR 0

F is the fundamental matrix for P. It is equal to (I - Q)^-1.

 1 0 0 1
minus
 0.6 0.3 0.2 0.6
is
 0.4 -0.3 -0.2 0.4

 0.4 -0.3 -0.2 0.4
to the power of -1 (or in inverse) is
 4 3 2 4

Now, FR is equal to
 4 3 2 4
multiplied by
 0.1 0.2
The product is
 1 1

This makes the limiting matrix seem like
 1 0 0 1 0 0 1 0 0

This matrix just seems strange O_o.
• Jan 31st 2013, 09:06 AM
Soroban
Re: Markov chains/limiting matrix
Hello, Yoodle15!

I arrived at the same result.

Quote:

Find the limiting matrix of the following transitional matrix of an absorbing Markov chain:

. . $A \;=\;\begin{bmatrix}1&0&0 \\ 0.1&0.6&0.3 \\ 0.2&0.2&0.6 \end{bmatrix}$

I was taught this method . . .

$\text{We want a row matrix: }\,X \:=\:(p,q,r)\,\text{ so that: }\:\begin{Bmatrix}X\cdot A \:=\:X & [1] \\ p+q+r \:=\:1 & [2] \end{Bmatrix}$

From [1], we have: . $(p,q,r)\begin{bmatrix}1&0&0 \\ 0.1&0.6&0.3\\0.2&0.2&0.6\end{bmatrix} \;=\;(p,q,r)$

And we have: . $\begin{Bmatrix}p+0.1q + 0.2r &=& p & \\ \quad 0.6q+0.2r &=& q \\ \quad 0.3q + 0.6r &=& r \end{Bmatrix}$

These simplify to: . $\begin{Bmatrix}\text{-}9p + q + 2r &=& 0 \\ \quad\text{-}4q + 2r &=& 0 \\ \quad 3q - 4r &=& 0 \end{Bmatrix}$
. . And we have [2]: . . . $p+q+r \:=\:1$

Solve the system of equations: . $\begin{Bmatrix}p &=& 1 \\ q&=&0 \\ r&=&0 \end{Bmatrix}$

Therefore: . $X \:=\:(1,0,0)$

~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~

This makes sense if we consider the transition probabilities.

We have three states: $a_1,\,a_2,\,a_3.$

Matrix $A$ gives us these probabilities:

. . $\begin{Bmatrix}P(a_1\!\to\!a_1) \:=\:1.0 & P(a_1\!\to\!a_2) \:=\:0.0 & P(a_1\!\to\!a_3) \:=\:0.0 \\ P(a_2\!\to\!a_1) \:=\:0.1 & P(a_2\!\to\!a_2) \:=\:0.6 & P(a_2\!\to\!a_3) \:=\:0.3 \\ P(a_3\!\to\!a_1) \:=\:0.2 & P(a_3\!\to\!a_2) \:=\:0.2 & P(a_3\!\to\!a_3) \:=\:0.6 \end{Bmatrix}$

$\text{If the process reaches }a_1\text{, it }stays\text{ in }a_1\text{ . . . forever.}$

$\text{As time passes, the probability of }a_2\text{ and }a_3\text{ avoiding }a_1\text{ gets smaller and smaller.}$

$\text{All states will eventually reach }a_1\text{ and be "absorbed".}$
• Jan 31st 2013, 09:21 AM
Yoodle15
Re: Markov chains/limiting matrix
Thank you for your help, Soroban! I am relieved that the answer is alright! I cross-checked the answer many times and couldn't find another answer but still felt, for some reason, suspicious of it.