Markov chains/limiting matrix

Hello. Can someone please help in finding the limiting matrix of the following transitional matrix of an absorbing Markov chain:

P=

I have tried finding the limiting matrix myself but my answer does not seem to be right. Can you please check if it's alright?

The matrix P is in standard form, which is the form

P=

And I know that the limiting matrix must be in this form:

Limiting matrix of P =

F is the fundamental matrix for P. It is equal to (I - Q)^-1.

minus

is

to the power of -1 (or in inverse) is

Now, FR is equal to

multiplied by

The product is

This makes the limiting matrix seem like

This matrix just seems strange O_o.

Re: Markov chains/limiting matrix

Hello, Yoodle15!

I arrived at the same result.

Quote:

Find the limiting matrix of the following transitional matrix of an absorbing Markov chain:

. . $\displaystyle A \;=\;\begin{bmatrix}1&0&0 \\ 0.1&0.6&0.3 \\ 0.2&0.2&0.6 \end{bmatrix} $

I was taught this method . . .

$\displaystyle \text{We want a row matrix: }\,X \:=\:(p,q,r)\,\text{ so that: }\:\begin{Bmatrix}X\cdot A \:=\:X & [1] \\ p+q+r \:=\:1 & [2] \end{Bmatrix}$

From [1], we have: .$\displaystyle (p,q,r)\begin{bmatrix}1&0&0 \\ 0.1&0.6&0.3\\0.2&0.2&0.6\end{bmatrix} \;=\;(p,q,r)$

And we have: .$\displaystyle \begin{Bmatrix}p+0.1q + 0.2r &=& p & \\ \quad 0.6q+0.2r &=& q \\ \quad 0.3q + 0.6r &=& r \end{Bmatrix}$

These simplify to: .$\displaystyle \begin{Bmatrix}\text{-}9p + q + 2r &=& 0 \\ \quad\text{-}4q + 2r &=& 0 \\ \quad 3q - 4r &=& 0 \end{Bmatrix}$

. . And we have [2]: . . . $\displaystyle p+q+r \:=\:1$

Solve the system of equations: .$\displaystyle \begin{Bmatrix}p &=& 1 \\ q&=&0 \\ r&=&0 \end{Bmatrix}$

Therefore: .$\displaystyle X \:=\:(1,0,0)$

~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~

This makes sense if we consider the transition probabilities.

We have three states: $\displaystyle a_1,\,a_2,\,a_3.$

Matrix $\displaystyle A$ gives us these probabilities:

. . $\displaystyle \begin{Bmatrix}P(a_1\!\to\!a_1) \:=\:1.0 & P(a_1\!\to\!a_2) \:=\:0.0 & P(a_1\!\to\!a_3) \:=\:0.0 \\ P(a_2\!\to\!a_1) \:=\:0.1 & P(a_2\!\to\!a_2) \:=\:0.6 & P(a_2\!\to\!a_3) \:=\:0.3 \\ P(a_3\!\to\!a_1) \:=\:0.2 & P(a_3\!\to\!a_2) \:=\:0.2 & P(a_3\!\to\!a_3) \:=\:0.6 \end{Bmatrix}$

$\displaystyle \text{If the process reaches }a_1\text{, it }stays\text{ in }a_1\text{ . . . forever.}$

$\displaystyle \text{As time passes, the probability of }a_2\text{ and }a_3\text{ avoiding }a_1\text{ gets smaller and smaller.}$

$\displaystyle \text{All states will eventually reach }a_1\text{ and be "absorbed".}$

Re: Markov chains/limiting matrix

Thank you for your help, Soroban! I am relieved that the answer is alright! I cross-checked the answer many times and couldn't find another answer but still felt, for some reason, suspicious of it.