# Math Help - What are the Limiting Distribution and Limiting Probabilities

1. ## What are the Limiting Distribution and Limiting Probabilities

Hi
I have been studying Markov Chains and Queuing Theory. I have completed quite a lot of Markov Chains and am moving onto Queuing Theory, and there is this procedure to convert an M/G/1 queue into an imbedded Markov Chain, and suddenly the all the pages refer to something called the Limiting Distribution of the Markov Chain.

What ????

I flipped back through the book, but could not see a single explaination of what the term "Limiting Distribution" of a Markov Chain is supposed to mean.
I checked through several books, they all use the term but never explain it. They do not even list it in the index.
I can't find it explained anywhere in the Internet either.

Please help me out, at the earliest, and explain what in the world "Limiting Distribution" of a Markov Chain is supposed to mean.

I know that if initial 0th distribuiton is I, and transition probability matrix is P, then the nth Distribution is (I)Pexpn.

where Pexpn means P to the power of n.

Please help me out, at the earliest, and explain what in the world "Limiting Distribution" of a Markov Chain is supposed to mean.

Note: Definition, Explaination and Example is usually the best way of getting an understanding.

Pankaj

2. Originally Posted by pankajthepiper
Hi
I have been studying Markov Chains and Queuing Theory. I have completed quite a lot of Markov Chains and am moving onto Queuing Theory, and there is this procedure to convert an M/G/1 queue into an imbedded Markov Chain, and suddenly the all the pages refer to something called the Limiting Distribution of the Markov Chain.

What ????

I flipped back through the book, but could not see a single explaination of what the term "Limiting Distribution" of a Markov Chain is supposed to mean.
I checked through several books, they all use the term but never explain it. They do not even list it in the index.
I can't find it explained anywhere in the Internet either.

Please help me out, at the earliest, and explain what in the world "Limiting Distribution" of a Markov Chain is supposed to mean.

I know that if initial 0th distribuiton is I, and transition probability matrix is P, then the nth Distribution is (I)Pexpn.

where Pexpn means P to the power of n.

Please help me out, at the earliest, and explain what in the world "Limiting Distribution" of a Markov Chain is supposed to mean.

Note: Definition, Explaination and Example is usually the best way of getting an understanding.

Pankaj
Section 2.6 of the Wikipedia article.

CB

3. Capt. Black
However, I had already looked at Section 2.6 of the Wikipedia article, where it mentions limiting distributions in the heading, but does not mention or explain it inside the section 2.6. Instead it explains stationary distribution and equilibrium distribution.

Are any of them, or both of them, identical to limiting distribution, and if so, which one?

Could not get the main idea behind what a limiting distribution is supposed to be, and would appreciate it if anyone could point it out to me.

Here is what I think,

When n -> infinity, all the row vectors of the transition matrix P^n become identical.
Is the Limiting Distribution identical to one of these row vectors, and is it always?

4. Originally Posted by pankajthepiper
Capt. Black
However, I had already looked at Section 2.6 of the Wikipedia article, where it mentions limiting distributions in the heading, but does not mention or explain it inside the section 2.6. Instead it explains stationary distribution and equilibrium distribution.

Are any of them, or both of them, identical to limiting distribution, and if so, which one?

Could not get the main idea behind what a limiting distribution is supposed to be, and would appreciate it if anyone could point it out to me.

Here is what I think,

When n -> infinity, all the row vectors of the transition matrix P^n become identical.
Is the Limiting Distribution identical to one of these row vectors, and is it always?

"The possible values for the Markov chain are called the states of the Markov chain. A stationary distribution $\pi$for a Markov chain is a distribution over the states such that if we start the Markov chain in $\pi$, we stay in $\pi$. A limiting distribution $\pi$;is a distribution over the states such that whatever the starting the distribution $\pi_0$, the Markov chain converges to $\pi$: It is easily seen that if there is a limiting distribution $\pi$; then it is unique, and it is the only stationary distribution. It is easier to find a stationary distribution than a limiting distribution. So to find a limiting distribution, we find a stationary distribution and then argue that it must be the limiting distribution."