# Math Help - Markov Chain

1. ## Markov Chain

A Markov chain $X_{n},n\geq 0$ with states 0, 1, 2 has the transition probability matrix

If $P(X_{0}=0)=P(X_{0}=1)=\frac{1}{4}$, find $E[X_{3}]$

How do I go about solving this? It seems like something that would be very basic, yet I can't find any similar examples in the my textbook.

Thanks

2. ## Re: Markov Chain

Originally Posted by downthesun01
A Markov chain $X_{n},n\geq 0$ with states 0, 1, 2 has the transition probability matrix

If $P(X_{0}=0)=P(X_{0}=1)=\frac{1}{4}$, find $E[X_{3}]$

How do I go about solving this? It seems like something that would be very basic, yet I can't find any similar examples in the my textbook.

Thanks
You know $S_n = T^n S_0$. So calculate $S_3$ and use its entries to calculate $E(X_3)$.

3. ## Re: Markov Chain

Ok, but what do S and T represent?

4. ## Re: Markov Chain

Originally Posted by downthesun01
Ok, but what do S and T represent?
If you're studying Markov chains, then you ought to know ....! T is the transition matrix. S is ......

5. ## Re: Markov Chain

Hahaha.. apparently we aren't using the same notation. I was under the impression that P represents the probability transition matrix. It's okay. I'll just find out somewhere else. Thanks anyway.

6. ## Re: Markov Chain

Originally Posted by downthesun01
Hahaha.. apparently we aren't using the same notation. I was under the impression that P represents the probability transition matrix. It's okay. I'll just find out somewhere else. Thanks anyway.
I figured as much. But I thought the context would suggest the definitions. S is the state matrix, the subscripts tells you what step in the chain.

7. ## Re: Markov Chain

Thank you for your help and taking the time to respond to my questions. However, I think that it'll take more than a simple nudging towards the answer to help me with this problem.

We've never had any discussion of State matrices, so I'm not quite sure of what they are and they do not seem to be mentioned in the textbook. I'm going to go ahead and check YouTube for any lectures on Markov Chains. Maybe they'll have some good examples that will help me out.

If anyone knows of any sites with well written, easy to follow examples link would be much appreciated. Thanks

8. ## Re: Markov Chain

Ok. I think I found out how to do this.

First calculate $P^3$

Then
$Pr(X_3=0)=(\frac{1}{4})(\frac{39}{108})+(\frac{1}{ 4})(\frac{48}{108})+(\frac{1}{2})(\frac{45}{108})= \frac{177}{432}$

$Pr(X_3=1)=(\frac{1}{4})(\frac{22}{108})+(\frac{1}{ 4})(\frac{16}{108})+(\frac{1}{2})(\frac{24}{108})= \frac{86}{432}$

$Pr(X_3=2)=(\frac{1}{4})(\frac{47}{108})+(\frac{1}{ 4})(\frac{44}{108})+(\frac{1}{2})(\frac{39}{108})= \frac{169}{432}$

Then $E(X_3)=(0)(\frac{177}{432})+(1)(\frac{86}{432})+(2 )(\frac{169}{432})=\frac{424}{432}$

Can someone confirm whether this is correct or not? Thanks

9. ## Re: Markov Chain

Originally Posted by mr fantastic
You know $S_n = T^n S_0$. So calculate $S_3$ and use its entries to calculate $E(X_3)$.
From the given transition matrix it is clear that the poster is using the other convention (which from my observation of posts on MHF is the more common notation in undergraduate education today), where in your notation $S_n$ is a row vector and:

$S_n=S_{n-1}{\text{A}}=S_{0}{\text{A}}^n$

where $\text{A}$ is the transition matrix in their format where the rows sum to 1.

10. ## Re: Markov Chain

Hello,

Originally Posted by downthesun01
Ok. I think I found out how to do this.

First calculate $P^3$

Then
$Pr(X_3=0)=(\frac{1}{4})(\frac{39}{108})+(\frac{1}{ 4})(\frac{48}{108})+(\frac{1}{2})(\frac{45}{108})= \frac{177}{432}$

$Pr(X_3=1)=(\frac{1}{4})(\frac{22}{108})+(\frac{1}{ 4})(\frac{16}{108})+(\frac{1}{2})(\frac{24}{108})= \frac{86}{432}$

$Pr(X_3=2)=(\frac{1}{4})(\frac{47}{108})+(\frac{1}{ 4})(\frac{44}{108})+(\frac{1}{2})(\frac{39}{108})= \frac{169}{432}$

Then $E(X_3)=(0)(\frac{177}{432})+(1)(\frac{86}{432})+(2 )(\frac{169}{432})=\frac{424}{432}$

Can someone confirm whether this is correct or not? Thanks
I haven't checked your calculations, but the reasoning is correct

This is basically what Mr F suggested, but I tend to agree that his notations are weird

11. ## Re: Markov Chain

Originally Posted by downthesun01
A Markov chain $X_{n},n\geq 0$ with states 0, 1, 2 has the transition probability matrix

If $P(X_{0}=0)=P(X_{0}=1)=\frac{1}{4}$, find $E[X_{3}]$

How do I go about solving this? It seems like something that would be very basic, yet I can't find any similar examples in the my textbook.

Thanks
The initial distribution vector for the states is $S_0=[0.25,0.25,0.5]$

So:

$S_3=S_2{\text{A}}=(S_1{\text{A}}){\text{A}}=((S_0{ \text{A}}){\text{A}}){\text{A}}=S_0{\text{A}}^3$

which is gives the probability of each of the states from which you compute the expectation.

CB

12. ## Re: Markov Chain

Originally Posted by downthesun01
Ok. I think I found out how to do this.

First calculate $P^3$

Then
$Pr(X_3=0)=(\frac{1}{4})(\frac{39}{108})+(\frac{1}{ 4})(\frac{48}{108})+(\frac{1}{2})(\frac{45}{108})= \frac{177}{432}$

$Pr(X_3=1)=(\frac{1}{4})(\frac{22}{108})+(\frac{1}{ 4})(\frac{16}{108})+(\frac{1}{2})(\frac{24}{108})= \frac{86}{432}$

$Pr(X_3=2)=(\frac{1}{4})(\frac{47}{108})+(\frac{1}{ 4})(\frac{44}{108})+(\frac{1}{2})(\frac{39}{108})= \frac{169}{432}$

Then $E(X_3)=(0)(\frac{177}{432})+(1)(\frac{86}{432})+(2 )(\frac{169}{432})=\frac{424}{432}$

Can someone confirm whether this is correct or not? Thanks
Well I have checked your arithmetic and it is correct.

CB

13. ## Re: Markov Chain

Thank you for all of your responses. All much appreciated.