# Discrete Time Markov Chain

• Mar 21st 2011, 07:04 PM
kingwinner
Discrete Time Markov Chain
1) "Theorem: Given that the system is in State s at time n, the probability of making the transition from State i at time n + k to State j at time n + k + 1 is
P(X_{n+k} = i | X_n =s) * P(X_{n+k+1}=j | X_{n+k} =i)."

I believe the probability in question is
P(X_{n+k+1} =j ∩ X_{n+k}=i | X_n =s).
The source of the theorem did not include a proof, so I don't actually understand why this theorem must be true. How can we formally prove it? Where do we need to use Markov's property?

2) For a discrete time Markov Chain, is it always true that
P(X_{n+2}=j | X_n =i ∩ X_{n-1}=s) = P(X_{n+2}=j | X_n =i) ??
I know that Markov's property implies that the above equality would always be true if the part in red is replaced by 1. But in this case, when we have 2 (instead of 1), would the equality still be true? If so, how can we formally prove it?

• Mar 22nd 2011, 07:56 AM
theodds
Quote:

Originally Posted by kingwinner
1) "Theorem: Given that the system is in State s at time n, the probability of making the transition from State i at time n + k to State j at time n + k + 1 is
P(X_{n+k} = i | X_n =s) * P(X_{n+k+1}=j | X_{n+k} =i)."

I believe the probability in question is
P(X_{n+k+1} =j ∩ X_{n+k}=i | X_n =s).
The source of the theorem did not include a proof, so I don't actually understand why this theorem must be true. How can we formally prove it? Where do we need to use Markov's property?

Yes, that is what it is asking. I'm not surprised the proof was omitted since it is immediate from the expression you wrote down; you are just having trouble seeing the step (it happens to the best of us):

$\displaystyle
P(X_{n + k + 1} = j \cap X_{n + k} = i| X_n = s) = P(X_{n + k + 1} = j | X_{n + k} = i, X_n = s) P(X_{n + k} = i | X_n = s).
$

So we are just using the definition of conditional probability - the conditioning on X_n doesn't change anything in that regard.

Quote:

2) For a discrete time Markov Chain, is it always true that
P(X_{n+2}=j | X_n =i ∩ X_{n-1}=s) = P(X_{n+2}=j | X_n =i) ??
I know that Markov's property implies that the above equality would always be true if the part in red is replaced by 1. But in this case, when we have 2 (instead of 1), would the equality still be true? If so, how can we formally prove it?
$\displaystyle