Results 1 to 2 of 2

Math Help - Discrete Time Markov Chain

  1. #1
    Senior Member
    Joined
    Jan 2009
    Posts
    404

    Question Discrete Time Markov Chain

    1) "Theorem: Given that the system is in State s at time n, the probability of making the transition from State i at time n + k to State j at time n + k + 1 is
    P(X_{n+k} = i | X_n =s) * P(X_{n+k+1}=j | X_{n+k} =i)."


    I believe the probability in question is
    P(X_{n+k+1} =j ∩ X_{n+k}=i | X_n =s).
    The source of the theorem did not include a proof, so I don't actually understand why this theorem must be true. How can we formally prove it? Where do we need to use Markov's property?



    2) For a discrete time Markov Chain, is it always true that
    P(X_{n+2}=j | X_n =i ∩ X_{n-1}=s) = P(X_{n+2}=j | X_n =i) ??
    I know that Markov's property implies that the above equality would always be true if the part in red is replaced by 1. But in this case, when we have 2 (instead of 1), would the equality still be true? If so, how can we formally prove it?


    Thanks in advance!
    Follow Math Help Forum on Facebook and Google+

  2. #2
    Senior Member
    Joined
    Oct 2009
    Posts
    340
    Quote Originally Posted by kingwinner View Post
    1) "Theorem: Given that the system is in State s at time n, the probability of making the transition from State i at time n + k to State j at time n + k + 1 is
    P(X_{n+k} = i | X_n =s) * P(X_{n+k+1}=j | X_{n+k} =i)."


    I believe the probability in question is
    P(X_{n+k+1} =j ∩ X_{n+k}=i | X_n =s).
    The source of the theorem did not include a proof, so I don't actually understand why this theorem must be true. How can we formally prove it? Where do we need to use Markov's property?
    Yes, that is what it is asking. I'm not surprised the proof was omitted since it is immediate from the expression you wrote down; you are just having trouble seeing the step (it happens to the best of us):

    \displaystyle<br />
P(X_{n + k + 1} = j \cap X_{n + k} = i| X_n = s) = P(X_{n + k + 1} = j | X_{n + k} = i, X_n = s) P(X_{n + k} = i | X_n = s).<br />

    So we are just using the definition of conditional probability - the conditioning on X_n doesn't change anything in that regard.

    2) For a discrete time Markov Chain, is it always true that
    P(X_{n+2}=j | X_n =i ∩ X_{n-1}=s) = P(X_{n+2}=j | X_n =i) ??
    I know that Markov's property implies that the above equality would always be true if the part in red is replaced by 1. But in this case, when we have 2 (instead of 1), would the equality still be true? If so, how can we formally prove it?
    Thanks in advance!
    Yes, this is true. A simple way to get at something like this would be

    \displaystyle<br />
P(X_3 = x_3 | X_1 = x_1, X_0 = x_0) = \sum_{x_2}P(X_3 = x_3, X_2 = x_2 | X_1 = x_1, X_0 = x_0)<br />

    then fiddle around with the rules of conditional probability to get rid of the conditioning on X_0.
    Follow Math Help Forum on Facebook and Google+

Similar Math Help Forum Discussions

  1. Markov Chain mixing time
    Posted in the Advanced Statistics Forum
    Replies: 0
    Last Post: April 18th 2010, 09:32 AM
  2. Discrete Time Markov Chain
    Posted in the Advanced Statistics Forum
    Replies: 3
    Last Post: April 2nd 2010, 03:21 PM
  3. Discrete Time Markov Chain
    Posted in the Advanced Statistics Forum
    Replies: 0
    Last Post: September 29th 2009, 01:44 PM
  4. Replies: 0
    Last Post: March 10th 2009, 11:29 AM
  5. Discrete Time Markov Chain
    Posted in the Advanced Statistics Forum
    Replies: 3
    Last Post: March 10th 2008, 10:54 PM

Search Tags


/mathhelpforum @mathhelpforum