Results 1 to 6 of 6

Math Help - Observed frequencies of transitions in a Markov Chain

  1. #1
    Newbie
    Joined
    Dec 2009
    Posts
    4

    Observed frequencies of transitions in a Markov Chain

    Dear Forum,

    I came across the following remark in P. Billingsley, "Statistical Inference for Markov Processes", and I wonder whether someone can point me to further information:

    "It is possible to show that if (the transition probability) p_ij > 0, then P (n_ij = 0) goes to 0 exponentially as n --> infinity."

    Remark: We assume that we have observed n transitions of the Markov chain, and n_ij denotes the number of times we have observed the transition i --> j.

    I tried to find information about this online but I failed. Maybe someone has seen this or a related statement online or in a book? Any help would be very much appreciated.
    Follow Math Help Forum on Facebook and Google+

  2. #2
    MHF Contributor

    Joined
    Aug 2008
    From
    Paris, France
    Posts
    1,174
    Quote Originally Posted by Wolfram View Post
    "It is possible to show that if (the transition probability) p_ij > 0, then P (n_ij = 0) goes to 0 exponentially as n --> infinity."
    This is not always true. This is true iff the Markov chain is positive recurrent, I believe. (And if state i is accessible from the starting point). Maybe your Markov chain has finitely many states?

    Here is a proof (that positive recurrence suffices). Let \tau be the time of the first transition from i to j, so that P(n_{ij}=0)=P(\tau>n). Note that P_x(\tau>1)=1 if x\neq i, and P_i(\tau>1)=1-p_{ij}. By Markov property we have, for all n, P(\tau>n+1)=E[{\bf 1}_{(\tau>n)}P_{X_n}(\tau>1)]=P(\tau>n)-p_{ij}P(\tau>n, X_n=i), where the last equality results from the previous remark. Thus,

    P(\tau>n+1)=P(\tau>n) (1-p_{ij}P(X_n=i|\tau>n)).

    For P(\tau>n) to decrease exponentially, it would be sufficient if the last probability is greater than a positive constant, for a positive proportion of indices n (we can't expect the probability to be positive for all n).

    After a short thought, it appears that the conditional distribution of (X_0,\ldots,X_n) given \{\tau>n\} is a Markov chain with similar transitions as the initial ones, expect that the transition from i to j is removed and the other transitions starting from i are normalized accordingly. Note that this doesn't work if (i,j) is the only possible transition starting from i, i.e. if p_{ij}=1; this special case could be dealt with by modifying the Markov chain at the very beginning so as to "erase the (useless) state i" (it's a bit messy to explain, I hope you get the picture, anyway it's not the crucial part).

    So let us consider this new Markov chain (without the transition (i,j)). If the state space was finite, of course it still is finite (same state space), so that the Markov chain still is positive recurrent. In the general case, the Markov chain (restricted to the still accessible states) still is positive recurrent as well, I guess, but I can't think of an argument so I let you figure that out if you want. Therefore, if d denotes the period of the new Markov chain (whose law is denoted by \widetilde{P}), then \widetilde{P}(X_{kd+n_0}=i)\to_{k\to\infty}\pi(i) for some invariant probability measure \pi, with \pi(i)>0 because i is accessible from the initial state (and where n_0\in\{0,\ldots,d-1\}). So we have a constant c>0 and k_0,n_0>0 such that, for all k>k_0, P(X_{kd+n_0}=i|\tau>kd)>c.

    This gives the conclusion: we have P(\tau>n+1)\leq P(\tau>n) for all n, and P(\tau>n+1)\leq (1-p_{ij}c)P(\tau>n) for indices n that are multiple of d ( +n_0, and large enough). Hence a geometric(=exponential) rate of decrease (here we use p_{ij}>0). If this is not obvious to you, try to prove it.

    This should answer your questions; you may ask for additional explanations if you need.
    Last edited by Laurent; December 11th 2009 at 03:27 PM. Reason: Adding the adjective "positive", and giving a proof
    Follow Math Help Forum on Facebook and Google+

  3. #3
    Newbie
    Joined
    Dec 2009
    Posts
    4
    Brilliant! Thank you very much for this excellent response.
    Follow Math Help Forum on Facebook and Google+

  4. #4
    Newbie
    Joined
    Dec 2009
    Posts
    4
    For the sake of completeness, I would like to add two remarks to Laurent's proof:

    (1) The case p_{ij} = 1: Unfortunately, in my case the Markov Chain is given and I do not have the possibility to modify it upfront. However, in this case the question whether we observe transition (i,j) within n+1 steps reduces to the question whether we observe state i within n steps. And this probability is easily seen to reduce exponentially if \pi(i) > 0, which holds if the Markov Chain is finite and irreducible.

    (2) Positive recurrence of the modified Markov Chain: Unfortunately, I believe that the modified Markov Chain is not necessarily positive recurrent. To see this, imagine that transition (i,j) is the only link from a set of states S_1 to a disjoint set of states S_2 (while there are other links in the other direction). However, we do not need positive recurrence of the modified Markov Chain to arrive at the conclusion that \mathbb{P} (\widetilde{X}_n = i) \geq c > 0 for a positive proportion of indices n, where the tilde refers to the state evolution in the new Markov Chain. Indeed, since the old Markov Chain (before removing transition (i,j)) had a strictly positive probability to reach state i from any state k within \left| S \right| steps ( \left| S \right| = number of states in the Markov Chain), the same applies to the new Markov Chain (this can be verified easily). It follows that \mathbb{P} (\widetilde{X}_n = i) \geq c > 0 for a positive proportion of indices n.

    But, in any case, these are only minor details.
    Follow Math Help Forum on Facebook and Google+

  5. #5
    MHF Contributor

    Joined
    Aug 2008
    From
    Paris, France
    Posts
    1,174
    Quote Originally Posted by Laurent View Post
    In the general case, the Markov chain (restricted to the still accessible states) still is positive recurrent as well, I guess, but I can't think of an argument so I let you figure that out if you want.
    Quote Originally Posted by Wolfram View Post
    (2) Positive recurrence of the modified Markov Chain: Unfortunately, I believe that the modified Markov Chain is not necessarily positive recurrent. To see this, imagine that transition (i,j) is the only link from a set of states S_1 to a disjoint set of states S_2 (while there are other links in the other direction).
    I can't see how this would be a counter-example to the statement recalled above (cf. what I put between parentheses) : since we only consider the subset of "still accessible states", in your case we would drop S_2 in the modified Markov chain, and the question is: is the Markov chain on S_1 still positive recurrent?

    Another way to ask the question is: given a transient, or null recurrent Markov chain on S_1, is it possible to turn it into a positive recurrent Markov chain by glueing it to a new subset of states S_2 with only one transition from S_1 to S_2 (the transition i,j) and possibly several transitions from S_2 back to S_1? If S_1 is transient, the enlarged Markov chain is still transient (starting from i, the probability of no-return is only decreased by a constant positive factor due to the new transition). If S_1 is null-recurrent,... I would bet the enlarged Markov chain still is, but I can't find a proof.

    However, we do not need positive recurrence of the modified Markov Chain to arrive at the conclusion that \mathbb{P} (\widetilde{X}_n = i) \geq c > 0 for a positive proportion of indices n, where the tilde refers to the state evolution in the new Markov Chain. Indeed, since the old Markov Chain (before removing transition (i,j)) had a strictly positive probability to reach state i from any state k within \left| S \right| steps ( \left| S \right| = number of states in the Markov Chain), the same applies to the new Markov Chain (this can be verified easily). It follows that \mathbb{P} (\widetilde{X}_n = i) \geq c > 0 for a positive proportion of indices n.
    Wait, wouldn't you be assuming that there are finitely many states in the Markov chain?... If such is the case, then positive recurrence is automatic (like I wrote in the previous post), on the subset of the still accessible states. The problem only arises for Markov chains with infinitely many states.
    Follow Math Help Forum on Facebook and Google+

  6. #6
    Newbie
    Joined
    Dec 2009
    Posts
    4
    I was indeed assuming a finite Markov Chain (sorry for not saying that at the beginning). I guess I misinterpreted the statement: "If the state space was finite, of course it still is finite (same state space), so that the Markov chain still is positive recurrent." and thought that the restriction to still accessible states in the next statement "In the general case, the Markov chain (restricted to the still accessible states) still is positive recurrent as well, I guess, but I can't think of an argument so I let you figure that out if you want." was limited to infinite Markov Chains. Given your clarification, I completely agree with you.
    Follow Math Help Forum on Facebook and Google+

Similar Math Help Forum Discussions

  1. Markov Chain of random variables from a primitive markov chain
    Posted in the Advanced Statistics Forum
    Replies: 0
    Last Post: October 19th 2011, 08:12 AM
  2. Markov chain
    Posted in the Advanced Statistics Forum
    Replies: 3
    Last Post: May 24th 2010, 09:45 AM
  3. Markov chain
    Posted in the Advanced Statistics Forum
    Replies: 8
    Last Post: April 14th 2010, 06:19 AM
  4. Markov Chain
    Posted in the Advanced Statistics Forum
    Replies: 3
    Last Post: April 7th 2010, 06:10 AM
  5. Replies: 2
    Last Post: October 28th 2008, 06:32 PM

Search Tags


/mathhelpforum @mathhelpforum