Results 1 to 6 of 6

Math Help - Random Walk

  1. #1
    Junior Member
    Joined
    Nov 2009
    Posts
    57

    Random Walk

    If w = (w_0, w_1, w_2, ...) is the trajectory of a simple symmetric random walk on Z^3 (e.g. a sequence of vectors), how do you prove that for any eps > 0:

    P{||w_n||. n^(eps - 1/6) --> \infty as n --> \infty} = 1

    Where ||w_n|| is the norm of the trajectory at step n?

    My understanding is that the increments w0, w1-w0, w2-w1, ..., are independent, identically distributed random variables. Since we are in Z^3, the probability mass function is : for y in Z^3, p(y) =1/6 if y = +/- e(i), i=1,2,3, e.g. the basis unit vectors, and zero otherwise.

    Any help would be most appreciated.
    Follow Math Help Forum on Facebook and Google+

  2. #2
    MHF Contributor

    Joined
    Aug 2008
    From
    Paris, France
    Posts
    1,174
    Hi,
    the following is not a true proof, but it shows where the \frac{1}{6} comes from.

    By CLT, \frac{W_n}{\sqrt{n}} converges to a (3d) normal r.v.; hence \frac{W_n}{n^{1/6-\varepsilon}}=n^{\frac{1}{3}+\varepsilon}\frac{W_n  }{\sqrt{n}} tends to \infty in probability. At this point, it seems that \frac{1}{6} could be turned into \frac{1}{2}. But you need almost-sure convergence, not in probability... And dimension 3 was not yet involved. Note that the result would even hold in dimension 1 and 2 in probability (i.e. \|W_n\|>Cn^{1/6} with probability tending to 1 for any C), but that it must be false in the almost-sure sense because of recurrence.


    Suppose (this is where it gets approximate) that the three components of W_n are independent. If \frac{\|W_n\|}{n^{1/6-\varepsilon}}<C, then each of the three components satisfies \frac{|W^{(i)}_n|}{n^{1/6-\varepsilon}}<C, and by the (1-dimensional) CLT, the probability that this bound happens is on the order of P(|Z|<\frac{C}{n^{1/3+\varepsilon}})\simeq \frac{C'}{n^{1/3+\varepsilon}} where Z is a standard normal variable. The three components are independent so we get a power 3: P(\frac{\|W_n\|}{n^{1/6-\varepsilon}}<C)\simeq \frac{C'}{n^{1+3\varepsilon}}.

    And at that point you can apply the Borel-Cantelli lemma since the right-hand side gives a convergent series.

    (Some of the CLT estimates above only hold in a slightly looser sense, and some small \varepsilon' would probably need to be introduced, but I just wanted to give a rough idea of the reason for the 1/6. So it comes from 3\left(\frac{1}{2}-\frac{1}{6}\right)>1 which leads to a convergent series)

    The real proof should avoid the independence problem... You probably already know some about 3d random walks, so perhaps you know a way? If you want more help, could you please specify if this is a single-question assignment, or if there were questions before? (or what's the context?: what did you already learn about random walks?) And tell us what you tried.
    Follow Math Help Forum on Facebook and Google+

  3. #3
    Junior Member
    Joined
    Nov 2009
    Posts
    57
    Hi,

    Thanks for taking the time to answer. Using your suggestions I was able to find the missing bits:

    - The problem is a single question taken from Chapter 10 of Koralov & Sinai's book "Theory of Probability and Random Processes" (p. 152, see for example in Google Books). The chapter deals with the Central Limit Theorem and its applications to random walks, recurrent events and large deviations.

    - One of the previous (unrelated) exercises deals with the generalization of the random walk in the multidimensional case: given a sequence of iid random vectors (v_n)_n with expectation vector m (=0 to simplify) and correlation matrix D, we get that \Sigma v_n / \sqrt(n) converges in distribution to a Gaussian vector with same correlation matrix D.

    Using this result, in the case of the 3d random walk, the correlation matrix of the increment vector is diagonal (this is immediate from the probability mass function). E.g. the vector components are uncorrelated (yet not independent).

    This ensures that the component of the limiting Gaussian vector are independent (2 by 2), since in the case of multivariate Gaussian variables it is implied by the fact they are uncorrelated. This enables to move to the independence case and apply your solution as it is with the limiting distribution.

    Hope this helps.
    Merci encore pour ton aide.
    Follow Math Help Forum on Facebook and Google+

  4. #4
    MHF Contributor

    Joined
    Aug 2008
    From
    Paris, France
    Posts
    1,174
    Quote Originally Posted by akbar View Post
    Using this result, in the case of the 3d random walk, the correlation matrix of the increment vector is diagonal (this is immediate from the probability mass function). E.g. the vector components are uncorrelated (yet not independent).

    This ensures that the component of the limiting Gaussian vector are independent (2 by 2), since in the case of multivariate Gaussian variables it is implied by the fact they are uncorrelated. This enables to move to the independence case and apply your solution as it is with the limiting distribution.
    I may be wrong, but I doubt that this argument (i.e. independence at the limit) will be enough to turn the sketch into a rigorous proof (i.e. get independence between components).

    On the other hand, I could find excerpts from the book on GoogleBooks and I noticed that the result on p138 can be used for a simpler proof. This result tells that P(W_{2n}=0)\sim_n c n^{-d/2}.

    Since the walk is symmetric, it is very tempting to say that P(W_{2n}=x) is maximum when x=0, i.e. there are more paths of length 2n from 0 back to 0 than from 0 to x if x\neq 0. There's probably a simple proof for that. More about this later.

    Then (assuming this), we have P(|W_{2n}|<n^{\frac{1}{6}-\varepsilon})\leq \#B(0,n^{\frac{1}{6}-\varepsilon}) P(W_{2n}=0) (where \#B(0,n^{\frac{1}{6}-\varepsilon}) is the number of vertices inside the ball of radius n^{\frac{1}{6}-\varepsilon} centered at 0). In dimension 3, \#B(0,n^{\frac{1}{6}-\varepsilon})\sim_n C (n^{\frac{1}{6}-\varepsilon})^3 = C n^{\frac{1}{2}-3\varepsilon}, hence (using the result p138):

    P(|W_{2n}|<n^{\frac{1}{6}-\varepsilon})\leq C' n^{\frac{1}{2}-3\varepsilon}n^{-3/2}=\frac{C'}{n^{1+3\varepsilon}}

    and you can apply Borel-Cantelli. That's it.

    Coming back to the inequality P(W_{2n}=x)\leq P(W_{2n}=0), one way to prove it is again based on p138. You have (cf. the book): (set d=3)

    P(W_{2n}=0)=\frac{1}{d^{2n}}\int_{-\pi}^\pi\cdots\int_{-\pi}^\pi (\cos\lambda_1+\cdots+\cos\lambda_d)^{2n}d\lambda_  1\cdots d\lambda_d.

    Using the same proof but multiplying by e^{-i(x,\lambda)} before integrating, we get (the result is real, so we can take the real part):

    P(W_{2n}=x)=\frac{1}{d^{2n}}\int_{-\pi}^\pi\cdots\int_{-\pi}^\pi (\cos\lambda_1+\cdots+\cos\lambda_d)^{2n}\cos(\lam  bda_1 x_1+\cdots+\lambda_d  x_d)d\lambda_1\cdots d\lambda_d.

    On this formula, it is clear (bounding the right-end cos by 1, since the other factor is positive) that indeed P(W_{2n}=x)\leq P(W_{2n}=0).
    Follow Math Help Forum on Facebook and Google+

  5. #5
    Junior Member
    Joined
    Nov 2009
    Posts
    57
    This is most likely the right answer since the order of the exercises in this book tend to follow the progression in each chapter and that exercise correspond to the section you mentioned. I have also been trying to apply that same result to the problem. Note your formula for P(W_2n =x) can be directly generalized for P(W_n=x) (with P(W_2p+1=x) =< P(W_2p=0)) .

    On the other hand, the solution using Gaussian random vectors uses the CLT and it is possible that dimension 3 was also chosen so that one can easily build the correlation matrix from the probability mass function. Once we apply the CLT, we end up only reasoning in terms of the limiting distribution. Maybe you could precise in what way you feel the proof cannot be made rigorous using this approach.

    Thanks in any case for both solutions.
    Follow Math Help Forum on Facebook and Google+

  6. #6
    MHF Contributor

    Joined
    Aug 2008
    From
    Paris, France
    Posts
    1,174
    Quote Originally Posted by akbar View Post
    Maybe you could precise in what way you feel the proof cannot be made rigorous using this approach.
    Sorry, after all it should indeed work, but requires more care than the other proof.

    Thanks in any case for both solutions.
    Je t'en prie, ce fut un plaisir !
    Follow Math Help Forum on Facebook and Google+

Similar Math Help Forum Discussions

  1. Random walk
    Posted in the Advanced Statistics Forum
    Replies: 4
    Last Post: March 26th 2011, 10:43 PM
  2. Random Walk
    Posted in the Advanced Statistics Forum
    Replies: 1
    Last Post: October 18th 2010, 05:29 PM
  3. an almost random walk SP
    Posted in the Advanced Statistics Forum
    Replies: 3
    Last Post: July 15th 2010, 11:58 AM
  4. Random Walk in 1-D
    Posted in the Advanced Statistics Forum
    Replies: 2
    Last Post: May 3rd 2010, 10:14 AM
  5. Random walk
    Posted in the Advanced Statistics Forum
    Replies: 2
    Last Post: February 19th 2009, 06:22 AM

Search Tags


/mathhelpforum @mathhelpforum