Results 1 to 5 of 5

Math Help - random walk probability problem

  1. #1
    Newbie
    Joined
    Nov 2007
    From
    Sault Ste. Marie, Ontario
    Posts
    15

    random walk probability problem

    Hi, I'm working through some math ecology stuff and I'm trying to understand how the author comes up with the following:

    The author claims that this Bernouilli distribution:

     <br />
p(m,n) = ({1 \over 2})^n {n! \over ((n + m)/2)! ((n - m)/2)!}<br />

    converges to this Gaussian distribution when n approaches infinity:

     <br />
\lim_{n \to \infty} p(m,n) = (2/{\pi}n)^{1/2}exp(-m^2/2n)<br />

    Does anybody know why that would be? The probability is for a particle arriving at point m on a number line after n moves.

    jjmclell
    Follow Math Help Forum on Facebook and Google+

  2. #2
    Grand Panjandrum
    Joined
    Nov 2005
    From
    someplace
    Posts
    14,972
    Thanks
    4
    Quote Originally Posted by jjmclell View Post
    Hi, I'm working through some math ecology stuff and I'm trying to understand how the author comes up with the following:

    The author claims that this Bernouilli distribution:

     <br />
p(m,n) = ({1 \over 2})^n {n! \over ((n + m)/2)! ((n - m)/2)!}<br />
    This is not the Bernoulli distribution that I nor my references know. Please provide some context.

    Does anybody know why that would be? The probability is for a particle arriving at point m on a number line after n moves.
    More information on the random walk needed.

    In the case of 0,1 steps with propabilities 1/2, 1/2 the distribution of distance from start after n steps is ~Binomial(1/2,n),
    This is the sum of n RV's each with known mean and variance, and the central limit theorem then gives the asymtotic distribution of distance as normal with appropriate mean and variance. This is a Bernoulli process with p=1/2.

    RonL
    Follow Math Help Forum on Facebook and Google+

  3. #3
    Newbie
    Joined
    Nov 2007
    From
    Sault Ste. Marie, Ontario
    Posts
    15
    That Bernoulli distribution was derived as follows:

    You start at 0 on a number line, and for each step you can go one interval to the left or to the right. To reach any point, m, in n steps, you have to take a steps to the right and b steps to the left. With that said:

    a + b = n
    a - b = m

    Through substitution:

    a = (n + m)/2
    b = (n - m)/2

    The total number of possible paths by which to arrive at m in n steps is:

     {n! \over a!b!} = {n! \over ((n + m)/2)!((n - m)/2)!}

    If we want to get to any point, m, in n moves, then we know how many moves we must take to the right, a, and how many to the left, b. If a represents successes and b represents failures then:

    p(m,n) = ({1 \over 2})^a ({1 \over 2})^b {n!\over a!b!}

    Which equals:

     ({1 \over 2})^n {n! \over ((n + m)/2)!((n - m)/2)!}

    To reiterate my question, how is it that the distribution above can go from being a Bernoulli distribution to a Gaussian distribution as n approaches infinity?...assuming that the above distribution is in fact a Bernoulli distribution and I'm not completely wrong about everything I just said.
    Follow Math Help Forum on Facebook and Google+

  4. #4
    Grand Panjandrum
    Joined
    Nov 2005
    From
    someplace
    Posts
    14,972
    Thanks
    4
    Quote Originally Posted by jjmclell View Post
    That Bernoulli distribution was derived as follows:

    You start at 0 on a number line, and for each step you can go one interval to the left or to the right. To reach any point, m, in n steps, you have to take a steps to the right and b steps to the left. With that said:

    a + b = n
    a - b = m

    Through substitution:

    a = (n + m)/2
    b = (n - m)/2

    The total number of possible paths by which to arrive at m in n steps is:

     {n! \over a!b!} = {n! \over ((n + m)/2)!((n - m)/2)!}

    If we want to get to any point, m, in n moves, then we know how many moves we must take to the right, a, and how many to the left, b. If a represents successes and b represents failures then:

    p(m,n) = ({1 \over 2})^a ({1 \over 2})^b {n!\over a!b!}

    Which equals:

     ({1 \over 2})^n {n! \over ((n + m)/2)!((n - m)/2)!}

    To reiterate my question, how is it that the distribution above can go from being a Bernoulli distribution to a Gaussian distribution as n approaches infinity?...assuming that the above distribution is in fact a Bernoulli distribution and I'm not completely wrong about everything I just said.
    The position after n steps is the sum of n independent identically distributed RV's taking the values -1 and +1 with probabilities 1/2 ans 1/2.

    Then the central limit theorem does the rest.

    This is essentially what I said before.

    In more detail the mean for the change on a single step is 0, and the variance is 1. So the sum of n such RV's has asymtotic distribution N(0,n). You are not expected to provide a proof of the special case of the CLT (at least I presume not).

    RonL
    Follow Math Help Forum on Facebook and Google+

  5. #5
    Newbie
    Joined
    Nov 2007
    From
    Sault Ste. Marie, Ontario
    Posts
    15
    Thanks a lot......definitely wish I had paid more attention in stats class.
    Follow Math Help Forum on Facebook and Google+

Similar Math Help Forum Discussions

  1. Random Walk in 2D
    Posted in the Advanced Statistics Forum
    Replies: 3
    Last Post: November 17th 2010, 04:04 AM
  2. an almost random walk SP
    Posted in the Advanced Statistics Forum
    Replies: 3
    Last Post: July 15th 2010, 11:58 AM
  3. Another problem related to random walk...
    Posted in the Advanced Statistics Forum
    Replies: 0
    Last Post: September 19th 2009, 03:24 PM
  4. Probability of Eventual Return in Random Walk
    Posted in the Discrete Math Forum
    Replies: 0
    Last Post: November 21st 2007, 10:43 AM
  5. Random walk problem
    Posted in the Discrete Math Forum
    Replies: 4
    Last Post: November 17th 2007, 09:08 AM

Search Tags


/mathhelpforum @mathhelpforum