Results 1 to 11 of 11

Math Help - Please help with basic issues

  1. #1
    Senior Member
    Joined
    Feb 2008
    Posts
    410

    Please help with basic issues

    Hi, all.

    I took single-variable calc in high school, and will be moving on to multi-variable calc (calc 3) this summer at the community college. However, it surprises me that in all of this we have not yet nor will we discuss probability theory. The only probability I learned was very basic stuff from grade school algebra. About the only thing I remember is that the probability of n events happening given n instances, with each event having an individual probability of p, is p^n. That's pretty much all I have to go on.

    That said, I came across a random problem which I think I've solved, but which I'd like someone to explain to me in more proper, formal language...

    Suppose you have a die with n sides, and you roll it again and again, until you roll a certain single-number result (e.g. roll a six-sided die until you get a 4). Then you repeat the experiment ad nauseum. Out of the multiple experiments, what is the average number of rolls it will take for you to get that certain result?

    My solution is as follows:

    Step one: The probability q of rolling an undesired result r times in a row is (\frac{n-1}{n})^r.

    Step two (*this is the step with which I need the most help*): To get an average number of throws, we must let q=\frac{1}{2}. I know this intuitively, but how do I show it on paper?

    Remaining steps:

    \frac{1}{2} = (\frac{n-1}{n})^r

    \ln{\frac{1}{2}} = \ln{(\frac{n-1}{n})^r}

    \ln{\frac{1}{2}} = r \ln{\frac{n-1}{n}}

    r = \frac{\ln{\frac{n-1}{n}}}{\ln{\frac{1}{2}}}<br />

    Thus, r is the average number of throws.

    First, is my conclusion correct? If so, can someone please help me understand the steps?

    Thanks!
    Follow Math Help Forum on Facebook and Google+

  2. #2
    Flow Master
    mr fantastic's Avatar
    Joined
    Dec 2007
    From
    Zeitgeist
    Posts
    16,948
    Thanks
    5
    Quote Originally Posted by hatsoff View Post
    Hi, all.

    I took single-variable calc in high school, and will be moving on to multi-variable calc (calc 3) this summer at the community college. However, it surprises me that in all of this we have not yet nor will we discuss probability theory. The only probability I learned was very basic stuff from grade school algebra. About the only thing I remember is that the probability of n events happening given n instances, with each event having an individual probability of p, is p^n. That's pretty much all I have to go on.

    That said, I came across a random problem which I think I've solved, but which I'd like someone to explain to me in more proper, formal language...

    Suppose you have a die with n sides, and you roll it again and again, until you roll a certain single-number result (e.g. roll a six-sided die until you get a 4). Then you repeat the experiment ad nauseum. Out of the multiple experiments, what is the average number of rolls it will take for you to get that certain result?

    My solution is as follows:

    Step one: The probability q of rolling an undesired result r times in a row is (\frac{n-1}{n})^r.

    Step two (*this is the step with which I need the most help*): To get an average number of throws, we must let q=\frac{1}{2}. I know this intuitively, but how do I show it on paper?

    Remaining steps:

    \frac{1}{2} = (\frac{n-1}{n})^r

    \ln{\frac{1}{2}} = \ln{(\frac{n-1}{n})^r}

    \ln{\frac{1}{2}} = r \ln{\frac{n-1}{n}}

    r = \frac{\ln{\frac{n-1}{n}}}{\ln{\frac{1}{2}}}<br />

    Thus, r is the average number of throws.

    First, is my conclusion correct? If so, can someone please help me understand the steps?

    Thanks!
    You want the expected value of a random varibale that follows a geometric distribution: Geometric distribution - Wikipedia, the free encyclopedia
    Follow Math Help Forum on Facebook and Google+

  3. #3
    Senior Member
    Joined
    Feb 2008
    Posts
    410
    Quote Originally Posted by mr fantastic View Post
    You want the expected value of a random varibale that follows a geometric distribution: Geometric distribution - Wikipedia, the free encyclopedia
    That's helpful, but still perplexing. Wikipedia gives this formula:



    Yet my answer is:

    \frac{\ln{\frac{n-1}{n}}}{\ln{\frac{1}{2}}}<br />

    IE, the numerator and denominators are reversed in mine and wikipedia's. I'm not sure what the problem is, here, and I still don't know how to prove step #2 in my OP.
    Follow Math Help Forum on Facebook and Google+

  4. #4
    Grand Panjandrum
    Joined
    Nov 2005
    From
    someplace
    Posts
    14,972
    Thanks
    4
    Quote Originally Posted by hatsoff View Post
    Hi, all.

    I took single-variable calc in high school, and will be moving on to multi-variable calc (calc 3) this summer at the community college. However, it surprises me that in all of this we have not yet nor will we discuss probability theory. The only probability I learned was very basic stuff from grade school algebra. About the only thing I remember is that the probability of n events happening given n instances, with each event having an individual probability of p, is p^n. That's pretty much all I have to go on.

    That said, I came across a random problem which I think I've solved, but which I'd like someone to explain to me in more proper, formal language...

    Suppose you have a die with n sides, and you roll it again and again, until you roll a certain single-number result (e.g. roll a six-sided die until you get a 4). Then you repeat the experiment ad nauseum. Out of the multiple experiments, what is the average number of rolls it will take for you to get that certain result?

    My solution is as follows:

    Step one: The probability q of rolling an undesired result r times in a row is (\frac{n-1}{n})^r.

    Step two (*this is the step with which I need the most help*): To get an average number of throws, we must let q=\frac{1}{2}. I know this intuitively, but how do I show it on paper?

    Remaining steps:

    \frac{1}{2} = (\frac{n-1}{n})^r

    \ln{\frac{1}{2}} = \ln{(\frac{n-1}{n})^r}

    \ln{\frac{1}{2}} = r \ln{\frac{n-1}{n}}

    r = \frac{\ln{\frac{n-1}{n}}}{\ln{\frac{1}{2}}}<br />

    Thus, r is the average number of throws.

    First, is my conclusion correct? If so, can someone please help me understand the steps?

    Thanks!
    The average number of rolls to get a specified result is:

    \bar{N}=\sum_{r=1}^{\infty} r~ p(r)

    where p(r)=(1-p)^{r-1}p is the probability of getting the desired result for the forst time on the r'th roll, and p=1/n the probability of getting the desired result on a single roll.

    That your result is not right can be seen by evaluating it for some value of n.

    RonL
    Follow Math Help Forum on Facebook and Google+

  5. #5
    Grand Panjandrum
    Joined
    Nov 2005
    From
    someplace
    Posts
    14,972
    Thanks
    4
    Quote Originally Posted by hatsoff View Post
    That's helpful, but still perplexing. Wikipedia gives this formula:



    Yet my answer is:

    \frac{\ln{\frac{n-1}{n}}}{\ln{\frac{1}{2}}}<br />

    IE, the numerator and denominators are reversed in mine and wikipedia's. I'm not sure what the problem is, here, and I still don't know how to prove step #2 in my OP.
    Because step 2 in your post is pulled out of the air, and is not relevant.

    You might know it intuitivly, but its still wrong.

    RonL
    Follow Math Help Forum on Facebook and Google+

  6. #6
    Senior Member
    Joined
    Feb 2008
    Posts
    410
    Quote Originally Posted by CaptainBlack View Post
    The average number of rolls to get a specified result is:

    \bar{N}=\sum_{r=1}^{\infty} r~ p(r)

    where p(r)=(1-p)^{r-1}p is the probability of getting the desired result for the forst time on the r'th roll, and p=1/n the probability of getting the desired result on a single roll.

    That your result is not right can be seen by evaluating it for some value of n.

    RonL
    So, let's say we use a six-sided die as an example, such that p=\frac{1}{6}

    Then:

    p(r)=(1-\frac{1}{6})^{r-1}\frac{1}{6}

    p(r)=(\frac{5}{6})^{r-1}\frac{1}{6}

    So:

    \bar{N}=\frac{1}{6}\sum_{r=1}^{\infty} r~ (\frac{5}{6})^{r-1}

    \bar{N}=\frac{1}{6}\sum_{r=1}^{\infty} r~ (\frac{5}{6})^{r}\frac{6}{5}

    \bar{N}=\frac{1}{5}\sum_{r=1}^{\infty} r~ (\frac{5}{6})^{r}

    That's as far as I can go at my current level. Would you mind finishing the solution, so I can compare it to my natural log equation?
    Follow Math Help Forum on Facebook and Google+

  7. #7
    Senior Member
    Joined
    Feb 2008
    Posts
    410
    Quote Originally Posted by CaptainBlack View Post
    Because step 2 in your post is pulled out of the air, and is not relevant.

    You might know it intuitivly, but its still wrong.

    RonL
    This isn't rigorous, I know, but here is my thinking...

    If r throws of a die with n sides does not yield the target result, then the probability p of that happening is (\frac{n-1}{n})^r. Now, if we only throw the die r times such that p>\frac{1}{2}, then we should not expect to have thrown the target result. However, if p<\frac{1}{2}, then we should expect to throw the target result, because we have less than a 50% chance of not doing so. If we repeat this an infinite number of times, it seems to me that the average value of r throws to reach the desired result should reflect the expected values for each complete trial.

    Or so I have been thinking.
    Follow Math Help Forum on Facebook and Google+

  8. #8
    Grand Panjandrum
    Joined
    Nov 2005
    From
    someplace
    Posts
    14,972
    Thanks
    4
    Quote Originally Posted by CaptainBlack View Post
    The average number of rolls to get a specified result is:

    \bar{N}=\sum_{r=1}^{\infty} r~ p(r)

    where p(r)=(1-p)^{r-1}p is the probability of getting the desired result for the forst time on the r'th roll, and p=1/n the probability of getting the desired result on a single roll.

    That your result is not right can be seen by evaluating it for some value of n.

    RonL
    And of course we have:

    \bar{N}=\sum_{r=1}^{\infty} r~ (1-p)^{r-1}p = \frac{1}{p}=n

    Which we could deduce from more general principles but its nice to see the series summed anyway.

    RonL
    Follow Math Help Forum on Facebook and Google+

  9. #9
    Senior Member
    Joined
    Feb 2008
    Posts
    410
    Quote Originally Posted by CaptainBlack View Post
    And of course we have:

    \bar{N}=\sum_{r=1}^{\infty} r~ (1-p)^{r-1}p = \frac{1}{p}=n

    Which we could deduce from more general principles but its nice to see the series summed anyway.

    RonL
    Thank you. This helps a great deal. I see clearly now that I was mistaken, and I see where and why. However, though I do not doubt your conclusion, neither do I understand how you reached it. I therefore have a couple more questions...

    Firstly, how did you get from here:

    \sum_{r=1}^{\infty} r~ (1-p)^{r-1}p

    to here:

    \frac{1}{p}

    ? Also, would you be able to explain in newbie terms how you got this initial formula:

    \bar{N}=\sum_{r=1}^{\infty} r~ p(r)

    ?

    Thanks for all your help so far!
    Follow Math Help Forum on Facebook and Google+

  10. #10
    Senior Member
    Joined
    Feb 2008
    Posts
    410
    Okay, guys, after thinking about it a while, I think I understand. The probability p(r) of rolling a certain number in r trials can be multiplied by each value r and added up for all possible values of r (IE, \sum_{r=1}^{\infty}). What troubles me is that \sum_{r=1}^{\infty}p(r) does not seem to calculate out to 1. Consider:

    p(r)=(1-p)^{r-1}p

    p=\frac{1}{n}

    p(r)=(1-\frac{1}{n})^{r-1}\frac{1}{n}

    p(r)=(1-\frac{1}{n})^{r}(\frac{n}{n-1})(\frac{1}{n})

    p(r)=(1-\frac{1}{n})^{r}\frac{1}{n-1}

    \sum_{r=1}^{\infty}p(r)=\sum_{r=1}^{\infty}(1-\frac{1}{n})^{r}\frac{1}{n-1}

    \sum_{r=1}^{\infty}p(r)=\frac{1}{n-1}\sum_{r=1}^{\infty}(1-\frac{1}{n})^{r}

    \sum_{r=1}^{\infty}p(r)=\frac{1}{n-1}\sum_{r=1}^{\infty}(\frac{n-1}{n})^{r}

    \sum_{r=1}^{\infty}p(r)=(\frac{1}{n-1})\frac{1}{1-\frac{n-1}{n}}

    \sum_{r=1}^{\infty}p(r)=\frac{n}{n-1}

    Am I doing something wrong, here? Shouldn't this come out to be 1, and not \frac{n}{n-1}?
    Follow Math Help Forum on Facebook and Google+

  11. #11
    Flow Master
    mr fantastic's Avatar
    Joined
    Dec 2007
    From
    Zeitgeist
    Posts
    16,948
    Thanks
    5
    Quote Originally Posted by hatsoff View Post
    Thank you. This helps a great deal. I see clearly now that I was mistaken, and I see where and why. However, though I do not doubt your conclusion, neither do I understand how you reached it. I therefore have a couple more questions...

    Firstly, how did you get from here:

    \sum_{r=1}^{\infty} r~ (1-p)^{r-1}p

    to here:

    \frac{1}{p}
    [snip]
    \sum_{r=1}^{\infty} r~ (1-p)^{r-1}p = p \sum_{r=1}^{\infty} r~ u^{r-1}

    where u = 1 - p. Note that |u| < 1.

    ----------------------------------------------------------------------

    Interlude:

    A standard result (the sum of an infinite geometric series):

    \sum_{r = 0}^{\infty} u^r = \frac{1}{1-u}, \, \, |u| < 1.

    If you differentiate both sides of this result with respect to u:

    \sum_{r = 1}^{\infty} r ~ u^{r-1} = \frac{1}{(1-u)^2}, \, \, |u| < 1.

    End interlude.

    -----------------------------------------------------------------------

    Therefore p \sum_{r=1}^{\infty} r~ u^{r-1} = \frac{p}{(1 - u)^2} = \frac{p}{p^2} = \frac{1}{p}.


    Quote Originally Posted by hatsoff View Post
    [snip]
    Also, would you be able to explain in newbie terms how you got this initial formula:

    \bar{N}=\sum_{r=1}^{\infty} r~ p(r)
    [snip]
    By definition, the expected value of a function f(r) is \sum_{r=1}^{\infty} f(r) ~ p(r). You're interested in the expected value of r, the number of rolls. Therefore ......
    Follow Math Help Forum on Facebook and Google+

Similar Math Help Forum Discussions

  1. Inverse Issues
    Posted in the Calculus Forum
    Replies: 4
    Last Post: June 2nd 2011, 05:56 AM
  2. Bearing issues!
    Posted in the Trigonometry Forum
    Replies: 2
    Last Post: April 20th 2010, 09:51 AM
  3. lost of issues
    Posted in the Calculus Forum
    Replies: 5
    Last Post: March 16th 2010, 04:41 PM
  4. Presentation Issues
    Posted in the Math Software Forum
    Replies: 6
    Last Post: August 8th 2009, 05:15 PM
  5. Basic probability issues!
    Posted in the Statistics Forum
    Replies: 1
    Last Post: May 7th 2009, 06:16 PM

Search Tags


/mathhelpforum @mathhelpforum