Hi, all.

I took single-variable calc in high school, and will be moving on to multi-variable calc (calc 3) this summer at the community college. However, it surprises me that in all of this we have not yet nor will we discuss probability theory. The only probability I learned was very basic stuff from grade school algebra. About the only thing I remember is that the probability of $\displaystyle n$ events happening given $\displaystyle n$ instances, with each event having an individual probability of $\displaystyle p$, is $\displaystyle p^n$. That's pretty much all I have to go on.

That said, I came across a random problem which I think I've solved, but which I'd like someone to explain to me in more proper, formal language...

Suppose you have a die with $\displaystyle n$ sides, and you roll it again and again, until you roll a certain single-number result (e.g. roll a six-sided die until you get a 4). Then you repeat the experiment ad nauseum. Out of the multiple experiments, what is the average number of rolls it will take for you to get that certain result?

My solution is as follows:

Step one: The probability $\displaystyle q$ of rolling an undesired result $\displaystyle r$ times in a row is $\displaystyle (\frac{n-1}{n})^r$.

Step two (*this is the step with which I need the most help*): To get an average number of throws, we must let $\displaystyle q=\frac{1}{2}$. I know this intuitively, but how do I show it on paper?

Remaining steps:

$\displaystyle \frac{1}{2} = (\frac{n-1}{n})^r$

$\displaystyle \ln{\frac{1}{2}} = \ln{(\frac{n-1}{n})^r}$

$\displaystyle \ln{\frac{1}{2}} = r \ln{\frac{n-1}{n}}$

$\displaystyle r = \frac{\ln{\frac{n-1}{n}}}{\ln{\frac{1}{2}}}$

Thus, $\displaystyle r$ is the average number of throws.

Thanks!

2. Originally Posted by hatsoff
Hi, all.

I took single-variable calc in high school, and will be moving on to multi-variable calc (calc 3) this summer at the community college. However, it surprises me that in all of this we have not yet nor will we discuss probability theory. The only probability I learned was very basic stuff from grade school algebra. About the only thing I remember is that the probability of $\displaystyle n$ events happening given $\displaystyle n$ instances, with each event having an individual probability of $\displaystyle p$, is $\displaystyle p^n$. That's pretty much all I have to go on.

That said, I came across a random problem which I think I've solved, but which I'd like someone to explain to me in more proper, formal language...

Suppose you have a die with $\displaystyle n$ sides, and you roll it again and again, until you roll a certain single-number result (e.g. roll a six-sided die until you get a 4). Then you repeat the experiment ad nauseum. Out of the multiple experiments, what is the average number of rolls it will take for you to get that certain result?

My solution is as follows:

Step one: The probability $\displaystyle q$ of rolling an undesired result $\displaystyle r$ times in a row is $\displaystyle (\frac{n-1}{n})^r$.

Step two (*this is the step with which I need the most help*): To get an average number of throws, we must let $\displaystyle q=\frac{1}{2}$. I know this intuitively, but how do I show it on paper?

Remaining steps:

$\displaystyle \frac{1}{2} = (\frac{n-1}{n})^r$

$\displaystyle \ln{\frac{1}{2}} = \ln{(\frac{n-1}{n})^r}$

$\displaystyle \ln{\frac{1}{2}} = r \ln{\frac{n-1}{n}}$

$\displaystyle r = \frac{\ln{\frac{n-1}{n}}}{\ln{\frac{1}{2}}}$

Thus, $\displaystyle r$ is the average number of throws.

Thanks!
You want the expected value of a random varibale that follows a geometric distribution: Geometric distribution - Wikipedia, the free encyclopedia

3. Originally Posted by mr fantastic
You want the expected value of a random varibale that follows a geometric distribution: Geometric distribution - Wikipedia, the free encyclopedia
That's helpful, but still perplexing. Wikipedia gives this formula:

$\displaystyle \frac{\ln{\frac{n-1}{n}}}{\ln{\frac{1}{2}}}$

IE, the numerator and denominators are reversed in mine and wikipedia's. I'm not sure what the problem is, here, and I still don't know how to prove step #2 in my OP.

4. Originally Posted by hatsoff
Hi, all.

I took single-variable calc in high school, and will be moving on to multi-variable calc (calc 3) this summer at the community college. However, it surprises me that in all of this we have not yet nor will we discuss probability theory. The only probability I learned was very basic stuff from grade school algebra. About the only thing I remember is that the probability of $\displaystyle n$ events happening given $\displaystyle n$ instances, with each event having an individual probability of $\displaystyle p$, is $\displaystyle p^n$. That's pretty much all I have to go on.

That said, I came across a random problem which I think I've solved, but which I'd like someone to explain to me in more proper, formal language...

Suppose you have a die with $\displaystyle n$ sides, and you roll it again and again, until you roll a certain single-number result (e.g. roll a six-sided die until you get a 4). Then you repeat the experiment ad nauseum. Out of the multiple experiments, what is the average number of rolls it will take for you to get that certain result?

My solution is as follows:

Step one: The probability $\displaystyle q$ of rolling an undesired result $\displaystyle r$ times in a row is $\displaystyle (\frac{n-1}{n})^r$.

Step two (*this is the step with which I need the most help*): To get an average number of throws, we must let $\displaystyle q=\frac{1}{2}$. I know this intuitively, but how do I show it on paper?

Remaining steps:

$\displaystyle \frac{1}{2} = (\frac{n-1}{n})^r$

$\displaystyle \ln{\frac{1}{2}} = \ln{(\frac{n-1}{n})^r}$

$\displaystyle \ln{\frac{1}{2}} = r \ln{\frac{n-1}{n}}$

$\displaystyle r = \frac{\ln{\frac{n-1}{n}}}{\ln{\frac{1}{2}}}$

Thus, $\displaystyle r$ is the average number of throws.

Thanks!
The average number of rolls to get a specified result is:

$\displaystyle \bar{N}=\sum_{r=1}^{\infty} r~ p(r)$

where $\displaystyle p(r)=(1-p)^{r-1}p$ is the probability of getting the desired result for the forst time on the $\displaystyle r$'th roll, and $\displaystyle p=1/n$ the probability of getting the desired result on a single roll.

That your result is not right can be seen by evaluating it for some value of $\displaystyle n$.

RonL

5. Originally Posted by hatsoff
That's helpful, but still perplexing. Wikipedia gives this formula:

$\displaystyle \frac{\ln{\frac{n-1}{n}}}{\ln{\frac{1}{2}}}$

IE, the numerator and denominators are reversed in mine and wikipedia's. I'm not sure what the problem is, here, and I still don't know how to prove step #2 in my OP.
Because step 2 in your post is pulled out of the air, and is not relevant.

You might know it intuitivly, but its still wrong.

RonL

6. Originally Posted by CaptainBlack
The average number of rolls to get a specified result is:

$\displaystyle \bar{N}=\sum_{r=1}^{\infty} r~ p(r)$

where $\displaystyle p(r)=(1-p)^{r-1}p$ is the probability of getting the desired result for the forst time on the $\displaystyle r$'th roll, and $\displaystyle p=1/n$ the probability of getting the desired result on a single roll.

That your result is not right can be seen by evaluating it for some value of $\displaystyle n$.

RonL
So, let's say we use a six-sided die as an example, such that $\displaystyle p=\frac{1}{6}$

Then:

$\displaystyle p(r)=(1-\frac{1}{6})^{r-1}\frac{1}{6}$

$\displaystyle p(r)=(\frac{5}{6})^{r-1}\frac{1}{6}$

So:

$\displaystyle \bar{N}=\frac{1}{6}\sum_{r=1}^{\infty} r~ (\frac{5}{6})^{r-1}$

$\displaystyle \bar{N}=\frac{1}{6}\sum_{r=1}^{\infty} r~ (\frac{5}{6})^{r}\frac{6}{5}$

$\displaystyle \bar{N}=\frac{1}{5}\sum_{r=1}^{\infty} r~ (\frac{5}{6})^{r}$

That's as far as I can go at my current level. Would you mind finishing the solution, so I can compare it to my natural log equation?

7. Originally Posted by CaptainBlack
Because step 2 in your post is pulled out of the air, and is not relevant.

You might know it intuitivly, but its still wrong.

RonL
This isn't rigorous, I know, but here is my thinking...

If $\displaystyle r$ throws of a die with $\displaystyle n$ sides does not yield the target result, then the probability $\displaystyle p$ of that happening is $\displaystyle (\frac{n-1}{n})^r$. Now, if we only throw the die $\displaystyle r$ times such that $\displaystyle p>\frac{1}{2}$, then we should not expect to have thrown the target result. However, if $\displaystyle p<\frac{1}{2}$, then we should expect to throw the target result, because we have less than a 50% chance of not doing so. If we repeat this an infinite number of times, it seems to me that the average value of $\displaystyle r$ throws to reach the desired result should reflect the expected values for each complete trial.

Or so I have been thinking.

8. Originally Posted by CaptainBlack
The average number of rolls to get a specified result is:

$\displaystyle \bar{N}=\sum_{r=1}^{\infty} r~ p(r)$

where $\displaystyle p(r)=(1-p)^{r-1}p$ is the probability of getting the desired result for the forst time on the $\displaystyle r$'th roll, and $\displaystyle p=1/n$ the probability of getting the desired result on a single roll.

That your result is not right can be seen by evaluating it for some value of $\displaystyle n$.

RonL
And of course we have:

$\displaystyle \bar{N}=\sum_{r=1}^{\infty} r~ (1-p)^{r-1}p = \frac{1}{p}=n$

Which we could deduce from more general principles but its nice to see the series summed anyway.

RonL

9. Originally Posted by CaptainBlack
And of course we have:

$\displaystyle \bar{N}=\sum_{r=1}^{\infty} r~ (1-p)^{r-1}p = \frac{1}{p}=n$

Which we could deduce from more general principles but its nice to see the series summed anyway.

RonL
Thank you. This helps a great deal. I see clearly now that I was mistaken, and I see where and why. However, though I do not doubt your conclusion, neither do I understand how you reached it. I therefore have a couple more questions...

Firstly, how did you get from here:

$\displaystyle \sum_{r=1}^{\infty} r~ (1-p)^{r-1}p$

to here:

$\displaystyle \frac{1}{p}$

? Also, would you be able to explain in newbie terms how you got this initial formula:

$\displaystyle \bar{N}=\sum_{r=1}^{\infty} r~ p(r)$

?

Thanks for all your help so far!

10. Okay, guys, after thinking about it a while, I think I understand. The probability $\displaystyle p(r)$ of rolling a certain number in $\displaystyle r$ trials can be multiplied by each value $\displaystyle r$ and added up for all possible values of $\displaystyle r$ (IE, $\displaystyle \sum_{r=1}^{\infty}$). What troubles me is that $\displaystyle \sum_{r=1}^{\infty}p(r)$ does not seem to calculate out to 1. Consider:

$\displaystyle p(r)=(1-p)^{r-1}p$

$\displaystyle p=\frac{1}{n}$

$\displaystyle p(r)=(1-\frac{1}{n})^{r-1}\frac{1}{n}$

$\displaystyle p(r)=(1-\frac{1}{n})^{r}(\frac{n}{n-1})(\frac{1}{n})$

$\displaystyle p(r)=(1-\frac{1}{n})^{r}\frac{1}{n-1}$

$\displaystyle \sum_{r=1}^{\infty}p(r)=\sum_{r=1}^{\infty}(1-\frac{1}{n})^{r}\frac{1}{n-1}$

$\displaystyle \sum_{r=1}^{\infty}p(r)=\frac{1}{n-1}\sum_{r=1}^{\infty}(1-\frac{1}{n})^{r}$

$\displaystyle \sum_{r=1}^{\infty}p(r)=\frac{1}{n-1}\sum_{r=1}^{\infty}(\frac{n-1}{n})^{r}$

$\displaystyle \sum_{r=1}^{\infty}p(r)=(\frac{1}{n-1})\frac{1}{1-\frac{n-1}{n}}$

$\displaystyle \sum_{r=1}^{\infty}p(r)=\frac{n}{n-1}$

Am I doing something wrong, here? Shouldn't this come out to be $\displaystyle 1$, and not $\displaystyle \frac{n}{n-1}$?

11. Originally Posted by hatsoff
Thank you. This helps a great deal. I see clearly now that I was mistaken, and I see where and why. However, though I do not doubt your conclusion, neither do I understand how you reached it. I therefore have a couple more questions...

Firstly, how did you get from here:

$\displaystyle \sum_{r=1}^{\infty} r~ (1-p)^{r-1}p$

to here:

$\displaystyle \frac{1}{p}$
[snip]
$\displaystyle \sum_{r=1}^{\infty} r~ (1-p)^{r-1}p = p \sum_{r=1}^{\infty} r~ u^{r-1}$

where u = 1 - p. Note that |u| < 1.

----------------------------------------------------------------------

Interlude:

A standard result (the sum of an infinite geometric series):

$\displaystyle \sum_{r = 0}^{\infty} u^r = \frac{1}{1-u}, \, \, |u| < 1$.

If you differentiate both sides of this result with respect to u:

$\displaystyle \sum_{r = 1}^{\infty} r ~ u^{r-1} = \frac{1}{(1-u)^2}, \, \, |u| < 1$.

End interlude.

-----------------------------------------------------------------------

Therefore $\displaystyle p \sum_{r=1}^{\infty} r~ u^{r-1} = \frac{p}{(1 - u)^2} = \frac{p}{p^2} = \frac{1}{p}$.

Originally Posted by hatsoff
[snip]
Also, would you be able to explain in newbie terms how you got this initial formula:

$\displaystyle \bar{N}=\sum_{r=1}^{\infty} r~ p(r)$
[snip]
By definition, the expected value of a function f(r) is $\displaystyle \sum_{r=1}^{\infty} f(r) ~ p(r)$. You're interested in the expected value of r, the number of rolls. Therefore ......