# Thread: Another Expectation through conditioning question

1. ## Another Expectation through conditioning question

(1) A gambler wins each game with probability p. In each of the following cases, determine the expected total number of wins.

(a) The gambler will play n games; if he wins X of these games, then he will play an additional X games before stopping.

(b) The gambler will play until he wins; if it takes him Y games to get this win, then he will play an additional Y games.

I'm having some trouble wrapping my head around this one.

So far, for part A I'm looking at it like this

E[W]=E[W|wins X games]P(wins X games)

That line of thinking gets me

$\displaystyle (xp+np)$$\displaystyle {n}\choose{x}$$\displaystyle p^{x}(1-p)^{n-x}$

2. ## Re: Another Expectation through conditioning question

The wording of the question is confusing. Is it saying that if the gambler wins a specific number of games, he'll get to play that many more games? Or is is it saying that every time he wins, he'll get to play an additional game?

3. ## Re: Another Expectation through conditioning question

It's the first way you wrote.

The answer for A. comes out to be np(1+p)

But I'm not exactly sure on how it's found.

I hope that clarifies it a bit.

4. ## Re: Another Expectation through conditioning question

Are you sure that's the answer?

Let W= the number of wins

let X = the event that the gambler wins X of his first n games

$\displaystyle E(W) = E(W|X)P(X) + E(W|X^{c})P(X^{c})$

= $\displaystyle (x+xp) \binom{n}{x} p^{x}(1-p)^{n-x} + E(W|X^{c}) (1- \binom{n}{x} p^{x}(1-p)^{n-x})$

Let Z ~ binom(n,p). Then $\displaystyle E(Z) = \sum_{z=0}^{n} z \binom{n}{z} p^{z}(1-p)^{n-z} = np$

therefore, $\displaystyle E(W|X^{c}) = np - x \binom{n}{x} p^{x}(1-p)^{n-x}$

$\displaystyle E(W) = (x+xp) \binom{n}{x} p^{x}(1-p)^{n-x} + \big( np - x\binom{n}{x}p^{x}(1-p)^{n-x} \big) \big(1- \binom{n}{x} p^{x}(1-p)^{n-x}\big)$

Perhaps with some creative manipulation you can show that the above equals np(1+p), but I doubt it.

5. ## Re: Another Expectation through conditioning question

I agree that the answer I gave seems a bit simple, but that's the answer that the professor gave us. I'll try to get clarification the next time class meets. It seems like much of the class is struggling right now.

6. ## Re: Another Expectation through conditioning question

I didn't calculate $\displaystyle E(W|X^{c})$ correctly. But I still don't get than answer (although it's a bit closer).

$\displaystyle E(W|X^{c}) = E(W|W \ne x) = \sum w P(W=w|W \ne x)$ where w=x is not included in the sum

$\displaystyle = \sum \frac{w P(W=w \cap W \ne x)}{P(W \ne x)}$

$\displaystyle = \frac{1}{P(W \ne x)} \sum w P(W=w)$

$\displaystyle = \frac{np - x \binom{n}{x}p^{x}(1-p)^{n-x}}{1- \binom{n}{x} p^{x} (1-p)^{n-x}}$

so $\displaystyle E(W) = (x+xp) \binom{n}{x} p^{x}(1-p)^{n-x} + np - x \binom{n}{x} p^{x} (1-p)^{n-x}$

$\displaystyle = xp \binom{n}{x} p^{x}(1-p)^{n-x} + np$

7. ## Re: Another Expectation through conditioning question

The professor solved a similar problem in class, but the procedure seems to work properly with these problems as well.

This is how the solution was presented in class:

Let N be the total number of wins
Let X be the number of wins in the first N games

(a) The gambler will play n games; if he wins X of these games, then he will play an additional X games before stopping.

$\displaystyle E[N]=E[E[N|X]]$

$\displaystyle E[N|X]=X+Xp$

So we have,
$\displaystyle E[X+Xp]=E[X]+pE[X]=np+np^2=np(1+p)$

8. ## Re: Another Expectation through conditioning question

We weren't treating X as a random variable.

9. ## Re: Another Expectation through conditioning question

$\displaystyle E(W) = 0 \binom{n}{0} p^{0}(1-p)^{n} + (1+p) \binom{n}{1} p^{1}(1-p)^{n-1} + (2+2p) \binom{n}{2} p^{2}(1-p)^{n-2} + \ldots + (n+np) \binom{n}{n} p^{n}(1-p)^{0}$

$\displaystyle = (1+p) \sum_{k=1}^{n} k \binom{n}{k} p^{k}(1-p)^{n-k}$

$\displaystyle = np(1+p)$

10. ## Re: Another Expectation through conditioning question

Do you have the answer to part (b)?

$\displaystyle E(W) = (1+p)p + (1+2p)(1-p)p + (1+3p)(1-p)^{2}p + \ldots$

$\displaystyle = \Big( p +(1-p)p + (1-p)^{2}p + \ldots \Big) + p \sum_{k=1}^{\infty}k(1-p)^{k-1} p$

$\displaystyle = 1 + \frac{p}{p} = 2$ which is a very strange answer

11. ## Re: Another Expectation through conditioning question

I have $\displaystyle 1+\frac{1}{p}$ written down, but it wasn't explained how to do it.

12. ## Re: Another Expectation through conditioning question

$\displaystyle 1 + \frac{1}{p}$ is the solution I got first when I wrote

$\displaystyle E(W) = (1+p)p + (2+2p)(1-p)p + (3+3p)(1-p)^{2}p$

$\displaystyle = (1+p) \sum_{n=1}^{\infty}n (1-p)^{n}p = \frac{1+p}{p}$

but $\displaystyle E(W|Y=k) = 1 + kp$ not $\displaystyle k + kp$ like in part (a).