# Win/loss game

• Jul 24th 2011, 07:45 PM
jsndacruz
Win/loss game
The problem states:

You play a game, and each time you win with probability p. You plan to play 5 games. However, if you win on the 5th game, you must keep playing until you lose.
a) find the expected number of games that you play,
b) find the expected number of games that you lose,

My attempt:
a) you must play at least 5; to see the number of games beyond that, you need to find the expected value of the sum of X, where X is whether or not you win on your ith try, i>5.

I came up with p * sum(1 - p)^i = p/(1-p) (1 / ( 1 - (1 - p)) = 1 / (1 - p). Since you already played 5 games, and you also had to win on the 5th try, the final expected number of games should be:
5 + p / (1-p);

b) Given my horrid attempt at part (a), I have no idea where to turn for part (b). I'll post again tomorrow, after thinking through it some more.
• Jul 25th 2011, 04:30 AM
SpringFan25
Re: Win/loss game
For part (a)

Define Y = number of turns played after 4th turn.

$\displaystyle \mathcal{P}(Y=1) = (1-p)$
$\displaystyle \mathcal{P}(Y=2) = p(1-p)$
$\displaystyle \mathcal{P}(Y=3) = p^2(1-p)$
...
$\displaystyle \mathcal{P}(Y=y) = p^{y-1} (1-p)$

Find the expected value:
$\displaystyle E(Y) = \sum_{i=1}^{\infty} i \mathcal{P}(Y=i)$

$\displaystyle E(Y) = \sum_{i=1}^{\infty} i p^{i-1} (1-p)$

$\displaystyle E(Y) = (1-p) \sum_{i=1}^{\infty} i p^{i-1}$

the sum is a hypergeometric progression, which hopefully you have been taught. i got:

$\displaystyle \sum_{i=1}^{\infty} i p^{i-1} = \frac{1}{(1-p)^2}$

if you dont believe that, see the spoiler:
Spoiler:

consider the related but different geometric progression, which has a well known sum:
$\displaystyle \sum_{i=1}^{\infty} p^i = \frac{p}{1-p}$
differentiate both sides with respect to p

$\displaystyle \sum_{i=1}^{\infty} i p^{i-1} = \frac{d \left(\frac{p}{1-p} \right) }{dp}$

$\displaystyle \sum_{i=1}^{\infty} i p^{i-1} = \frac{(1-p) - (p \cdot -1)}{(1-p)^2} = \frac{1}{(1-p)^2}$

This gives us the required sum for our hypergeometric progression.

and hence
$\displaystyle E(Y) = \frac{(1-p)}{(1-p)^2} = \frac{1}{1-p}$

So the expected number of games:
$\displaystyle E(4+Y) = 4+E(Y) = 4 + \frac{1}{1-P}$

Quote:

I have no idea where to turn for part (b)
for (b), note that you are certain to lose exactly 1 game after your 4th turn has finished. So find the expected number of looses in the first 4 turns and add one.
• Jul 30th 2011, 04:31 AM
obd2
Re: Win/loss game
Quote:

Originally Posted by SpringFan25
For part (a)

Define Y = number of turns played after 4th turn.

$\displaystyle \mathcal{P}(Y=1) = (1-p)$
$\displaystyle \mathcal{P}(Y=2) = p(1-p)$
$\displaystyle \mathcal{P}(Y=3) = p^2(1-p)$
...
$\displaystyle \mathcal{P}(Y=y) = p^{y-1} (1-p)$

Find the expected value:
$\displaystyle E(Y) = \sum_{i=1}^{\infty} i \mathcal{P}(Y=i)$

$\displaystyle E(Y) = \sum_{i=1}^{\infty} i p^{i-1} (1-p)$

$\displaystyle E(Y) = (1-p) \sum_{i=1}^{\infty} i p^{i-1}$

the sum is a hypergeometric progression, which hopefully you have been taught. i got:

$\displaystyle \sum_{i=1}^{\infty} i p^{i-1} = \frac{1}{(1-p)^2}$

if you dont believe that, see the spoiler:
Spoiler:

consider the related but different geometric progression, which has a well known sum:
$\displaystyle \sum_{i=1}^{\infty} p^i = \frac{p}{1-p}$
differentiate both sides with respect to p

$\displaystyle \sum_{i=1}^{\infty} i p^{i-1} = \frac{d \left(\frac{p}{1-p} \right) }{dp}$

$\displaystyle \sum_{i=1}^{\infty} i p^{i-1} = \frac{(1-p) - (p \cdot -1)}{(1-p)^2} = \frac{1}{(1-p)^2}$

This gives us the required sum for our hypergeometric progression.

and hence
$\displaystyle E(Y) = \frac{(1-p)}{(1-p)^2} = \frac{1}{1-p}$

So the expected number of games:
$\displaystyle E(4+Y) = 4+E(Y) = 4 + \frac{1}{1-P}$

for (b), note that you are certain to lose exactly 1 game after your 4th turn has finished. So find the expected number of looses in the first 4 turns and add one.

I will attempt to disagree on your calculations and logic for part (a). Here is how I see it: We will have to play at least 5 games. So the expected value will look something like $\displaystyle 5+\alpha$, where $\displaystyle \alpha$ is an unknown constant. Let $\displaystyle X$ denote the number of games we play after the fifth turn. Then it follows that $\displaystyle \alpha = E[X]$. I will use a relation that is widely known:

$\displaystyle E(X) = \sum_{k=1}^{+\infty} P(X\ge k)$.

Firstly we should immediately see that $\displaystyle P(X\ge 0)=1$. Since we will play at least 0 games after the fifth has finished. What about $\displaystyle P(X \ge 1)?$ To play at least one more game after the fifth one we will have to win game number 5 and that happens with probability $\displaystyle p$. Hence $\displaystyle P(X\ge 1) = p$. With induction one should see that to play $\displaystyle k$ number of games one would have to win all the preceding $\displaystyle k-1$ games along with game number 5. Therefore $\displaystyle P(X\ge k) = p^k$. This gives:

$\displaystyle E[X] = \sum_{k=1}^{+\infty} P(X\ge k) = \sum_{k=1}^{+\infty} p^k = \frac{p}{1-p}$.

The last equaltity is commonly known as we have a simple geometric series. Finally this will give us, if we let $\displaystyle Y$ denote the number of games we play in total:

$\displaystyle E[Y] = 5 + E[X] = 5 + \frac{p}{1-p}$.

Please let me know if you find any mistakes. I working on part (b).

I'm done with (b). For (b): First we play four games, we will lose on average $\displaystyle 4(1-p)$ of those games. When we think about what happens in the fifth game and after then we will see that we can only lose at most one game. This is clear because either we lose game number 5 or we keep on playing until we lose, in either case there is only one loss. Hence the expected number off losses would be $\displaystyle 4(1-p) + 1$.
• Jul 30th 2011, 04:50 AM
SpringFan25
Re: Win/loss game
Lets make sure we agree on the rules. My interpretation of post #1 was:
• You start by playing 5 games.
• if you win game 5, you play another game. Otherwise you stop
• if you win game 6, you play another game. Otherwise you stop
• ..etc

suppose p=0, so you are certain to loose every game. You will loose game 5 and stop. So you will definately only play 5 games. Your formula says you expect to play 6 games which is inconsistent. My formula gives the right answer of 5.

Quote:

So the expected value will look something like 5+\alpha, where \alpha is an unknown constant. Let X denote the number of games we play after the fifth turn. Then it follows that \alpha = E[X]. I will use a relation that is widely known:
I would only agree that the lower bound on the value must be 5, but that constraint is satisfied by my function.

Quote:

E(X) = \sum_{k=0}^{+\infty} P(X\ge k).

Firstly we should immediately see that P(X\ge 0)=1. Since we will play at least 0 games after the fifth has finished. What about P(X \ge 1)? To play at least one more game after the fifth one we will have to win game number 5 and that happens with probability p. Hence P(X\ge 1) = p. With induction one should see that to play k number of games one would have to win all the preceding k-1 games along with game number 5. Therefore P(X\ge k) = p^k. This gives:

E[X] = \sum_{k=0}^{+\infty} P(X\ge k) = \sum_{k=0}^{+\infty} p^k = \frac{1}{1-p}.

The last equaltity is commonly known as we have a simple geometric series. Finally this will give us, if we let Y denote the number of games we play in total:

E[Y] = 5 + E[X] = 5 + \frac{1}{1-p} .

I cant comment on this as I've never seen the relationship you are trying to use.

Your solution to (b) is fine.
• Jul 30th 2011, 05:24 AM
obd2
Re: Win/loss game
Quote:

Originally Posted by SpringFan25
Lets make sure we agree on the rules. My interpretation of post #1 was:
• You start by playing 5 games.
• if you win game 5, you play another game. Otherwise you stop
• if you win game 6, you play another game. Otherwise you stop
• ..etc

suppose p=0, so you are certain to loose every game. You will loose game 5 and stop. So you will definately only play 5 games. Your formula says you expect to play 6 games which is inconsistent. My formula gives the right answer of 5.

I would only agree that the lower bound on the value must be 5, but that constraint is satisfied by my function.

I cant comment on this as I've never seen the relationship you are trying to use.

Your solution to (b) is fine.

Hmm. You are right. I wonder where my logic fails. When I look at it more closely I see that if p is nonzero then my answer will always be greater that 6. Howerver yours is always greater than 5. I see now what I did wrong though. The relation that I sued applies to random variables that takes values in the set $\displaystyle \{0,1,2,3,...\}$ and the formula was supposed to be

$\displaystyle E(X) = \sum_{k=1}^{+\infty} P(X\ge k)$ not $\displaystyle \sum_{k=0}^{+\infty} P(X\ge k)$. Which will give a better answer. My answer will then be

$\displaystyle E(Y) = 5 + \frac{p}{1-p}$,

which works for the $\displaystyle p=0$ case. I will edit my post. Can never rely completely on my memory. The relation along with the proof can be found here:

Expected value - Wikipedia, the free encyclopedia

Upon further inspection I can see that our answers are the same since

$\displaystyle 4+ \frac{1}{1-p} = 4 + \frac{1-p+p}{1-p} = 5 + \frac{p}{1-p}$.

Thanks for clearing that up.
• Jul 30th 2011, 05:31 AM
SpringFan25
Re: Win/loss game
yes, your answer is the same as mine. doing partial fractions on $\displaystyle \frac{p}{1-p}$ gives $\displaystyle \frac{1}{1-p} - 1$.