The wording of the question is confusing. Is it saying that if the gambler wins a specific number of games, he'll get to play that many more games? Or is is it saying that every time he wins, he'll get to play an additional game?
(1) A gambler wins each game with probability p. In each of the following cases, determine the expected total number of wins.
(a) The gambler will play n games; if he wins X of these games, then he will play an additional X games before stopping.
(b) The gambler will play until he wins; if it takes him Y games to get this win, then he will play an additional Y games.
I'm having some trouble wrapping my head around this one.
So far, for part A I'm looking at it like this
E[W]=E[W|wins X games]P(wins X games)
That line of thinking gets me
The wording of the question is confusing. Is it saying that if the gambler wins a specific number of games, he'll get to play that many more games? Or is is it saying that every time he wins, he'll get to play an additional game?
Are you sure that's the answer?
Let W= the number of wins
let X = the event that the gambler wins X of his first n games
=
Let Z ~ binom(n,p). Then
therefore,
Perhaps with some creative manipulation you can show that the above equals np(1+p), but I doubt it.
I agree that the answer I gave seems a bit simple, but that's the answer that the professor gave us. I'll try to get clarification the next time class meets. It seems like much of the class is struggling right now.
The professor solved a similar problem in class, but the procedure seems to work properly with these problems as well.
This is how the solution was presented in class:
Let N be the total number of wins
Let X be the number of wins in the first N games
(a) The gambler will play n games; if he wins X of these games, then he will play an additional X games before stopping.
So we have,