Originally Posted by

**Anonymous1** Two boys take turns throwing darts at a target. Al throws first and hits with probability $\displaystyle \frac{1}{5}.$ Bob throws second and hits with probability b. What value of b makes each boy have probability $\displaystyle \frac{1}{2}$ of winning?

What I've got:

$\displaystyle P(Al Wins) = \frac{1}{5} \sum_{i=0}^{\infty} (\frac{4}{5} (1-b))^i = \frac{1+4b}{5}$ Mr F says: No. This is equal to $\displaystyle {\color{red} \frac{1}{1 + 4b}}$. Recheck your calculation.

$\displaystyle P(Bob Wins) = b \sum_{i=0}^{\infty} (\frac{4}{5})^{i+1} (1-b)^i = \frac{4b}{1+4b}.$

Then I set them equal and got $\displaystyle 16b^2 -12b + 1 = 0$

Which makes no sense. What am I doing wrong?