Originally Posted by

**Glitch** **The question:**

Calculate $\displaystyle P(X \le k) $when X has the probability distribution

$\displaystyle P(X = k) = p(1 - p)^k$, for k = 0, 1, 2, ... and 0 < p < 1.

**My attempt:**

$\displaystyle \sum\limits_{k = 0}^{\infty} p(1 - p)^k$

Then by geometric series:

= $\displaystyle p(1 - p)\frac{1 - (1 - p)^k }{1 - (1 - p)}$

= $\displaystyle (1 - p)-(1 - p)^{k + 1}$

The answer is supposed to be $\displaystyle 1 - (1 - p)^{k+1}$

What have I screwed up this time? :/

EDIT: It has occurred to me that I've done the geometric series wrong. >_<