Originally Posted by

**Mathsdog** Hi,

I see what you are saying I think. I should have said the distribution of the sum of two independent random variables. But to find the distribution for the sum X+Y, i.e. the joint, one has to take the sum over the product of the two distributions (that's where independence comes in), right ? i.e. for Z = X +Y canonically

$\displaystyle p_{Z}(z) = \sum_{all x} p_{X}(x) \cdot p_{Y}(z-x)$

so does this translate for our example as

$\displaystyle p_{X + Y = Z}(z)= \sum_{all k} \exp{(-\mu) \cdot\frac{\mu^k}{k!} \cdot \exp{(-\lambda) \cdot\frac{\lambda^{(z-k)}}{(z-k)!}$

?

But I cant massage this form into the desired form. Am I setting things up incorrectly here?

And here the notation implies that k is the same for both initial distributions. Should the notation not more correctly be $\displaystyle k_Y = 1,2, \ldots$ and $\displaystyle k_X = 1,2, \ldots$

So that $\displaystyle z = k_Y + k_X$

Or am I missing something (as I guess is much more likely) ?

Thanks again.