Convolution in statistics

Dear all,

I learned, to derive the common probability distribution function

of e.g. two independent distribution functions X and Y

, one can use the convolution.

The second way, from my point of view, could be, if e.g. the two distribution have, let's say two states with given probabilities

with 0.6, for instance, as the probability for an electrical current IX1 to occur and 0.4 for an electrical current IX2 to occur.

with 0.3, for instance, as the probability for an electrical current IY1 to occur and 0.7 for an electrical current IY2 to occur.

The new distribution function of Z is then[tex]

f_{Z}(i=0)=0.6*0.3=0.18

f_{Z}(i=1)=0.6*0.7=0.42

f_{Z}(i=2)=0.4*0.3=0.12

f_{Z}(i=3)=0.4*0.7=0.28[\MATH]

with [tex]f_{Z}(i=1)+f_{Z}(i=2)=0.54[\MATH]

The convolution would give only three stages:

I)

II)

III)

The convolution spitts out two results in one as in II). Basically to me that is fine and right, but I do need the fourth stage as I get it without concolution.

Is the second way wrong? From my point of view is it even better as it gives more stages in the result.

So, hopefully one of you guys can help with this.

Which version is correct? From my point of view both.

Thanks

HHsts