We have 2 independent random variables X and Y.

X has probability density function f1(X).

Y has probability density function f2(X).

Z = X + Y is the sum of both variables.

What is probability density function of X knowing Z, thus f1(X|Z)?

Hereafter I expose what I propose.

Here is the probability density function of the sum of X and Y

(convolution):

f(X+Y) = f(Z) = f(z) = integral { - inf to + inf } f1(x) f2(z - x) dx

P(A|B) = (P(A) P(B|A)) / P(B) (Bayes theorem)

f1(x|z) dx = (f1(x) dx f2(z-x) dz) / (f(z) dz)

f1(x|z) = (f1(x) f2(z-x)) / f(z)

Is it the right result?

Is my proof sufficiently rigorous?