1. Convolution

Hi all,
I am trying to figure the following out:
Let Z,X be random variables and Y = X + Z;
For a known f(X) and g(Z), and a given data set of Y it is required to find the parameters of f and g functions.
I wonder if this is at all possible. If f and g are Normal then the distribution of Y is also Normal with the sum of means and variances of f and g. In this case I presume it is not possible to derive the mean of X and Z separately. Is this true? and if it is is this true for any arbitrary pair of functions f and g.

With thanks,
R2

2. Originally Posted by Iskatel
Hi all,
I am trying to figure the following out:
Let Z,X be random variables and Y = X + Z;
For a known f(X) and g(Z), and a given data set of Y it is required to find the parameters of f and g functions.
I wonder if this is at all possible. If f and g are Normal then the distribution of Y is also Normal with the sum of means and variances of f and g. In this case I presume it is not possible to derive the mean of X and Z separately. Is this true? and if it is is this true for any arbitrary pair of functions f and g.

With thanks,
R2
Are Z and X independent?

3. Yes, X and Z are independent. Say, f - lognormal and g - Weibull or gamma. Is it possible to find unique parameters of these two distributions based on the observations of Y, say, using MLE?!

4. Originally Posted by Iskatel
Yes, X and Z are independent. Say, f - lognormal and g - Weibull or gamma. Is it possible to find unique parameters of these two distributions based on the observations of Y, say, using MLE?!
Honestly, I'm not sure...

The only thing I can think of using to do this is:

$M_Y = M_X \times M_Z$

5. OK, thank you for having a look at it.

6. I just noticed the densities of X and Z are known. Then you can definitely use MGF. Simply factorize $E(e^{Yt})$ into the product of the two known MGF's and your parameters will be apparent.

Of course I'm assuming the generating functions exist...

7. Originally Posted by Anonymous1
I just noticed the densities of X and Z are known. Then you can definitely use MGF. Simply factorize $E(e^{Yt})$ into the product of the two known MGF's and your parameters will be apparent.

Of course I'm assuming the generating functions exist...
Thank you for your suggestion. I tried using Weibull for both X and Z. It looks like it is not possible to find the parameters of f and g uniquely based on observations of Y. More like trying to solve 5 = x + z for x and z.

8. Originally Posted by Iskatel
Thank you for your suggestion. I tried using Weibull for both X and Z. It looks like it is not possible to find the parameters of f and g uniquely based on observations of Y. More like trying to solve 5 = x + z for x and z.
What information do you have about $Y?$ Can you fit the data $?$ If so, the above task should be easy enough.

Try it, I don't see why it wouldn't work. You know, as long as $f_Y$ is integrable...

9. Originally Posted by Anonymous1
What information do you have about $Y?$ Can you fit the data $?$ If so, the above task should be easy enough.

Try it, I don't see why it wouldn't work. You know, as long as $f_Y$ is integrable...
The information on Y is in the form of a set of observation, say: Y = {24, 15, 78 ...}.
To be honest I am not entirely sure how to fit the data on M[Y]=M[X]M[Z]. And my main issue with this is if the convolution of two Normal distributions is also Normal then you would fit the data using MLE for f(y)-Normal(m[Y],Var[Y]). Since m[Y]=m[X]+m[Z] there is no way to deduce unique m[X] or m[Z], isn't it?! Same for the variance. Hence, for other distributions similar problem may hold true?!

10. Originally Posted by Iskatel
The information on Y is in the form of a set of observation, say: Y = {24, 15, 78 ...}.
To be honest I am not entirely sure how to fit the data on M[Y]=M[X]M[Z]. And my main issue with this is if the convolution of two Normal distributions is also Normal then you would fit the data using MLE for f(y)-Normal(m[Y],Var[Y]). Since m[Y]=m[X]+m[Z] there is no way to deduce unique m[X] or m[Z], isn't it?! Same for the variance. Hence, for other distributions similar problem may hold true?!
You know X and Z to be normal? Then clearly, we can fit Y with a normal distribution. Determine its mean and variance empirically...

11. Originally Posted by Anonymous1
You know X and Z to be normal? Then clearly, we can fit Y with a normal distribution. Determine its mean and variance empirically...
Surely we can fit the data to Y as we know it is Normal because X and Z are Normal. And we will get f(y) which is a Normal distribution with mean - mean[Y] and variance-var[Y]. However, I want to find the means and variances of Z and X.
Thanks for trying to help.

12. Originally Posted by Iskatel
Surely we can fit the data to Y as we know it is Normal because X and Z are Normal. And we will get f(y) which is a Normal distribution with mean - mean[Y] and variance-var[Y]. However, I want to find the means and variances of Z and X.
Thanks for trying to help.
Okay I see the issue now...

Well you have a bound on the sum of your means, and you can standardize one of them to draw some inference on the other...

Obviously there are an infinite number of possible combinations for the parameters on X and Z, though.

13. Originally Posted by Anonymous1
Okay I see the issue now...

Well you have a bound on the sum of your means, and you can standardize one of them to draw some inference on the other...

Obviously there are an infinite number of possible combinations for the parameters on X and Z, though.
My main question was whether this generalises to all combinations of different forms of f(X) and f(Z). Say if the resulting f(Y) is bimodal, can we find unique parameters for f(X) and f(Z).