yes you need indep...
Let X be any normal rv.
Then Y=X-X is not a normal.
P(Y=0)=1 is not normal.
"Fact: Any linear combination of independent normal random variables has a normal distribution."
Is the condition "independent" here absolutely necessary? If we remove the word "independent", would the linear combination still be normally distributed? Why or why not?
Thank you!
No!
(This counterexample is wrong, see Laurents post below) Counterexample:Lets say your r.v pair X,Y has a joint char function . Clearly they are marginally gaussian as you require and the sum distribution is not gaussian.
However if your question is if "independence" is necessary,then its not. Counterexample: What if X = 3Z+Y and where Z and Y are independent normal?
Unfortunately, I haven't learnt joint characteristic function in my class yet (I've learnt moment generating functions, however), so I don't have the required background to understand your example. I am sorry about that.
Since they are indepednent, X must be normally distributed as well. Now my question is, if the random variables are NOT indepednent (see my example below), would a linear combination of those random variables also be normally distributed?What if X = 3Z+Y and where Z and Y are independent normal?
In short, my question is:
Suppose that X~Normal(0,1), Y~Normal(0,1), where X and Y are NOT independent
Can we say that X+Y~Normal(0, 2+2Cov(X,Y) ) for sure?
Thanks!
You didnt get my point. I wanted to say X+Y is normal distributed even though X and Y are not independent.
My previous post is trying to tell you that there is no general answer. Some times the sum could be normal, sometimes not.Now my question is, if the random variables are NOT indepednent (see my example below), would a linear combination of those random variables also be normally distributed?
In short, my question is:
Suppose that X~Normal(0,1), Y~Normal(0,1), where X and Y are NOT independent
Can we say that X+Y~Normal(0, 2+2Cov(X,Y) ) for sure?
Thanks!
To be precise, what you're dealing with is rather a moment generating function (kind of Laplace transform) than a characteristic function (kind of Fourier transform).
By the way, it is not easy to prove that a function is a moment generating function for some probability distribution (it involves Bochner's theorem, which is uneasy to check in general), so that you should prove first that your function is indeed a m.g.f.. Actually it is a m.g.function, because this is that of a Gaussian vector...
Marginals are indeed standard Gaussian r.v., but the m.g.f. of the sum is (take ), which is the m.g.f. of a centered Gaussian with variance 4... Hence this is no counterexample.
Matheagle gave a working counterexample. Since Dirac measures can be seen as "limit cases" of Gaussian distribution when the variance goes to 0, I tend to prefer the following one: Let be independent r.v. where is a standard Gaussian r.v., and has a distribution given by . Then let , so that is a standard Gaussian, while is 0 with probability 1/2, hence it is not Gaussian, and it is not degenerate.
Where do you get this example from? If my computations are correct, this still doesn't work, because what you gave seems to be the m.g.f. of the "density" on , which is negative sometimes... This illustrates what I said about Bochner's theorem in my last post: not any function, even if it equals 1 at 0 and is "smooth", is a m.g.f.. Being a m.g.f. is a strong condition that can't be checked at first sight.
However if your question is if "independence" is necessary,then its not. Counterexample: What if X = 3Z+Y and where Z and Y are independent normal?Then X=3Z-Y would be a better exampleYou didnt get my point. I wanted to say X+Y is normal distributed even though X and Y are not independent.
However, he talked about independence for "any linear combination" of normal distribution. Not for X+Y in particular.
With Gaussian vectors, we were told that form a Gaussian vector (meaning that any linear combination of its components follows a normal distribution) if (and only if ?) they're independent.