Page 1 of 2 12 LastLast
Results 1 to 15 of 20

Math Help - Linear combination of normal random variables

  1. #1
    Senior Member
    Joined
    Jan 2009
    Posts
    404

    Linear combination of normal random variables

    "Fact: Any linear combination of independent normal random variables has a normal distribution."

    Is the condition "independent" here absolutely necessary? If we remove the word "independent", would the linear combination still be normally distributed? Why or why not?

    Thank you!
    Follow Math Help Forum on Facebook and Google+

  2. #2
    MHF Contributor matheagle's Avatar
    Joined
    Feb 2009
    Posts
    2,763
    Thanks
    5
    yes you need indep...

    Let X be any normal rv.
    Then Y=X-X is not a normal.
    P(Y=0)=1 is not normal.
    Follow Math Help Forum on Facebook and Google+

  3. #3
    Senior Member
    Joined
    Jan 2009
    Posts
    404
    Say if we have X~Normal(0,1), Y~Normal(0,1)

    E(X+Y) = E(X)+E(Y) = 0
    V(X+Y) = V(X)+V(Y)+2Cov(X,Y) = 2+2Cov(X,Y)

    Can we now say that X+Y~Normal(0, 2+2Cov(X,Y) ) ?

    Thanks!
    Follow Math Help Forum on Facebook and Google+

  4. #4
    Lord of certain Rings
    Isomorphism's Avatar
    Joined
    Dec 2007
    From
    IISc, Bangalore
    Posts
    1,465
    Thanks
    6
    Quote Originally Posted by kingwinner View Post
    Say if we have X~Normal(0,1), Y~Normal(0,1)

    E(X+Y) = E(X)+E(Y) = 0
    V(X+Y) = V(X)+V(Y)+2Cov(X,Y) = 2+2Cov(X,Y)

    Can we now say that X+Y~Normal(0, 2+2Cov(X,Y) ) ?

    Thanks!
    No!

    (This counterexample is wrong, see Laurents post below) Counterexample:Lets say your r.v pair X,Y has a joint char function \phi(s_1,s_2) = \mathbb{E}(e^{s_1X+s_2Y})  = \text{exp}\left({\frac{s_1 ^2 + s_2 ^2}2 + s_1s_2}\right). Clearly they are marginally gaussian as you require and the sum distribution is not gaussian.

    However if your question is if "independence" is necessary,then its not. Counterexample: What if X = 3Z+Y and where Z and Y are independent normal?
    Last edited by Isomorphism; May 21st 2009 at 06:23 AM.
    Follow Math Help Forum on Facebook and Google+

  5. #5
    Senior Member
    Joined
    Jan 2009
    Posts
    404
    Unfortunately, I haven't learnt joint characteristic function in my class yet (I've learnt moment generating functions, however), so I don't have the required background to understand your example. I am sorry about that.

    What if X = 3Z+Y and where Z and Y are independent normal?
    Since they are indepednent, X must be normally distributed as well. Now my question is, if the random variables are NOT indepednent (see my example below), would a linear combination of those random variables also be normally distributed?

    In short, my question is:
    Suppose that X~Normal(0,1), Y~Normal(0,1), where X and Y are NOT independent
    Can we say that X+Y~Normal(0, 2+2Cov(X,Y) ) for sure?


    Thanks!
    Last edited by kingwinner; May 21st 2009 at 03:34 AM.
    Follow Math Help Forum on Facebook and Google+

  6. #6
    Lord of certain Rings
    Isomorphism's Avatar
    Joined
    Dec 2007
    From
    IISc, Bangalore
    Posts
    1,465
    Thanks
    6
    Quote Originally Posted by kingwinner View Post
    Since they are indepednent, X must be normally distributed as well.
    You didnt get my point. I wanted to say X+Y is normal distributed even though X and Y are not independent.



    Now my question is, if the random variables are NOT indepednent (see my example below), would a linear combination of those random variables also be normally distributed?

    In short, my question is:
    Suppose that X~Normal(0,1), Y~Normal(0,1), where X and Y are NOT independent
    Can we say that X+Y~Normal(0, 2+2Cov(X,Y) ) for sure?


    Thanks!
    My previous post is trying to tell you that there is no general answer. Some times the sum could be normal, sometimes not.
    Follow Math Help Forum on Facebook and Google+

  7. #7
    MHF Contributor

    Joined
    Aug 2008
    From
    Paris, France
    Posts
    1,174
    Quote Originally Posted by Isomorphism View Post
    No!

    Counterexample:Lets say your r.v pair X,Y has a joint char function \phi(s_1,s_2) = \mathbb{E}(e^{s_1X+s_2Y})  = \text{exp}\left({\frac{s_1 ^2 + s_2 ^2}2 + s_1s_2}\right). Clearly they are marginally gaussian as you require and the sum distribution is not gaussian.
    To be precise, what you're dealing with is rather a moment generating function (kind of Laplace transform) than a characteristic function (kind of Fourier transform).
    By the way, it is not easy to prove that a function is a moment generating function for some probability distribution (it involves Bochner's theorem, which is uneasy to check in general), so that you should prove first that your function is indeed a m.g.f.. Actually it is a m.g.function, because this is that of a Gaussian vector...

    Marginals are indeed standard Gaussian r.v., but the m.g.f. of the sum is \mathbb{E}[e^{s(X+Y)}]=\exp(2s^2) (take s_1=s_2=s), which is the m.g.f. of a centered Gaussian with variance 4... Hence this is no counterexample.

    Matheagle gave a working counterexample. Since Dirac measures can be seen as "limit cases" of Gaussian distribution when the variance goes to 0, I tend to prefer the following one: Let X,\varepsilon be independent r.v. where X is a standard Gaussian r.v., and \varepsilon has a distribution given by P(\varepsilon=+1)=P(\varepsilon=-1)=1/2. Then let Y=\varepsilon X, so that Y is a standard Gaussian, while X+Y=(1+\varepsilon)X is 0 with probability 1/2, hence it is not Gaussian, and it is not degenerate.
    Follow Math Help Forum on Facebook and Google+

  8. #8
    Lord of certain Rings
    Isomorphism's Avatar
    Joined
    Dec 2007
    From
    IISc, Bangalore
    Posts
    1,465
    Thanks
    6
    Quote Originally Posted by Laurent View Post
    To be precise, what you're dealing with is rather a moment generating function (kind of Laplace transform) than a characteristic function (kind of Fourier transform).

    By the way, it is not easy to prove that a function is a moment generating function for some probability distribution (it involves Bochner's theorem, which is uneasy to check in general), so that you should prove first that your function is indeed a m.g.f.. Actually it is a m.g.function, because this is that of a Gaussian vector...

    Marginals are indeed standard Gaussian r.v., but the m.g.f. of the sum is \mathbb{E}[e^{s(X+Y)}]=\exp(2s^2) (take s_1=s_2=s), which is the m.g.f. of a centered Gaussian with variance 4... Hence this is no counterexample.
    Whoops!

    Perhaps I should have gone with my standard counter \phi(s_1,s_2) = \mathbb{E}(e^{s_1X+s_2Y})  = (1 + s_1s_2)\text{exp}\left({\frac{s_1 ^2 + s_2 ^2}2}\right)<br />

    In my basic random process classes these examples were considered good enough. So does this new example confirm to the rigorous approach?

    Your counterexample is nice
    Follow Math Help Forum on Facebook and Google+

  9. #9
    MHF Contributor

    Joined
    Aug 2008
    From
    Paris, France
    Posts
    1,174
    Quote Originally Posted by Isomorphism View Post
    Whoops!

    Perhaps I should have gone with my standard counter \phi(s_1,s_2) = \mathbb{E}(e^{s_1X+s_2Y})  = (1 + s_1s_2)\text{exp}\left({\frac{s_1 ^2 + s_2 ^2}2}\right)<br />
    Where do you get this example from? If my computations are correct, this still doesn't work, because what you gave seems to be the m.g.f. of the "density" f(x,y)=\frac{1}{2\pi}(1+xy)\exp\left(-\frac{x^2+y^2}{2}\right) on \mathbb{R}^2, which is negative sometimes... This illustrates what I said about Bochner's theorem in my last post: not any function, even if it equals 1 at 0 and is "smooth", is a m.g.f.. Being a m.g.f. is a strong condition that can't be checked at first sight.
    Follow Math Help Forum on Facebook and Google+

  10. #10
    MHF Contributor matheagle's Avatar
    Joined
    Feb 2009
    Posts
    2,763
    Thanks
    5
    Again, if Y=-X, then W=X+Y is not normal. P(W=0)=1.
    The negative of a N(0,1) is a N(0,1), so both X and Y are st normals, but their sum is not a normal, not even continuous.
    Follow Math Help Forum on Facebook and Google+

  11. #11
    Moo
    Moo is offline
    A Cute Angle Moo's Avatar
    Joined
    Mar 2008
    From
    P(I'm here)=1/3, P(I'm there)=t+1/3
    Posts
    5,618
    Thanks
    6
    However if your question is if "independence" is necessary,then its not. Counterexample: What if X = 3Z+Y and where Z and Y are independent normal?
    You didnt get my point. I wanted to say X+Y is normal distributed even though X and Y are not independent.
    Then X=3Z-Y would be a better example

    However, he talked about independence for "any linear combination" of normal distribution. Not for X+Y in particular.
    With Gaussian vectors, we were told that X_1,\dots,X_n form a Gaussian vector (meaning that any linear combination of its components follows a normal distribution) if (and only if ?) they're independent.
    Follow Math Help Forum on Facebook and Google+

  12. #12
    MHF Contributor matheagle's Avatar
    Joined
    Feb 2009
    Posts
    2,763
    Thanks
    5
    He's going to keep asking this question until the COWS (MOOO) come home.
    He knows the answer by now.
    Follow Math Help Forum on Facebook and Google+

  13. #13
    Senior Member
    Joined
    Jan 2009
    Posts
    404
    How about this?
    "ANY linear combination of uncorrelated normal random variables has a normal distribution."

    Is this a correct statement? Why or why not?

    Thanks!
    Last edited by kingwinner; May 23rd 2009 at 02:09 PM.
    Follow Math Help Forum on Facebook and Google+

  14. #14
    MHF Contributor

    Joined
    Aug 2008
    From
    Paris, France
    Posts
    1,174
    Quote Originally Posted by kingwinner View Post
    How about this?
    "ANY linear combination of uncorrelated normal random variables has a normal distribution."

    Is this a correct statement? Why or why not?

    Thanks!
    Did you check on our counterexamples first? In mine (post #7), the r.v. X and Y are uncorrelated.
    Follow Math Help Forum on Facebook and Google+

  15. #15
    MHF Contributor matheagle's Avatar
    Joined
    Feb 2009
    Posts
    2,763
    Thanks
    5
    I guess the cows aren't home yet.
    Follow Math Help Forum on Facebook and Google+

Page 1 of 2 12 LastLast

Similar Math Help Forum Discussions

  1. Expected Value of a Combination of Random Variables
    Posted in the Advanced Statistics Forum
    Replies: 6
    Last Post: August 23rd 2011, 08:39 PM
  2. Linear combination of poisson random variables
    Posted in the Advanced Statistics Forum
    Replies: 2
    Last Post: April 18th 2010, 04:12 AM
  3. Variance of a combination of normal random variables
    Posted in the Advanced Statistics Forum
    Replies: 2
    Last Post: October 8th 2009, 02:18 AM
  4. linear combination of Poisson random variables
    Posted in the Advanced Statistics Forum
    Replies: 5
    Last Post: May 22nd 2009, 05:41 AM
  5. linear combination of chi-squared variables in R
    Posted in the Math Software Forum
    Replies: 0
    Last Post: January 27th 2009, 02:11 AM

Search Tags


/mathhelpforum @mathhelpforum