Results 1 to 11 of 11

Math Help - Two random variables equal in distribution?

  1. #1
    Senior Member
    Joined
    Jan 2009
    Posts
    404

    Two random variables equal in distribution?

    Let X1,X2,X3,Y1,Y2,Y3 be random variables.
    If X1 and Y1 have the same distribution,
    X2 and Y2 have the same distribution,
    X3 and Y3 have the same distribution,
    then is it true that X1+X2+X3 and Y1+Y2+Y3 will have the same distribution? Why or why not?

    Any help is appreciated!
    Follow Math Help Forum on Facebook and Google+

  2. #2
    Moo
    Moo is offline
    A Cute Angle Moo's Avatar
    Joined
    Mar 2008
    From
    P(I'm here)=1/3, P(I'm there)=t+1/3
    Posts
    5,618
    Thanks
    6
    Hello,

    I don't think it is. Maybe you can have a look at gaussian vectors, with covariance matrices such that the diagonal is the same, but not the numbers off the diagonal.
    (and with the same expectation vector)
    Follow Math Help Forum on Facebook and Google+

  3. #3
    MHF Contributor matheagle's Avatar
    Joined
    Feb 2009
    Posts
    2,763
    Thanks
    5
    (X1,X2,X3) and (Y1,Y2,Y3) would have to have the same joint distribution.
    Follow Math Help Forum on Facebook and Google+

  4. #4
    Moo
    Moo is offline
    A Cute Angle Moo's Avatar
    Joined
    Mar 2008
    From
    P(I'm here)=1/3, P(I'm there)=t+1/3
    Posts
    5,618
    Thanks
    6
    Let's imagine the Gaussian vector M=(X_1,X_2,X_3) with the expectation vector \mu\in\mathbb{R}^3 and the covariance matrix K=\begin{pmatrix} a_{11}^2 & a_{12} & a_{13} \\ a_{12} & a_{22}^2 & a_{23} \\ a_{13} & a_{23} & a_{33}^2 \end{pmatrix}

    and then the Gaussian vector N=(Y_1,Y_2,Y_3) with the expectation vector \mu and the covariance matrix L=\begin{pmatrix} a_{11}^2 & 3a_{12}+1 & a_{13} \\ 3a_{12}+1 & a_{22}^2 & a_{23} \\ a_{13} & a_{23} & a_{33}^2 \end{pmatrix}

    we obviously have X_i\sim Y_i ~ \forall i \in\{1,2,3\}, though since the covariance matrices are different, the pdf of M is different from the pdf of N...

    Am I wrong somewhere ?
    Follow Math Help Forum on Facebook and Google+

  5. #5
    MHF Contributor

    Joined
    Aug 2008
    From
    Paris, France
    Posts
    1,174
    Quote Originally Posted by kingwinner View Post
    Let X1,X2,X3,Y1,Y2,Y3 be random variables.
    If X1 and Y1 have the same distribution,
    X2 and Y2 have the same distribution,
    X3 and Y3 have the same distribution,
    then is it true that X1+X2+X3 and Y1+Y2+Y3 will have the same distribution? Why or why not?

    Any help is appreciated!
    Let X be a random variable with a symmetric non-trivial distribution, like P(X=1)=P(X=-1)=1/2 or a centered Gaussian.

    Let (X_1,X_2,X_3)=(X,X,X) and (Y_1,Y_2,Y_3)=(X,-X,X). Then the hypothesis is fulfilled, while X_1+X_2+X_3=3X and Y_1+Y_2+Y_3=X don't have the same distribution.
    Follow Math Help Forum on Facebook and Google+

  6. #6
    Senior Member
    Joined
    Jan 2009
    Posts
    404
    Quote Originally Posted by Laurent View Post
    Let X be a random variable with a symmetric non-trivial distribution, like P(X=1)=P(X=-1)=1/2 or a centered Gaussian.

    Let (X_1,X_2,X_3)=(X,X,X) and (Y_1,Y_2,Y_3)=(X,-X,X). Then the hypothesis is fulfilled, while X_1+X_2+X_3=3X and Y_1+Y_2+Y_3=X don't have the same distribution.
    I see!
    But as matheagle suggested, if (X1,X2,X3) and (Y1,Y2,Y3) have the same JOINT distribution, then X1+X2+X3 and Y1+Y2+Y3 would have the same distribution, right?
    Follow Math Help Forum on Facebook and Google+

  7. #7
    MHF Contributor

    Joined
    Aug 2008
    From
    Paris, France
    Posts
    1,174
    Quote Originally Posted by kingwinner View Post
    I see!
    But as matheagle suggested, if (X1,X2,X3) and (Y1,Y2,Y3) have the same JOINT distribution, then X1+X2+X3 and Y1+Y2+Y3 would have the same distribution, right?
    Right.
    Follow Math Help Forum on Facebook and Google+

  8. #8
    Senior Member
    Joined
    Jan 2009
    Posts
    404
    Quote Originally Posted by Laurent View Post
    Right.
    I see.
    I am sorry...the following may seem very obvious to you, but to me it is not

    Statement 1:
    X1 and Y1 have the same distribution, AND
    X2 and Y2 have the same distribution.

    Statement 2:
    (X1,X2) and (Y1,Y2) have the same JOINT distribution.

    Are statements 1 and 2 equivalent? If not, what is the difference between them? (please explain in the simplest terms if possible as I am only a 2nd year stat undergrad student)

    I was never able to understand this, and I would really appreciate if you could clarify this concept.
    Follow Math Help Forum on Facebook and Google+

  9. #9
    MHF Contributor

    Joined
    Aug 2008
    From
    Paris, France
    Posts
    1,174
    Quote Originally Posted by kingwinner View Post
    I see.
    I am sorry...the following may seem very obvious to you, but to me it is not

    Statement 1:
    X1 and Y1 have the same distribution, AND
    X2 and Y2 have the same distribution.

    Statement 2:
    (X1,X2) and (Y1,Y2) have the same JOINT distribution.

    Are statements 1 and 2 equivalent? If not, what is the difference between them? (please explain in the simplest terms if possible as I am only a 2nd year stat undergrad student)

    I was never able to understand this, and I would really appreciate if you could clarify this concept.
    My humble attempt at a clarification :

    Plainly, the joint distribution of (X,Y) tells you P(X\in A,Y\in B) for any subsets A,B.
    Then, if you take B=\mathbb{R}, it gives you P(X\in A) for any subset A, which is the distribution of X. So, at least, when you know the joint distribution of (X,Y), you know the distributions of X and Y.
    But you know much more. For instance, X and Y are independent iff P(X\in A,Y\in B)=P(X\in A)P(Y\in B), and this condition only involves the joint distribution. So the joint distribution tells you if X and Y are independent.
    More generally, it contains the way the values of X and Y relate to each other. The very fact that X=Y (almost surely) can be read from the joint distribution, while it is not readable from the distributions of X and Y. For the same distribution \mu, there are many variables (X,Y) such that X and Y have distribution \mu; extreme cases are X=Y of law \mu, and X,Y independent of law \mu.
    Maybe if will be clearer if you think that the joint distribution not only tells you the distribution of X but also the conditional distribution of X given Y: P(X=k|Y=l)=\frac{P(X=k,Y=l)}{P(Y=l)} (the right-hand side depends only on the joint distribution).


    A "visual" way: The joint distribution is a probability measure on \mathbb{R}^2 that describes how the values of (X,Y) are distributed in the plane. You can think of hot spots (or peaks) where the measure gives more probability, and it gets colder and colder at infinity (nearer to 0). Then for instance you may have some very hot spot near (1,2), which means that (X,Y) has high probability to be near that point, i.e. with high probability X is near 1 and at the same time Y is near 2.
    Now, one can see the distributions of X and Y in this setting: they are distributions on each of the axes obtained by averaging the measure on the whole line projecting to the chosen point of the axis. For instance, P(X=x) is obtained by averaging the previous measure on the (vertical) line of equation "X=x"; like a "projection" of the measure. If there was at hot spot at (1,2), then there will be a hot spot at x=1 as well by projection, and at y=2.
    But if there are for instance a hot spot at (1,2) and another at (3,4), you will have two spots at 1 and 3 for X, and at 2 and 4 for Y. In that case, you don't know if (1,4) is a likely spot for (X,Y) or not from the distributions of X and Y.

    I don't know if this has clarified anything... You'll probably get used to it and understand the concept progressively.
    Follow Math Help Forum on Facebook and Google+

  10. #10
    Senior Member
    Joined
    Jan 2009
    Posts
    404
    Quote Originally Posted by Laurent View Post
    My humble attempt at a clarification :

    Plainly, the joint distribution of (X,Y) tells you P(X\in A,Y\in B) for any subsets A,B.
    Then, if you take B=\mathbb{R}, it gives you P(X\in A) for any subset A, which is the distribution of X. So, at least, when you know the joint distribution of (X,Y), you know the distributions of X and Y.
    But you know much more. For instance, X and Y are independent iff P(X\in A,Y\in B)=P(X\in A)P(Y\in B), and this condition only involves the joint distribution. So the joint distribution tells you if X and Y are independent.
    More generally, it contains the way the values of X and Y relate to each other. The very fact that X=Y (almost surely) can be read from the joint distribution, while it is not readable from the distributions of X and Y. For the same distribution \mu, there are many variables (X,Y) such that X and Y have distribution \mu; extreme cases are X=Y of law \mu, and X,Y independent of law \mu.
    Maybe if will be clearer if you think that the joint distribution not only tells you the distribution of X but also the conditional distribution of X given Y: P(X=k|Y=l)=\frac{P(X=k,Y=l)}{P(Y=l)} (the right-hand side depends only on the joint distribution).


    A "visual" way: The joint distribution is a probability measure on \mathbb{R}^2 that describes how the values of (X,Y) are distributed in the plane. You can think of hot spots (or peaks) where the measure gives more probability, and it gets colder and colder at infinity (nearer to 0). Then for instance you may have some very hot spot near (1,2), which means that (X,Y) has high probability to be near that point, i.e. with high probability X is near 1 and at the same time Y is near 2.
    Now, one can see the distributions of X and Y in this setting: they are distributions on each of the axes obtained by averaging the measure on the whole line projecting to the chosen point of the axis. For instance, P(X=x) is obtained by averaging the previous measure on the (vertical) line of equation "X=x"; like a "projection" of the measure. If there was at hot spot at (1,2), then there will be a hot spot at x=1 as well by projection, and at y=2.
    But if there are for instance a hot spot at (1,2) and another at (3,4), you will have two spots at 1 and 3 for X, and at 2 and 4 for Y. In that case, you don't know if (1,4) is a likely spot for (X,Y) or not from the distributions of X and Y.

    I don't know if this has clarified anything... You'll probably get used to it and understand the concept progressively.
    Yes, it clarifies.
    So I think the point is that the joint distribtuion of X1 and X2 tells you MORE than that from the distributions of X1 and X2 separately. In a sense, the joint distribtuion of X1 and X2 gives you more complete information (e.g. whether X1 and X2 are independent or not).

    The joint distribtuion tells you MORE, so is it correct to say that Statement 2 implies Statement 1? i.e. IF we know that (X1,X2) and (Y1,Y2) have the same JOINT distribution, THEN X1 and Y1 have the same distribution, AND X2 and Y2 have the same distribution. (But the converse is NOT necessarily true.) Am I right?
    Follow Math Help Forum on Facebook and Google+

  11. #11
    MHF Contributor

    Joined
    Aug 2008
    From
    Paris, France
    Posts
    1,174
    Quote Originally Posted by kingwinner View Post
    The joint distribtuion tells you MORE, so is it correct to say that Statement 2 implies Statement 1? i.e. IF we know that (X1,X2) and (Y1,Y2) have the same JOINT distribution, THEN X1 and Y1 have the same distribution, AND X2 and Y2 have the same distribution. (But the converse is NOT necessarily true.) Am I right?
    Yes, you are. The joint distribution contains the distribution of the marginals (the "coordinates"), and Statement 1 tells that the marginals are the same.

    In fact, the joint distribution of (X1,X2) simply tells you everything you need in order to compute anything about X1 and X2.
    Follow Math Help Forum on Facebook and Google+

Similar Math Help Forum Discussions

  1. Probability Distribution of Random Variables
    Posted in the Advanced Statistics Forum
    Replies: 0
    Last Post: March 8th 2010, 07:58 AM
  2. Distribution of 2 Random Variables
    Posted in the Advanced Statistics Forum
    Replies: 1
    Last Post: November 29th 2009, 10:51 PM
  3. Random Variables and Distribution
    Posted in the Advanced Statistics Forum
    Replies: 3
    Last Post: November 12th 2009, 12:10 PM
  4. distribution of sum of random variables
    Posted in the Advanced Statistics Forum
    Replies: 7
    Last Post: April 30th 2009, 05:36 PM
  5. Distribution of the sum of two ind. uniform random variables.
    Posted in the Advanced Statistics Forum
    Replies: 4
    Last Post: February 5th 2009, 05:30 PM

Search Tags


/mathhelpforum @mathhelpforum