I don't think it is. Maybe you can have a look at gaussian vectors, with covariance matrices such that the diagonal is the same, but not the numbers off the diagonal.
(and with the same expectation vector)
Let X1,X2,X3,Y1,Y2,Y3 be random variables.
If X1 and Y1 have the same distribution,
X2 and Y2 have the same distribution,
X3 and Y3 have the same distribution,
then is it true that X1+X2+X3 and Y1+Y2+Y3 will have the same distribution? Why or why not?
Any help is appreciated!
Let's imagine the Gaussian vector with the expectation vector and the covariance matrix
and then the Gaussian vector with the expectation vector and the covariance matrix
we obviously have , though since the covariance matrices are different, the pdf of M is different from the pdf of N...
Am I wrong somewhere ?
I am sorry...the following may seem very obvious to you, but to me it is not
X1 and Y1 have the same distribution, AND
X2 and Y2 have the same distribution.
(X1,X2) and (Y1,Y2) have the same JOINT distribution.
Are statements 1 and 2 equivalent? If not, what is the difference between them? (please explain in the simplest terms if possible as I am only a 2nd year stat undergrad student)
I was never able to understand this, and I would really appreciate if you could clarify this concept.
Plainly, the joint distribution of (X,Y) tells you for any subsets A,B.
Then, if you take , it gives you for any subset A, which is the distribution of X. So, at least, when you know the joint distribution of (X,Y), you know the distributions of X and Y.
But you know much more. For instance, X and Y are independent iff , and this condition only involves the joint distribution. So the joint distribution tells you if X and Y are independent.
More generally, it contains the way the values of X and Y relate to each other. The very fact that X=Y (almost surely) can be read from the joint distribution, while it is not readable from the distributions of X and Y. For the same distribution , there are many variables (X,Y) such that X and Y have distribution ; extreme cases are X=Y of law , and X,Y independent of law .
Maybe if will be clearer if you think that the joint distribution not only tells you the distribution of X but also the conditional distribution of X given Y: (the right-hand side depends only on the joint distribution).
A "visual" way: The joint distribution is a probability measure on that describes how the values of are distributed in the plane. You can think of hot spots (or peaks) where the measure gives more probability, and it gets colder and colder at infinity (nearer to 0). Then for instance you may have some very hot spot near (1,2), which means that has high probability to be near that point, i.e. with high probability X is near 1 and at the same time Y is near 2.
Now, one can see the distributions of X and Y in this setting: they are distributions on each of the axes obtained by averaging the measure on the whole line projecting to the chosen point of the axis. For instance, P(X=x) is obtained by averaging the previous measure on the (vertical) line of equation "X=x"; like a "projection" of the measure. If there was at hot spot at (1,2), then there will be a hot spot at x=1 as well by projection, and at y=2.
But if there are for instance a hot spot at (1,2) and another at (3,4), you will have two spots at 1 and 3 for X, and at 2 and 4 for Y. In that case, you don't know if (1,4) is a likely spot for (X,Y) or not from the distributions of X and Y.
I don't know if this has clarified anything... You'll probably get used to it and understand the concept progressively.
So I think the point is that the joint distribtuion of X1 and X2 tells you MORE than that from the distributions of X1 and X2 separately. In a sense, the joint distribtuion of X1 and X2 gives you more complete information (e.g. whether X1 and X2 are independent or not).
The joint distribtuion tells you MORE, so is it correct to say that Statement 2 implies Statement 1? i.e. IF we know that (X1,X2) and (Y1,Y2) have the same JOINT distribution, THEN X1 and Y1 have the same distribution, AND X2 and Y2 have the same distribution. (But the converse is NOT necessarily true.) Am I right?
In fact, the joint distribution of (X1,X2) simply tells you everything you need in order to compute anything about X1 and X2.