# Thread: Proof of independence of random variables

1. ## Proof of independence of random variables

Hi, i have a question

Two random variables X and Y are independent. How can i prove that X,Y+1 are independent?

Thanks

2. Originally Posted by noname
Hi, i have a question

Two random variables X and Y are independent. How can i prove that X,Y+1 are independent?

Thanks
You can either show that:

$f(x,y+1)=g(x)h(y+1)$

or that:

$f(y+1|x)=h(y+1)$

Where $f(x,y+1)$ is the pdf of the joint distribution, $g(x)$ is the pdf of $X$, $h(y+1)$ the pdf of $Y+1$ and $f(y+1|x)$ is the condition distribution of $Y+1$

CB

Is this correct?

4. Hello,

Hmmm actually, you have to prove that $f(x,y+1)$, the joint pdf of X and Y+1, can be written in such a form. In what you did, you assumed it was true.

-----
Let f be the joint pdf of X and . So we'd have $f(x,y)=f_1(x)f_2(y)$, as defined above.
For any measurable function h, we have :

\begin{aligned}\mathbb{E}(h(X,Y+1))&=\int_{\mathbb {R}^2} h(x,y+1)f(x,y) ~dxdy \quad (*) \\
&=\int_{\mathbb{R}^2} h(x,y)f(x,y-1) ~dxdy \\
&=\int_{\mathbb{R}^2} h(x,y)f_1(x)f_2(y-1) ~dxdy
\end{aligned}

So $f_1(x)f_2(y-1)$ is the pdf of $(X,Y+1)$
And this proves that X and Y+1 are independent.

If you can't see (*), consider you're working on $\mathbb{E}((h\circ p)(X,Y))$, where $p(a,b)=(a,b+1)$ and use the general thing :
$\mathbb{E}(h(X))=\int_{\mathbb{R}} h(x)f(x) ~dx$ iff f is the pdf of X.

And as usual, if there's any mistake, tell me... I'm so self-confident you know...

5. Originally Posted by noname
Hi, i have a question

Two random variables X and Y are independent. How can i prove that X,Y+1 are independent?

Thanks
You could use the very definition of independence.

If X and Y are discrete r.v.: for all $x,y$, we have $P(X=x,Y+1=y)=P(X=x,Y=y-1)$ $=P(X=x)P(Y=y-1)=P(X=x)P(Y+1=y)$ (the comma means "and" ; I used the independence of X and Y in the second equality). This proves that $X$ and $Y+1$ are independent.

General case: for all measurable subsets $A,B$, we have $P(X\in A,Y+1\in B)=P(X\in A,Y\in B-1)$ $=P(X\in A)P(Y\in B-1)=P(X\in A)P(Y+1\in B)$ (the comma means "and" ; $B-1=\{y-1|y\in B\}$ ; I used the independence of X and Y in the second equality). This proves that $X$ and $Y+1$ are independent.

6. Originally Posted by Laurent
You could use the very definition of independence.

[...snip]

General case: for all measurable subsets $A,B$, we have $P(X\in A,Y+1\in B)=P(X\in A,Y\in B-1)$ $=P(X\in A)P(Y\in B-1)=P(X\in A)P(Y+1\in B)$ (the comma means "and" ; $B-1=\{y-1|y\in B\}$ ; I used the independence of X and Y in the second equality). This proves that $X$ and $Y+1$ are independent.
Hi,

Can I use the same to proof the following?

$X_1, X_2, Y_1, Y_2$ are indipendent random variables, expectations and variances are $E(X_i)=E(X), E(Y_j)=E(Y), var(X_i)=var(X), var(Y_i)=var(Y)$ where $i=1,2$ and $j=1,2$

Is $Z_1=X_1*Y_1$ indipendent from $Z_2=X_2*Y_2$?

Thanks!

7. Originally Posted by Ruby
Can I use the same to proof the following?

$X_1, X_2, Y_1, Y_2$ are indipendent random variables, expectations and variances are $E(X_i)=E(X), E(Y_j)=E(Y), var(X_i)=var(X), var(Y_i)=var(Y)$ where $i=1,2$ and $j=1,2$

Is $Z_1=X_1*Y_1$ indipendent from $Z_2=X_2*Y_2$?
No, you can't just mimic the proof.

In fact, there is a very general (and intuitive) result that answers both the initial question and yours. I only give two examples of statements; the generalization is straightforward:

if $X_1,X_2,X_3,X_4$ are independent random variables, and $f_1:\mathbb{R}^2\to\mathbb{R}$, $f_2:\mathbb{R}^2\to\mathbb{R}$ are (measurable) functions, then $f_1(X_1,X_2)$ and $f_2(X_3,X_4)$ are independent.

if $X_1,X_2,X_3,X_4$ are independent random variables, and $f_1:\mathbb{R}^3\to\mathbb{R}$, $f_2:\mathbb{R}\to\mathbb{R}$ are (measurable) functions, then $f_1(X_1,X_2,X_3)$ and $f_2(X_4)$ are independent.

In the generalized version, there can be any number of random variables, and the functions may depend on variously sized groups of variables provided they are disjoint.

8. Originally Posted by Laurent
No, you can't just mimic the proof.

In fact, there is a very general (and intuitive) result that answers both the initial question and yours. I only give two examples of statements; the generalization is straightforward:

if $X_1,X_2,X_3,X_4$ are independent random variables, and $f_1:\mathbb{R}^2\to\mathbb{R}$, $f_2:\mathbb{R}^2\to\mathbb{R}$ are (measurable) functions, then $f_1(X_1,X_2)$ and $f_2(X_3,X_4)$ are independent.

[...snip]

This bring us back to Moo's proof I think, in the generalized version I should have that:

Two r.v.s (but this should be true also for n r.v.s, two by two independent) $X_1$ and $X_2$ are independent if and only if for any pair of functions $f_1(x_1)$ and $f_2(x_2)$

$E[f_1(x_1)f_2(x_2)] = E[f_1(x_1)]E[f_2(x_2)]$

provided that the expectations exist.

A special case is of course when two r.v.s $X$ and $Y$ are independent, and you have to check the independence of $f_1(x) = x$ and $f_2(y) = y+1$ which is noname's question...

Am I right?

9. Originally Posted by Ruby
This bring us back to Moo's proof I think, in the generalized version I should have that:

Two r.v.s (but this should be true also for n r.v.s, two by two independent) $X_1$ and $X_2$ are independent if and only if for any pair of functions $f_1(x_1)$ and $f_2(x_2)$

$E[f_1(x_1)f_2(x_2)] = E[f_1(x_1)]E[f_2(x_2)]$

provided that the expectations exist.

The problem with Moo's and Captain Black's proofs is that they seemed to assume that the random variables had probability density functions; that's why I posted another elementary and general proof.

The property you're quoting is a (quick) consequence of the definition of independence (the definition is when $f_i=1_{A_i}$, indicator function of set $A_i$); therefore it can be used to prove noname's question, but not yours. And it is not directly related to my previous post, which gave a way to get new independent r.v. from independent r.v. (by applying functions to disjoint groups of r.v.).

If you would like to answer noname's question using the property you give: let $f_1,f_2$ be such that $E[f_1(X)]$ and $E[f_2(Y+1)]$ exist. Remark that $f_2(Y+1)=\widetilde{f_2}(Y)$ where $\widetilde{f_2}(y)=f_2(y+1)$. Therefore, since $X$ and $Y$ are independent, $E[f_1(X)f_2(Y+1)]=E[f_1(X)\widetilde{f_2}(Y)]$ $=E[f_1(X)]E[\widetilde{f_2}(Y)]=E[f_1(X)]E[f_2(Y+1)]$. This proves that $X$ and $Y+1$ are independent. Compare with my previous proof.

10. Thanks for your time Laurent, it's all becoming clearer to me now!

11. Today I have taken my Statistics exam!