# Proof of independence of random variables

• Sep 17th 2009, 12:15 AM
noname
Proof of independence of random variables
Hi, i have a question

Two random variables X and Y are independent. How can i prove that X,Y+1 are independent?

Thanks
• Sep 17th 2009, 11:00 AM
CaptainBlack
Quote:

Originally Posted by noname
Hi, i have a question

Two random variables X and Y are independent. How can i prove that X,Y+1 are independent?

Thanks

You can either show that:

$\displaystyle f(x,y+1)=g(x)h(y+1)$

or that:

$\displaystyle f(y+1|x)=h(y+1)$

Where $\displaystyle f(x,y+1)$ is the pdf of the joint distribution, $\displaystyle g(x)$ is the pdf of $\displaystyle X$, $\displaystyle h(y+1)$ the pdf of $\displaystyle Y+1$ and $\displaystyle f(y+1|x)$ is the condition distribution of $\displaystyle Y+1$

CB
• Sep 17th 2009, 12:40 PM
noname

Is this correct?

http://img200.imageshack.us/img200/5849/equation.png
• Sep 18th 2009, 07:46 AM
Moo
Hello,

Hmmm actually, you have to prove that $\displaystyle f(x,y+1)$, the joint pdf of X and Y+1, can be written in such a form. In what you did, you assumed it was true.

-----
Let f be the joint pdf of X and . So we'd have $\displaystyle f(x,y)=f_1(x)f_2(y)$, as defined above.
For any measurable function h, we have :

\displaystyle \begin{aligned}\mathbb{E}(h(X,Y+1))&=\int_{\mathbb {R}^2} h(x,y+1)f(x,y) ~dxdy \quad (*) \\ &=\int_{\mathbb{R}^2} h(x,y)f(x,y-1) ~dxdy \\ &=\int_{\mathbb{R}^2} h(x,y)f_1(x)f_2(y-1) ~dxdy \end{aligned}

So $\displaystyle f_1(x)f_2(y-1)$ is the pdf of $\displaystyle (X,Y+1)$
And this proves that X and Y+1 are independent.

If you can't see (*), consider you're working on $\displaystyle \mathbb{E}((h\circ p)(X,Y))$, where $\displaystyle p(a,b)=(a,b+1)$ and use the general thing :
$\displaystyle \mathbb{E}(h(X))=\int_{\mathbb{R}} h(x)f(x) ~dx$ iff f is the pdf of X.

And as usual, if there's any mistake, tell me... I'm so self-confident you know...
• Sep 18th 2009, 09:20 AM
Laurent
Quote:

Originally Posted by noname
Hi, i have a question

Two random variables X and Y are independent. How can i prove that X,Y+1 are independent?

Thanks

You could use the very definition of independence.

If X and Y are discrete r.v.: for all $\displaystyle x,y$, we have $\displaystyle P(X=x,Y+1=y)=P(X=x,Y=y-1)$ $\displaystyle =P(X=x)P(Y=y-1)=P(X=x)P(Y+1=y)$ (the comma means "and" ; I used the independence of X and Y in the second equality). This proves that $\displaystyle X$ and $\displaystyle Y+1$ are independent.

General case: for all measurable subsets $\displaystyle A,B$, we have $\displaystyle P(X\in A,Y+1\in B)=P(X\in A,Y\in B-1)$ $\displaystyle =P(X\in A)P(Y\in B-1)=P(X\in A)P(Y+1\in B)$ (the comma means "and" ; $\displaystyle B-1=\{y-1|y\in B\}$ ; I used the independence of X and Y in the second equality). This proves that $\displaystyle X$ and $\displaystyle Y+1$ are independent.
• Sep 18th 2009, 12:19 PM
Ruby
Quote:

Originally Posted by Laurent
You could use the very definition of independence.

[...snip]

General case: for all measurable subsets $\displaystyle A,B$, we have $\displaystyle P(X\in A,Y+1\in B)=P(X\in A,Y\in B-1)$ $\displaystyle =P(X\in A)P(Y\in B-1)=P(X\in A)P(Y+1\in B)$ (the comma means "and" ; $\displaystyle B-1=\{y-1|y\in B\}$ ; I used the independence of X and Y in the second equality). This proves that $\displaystyle X$ and $\displaystyle Y+1$ are independent.

Hi,

Can I use the same to proof the following?

$\displaystyle X_1, X_2, Y_1, Y_2$ are indipendent random variables, expectations and variances are $\displaystyle E(X_i)=E(X), E(Y_j)=E(Y), var(X_i)=var(X), var(Y_i)=var(Y)$ where $\displaystyle i=1,2$ and $\displaystyle j=1,2$

Is $\displaystyle Z_1=X_1*Y_1$ indipendent from $\displaystyle Z_2=X_2*Y_2$?

Thanks!
• Sep 18th 2009, 01:40 PM
Laurent
Quote:

Originally Posted by Ruby
Can I use the same to proof the following?

$\displaystyle X_1, X_2, Y_1, Y_2$ are indipendent random variables, expectations and variances are $\displaystyle E(X_i)=E(X), E(Y_j)=E(Y), var(X_i)=var(X), var(Y_i)=var(Y)$ where $\displaystyle i=1,2$ and $\displaystyle j=1,2$

Is $\displaystyle Z_1=X_1*Y_1$ indipendent from $\displaystyle Z_2=X_2*Y_2$?

No, you can't just mimic the proof.

In fact, there is a very general (and intuitive) result that answers both the initial question and yours. I only give two examples of statements; the generalization is straightforward:

if $\displaystyle X_1,X_2,X_3,X_4$ are independent random variables, and $\displaystyle f_1:\mathbb{R}^2\to\mathbb{R}$, $\displaystyle f_2:\mathbb{R}^2\to\mathbb{R}$ are (measurable) functions, then $\displaystyle f_1(X_1,X_2)$ and $\displaystyle f_2(X_3,X_4)$ are independent.

if $\displaystyle X_1,X_2,X_3,X_4$ are independent random variables, and $\displaystyle f_1:\mathbb{R}^3\to\mathbb{R}$, $\displaystyle f_2:\mathbb{R}\to\mathbb{R}$ are (measurable) functions, then $\displaystyle f_1(X_1,X_2,X_3)$ and $\displaystyle f_2(X_4)$ are independent.

In the generalized version, there can be any number of random variables, and the functions may depend on variously sized groups of variables provided they are disjoint.
• Sep 19th 2009, 02:06 AM
Ruby
Quote:

Originally Posted by Laurent
No, you can't just mimic the proof.

In fact, there is a very general (and intuitive) result that answers both the initial question and yours. I only give two examples of statements; the generalization is straightforward:

if $\displaystyle X_1,X_2,X_3,X_4$ are independent random variables, and $\displaystyle f_1:\mathbb{R}^2\to\mathbb{R}$, $\displaystyle f_2:\mathbb{R}^2\to\mathbb{R}$ are (measurable) functions, then $\displaystyle f_1(X_1,X_2)$ and $\displaystyle f_2(X_3,X_4)$ are independent.

[...snip]

This bring us back to Moo's proof I think, in the generalized version I should have that:

Two r.v.s (but this should be true also for n r.v.s, two by two independent) $\displaystyle X_1$ and $\displaystyle X_2$ are independent if and only if for any pair of functions $\displaystyle f_1(x_1)$ and $\displaystyle f_2(x_2)$

$\displaystyle E[f_1(x_1)f_2(x_2)] = E[f_1(x_1)]E[f_2(x_2)]$

provided that the expectations exist.

A special case is of course when two r.v.s $\displaystyle X$ and $\displaystyle Y$ are independent, and you have to check the independence of $\displaystyle f_1(x) = x$ and $\displaystyle f_2(y) = y+1$ which is noname's question...

Am I right?
• Sep 19th 2009, 04:57 AM
Laurent
Quote:

Originally Posted by Ruby
This bring us back to Moo's proof I think, in the generalized version I should have that:

Two r.v.s (but this should be true also for n r.v.s, two by two independent) $\displaystyle X_1$ and $\displaystyle X_2$ are independent if and only if for any pair of functions $\displaystyle f_1(x_1)$ and $\displaystyle f_2(x_2)$

$\displaystyle E[f_1(x_1)f_2(x_2)] = E[f_1(x_1)]E[f_2(x_2)]$

provided that the expectations exist.

The problem with Moo's and Captain Black's proofs is that they seemed to assume that the random variables had probability density functions; that's why I posted another elementary and general proof.

The property you're quoting is a (quick) consequence of the definition of independence (the definition is when $\displaystyle f_i=1_{A_i}$, indicator function of set $\displaystyle A_i$); therefore it can be used to prove noname's question, but not yours. And it is not directly related to my previous post, which gave a way to get new independent r.v. from independent r.v. (by applying functions to disjoint groups of r.v.).

If you would like to answer noname's question using the property you give: let $\displaystyle f_1,f_2$ be such that $\displaystyle E[f_1(X)]$ and $\displaystyle E[f_2(Y+1)]$ exist. Remark that $\displaystyle f_2(Y+1)=\widetilde{f_2}(Y)$ where $\displaystyle \widetilde{f_2}(y)=f_2(y+1)$. Therefore, since $\displaystyle X$ and $\displaystyle Y$ are independent, $\displaystyle E[f_1(X)f_2(Y+1)]=E[f_1(X)\widetilde{f_2}(Y)]$ $\displaystyle =E[f_1(X)]E[\widetilde{f_2}(Y)]=E[f_1(X)]E[f_2(Y+1)]$. This proves that $\displaystyle X$ and $\displaystyle Y+1$ are independent. Compare with my previous proof.
• Sep 19th 2009, 06:07 AM
Ruby
Thanks for your time Laurent, it's all becoming clearer to me now!
• Sep 28th 2009, 12:31 PM
noname
Today I have taken my Statistics exam!