Results 1 to 11 of 11

Math Help - Proof of independence of random variables

  1. #1
    Newbie noname's Avatar
    Joined
    Sep 2009
    Posts
    16

    Question Proof of independence of random variables

    Hi, i have a question

    Two random variables X and Y are independent. How can i prove that X,Y+1 are independent?

    Thanks
    Follow Math Help Forum on Facebook and Google+

  2. #2
    Grand Panjandrum
    Joined
    Nov 2005
    From
    someplace
    Posts
    14,972
    Thanks
    4
    Quote Originally Posted by noname View Post
    Hi, i have a question

    Two random variables X and Y are independent. How can i prove that X,Y+1 are independent?

    Thanks
    You can either show that:

    f(x,y+1)=g(x)h(y+1)

    or that:

    f(y+1|x)=h(y+1)

    Where f(x,y+1) is the pdf of the joint distribution, g(x) is the pdf of X, h(y+1) the pdf of Y+1 and f(y+1|x) is the condition distribution of Y+1

    CB
    Follow Math Help Forum on Facebook and Google+

  3. #3
    Newbie noname's Avatar
    Joined
    Sep 2009
    Posts
    16
    Hi, thank you for your answer.

    Is this correct?

    Last edited by noname; September 18th 2009 at 06:58 AM. Reason: Latex image link updated
    Follow Math Help Forum on Facebook and Google+

  4. #4
    Moo
    Moo is offline
    A Cute Angle Moo's Avatar
    Joined
    Mar 2008
    From
    P(I'm here)=1/3, P(I'm there)=t+1/3
    Posts
    5,618
    Thanks
    6
    Hello,

    Hmmm actually, you have to prove that f(x,y+1), the joint pdf of X and Y+1, can be written in such a form. In what you did, you assumed it was true.

    -----
    Let f be the joint pdf of X and . So we'd have f(x,y)=f_1(x)f_2(y), as defined above.
    For any measurable function h, we have :

    \begin{aligned}\mathbb{E}(h(X,Y+1))&=\int_{\mathbb  {R}^2} h(x,y+1)f(x,y) ~dxdy \quad (*) \\<br />
&=\int_{\mathbb{R}^2} h(x,y)f(x,y-1) ~dxdy \\<br />
&=\int_{\mathbb{R}^2} h(x,y)f_1(x)f_2(y-1) ~dxdy<br />
\end{aligned}

    So f_1(x)f_2(y-1) is the pdf of (X,Y+1)
    And this proves that X and Y+1 are independent.

    If you can't see (*), consider you're working on \mathbb{E}((h\circ p)(X,Y)), where p(a,b)=(a,b+1) and use the general thing :
    \mathbb{E}(h(X))=\int_{\mathbb{R}} h(x)f(x) ~dx iff f is the pdf of X.


    And as usual, if there's any mistake, tell me... I'm so self-confident you know...
    Follow Math Help Forum on Facebook and Google+

  5. #5
    MHF Contributor

    Joined
    Aug 2008
    From
    Paris, France
    Posts
    1,174
    Quote Originally Posted by noname View Post
    Hi, i have a question

    Two random variables X and Y are independent. How can i prove that X,Y+1 are independent?

    Thanks
    You could use the very definition of independence.

    If X and Y are discrete r.v.: for all x,y, we have P(X=x,Y+1=y)=P(X=x,Y=y-1) =P(X=x)P(Y=y-1)=P(X=x)P(Y+1=y) (the comma means "and" ; I used the independence of X and Y in the second equality). This proves that X and Y+1 are independent.

    General case: for all measurable subsets A,B, we have P(X\in A,Y+1\in B)=P(X\in A,Y\in B-1) =P(X\in A)P(Y\in B-1)=P(X\in A)P(Y+1\in B) (the comma means "and" ; B-1=\{y-1|y\in B\} ; I used the independence of X and Y in the second equality). This proves that X and Y+1 are independent.
    Follow Math Help Forum on Facebook and Google+

  6. #6
    Newbie
    Joined
    Sep 2009
    Posts
    3
    Quote Originally Posted by Laurent View Post
    You could use the very definition of independence.

    [...snip]

    General case: for all measurable subsets A,B, we have P(X\in A,Y+1\in B)=P(X\in A,Y\in B-1) =P(X\in A)P(Y\in B-1)=P(X\in A)P(Y+1\in B) (the comma means "and" ; B-1=\{y-1|y\in B\} ; I used the independence of X and Y in the second equality). This proves that X and Y+1 are independent.
    Hi,

    Can I use the same to proof the following?

    X_1, X_2, Y_1, Y_2 are indipendent random variables, expectations and variances are E(X_i)=E(X), E(Y_j)=E(Y), var(X_i)=var(X), var(Y_i)=var(Y) where    i=1,2 and   j=1,2

    Is Z_1=X_1*Y_1 indipendent from Z_2=X_2*Y_2?

    Thanks!
    Follow Math Help Forum on Facebook and Google+

  7. #7
    MHF Contributor

    Joined
    Aug 2008
    From
    Paris, France
    Posts
    1,174
    Quote Originally Posted by Ruby View Post
    Can I use the same to proof the following?

    X_1, X_2, Y_1, Y_2 are indipendent random variables, expectations and variances are E(X_i)=E(X), E(Y_j)=E(Y), var(X_i)=var(X), var(Y_i)=var(Y) where    i=1,2 and   j=1,2

    Is Z_1=X_1*Y_1 indipendent from Z_2=X_2*Y_2?
    No, you can't just mimic the proof.

    In fact, there is a very general (and intuitive) result that answers both the initial question and yours. I only give two examples of statements; the generalization is straightforward:

    if X_1,X_2,X_3,X_4 are independent random variables, and f_1:\mathbb{R}^2\to\mathbb{R}, f_2:\mathbb{R}^2\to\mathbb{R} are (measurable) functions, then f_1(X_1,X_2) and f_2(X_3,X_4) are independent.

    if X_1,X_2,X_3,X_4 are independent random variables, and f_1:\mathbb{R}^3\to\mathbb{R}, f_2:\mathbb{R}\to\mathbb{R} are (measurable) functions, then f_1(X_1,X_2,X_3) and f_2(X_4) are independent.

    In the generalized version, there can be any number of random variables, and the functions may depend on variously sized groups of variables provided they are disjoint.
    Follow Math Help Forum on Facebook and Google+

  8. #8
    Newbie
    Joined
    Sep 2009
    Posts
    3
    Quote Originally Posted by Laurent View Post
    No, you can't just mimic the proof.

    In fact, there is a very general (and intuitive) result that answers both the initial question and yours. I only give two examples of statements; the generalization is straightforward:

    if X_1,X_2,X_3,X_4 are independent random variables, and f_1:\mathbb{R}^2\to\mathbb{R}, f_2:\mathbb{R}^2\to\mathbb{R} are (measurable) functions, then f_1(X_1,X_2) and f_2(X_3,X_4) are independent.

    [...snip]
    Thank you for your answer Laurent!

    This bring us back to Moo's proof I think, in the generalized version I should have that:

    Two r.v.s (but this should be true also for n r.v.s, two by two independent) X_1 and X_2 are independent if and only if for any pair of functions  f_1(x_1)              and  f_2(x_2)

    E[f_1(x_1)f_2(x_2)]              = E[f_1(x_1)]E[f_2(x_2)]

    provided that the expectations exist.

    A special case is of course when two r.v.s X and Y are independent, and you have to check the independence of  f_1(x) = x and  f_2(y) = y+1 which is noname's question...

    Am I right?
    Follow Math Help Forum on Facebook and Google+

  9. #9
    MHF Contributor

    Joined
    Aug 2008
    From
    Paris, France
    Posts
    1,174
    Quote Originally Posted by Ruby View Post
    This bring us back to Moo's proof I think, in the generalized version I should have that:

    Two r.v.s (but this should be true also for n r.v.s, two by two independent) X_1 and X_2 are independent if and only if for any pair of functions  f_1(x_1)              and  f_2(x_2)

    E[f_1(x_1)f_2(x_2)]              = E[f_1(x_1)]E[f_2(x_2)]

    provided that the expectations exist.

    The problem with Moo's and Captain Black's proofs is that they seemed to assume that the random variables had probability density functions; that's why I posted another elementary and general proof.

    The property you're quoting is a (quick) consequence of the definition of independence (the definition is when f_i=1_{A_i}, indicator function of set A_i); therefore it can be used to prove noname's question, but not yours. And it is not directly related to my previous post, which gave a way to get new independent r.v. from independent r.v. (by applying functions to disjoint groups of r.v.).

    If you would like to answer noname's question using the property you give: let f_1,f_2 be such that E[f_1(X)] and E[f_2(Y+1)] exist. Remark that f_2(Y+1)=\widetilde{f_2}(Y) where \widetilde{f_2}(y)=f_2(y+1). Therefore, since X and Y are independent, E[f_1(X)f_2(Y+1)]=E[f_1(X)\widetilde{f_2}(Y)] =E[f_1(X)]E[\widetilde{f_2}(Y)]=E[f_1(X)]E[f_2(Y+1)]. This proves that X and Y+1 are independent. Compare with my previous proof.
    Follow Math Help Forum on Facebook and Google+

  10. #10
    Newbie
    Joined
    Sep 2009
    Posts
    3
    Thanks for your time Laurent, it's all becoming clearer to me now!
    Follow Math Help Forum on Facebook and Google+

  11. #11
    Newbie noname's Avatar
    Joined
    Sep 2009
    Posts
    16
    Today I have taken my Statistics exam!
    Thank you for your help...
    Only one exercise gave me trouble, and it was about the independence of random variables. But it wasn't like the one above.

    It was:

    X and Y are random variables, check if random variables X+Y , 1 are independent.

    The professor tell us that we should have used an appropriate example to check the independence of the variables.

    Does anyone know a way to solve this exercise?

    Thanks
    Follow Math Help Forum on Facebook and Google+

Similar Math Help Forum Discussions

  1. Independence of random variables X and X^2
    Posted in the Statistics Forum
    Replies: 2
    Last Post: December 9th 2011, 04:22 AM
  2. Independence of Random Variables
    Posted in the Statistics Forum
    Replies: 1
    Last Post: July 18th 2011, 08:39 PM
  3. Replies: 6
    Last Post: November 16th 2009, 03:42 PM
  4. stochastic independence and random variables
    Posted in the Advanced Statistics Forum
    Replies: 2
    Last Post: November 5th 2008, 12:12 AM
  5. 3 Gaussian random variables, independence
    Posted in the Advanced Statistics Forum
    Replies: 1
    Last Post: May 12th 2007, 05:06 AM

Search Tags


/mathhelpforum @mathhelpforum