Results 1 to 7 of 7

Math Help - [SOLVED] convergence in probability

  1. #1
    Junior Member
    Joined
    Feb 2010
    Posts
    43

    [SOLVED] convergence in probability

    Hi,

    I've been banging my head against the wall for too many hours and would love a hint... I'm looking to show that

    <br />
\frac {1} {n} \sum {(x_i - \overline{x_n})*(y_i - \overline{y_n})} \rightarrow Cov(X,Y)<br />

    (in probability), where (X_i, Y_i) are iid random vectors (finite 4th order moments).

    I'm trying to use the definition of convergence in probability and get

    <br />
P[|\frac {1} {n} \sum {(x_i - \overline{x_n})*(y_i - \overline{y_n})} - Cov(X,Y)| > \epsilon)]<br />
    <br />
       \leq E[\frac {1} {n} \sum {(x_i - \overline{x_n})*(y_i - \overline{y_n})} - Cov(X,Y)]^2 / \epsilon^2<br />

    I'd appreciate a hint. I am not getting the Var(X,Y) part, I think, and I don't think it should be that hard!?

    Thank you!
    Last edited by Statistik; February 20th 2010 at 02:24 AM.
    Follow Math Help Forum on Facebook and Google+

  2. #2
    MHF Contributor

    Joined
    Aug 2008
    From
    Paris, France
    Posts
    1,174
    Quote Originally Posted by Statistik View Post
    Hi,

    I've been banging my head against the wall for too many hours and would love a hint... I'm looking to show that

    <br />
\frac {1} {n} \sum {(x_i - \overline{x_n})*(y_i - \overline{y_n})} \rightarrow Cov(X,Y)<br />

    (in probability), where (X_i, Y_i) are iid random vectors (finite 4th order moments).

    I'm trying to use the definition of convergence in probability and get

    <br />
P[|\frac {1} {n} \sum {(x_i - \overline{x_n})*(y_i - \overline{y_n})} - Cov(X,Y)| > \epsilon)]<br />
    <br />
       \leq E[\frac {1} {n} \sum {(x_i - \overline{x_n})*(y_i - \overline{y_n})} - Cov(X,Y)]^2 / \epsilon^2<br />

    I'd appreciate a hint. I am not getting the Var(X,Y) part, I think, and I don't think it should be that hard!?

    Thank you!
    Finite fourth moment indeed suggests to apply Chebychev inequality, expand the square, and pray for not losing a few terms during the computation.

    Actually, you don't really need that fourth moment; second moment suffices. Together with weak law of large numbers (wLLN).

    Remember {\rm Cov}(X,Y)=E[XY]-E[X]E[Y]. Likewise, (x_i-\overline{x}_n)(y_i-\overline{y}_n)=x_i y_i-x_i\overline{y}_n-y_i\overline{x}_n+\overline{x}_n\overline{y}_n and, after summation (split the sum into four sums), the terms with "bars" factorize and the last three sums are the same, hence \frac{1}{n}\sum_{i=1}^n (x_i-\overline{x}_n)(y_i-\overline{y}_n) =\frac{1}{n}\sum_{i=1}^n x_iy_i-\left(\frac{1}{n}\sum_{i=1}^n x_i\right)\left(\frac{1}{n}\sum_{i=1}^n y_i\right).

    This way, we may split \frac {1} {n} \sum {(x_i - \overline{x_n})*(y_i - \overline{y_n})} - Cov(X,Y) into two simple terms. If both of them converge to 0 in probability, then so does their sum (you may need to prove that).

    First term is \frac{1}{n}\sum_{i=1}^n x_iy_i - E[XY]. Just apply wLLN.

    Second term is \left(\frac{1}{n}\sum_{i=1}^n x_i\right)\left(\frac{1}{n}\sum_{i=1}^n y_i\right)-E[X]E[Y]. Apply wLLN and the fact that the product of two sequences converging to 0 in proba converges to 0 as well (you may need to prove this).

    In fact, applying the strong law of large numbers, you even get an almost-sure convergence and this eliminates all problems of justifications (it is obvious (do you understand why?) that the sum or the product of two sequences converging a.s. to 0 converges a.s. to 0 as well).
    Follow Math Help Forum on Facebook and Google+

  3. #3
    Moo
    Moo is offline
    A Cute Angle Moo's Avatar
    Joined
    Mar 2008
    From
    P(I'm here)=1/3, P(I'm there)=t+1/3
    Posts
    5,618
    Thanks
    6
    Hello,

    The moment of 4th order and the convergence in probability should remind you of the weak law of large numbers.

    But we have to get the sum of iid random variables.

    Doing little algebra (same way as to prove that cov(X,Y)=E[(X-\mu_X)(Y-\mu_Y)]=E[XY]-\mu_X\mu_Y), we get that \sum (X_i-\bar X_n)(Y_i-\bar Y_n)=\sum X_iY_i-n\bar X_n\bar Y_n

    So \frac 1n\sum (X_i-\bar X_n)(Y_i-\bar Y_n)=\frac 1n\sum X_iY_i-\bar X_n\bar Y_n

    And hence \frac 1n\sum (X_i-\bar X_n)(Y_i-\bar Y_n)-cov(X,Y)=\left[\frac 1n\sum X_iY_i-E[XY]\right]-\left[\bar X_n\bar Y_n-E[X]E[Y]\right]

    By the weak law of large numbers, the first bracket converges in probability to 0, because the vectors (X_i,Y_i) are iid, and so are X_iY_i.
    And for the second bracket, we know that \bar X_n\to E[X] almost surely, and \bar Y_n\to E[Y] almost surely.
    So their product converge almost surely to E[X]E[Y] (because the product is a continuous function) and hence in probability.
    So the second bracket converges in probability to 0.

    And so the sum converges in probability to 0, which ends the proof (you just have to use the triangle inequality)

    I hope I didn't write too many false things



    Edit : whoaaaa, I'm waaaay too late and maybe I'm waaaay too wrong with the justifications, since it seems more complicated than in first thought...
    Well, I spent time thinking and typing it so I won't delete
    Follow Math Help Forum on Facebook and Google+

  4. #4
    MHF Contributor

    Joined
    Aug 2008
    From
    Paris, France
    Posts
    1,174
    Quote Originally Posted by Moo View Post
    I hope I didn't write too many false things

    Edit : whoaaaa, I'm waaaay too late and maybe I'm waaaay too wrong with the justifications, since it seems more complicated than in first thought...
    Well, I spent time thinking and typing it so I won't delete
    Sorry about that By the way, you justifications are fully correct!
    Follow Math Help Forum on Facebook and Google+

  5. #5
    Junior Member
    Joined
    Feb 2010
    Posts
    43
    Thank you, Laurent and Moo for your awesome responses! Much clearer now - I wasn't able to see the groupings of the convergent terms.

    I am stuck on one part, and it may be more of a semantic question:
    For:

    <br />
\sum_{i=1}^n x_iy_i = n * x_iy_i<br />

    I keep getting stuck because it seems to me that I cannot separate  x_i * y_i since X and Y are not independent from each other (just the vectors of (X_i,Y_i)) are. How would I properly write the sum?

    <br />
\frac {1}{n} \sum_{i=1}^n x_iy_i =  \frac {1}{n}  x_n*y_n = \overline {x_ny_n}<br />

    which then approaches E(XY) in probability?

    Thanks!

    PS: You asked if I understood why it is obvious that the sum of two sequences converges to 0 if each one of them converges marginally when working with sLLN vs. wLLN. If I understand correctly, I am looking at the limit of the variables themselves with the sLLN vs their probability of equaling something with the wLLN. Then, when one converges with sLLN, I have a statement about the variables again and can just take the "at face value", while I may need to re-evaluate the probabilities combined (using Slutsky though makes it easier as at least one of them equals a non-random number)?
    Follow Math Help Forum on Facebook and Google+

  6. #6
    MHF Contributor

    Joined
    Aug 2008
    From
    Paris, France
    Posts
    1,174
    Quote Originally Posted by Statistik View Post
    I am stuck on one part, and it may be more of a semantic question:
    For:

    <br />
\sum_{i=1}^n x_iy_i = n * x_iy_i<br />

    I keep getting stuck because it seems to me that I cannot separate  x_i * y_i since X and Y are not independent from each other (just the vectors of (X_i,Y_i)) are. How would I properly write the sum?

    <br />
\frac {1}{n} \sum_{i=1}^n x_iy_i =  \frac {1}{n}  x_n*y_n = \overline {x_ny_n}<br />

    which then approaches E(XY) in probability?
    I don't get what you mean (and your formulas are weird, I guess there are typos...); as Moo said, the random variables z_i=x_iy_i, i\geq 1 are independent, and integrable by assumption, with expectation E[z_i]=E[x_iy_i]=E[XY]. Then we may just apply the law of large numbers to the sequence (z_i)_{i\geq 1}.


    PS: You asked if I understood why it is obvious that the sum of two sequences converges to 0 if each one of them converges marginally when working with sLLN vs. wLLN. If I understand correctly, I am looking at the limit of the variables themselves with the sLLN vs their probability of equaling something with the wLLN. Then, when one converges with sLLN, I have a statement about the variables again and can just take the "at face value", while I may need to re-evaluate the probabilities combined (using Slutsky though makes it easier as at least one of them equals a non-random number)?
    This looks pretty much like it. Let me express it my way: if we have sequences (on the same probability space (\Omega,\mathcal{F},P)) such that X_n\to X a.s. and Y_n\to Y a.s., then this means that for almost-all \omega\in\Omega, we have the following limits: X_n(\omega)\to X(\omega) and Y_n(\omega)\to Y(\omega), and these are limits of sequences of real numbers, so we have, by the usual elementary properties of limits, X_n(\omega)Y_n(\omega)\to_n X(\omega)Y(\omega). In other words, X_nY_n\to_n XY a.s.. Thus we only need to apply results about real-valued sequences.
    Follow Math Help Forum on Facebook and Google+

  7. #7
    Junior Member
    Joined
    Feb 2010
    Posts
    43
    Laurent,

    Thanks. No typos ;-(, still learning the ropes. (My 2nd probability / statistics class, trying to get a hang of the symbols.) The z=xy substitution makes sense and makes it "clean", too.

    And thanks for the clarification on the a.s. vs. probability - I really appreciate it this type of info.
    Follow Math Help Forum on Facebook and Google+

Similar Math Help Forum Discussions

  1. Convergence in probability but no convergence in L^p nor A.S.
    Posted in the Advanced Statistics Forum
    Replies: 3
    Last Post: April 16th 2011, 06:17 AM
  2. [SOLVED] Convergence in Probability
    Posted in the Advanced Statistics Forum
    Replies: 2
    Last Post: March 2nd 2010, 09:56 PM
  3. fundamental in probability & convergence with probability 1
    Posted in the Advanced Statistics Forum
    Replies: 3
    Last Post: February 23rd 2010, 10:58 AM
  4. Almost sure convergence & convergence in probability
    Posted in the Advanced Statistics Forum
    Replies: 9
    Last Post: November 28th 2009, 12:31 AM
  5. Convergence of Probability
    Posted in the Advanced Statistics Forum
    Replies: 4
    Last Post: October 14th 2009, 04:51 PM

Search Tags


/mathhelpforum @mathhelpforum