Results 1 to 9 of 9

Math Help - Convergence in probability but not almost surely nor L^p

  1. #1
    Newbie
    Joined
    Apr 2012
    From
    US
    Posts
    5

    Convergence in probability but not almost surely nor L^p

    Hi,

    I'm trying to find a single example of a sequence of random variables X_n such that the sequence converges to random variable X in probability, but not almost surely nor in L^p for any p. Does anyone know on any simple examples, and how to prove the above?


    Regards,

    John.
    Follow Math Help Forum on Facebook and Google+

  2. #2
    MHF Contributor
    Joined
    Oct 2009
    Posts
    5,513
    Thanks
    769

    Re: Convergence in probability but not almost surely nor L^p

    Let the probability space be [0, 1] with the standard measure. The idea is to have X_n = n on an interval (or set) of measure 1/n and 0 everywhere else. Moreover, the support of X_n (i.e., \{\omega\in[0,1]: X_n(\omega)\ne0\}) should shift visiting every point over and over again so that every \omega is mapped to zero and non-zero arbitrarily far in the sequence. Then X_n converges to 0 in probability because the measure of the support tends to zero. On the other hand, X_n(\omega) does not converge for any \omega, so there is no almost sure convergence. Similarly, the measure of each X_n is 1, so there is no convergence to 0 in L^p.

    Can you give a precise definition of such X_n?
    Follow Math Help Forum on Facebook and Google+

  3. #3
    Newbie
    Joined
    Apr 2012
    From
    US
    Posts
    5

    Re: Convergence in probability but not almost surely nor L^p

    Quote Originally Posted by emakarov View Post
    Let the probability space be [0, 1] with the standard measure. The idea is to have X_n = n on an interval (or set) of measure 1/n and 0 everywhere else. Moreover, the support of X_n (i.e., \{\omega\in[0,1]: X_n(\omega)\ne0\}) should shift visiting every point over and over again so that every \omega is mapped to zero and non-zero arbitrarily far in the sequence. Then X_n converges to 0 in probability because the measure of the support tends to zero. On the other hand, X_n(\omega) does not converge for any \omega, so there is no almost sure convergence. Similarly, the measure of each X_n is 1, so there is no convergence to 0 in L^p.

    Can you give a precise definition of such X_n?
    I see. So if I were to define the sequence of random variables X_n by P(X_n=n) = \frac{1}{n} and P(X_n=0) = 1 - \frac{1}{n}, I can understand how such a sequence would converge to 0 in probability, but would not converge almost surely.But i'm just having a little trouble understanding why it does not converge in L^p. Is it due to the fact that E(X_n^r) = \frac{n^r}{n} \rightarrow \infty as n \rightarrow \infty?
    Follow Math Help Forum on Facebook and Google+

  4. #4
    MHF Contributor
    Joined
    Oct 2009
    Posts
    5,513
    Thanks
    769

    Re: Convergence in probability but not almost surely nor L^p

    Quote Originally Posted by jjacobs View Post
    I see. So if I were to define the sequence of random variables X_n by P(X_n=n) = \frac{1}{n} and P(X_n=0) = 1 - \frac{1}{n}, I can understand how such a sequence would converge to 0 in probability, but would not converge almost surely.
    Yes, but note that you can't define, for example,

    X_n(\omega)=\begin{cases}1&\omega\le1/n\\0&1/n<\omega\le1\end{cases}

    Quote Originally Posted by jjacobs View Post
    But i'm just having a little trouble understanding why it does not converge in L^p. Is it due to the fact that E(X_n^r) = \frac{n^r}{n} \rightarrow \infty as n \rightarrow \infty?
    Yes, at least when r > 1.
    Follow Math Help Forum on Facebook and Google+

  5. #5
    Newbie
    Joined
    Apr 2012
    From
    US
    Posts
    5

    Re: Convergence in probability but not almost surely nor L^p

    That's brilliant, I think I'm understanding it a bit better now. So to ensure that the sequence does not converge in  L^r for all  r > 0 (thus including  r = 1 ), am I right in saying I could define the sequence of random variables X_n by P(X_n=n) = \frac{1}{n^2} and P(X_n=0) = 1 - \frac{1}{n^2} and the results that the sequence converges in probability but not almost surely would still hold, but the sequence would also not converge in  L^r for all  r > 0?
    Follow Math Help Forum on Facebook and Google+

  6. #6
    MHF Contributor
    Joined
    Oct 2009
    Posts
    5,513
    Thanks
    769

    Re: Convergence in probability but not almost surely nor L^p

    I don't see how changing the support measure from 1/n to 1/n^2 helps ensure non-convergence in L^r for all r > 0. In fact, such sequence would not converge only for r >= 2.

    Quote Originally Posted by jjacobs View Post
    I could define the sequence of random variables X_n by P(X_n=n) = \frac{1}{n^2} and P(X_n=0) = 1 - \frac{1}{n^2} and the results that the sequence converges in probability but not almost surely would still hold
    I want to stress again that the conditions P(X_n=n) = \frac{1}{n^2} and P(X_n=0) = 1 - \frac{1}{n^2} do not define X_n. There are many sequences that satisfy these conditions. Further, some of them do converge almost surely. Also, if the support measure of X_n is 1/n, some sequences converge almost surely. In fact, my idea to find a sequence that does not converge almost surely relies on the fact that \sum_{n=1}^\infty\frac{1}{n} diverges. So I am not sure right away if there are sequences with P(X_n=n) = \frac{1}{n^2} and P(X_n=0) = 1 - \frac{1}{n^2} that don't converge almost surely. For another idea, you may want to see Wikipedia's claim that convergence in probability does not imply almost sure convergence and its proof using Borel–Cantelli lemma.
    Follow Math Help Forum on Facebook and Google+

  7. #7
    Newbie
    Joined
    Apr 2012
    From
    US
    Posts
    5

    Re: Convergence in probability but not almost surely nor L^p

    Apologies, what I had intended to write was P(X_n=n^2) = \frac{1}{n} and P(X_n=0) = 1 - \frac{1}{n}, which i believe retains the property of not converging almost surely and also does not converge in L^1.
    Follow Math Help Forum on Facebook and Google+

  8. #8
    MHF Contributor
    Joined
    Oct 2009
    Posts
    5,513
    Thanks
    769

    Re: Convergence in probability but not almost surely nor L^p

    Quote Originally Posted by jjacobs View Post
    Apologies, what I had intended to write was P(X_n=n^2) = \frac{1}{n} and P(X_n=0) = 1 - \frac{1}{n}, which i believe retains the property of not converging almost surely and also does not converge in L^1.
    Yes, such X_n converges in L^r for r < 1/2. But your conditions do not imply that X_n does not converge almost surely.
    Follow Math Help Forum on Facebook and Google+

  9. #9
    Newbie
    Joined
    Apr 2012
    From
    US
    Posts
    5

    Re: Convergence in probability but not almost surely nor L^p

    The example given on wikipedia of a sequence where X_n assumes the value 1 with probability \frac{1}{n} and zero otherwise can be shown not to converge almost surely by the Borel-Cantelli lemmas. Can this proof not be adapted for the example above, where instead of considering the event that X_n = 1 infinitely often, we substitute this for the event X_n \geq c for some constant c? Would this not suffice to show the intended result?
    Follow Math Help Forum on Facebook and Google+

Similar Math Help Forum Discussions

  1. Replies: 0
    Last Post: April 15th 2011, 05:12 AM
  2. almost surely
    Posted in the Advanced Statistics Forum
    Replies: 4
    Last Post: October 18th 2009, 07:47 AM
  3. X=0 almost surely => E(X)=0
    Posted in the Advanced Statistics Forum
    Replies: 4
    Last Post: July 11th 2009, 12:10 AM
  4. converges almost surely
    Posted in the Advanced Statistics Forum
    Replies: 1
    Last Post: November 15th 2007, 09:22 AM
  5. converges almost surely 2
    Posted in the Advanced Statistics Forum
    Replies: 0
    Last Post: November 14th 2007, 02:16 PM

Search Tags


/mathhelpforum @mathhelpforum