Results 1 to 13 of 13

Math Help - [SOLVED] Some problems (limsup, liminf, ...)

  1. #1
    Moo
    Moo is offline
    A Cute Angle Moo's Avatar
    Joined
    Mar 2008
    From
    P(I'm here)=1/3, P(I'm there)=t+1/3
    Posts
    5,618
    Thanks
    6

    [SOLVED] Some problems (limsup, liminf, ...)

    Hi ^^

    Probability again !


    So now we're dealing with Borel-Cantelli's lemma, limsup and liminf, and associates... And this is desperating to see how it is difficult to grasp the concept


    We'll correct these exercises next week and we don't have to solve them before, but I'm not kind of patient I tried to do them today, but nothing came out from these ones... Just getting hints would really help me go through... ^^


    By definition, \liminf_n A_n=\bigcup_k \bigcap_{n \ge k} A_n
    \limsup_n A_n=\bigcap_k \bigcup_{n \ge k} A_n


    ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~

    1.
    (X_n) is a sequence of iid Bernoulli(p), p \in (0,1), random variables
    - show that there is almost everywhere an infinity of n such that X_n=1
    >>> So basically, I have to show that \mathbb{P}(\cap \cup A_n)=1, that is to say \mathbb{P}(\limsup_n A_n)=1, where A_n=\{X_n=1\}
    I thought I could use the inequality \mathbb{P}(\limsup A_n)\ge \limsup \mathbb{P}(A_n)
    But I don't see how to get the RHS go to 1... (I told you I have problems with limsup )

    - show that for any n, if we define B_n=\{X_n=X_{n+1}=\dots=X_{2n-1}=1\}, there is, almost everywhere, only a finite number of B_n that are realised.
    >>> huh ???



    2.
    Let (X_n) be a similar sequence as above.
    Let Y_n=X_nX_{n+1} and V_n=Y_1+\dots+Y_n

    Show that \frac{V_n}{n} converges in probability to p^2

    >>> so I have to show that \forall \epsilon>0,~ \mathbb{P}\left(\left|\tfrac{V_n}{n}-p^2 \right|> \epsilon\right) \longrightarrow 0
    and then... ? XD



    thanks in advance ^^
    Last edited by Moo; March 10th 2009 at 12:01 PM. Reason: math tag missing
    Follow Math Help Forum on Facebook and Google+

  2. #2
    MHF Contributor

    Joined
    Aug 2008
    From
    Paris, France
    Posts
    1,174
    Salut, Moo,

    I wish my students were as excited as you are!

    Quote Originally Posted by Moo View Post
    1.
    (X_n) is a sequence of iid Bernoulli(p), p \in (0,1), random variables
    - show that there is almost everywhere an infinity of n such that X_n=1
    Are you allowed to use Borel-Cantelli lemma? If so, look at what it says about the sequence of the (independent) events A_n=\{X_n=1\}.
    Otherwise, you can decompose the definition of the limsup: try to show that for every M\in\mathbb{N}, there is almost surely an index n\geq M such that X_n=1 (if not,...). Then find a way deduce that almost surely this is true for all M (look at the complement). (NB: "almost surely" is like "almost everywhere", in the context of probability (both are correct of course))


    - show that for any n, if we define B_n=\{X_n=X_{n+1}=\dots=X_{2n-1}=1\}, there is, almost everywhere, only a finite number of B_n that are realised.
    >>> huh ???
    If you are allowed to use Borel-Cantelli, look at what it says for the sequence (B_n)_n. In other words, does the series \sum_n P(B_n) converge?

    2.
    Let (X_n) be a similar sequence as above.
    Let Y_n=X_nX_{n+1} and V_n=Y_1+\dots+Y_n

    Show that \frac{V_n}{n} converges in probability to p^2

    >>> so I have to show that \forall \epsilon>0,~ \mathbb{P}\left(\left|\tfrac{V_n}{n}-p^2 \right|> \epsilon\right) \longrightarrow 0
    and then... ? XD
    A key word would be "Tchebychev's inequality": compute first the expectation and variance (it needs some care) of V_n in order to apply Tchebychev's inequality to V_n (or \tfrac{V_n}{n}).
    Follow Math Help Forum on Facebook and Google+

  3. #3
    Moo
    Moo is offline
    A Cute Angle Moo's Avatar
    Joined
    Mar 2008
    From
    P(I'm here)=1/3, P(I'm there)=t+1/3
    Posts
    5,618
    Thanks
    6
    Quote Originally Posted by Laurent View Post
    Salut, Moo,
    Plop

    I wish my students were as excited as you are!
    haha ! i'm not that excited... it's just that i'm interested xD and my teacher is really nice
    what kind of students do you have ?

    Are you allowed to use Borel-Cantelli lemma? If so, look at what it says about the sequence of the (independent) events A_n=\{X_n=1\}.
    I guess I can use it... lol
    Now that I've looked through my notes again, I saw that version of Borel-Cantelli's lemma :
    \sum \mathbb{P}(A_n)=\infty \Rightarrow \mathbb{P}(\bigcap_{n>0} \bigcup_{k>n} A_k)=1
    I thought we necessarily gotta have \sum \mathbb{P}(A_n)<\infty


    So is it correct to translate the "there is almost everywhere an infinity of n such that X_n=1" into \mathbb{P}(\cap \cup A_n)=1 ?
    In that case, the question is quite straightforward... ?

    Otherwise, you can decompose the definition of the limsup: try to show that for every M\in\mathbb{N}, there is almost surely an index n\geq M such that X_n=1 (if not,...). Then find a way deduce that almost surely this is true for all M (look at the complement).
    that's this kind of thing that I don't know... how to show that "for any n, there exists..." etc ><

    (NB: "almost surely" is like "almost everywhere", in the context of probability (both are correct of course))
    Yup, that's what was written in French, I guess it's a reflex from the measure theory course XD


    If you are allowed to use Borel-Cantelli, look at what it says for the sequence (B_n)_n. In other words, does the series \sum_n P(B_n) converge?
    geometric series, so it converges.
    hence from BC's lemma, we get \mathbb{P}(\cap \cup B_k)=1
    does this translate to "there is a finite number of Bn that are realised" ?
    There's really a problem with understanding the translations of \cap \& \cup... If you could clarify this, I would be grateful

    A key word would be "Tchebychev's inequality": compute first the expectation and variance (it needs some care) of V_n in order to apply Tchebychev's inequality to V_n (or \tfrac{V_n}{n}).
    I love calculating expectations and variances
    okay i'll work on it



    thank you, as always !
    Follow Math Help Forum on Facebook and Google+

  4. #4
    MHF Contributor

    Joined
    Aug 2008
    From
    Paris, France
    Posts
    1,174
    Quote Originally Posted by Moo View Post
    what kind of students do you have ?
    This semester, I'm a TA (chargé de td) for a probability course in "L3 Maths". I did it last year already and the students found this course really hard; they're fine when it comes to computing integrals, but most found it challenging to translate their intuitions into formulas, you see what I mean...

    In that case, the question is quite straightforward... ?
    I wouldn't have dared saying it before you do, but yeah, it is...


    that's this kind of thing that I don't know... how to show that "for any n, there exists..." etc ><
    Maybe you can look at the proof of BC's lemma with the previous examples in mind.

    "For any n" => intersection
    "there exits" => union
    and you can use the equality P(A\cap B)=P(A)P(B) for independent events, and the inequality P(A\cup B)\leq P(A)+P(B) in general.

    Just an example. Translating into symbols, I suggested to first show that, for any M\geq 1, P(\bigcup_{n\geq M}A_n)=1. In order to use independence, I must handle intersections, hence let's write P(\bigcup_{n\geq M}A_n)=1-P(\bigcap_{n\geq M}A_n^c). And we have P(\bigcap_{n\geq M}A_n^c)=\prod_{n\geq M} P(A_n^c)=\prod_{n\geq M}\frac{1}{2}=0 by independence.

    (If you aren't used to applying independence to infinitely many events at once, you can say P(\bigcap_{n\geq M}A_n^c)\leq P(\bigcap_{n=M}^{M+N-1} A_n^c)=\frac{1}{2^N} for all N, hence P(\bigcap_{n\geq M}A_n^c)= 0. )

    Thus P(\bigcup_{n\geq M}A_n)=1. What about P(\bigcap_{M\geq 0}\bigcup_{n\geq M}A_n)? Switching to the complement: P(\bigcup_{M\geq 0}\left(\bigcup_{n\geq M}A_n\right)^c)\leq  \sum_{M\geq 0}P(\left(\bigcup_{n\geq M}A_n\right)^c)=\sum_{M\geq 0} 0=0 (using what we got before). In conclusion, P(\bigcap_{M\geq 0}\bigcup_{n\geq M}A_n)=1. This is what you need.

    You can remember this (this was the last part): a countable intersection of almost sure events is almost sure.


    There's really a problem with understanding the translations of \cap \& \cup... If you could clarify this, I would be grateful
    To remember: \limsup_n A_n is the event where infinitely many events in the family (A_n)_n happen: for every n there is k>n such that A_k happens (in other words, there are arbitrarily large k such that A_k happens).

    In particular, the complement is "finitely many events in the family (A_n)_n happen". Or: there is n such that for all k\geq n, A_k^c happens. This is a liminf:

    \liminf_n A_n is much stronger than \limsup_n A_n: it is the event where there is n such that A_k happens for every k\geq n.

    Hence, we have seen (\limsup_n A_n)^c=\liminf_n A_n^c. Take time to get convinced of this.
    Follow Math Help Forum on Facebook and Google+

  5. #5
    Moo
    Moo is offline
    A Cute Angle Moo's Avatar
    Joined
    Mar 2008
    From
    P(I'm here)=1/3, P(I'm there)=t+1/3
    Posts
    5,618
    Thanks
    6
    Quote Originally Posted by Laurent View Post
    This semester, I'm a TA (chargé de td) for a probability course in "L3 Maths". I did it last year already and the students found this course really hard; they're fine when it comes to computing integrals, but most found it challenging to translate their intuitions into formulas, you see what I mean...
    Yes there is no problem with computations, but theory is not easy to grasp (as you can see... :P)
    Our course is a tough one too... A friend told me that if one had 60/100 for this course, it was a very good mark

    I wouldn't have dared saying it before you do, but yeah, it is...
    okay ^^


    Maybe you can look at the proof of BC's lemma with the previous examples in mind.

    "For any n" => intersection
    "there exits" => union
    and you can use the equality P(A\cap B)=P(A)P(B) for independent events, and the inequality P(A\cup B)\leq P(A)+P(B) in general.

    Just an example. Translating into symbols, I suggested to first show that, for any M\geq 1, P(\bigcup_{n\geq M}A_n)=1. In order to use independence, I must handle intersections, hence let's write P(\bigcup_{n\geq M}A_n)=1-P(\bigcap_{n\geq M}A_n^c). And we have P(\bigcap_{n\geq M}A_n^c)=\prod_{n\geq M} P(A_n^c)=\prod_{n\geq M}\frac{1}{2}=0 by independence.

    (If you aren't used to applying independence to infinitely many events at once, you can say P(\bigcap_{n\geq M}A_n^c)\leq P(\bigcap_{n=M}^{M+N-1} A_n^c)=\frac{1}{2^N} for all N, hence P(\bigcap_{n\geq M}A_n^c)= 0. )

    Thus P(\bigcup_{n\geq M}A_n)=1. What about P(\bigcap_{M\geq 0}\bigcup_{n\geq M}A_n)? Switching to the complement: P(\bigcup_{M\geq 0}\left(\bigcup_{n\geq M}A_n\right)^c)\leq  \sum_{M\geq 0}P(\left(\bigcup_{n\geq M}A_n\right)^c)=\sum_{M\geq 0} 0=0 (using what we got before). In conclusion, P(\bigcap_{M\geq 0}\bigcup_{n\geq M}A_n)=1. This is what you need.

    You can remember this (this was the last part): a countable intersection of almost sure events is almost sure.
    Hey... I think I've understood what you said !


    Though there a little question... :
    P(\bigcap_{n\geq M}A_n^c)=\prod_{n\geq M} P(A_n^c)=\prod_{n\geq M}\frac{1}{2}=0
    this would hold for any independent events A_n (with the condition given below) then ??
    Since P(A_n^c)<1 (condition : let's assume there are only finitely many that are equal to 1), \prod_{n \ge M} P(A_n^c)=0 (we can find \alpha<1 such that P(A_n^c)<\alpha)


    oh... would this be, by any chance, equivalent to \sum P(A_n)<\infty ? lol
    if so, how ? o.O


    To remember: \limsup_n A_n is the event where infinitely many events in the family (A_n)_n happen: for every n there is k>n such that A_k happens (in other words, there are arbitrarily large k such that A_k happens).

    In particular, the complement is "finitely many events in the family (A_n)_n happen". Or: there is n such that for all k\geq n, A_k^c happens. This is a liminf:

    \liminf_n A_n is much stronger than \limsup_n A_n: it is the event where there is n such that A_k happens for every k\geq n.

    Hence, we have seen (\limsup_n A_n)^c=\liminf_n A_n^c. Take time to get convinced of this.
    Yes, but I'll rather use this time for understanding this ^^




    Sorry for all the questions... there always are questions following previous ones :P
    and thank you very much for taking from your time to answer them !
    Follow Math Help Forum on Facebook and Google+

  6. #6
    MHF Contributor

    Joined
    Aug 2008
    From
    Paris, France
    Posts
    1,174
    Quote Originally Posted by Moo View Post
    Though there a little question... :

    this would hold for any independent events A_n (with the condition given below) then ??
    Since P(A_n^c)<1 (condition : let's assume there are only finitely many that are equal to 1), \prod_{n \ge M} P(A_n^c)=0 (we can find \alpha<1 such that P(A_n^c)<\alpha)


    oh... would this be, by any chance, equivalent to \sum P(A_n)\textcolor{red}{=}\infty ? lol
    if so, how ? o.O
    Good point. If \sum_n P(A_n)=\infty, then we have indeed:

    \prod_{n\ge M} P(A_n^c)=\prod_{n\geq M}(1-P(A_n))\leq  \prod_{n\ge M}e^{-P(A_n)}=e^{-\sum_{n\ge M} P(A_n)}=0

    (using 1-x\leq e^{-x} for every x). This gives a proof for one part of BC's lemma.

    (As for the other part of BC's lemma, it is called the "easy part". Here's why: we have \sum_n P(A_n)=\sum_n E[{\bf 1}_{A_n}]=E[\sum_n {\bf 1}_{A_n}] using monotone convergence or Fubini. If \sum_n P(A_n)<\infty, then E[\sum_n {\bf 1}_{A_n}]<\infty, so that almost surely \sum_n {\bf 1}_{A_n}<\infty. Since this r.v. counts how many events happen, we have the result.)


    This is one of my favorite topics, so I'm truely glad to help!
    Follow Math Help Forum on Facebook and Google+

  7. #7
    MHF Contributor matheagle's Avatar
    Joined
    Feb 2009
    Posts
    2,763
    Thanks
    5
    Number 2 is just the weak law of large numbers or what's called the degenerate convergence theorem.
    Follow Math Help Forum on Facebook and Google+

  8. #8
    MHF Contributor

    Joined
    Aug 2008
    From
    Paris, France
    Posts
    1,174
    Quote Originally Posted by matheagle View Post
    Number 2 is just the weak law of large numbers or what's called the degenerate convergence theorem.
    It is a weak law of large numbers, but I wouldn't say "just the weak LLN", because there's a subtlety in that the random variables Y_n\ (n\geq 1) aren't independent. However, I never heard of the degenerate convergence theorem, so I can't tell about this part.
    Follow Math Help Forum on Facebook and Google+

  9. #9
    MHF Contributor matheagle's Avatar
    Joined
    Feb 2009
    Posts
    2,763
    Thanks
    5
    I knew that these rvs were dependent.
    The nice thing about the Weak Law, page 356 of Chow and Teicher, is that it gives necessary and sufficient conditions, just like the Central Limit Theorem.
    But that Weak Law does assume independence, so I was wrong.
    Interestingly there isn't a necessary and sufficient Strong Law.
    Last edited by matheagle; March 13th 2009 at 12:23 AM.
    Follow Math Help Forum on Facebook and Google+

  10. #10
    Moo
    Moo is offline
    A Cute Angle Moo's Avatar
    Joined
    Mar 2008
    From
    P(I'm here)=1/3, P(I'm there)=t+1/3
    Posts
    5,618
    Thanks
    6
    Okay, so for 2., I found the expectation of \frac{V_n}{n} to be p^2 (woaaaah xD)

    used above : \mathbb{E}(X_iX_j)=p^2
    useful for below : \mathbb{E}(X_i^2)=p

    and the variance is messy... so I don't know if it is correct.
    Here is the working :

    \begin{aligned}<br />
n^2 \text{Var}\left(\tfrac{V_n}{n}\right)<br />
&=\text{Var}(V_n) \\<br />
&=\text{Var}(Y_1+\dots+Y_n) \\<br />
(*) &=\sum_{i=1}^n \text{Var}(Y_i)+2 \sum_{1\leq i<j\leq n} \text{Cov}(Y_i,Y_j) \\<br />
\end{aligned}

    \text{Var}(Y_i)=\mathbb{E}(X_i^2X_{i+1}^2)-[\mathbb{E}(X_iX_{i+1})]^2=p^2-p^4
    and
    Since Y_i \in \sigma(X_i,X_{i+1}) and Y_j \in \sigma(X_j,X_{j+1}), Y_i and Y_j are independent iff j-i>1 (and thus their covariance is 0)

    (*)=n(p^2-p^4)+2 \sum_{i=1}^{n-1} \text{Cov}(Y_i,Y_{i+1})

    Now calculating \text{Cov}(Y_i,Y_{i+1}) :
    \text{Cov}(Y_i,Y_{i+1})=\mathbb{E}(Y_iY_{i+1})-\mathbb{E}(Y_i)\mathbb{E}(Y_{i+1})=\mathbb{E}(X_i)  \mathbb{E}(X_{i+1}^2)\mathbb{E}(X_{i+2})-p^4=p^3-p^4


    (*)=n(p^2-p^4)+2(n-1)(p^3-p^4)

    Hence \text{Var}\left(\tfrac{V_n}{n}\right)=\frac{p^2-p^4}{n}+\frac{2(n-1)(p^3-p^4)}{n^2}
    which indeed goes to 0, but is there a mistake somewhere ? I don't like this expression
    Follow Math Help Forum on Facebook and Google+

  11. #11
    MHF Contributor

    Joined
    Aug 2008
    From
    Paris, France
    Posts
    1,174
    Quote Originally Posted by Moo View Post
    and the variance is messy... so I don't know if it is correct.
    It looks correct, and by the way it was very well explained!

    Since you only want to prove {\rm Var}(\tfrac{V_n}{n})\to 0, you can also skip some of the computations using the homogenity of the variables, like {\rm Var}(V_n)=\sum_{i=1}^n {\rm Var}(Y_i)+\sum_{i=1}^{n-1} {\rm Cov}(Y_i,Y_{i+1})=n{\rm Var }(Y_1)+(n-1){\rm Cov}(Y_1,Y_2)\leq C n for some C, and all n\geq 1.
    Follow Math Help Forum on Facebook and Google+

  12. #12
    Moo
    Moo is offline
    A Cute Angle Moo's Avatar
    Joined
    Mar 2008
    From
    P(I'm here)=1/3, P(I'm there)=t+1/3
    Posts
    5,618
    Thanks
    6
    Quote Originally Posted by Laurent View Post
    It looks correct, and by the way it was very well explained!

    Since you only want to prove {\rm Var}(\tfrac{V_n}{n})\to 0, you can also skip some of the computations using the homogenity of the variables, like {\rm Var}(V_n)=\sum_{i=1}^n {\rm Var}(Y_i)+\sum_{i=1}^{n-1} {\rm Cov}(Y_i,Y_{i+1})=n{\rm Var }(Y_1)+(n-1){\rm Cov}(Y_1,Y_2)\leq C n for some C, and all n\geq 1.
    That's cheating... you're skipping the awful part !
    Follow Math Help Forum on Facebook and Google+

  13. #13
    MHF Contributor

    Joined
    Aug 2008
    From
    Paris, France
    Posts
    1,174
    Quote Originally Posted by Moo View Post
    That's cheating... you're skipping the awful part !
    Quote Originally Posted by Max Rosenlicht (1949)
    You know we all became mathematicians for the same reason: we were lazy.
    Follow Math Help Forum on Facebook and Google+

Similar Math Help Forum Discussions

  1. prove limsup-liminf<2
    Posted in the Differential Geometry Forum
    Replies: 5
    Last Post: December 6th 2011, 04:15 PM
  2. [SOLVED] limsup and liminf
    Posted in the Differential Geometry Forum
    Replies: 2
    Last Post: September 21st 2010, 06:09 AM
  3. Liminf to limsup
    Posted in the Differential Geometry Forum
    Replies: 2
    Last Post: November 20th 2009, 11:49 AM
  4. quick help with liminf and limsup
    Posted in the Calculus Forum
    Replies: 3
    Last Post: October 31st 2007, 08:13 AM
  5. limsup and liminf
    Posted in the Calculus Forum
    Replies: 3
    Last Post: October 29th 2007, 10:05 AM

Search Tags


/mathhelpforum @mathhelpforum