[SOLVED] Some problems (limsup, liminf, ...)

• Mar 10th 2009, 09:42 AM
Moo
[SOLVED] Some problems (limsup, liminf, ...)
Hi ^^

Probability again ! :D

So now we're dealing with Borel-Cantelli's lemma, limsup and liminf, and associates... And this is desperating to see how it is difficult to grasp the concept :D

We'll correct these exercises next week and we don't have to solve them before, but I'm not kind of patient (Rofl) I tried to do them today, but nothing came out from these ones... Just getting hints would really help me go through... ^^

By definition, $\liminf_n A_n=\bigcup_k \bigcap_{n \ge k} A_n$
$\limsup_n A_n=\bigcap_k \bigcup_{n \ge k} A_n$

~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~

1.
$(X_n)$ is a sequence of iid Bernoulli(p), p $\in$ (0,1), random variables
- show that there is almost everywhere an infinity of n such that $X_n=1$
>>> So basically, I have to show that $\mathbb{P}(\cap \cup A_n)=1$, that is to say $\mathbb{P}(\limsup_n A_n)=1$, where $A_n=\{X_n=1\}$
I thought I could use the inequality $\mathbb{P}(\limsup A_n)\ge \limsup \mathbb{P}(A_n)$
But I don't see how to get the RHS go to 1... (I told you I have problems with limsup :D)

- show that for any n, if we define $B_n=\{X_n=X_{n+1}=\dots=X_{2n-1}=1\}$, there is, almost everywhere, only a finite number of $B_n$ that are realised.
>>> huh ???

2.
Let $(X_n)$ be a similar sequence as above.
Let $Y_n=X_nX_{n+1}$ and $V_n=Y_1+\dots+Y_n$

Show that $\frac{V_n}{n}$ converges in probability to $p^2$

>>> so I have to show that $\forall \epsilon>0,~ \mathbb{P}\left(\left|\tfrac{V_n}{n}-p^2 \right|> \epsilon\right) \longrightarrow 0$
and then... ? XD

• Mar 10th 2009, 10:17 AM
Laurent
Salut, Moo,

I wish my students were as excited as you are!

Quote:

Originally Posted by Moo
1.
$(X_n)$ is a sequence of iid Bernoulli(p), p $\in$ (0,1), random variables
- show that there is almost everywhere an infinity of n such that $X_n=1$

Are you allowed to use Borel-Cantelli lemma? If so, look at what it says about the sequence of the (independent) events $A_n=\{X_n=1\}$.
Otherwise, you can decompose the definition of the limsup: try to show that for every $M\in\mathbb{N}$, there is almost surely an index $n\geq M$ such that $X_n=1$ (if not,...). Then find a way deduce that almost surely this is true for all $M$ (look at the complement). (NB: "almost surely" is like "almost everywhere", in the context of probability (both are correct of course))

Quote:

- show that for any n, if we define $B_n=\{X_n=X_{n+1}=\dots=X_{2n-1}=1\}$, there is, almost everywhere, only a finite number of $B_n$ that are realised.
>>> huh ???
If you are allowed to use Borel-Cantelli, look at what it says for the sequence $(B_n)_n$. In other words, does the series $\sum_n P(B_n)$ converge?

Quote:

2.
Let $(X_n)$ be a similar sequence as above.
Let $Y_n=X_nX_{n+1}$ and $V_n=Y_1+\dots+Y_n$

Show that $\frac{V_n}{n}$ converges in probability to $p^2$

>>> so I have to show that $\forall \epsilon>0,~ \mathbb{P}\left(\left|\tfrac{V_n}{n}-p^2 \right|> \epsilon\right) \longrightarrow 0$
and then... ? XD
A key word would be "Tchebychev's inequality": compute first the expectation and variance (it needs some care) of $V_n$ in order to apply Tchebychev's inequality to $V_n$ (or $\tfrac{V_n}{n}$).
• Mar 10th 2009, 10:37 AM
Moo
Quote:

Originally Posted by Laurent
Salut, Moo,

Plop :D

Quote:

I wish my students were as excited as you are!
haha ! i'm not that excited... it's just that i'm interested xD and my teacher is really nice (Itwasntme)
what kind of students do you have ?

Quote:

Are you allowed to use Borel-Cantelli lemma? If so, look at what it says about the sequence of the (independent) events $A_n=\{X_n=1\}$.
I guess I can use it... lol
Now that I've looked through my notes again, I saw that version of Borel-Cantelli's lemma :
$\sum \mathbb{P}(A_n)=\infty \Rightarrow \mathbb{P}(\bigcap_{n>0} \bigcup_{k>n} A_k)=1$
I thought we necessarily gotta have $\sum \mathbb{P}(A_n)<\infty$ (Tongueout)

So is it correct to translate the "there is almost everywhere an infinity of n such that X_n=1" into $\mathbb{P}(\cap \cup A_n)=1$ ?
In that case, the question is quite straightforward... ?

Quote:

Otherwise, you can decompose the definition of the limsup: try to show that for every $M\in\mathbb{N}$, there is almost surely an index $n\geq M$ such that $X_n=1$ (if not,...). Then find a way deduce that almost surely this is true for all $M$ (look at the complement).
that's this kind of thing that I don't know... how to show that "for any n, there exists..." etc ><

Quote:

(NB: "almost surely" is like "almost everywhere", in the context of probability (both are correct of course))
Yup, that's what was written in French, I guess it's a reflex from the measure theory course XD

Quote:

If you are allowed to use Borel-Cantelli, look at what it says for the sequence $(B_n)_n$. In other words, does the series $\sum_n P(B_n)$ converge?
geometric series, so it converges.
hence from BC's lemma, we get $\mathbb{P}(\cap \cup B_k)=1$
does this translate to "there is a finite number of Bn that are realised" ? (Surprised)
There's really a problem with understanding the translations of $\cap \& \cup$... If you could clarify this, I would be grateful :p

Quote:

A key word would be "Tchebychev's inequality": compute first the expectation and variance (it needs some care) of $V_n$ in order to apply Tchebychev's inequality to $V_n$ (or $\tfrac{V_n}{n}$).
I love calculating expectations and variances :D
okay i'll work on it :)

thank you, as always !
• Mar 10th 2009, 11:48 AM
Laurent
Quote:

Originally Posted by Moo
what kind of students do you have ?

This semester, I'm a TA (chargé de td) for a probability course in "L3 Maths". I did it last year already and the students found this course really hard; they're fine when it comes to computing integrals, but most found it challenging to translate their intuitions into formulas, you see what I mean...

Quote:

In that case, the question is quite straightforward... ?
I wouldn't have dared saying it before you do, but yeah, it is... (Happy)

Quote:

that's this kind of thing that I don't know... how to show that "for any n, there exists..." etc ><
Maybe you can look at the proof of BC's lemma with the previous examples in mind.

"For any n" => intersection
"there exits" => union
and you can use the equality $P(A\cap B)=P(A)P(B)$ for independent events, and the inequality $P(A\cup B)\leq P(A)+P(B)$ in general.

Just an example. Translating into symbols, I suggested to first show that, for any $M\geq 1$, $P(\bigcup_{n\geq M}A_n)=1$. In order to use independence, I must handle intersections, hence let's write $P(\bigcup_{n\geq M}A_n)=1-P(\bigcap_{n\geq M}A_n^c)$. And we have $P(\bigcap_{n\geq M}A_n^c)=\prod_{n\geq M} P(A_n^c)=\prod_{n\geq M}\frac{1}{2}=0$ by independence.

(If you aren't used to applying independence to infinitely many events at once, you can say $P(\bigcap_{n\geq M}A_n^c)\leq P(\bigcap_{n=M}^{M+N-1} A_n^c)=\frac{1}{2^N}$ for all $N$, hence $P(\bigcap_{n\geq M}A_n^c)= 0$. )

Thus $P(\bigcup_{n\geq M}A_n)=1$. What about $P(\bigcap_{M\geq 0}\bigcup_{n\geq M}A_n)$? Switching to the complement: $P(\bigcup_{M\geq 0}\left(\bigcup_{n\geq M}A_n\right)^c)\leq$ $\sum_{M\geq 0}P(\left(\bigcup_{n\geq M}A_n\right)^c)=\sum_{M\geq 0} 0=0$ (using what we got before). In conclusion, $P(\bigcap_{M\geq 0}\bigcup_{n\geq M}A_n)=1$. This is what you need.

You can remember this (this was the last part): a countable intersection of almost sure events is almost sure.

Quote:

There's really a problem with understanding the translations of $\cap \& \cup$... If you could clarify this, I would be grateful :p
To remember: $\limsup_n A_n$ is the event where infinitely many events in the family $(A_n)_n$ happen: for every $n$ there is $k>n$ such that $A_k$ happens (in other words, there are arbitrarily large $k$ such that $A_k$ happens).

In particular, the complement is "finitely many events in the family $(A_n)_n$ happen". Or: there is $n$ such that for all $k\geq n$, $A_k^c$ happens. This is a liminf:

$\liminf_n A_n$ is much stronger than $\limsup_n A_n$: it is the event where there is $n$ such that $A_k$ happens for every $k\geq n$.

Hence, we have seen $(\limsup_n A_n)^c=\liminf_n A_n^c$. Take time to get convinced of this.
• Mar 10th 2009, 12:41 PM
Moo
Quote:

Originally Posted by Laurent
This semester, I'm a TA (chargé de td) for a probability course in "L3 Maths". I did it last year already and the students found this course really hard; they're fine when it comes to computing integrals, but most found it challenging to translate their intuitions into formulas, you see what I mean...

Yes :D there is no problem with computations, but theory is not easy to grasp (as you can see... :P)
Our course is a tough one too... A friend told me that if one had 60/100 for this course, it was a very good mark (Worried)

Quote:

I wouldn't have dared saying it before you do, but yeah, it is... (Happy)
okay ^^

Quote:

Maybe you can look at the proof of BC's lemma with the previous examples in mind.

"For any n" => intersection
"there exits" => union
and you can use the equality $P(A\cap B)=P(A)P(B)$ for independent events, and the inequality $P(A\cup B)\leq P(A)+P(B)$ in general.

Just an example. Translating into symbols, I suggested to first show that, for any $M\geq 1$, $P(\bigcup_{n\geq M}A_n)=1$. In order to use independence, I must handle intersections, hence let's write $P(\bigcup_{n\geq M}A_n)=1-P(\bigcap_{n\geq M}A_n^c)$. And we have $P(\bigcap_{n\geq M}A_n^c)=\prod_{n\geq M} P(A_n^c)=\prod_{n\geq M}\frac{1}{2}=0$ by independence.

(If you aren't used to applying independence to infinitely many events at once, you can say $P(\bigcap_{n\geq M}A_n^c)\leq P(\bigcap_{n=M}^{M+N-1} A_n^c)=\frac{1}{2^N}$ for all $N$, hence $P(\bigcap_{n\geq M}A_n^c)= 0$. )

Thus $P(\bigcup_{n\geq M}A_n)=1$. What about $P(\bigcap_{M\geq 0}\bigcup_{n\geq M}A_n)$? Switching to the complement: $P(\bigcup_{M\geq 0}\left(\bigcup_{n\geq M}A_n\right)^c)\leq$ $\sum_{M\geq 0}P(\left(\bigcup_{n\geq M}A_n\right)^c)=\sum_{M\geq 0} 0=0$ (using what we got before). In conclusion, $P(\bigcap_{M\geq 0}\bigcup_{n\geq M}A_n)=1$. This is what you need.

You can remember this (this was the last part): a countable intersection of almost sure events is almost sure.
Hey... I think I've understood what you said ! (Rofl)

Though there a little question... :
Quote:

$P(\bigcap_{n\geq M}A_n^c)=\prod_{n\geq M} P(A_n^c)=\prod_{n\geq M}\frac{1}{2}=0$
this would hold for any independent events $A_n$ (with the condition given below) then ??
Since $P(A_n^c)<1$ (condition : let's assume there are only finitely many that are equal to 1), $\prod_{n \ge M} P(A_n^c)=0$ (we can find $\alpha<1$ such that $P(A_n^c)<\alpha$)

oh... would this be, by any chance, equivalent to $\sum P(A_n)<\infty$ ? lol
if so, how ? o.O

Quote:

To remember: $\limsup_n A_n$ is the event where infinitely many events in the family $(A_n)_n$ happen: for every $n$ there is $k>n$ such that $A_k$ happens (in other words, there are arbitrarily large $k$ such that $A_k$ happens).

In particular, the complement is "finitely many events in the family $(A_n)_n$ happen". Or: there is $n$ such that for all $k\geq n$, $A_k^c$ happens. This is a liminf:

$\liminf_n A_n$ is much stronger than $\limsup_n A_n$: it is the event where there is $n$ such that $A_k$ happens for every $k\geq n$.

Hence, we have seen $(\limsup_n A_n)^c=\liminf_n A_n^c$. Take time to get convinced of this.
Yes, but I'll rather use this time for understanding this ^^

Sorry for all the questions... there always are questions following previous ones :P
and thank you very much for taking from your time to answer them !
• Mar 10th 2009, 02:02 PM
Laurent
Quote:

Originally Posted by Moo
Though there a little question... :

this would hold for any independent events $A_n$ (with the condition given below) then ??
Since $P(A_n^c)<1$ (condition : let's assume there are only finitely many that are equal to 1), $\prod_{n \ge M} P(A_n^c)=0$ (we can find $\alpha<1$ such that $P(A_n^c)<\alpha$)

oh... would this be, by any chance, equivalent to $\sum P(A_n)\textcolor{red}{=}\infty$ ? lol
if so, how ? o.O

Good point. If $\sum_n P(A_n)=\infty$, then we have indeed:

$\prod_{n\ge M} P(A_n^c)=\prod_{n\geq M}(1-P(A_n))\leq$ $\prod_{n\ge M}e^{-P(A_n)}=e^{-\sum_{n\ge M} P(A_n)}=0$

(using $1-x\leq e^{-x}$ for every $x$). This gives a proof for one part of BC's lemma.

(As for the other part of BC's lemma, it is called the "easy part". Here's why: we have $\sum_n P(A_n)=\sum_n E[{\bf 1}_{A_n}]=E[\sum_n {\bf 1}_{A_n}]$ using monotone convergence or Fubini. If $\sum_n P(A_n)<\infty$, then $E[\sum_n {\bf 1}_{A_n}]<\infty$, so that almost surely $\sum_n {\bf 1}_{A_n}<\infty$. Since this r.v. counts how many events happen, we have the result.)

This is one of my favorite topics, so I'm truely glad to help!
• Mar 10th 2009, 11:15 PM
matheagle
Number 2 is just the weak law of large numbers or what's called the degenerate convergence theorem.
• Mar 11th 2009, 01:58 AM
Laurent
Quote:

Originally Posted by matheagle
Number 2 is just the weak law of large numbers or what's called the degenerate convergence theorem.

It is a weak law of large numbers, but I wouldn't say "just the weak LLN", because there's a subtlety in that the random variables $Y_n\ (n\geq 1)$ aren't independent. However, I never heard of the degenerate convergence theorem, so I can't tell about this part.
• Mar 11th 2009, 10:32 AM
matheagle
I knew that these rvs were dependent.
The nice thing about the Weak Law, page 356 of Chow and Teicher, is that it gives necessary and sufficient conditions, just like the Central Limit Theorem.
But that Weak Law does assume independence, so I was wrong.
Interestingly there isn't a necessary and sufficient Strong Law.
• Mar 15th 2009, 02:49 AM
Moo
Okay, so for 2., I found the expectation of $\frac{V_n}{n}$ to be $p^2$ (woaaaah xD)

used above : $\mathbb{E}(X_iX_j)=p^2$
useful for below : $\mathbb{E}(X_i^2)=p$

and the variance is messy... so I don't know if it is correct.
Here is the working :

\begin{aligned}
n^2 \text{Var}\left(\tfrac{V_n}{n}\right)
&=\text{Var}(V_n) \\
&=\text{Var}(Y_1+\dots+Y_n) \\
(*) &=\sum_{i=1}^n \text{Var}(Y_i)+2 \sum_{1\leq i \end{aligned}

$\text{Var}(Y_i)=\mathbb{E}(X_i^2X_{i+1}^2)-[\mathbb{E}(X_iX_{i+1})]^2=p^2-p^4$
and
Since $Y_i \in \sigma(X_i,X_{i+1})$ and $Y_j \in \sigma(X_j,X_{j+1})$, $Y_i$ and $Y_j$ are independent iff $j-i>1$ (and thus their covariance is 0)

$(*)=n(p^2-p^4)+2 \sum_{i=1}^{n-1} \text{Cov}(Y_i,Y_{i+1})$

Now calculating $\text{Cov}(Y_i,Y_{i+1})$ :
$\text{Cov}(Y_i,Y_{i+1})=\mathbb{E}(Y_iY_{i+1})-\mathbb{E}(Y_i)\mathbb{E}(Y_{i+1})=\mathbb{E}(X_i) \mathbb{E}(X_{i+1}^2)\mathbb{E}(X_{i+2})-p^4=p^3-p^4$

$(*)=n(p^2-p^4)+2(n-1)(p^3-p^4)$

Hence $\text{Var}\left(\tfrac{V_n}{n}\right)=\frac{p^2-p^4}{n}+\frac{2(n-1)(p^3-p^4)}{n^2}$
which indeed goes to 0, but is there a mistake somewhere ? :( I don't like this expression :D
• Mar 15th 2009, 04:21 AM
Laurent
Quote:

Originally Posted by Moo
and the variance is messy... so I don't know if it is correct.

It looks correct, and by the way it was very well explained! (Clapping)

Since you only want to prove ${\rm Var}(\tfrac{V_n}{n})\to 0$, you can also skip some of the computations using the homogenity of the variables, like ${\rm Var}(V_n)=\sum_{i=1}^n {\rm Var}(Y_i)+\sum_{i=1}^{n-1} {\rm Cov}(Y_i,Y_{i+1})=n{\rm Var }(Y_1)+(n-1){\rm Cov}(Y_1,Y_2)\leq C n$ for some $C$, and all $n\geq 1$.
• Mar 15th 2009, 05:01 AM
Moo
Quote:

Originally Posted by Laurent
It looks correct, and by the way it was very well explained! (Clapping)

Since you only want to prove ${\rm Var}(\tfrac{V_n}{n})\to 0$, you can also skip some of the computations using the homogenity of the variables, like ${\rm Var}(V_n)=\sum_{i=1}^n {\rm Var}(Y_i)+\sum_{i=1}^{n-1} {\rm Cov}(Y_i,Y_{i+1})=n{\rm Var }(Y_1)+(n-1){\rm Cov}(Y_1,Y_2)\leq C n$ for some $C$, and all $n\geq 1$.

That's cheating... you're skipping the awful part ! :D
• Mar 15th 2009, 12:38 PM
Laurent
Quote:

Originally Posted by Moo
That's cheating... you're skipping the awful part ! :D

Quote:

Originally Posted by Max Rosenlicht (1949)
You know we all became mathematicians for the same reason: we were lazy.

(Giggle)