# Thread: A simple problem of convergence...

1. ## A simple problem of convergence...

...But I can't get it

We have a sequence of iid nonnegative rv's $\displaystyle (X_n)_{n\in\mathbb{N}}$

Assuming that $\displaystyle E[X_1]$ is finite, we have by the sLLN that $\displaystyle \frac{X_1+\dots+X_n}{n}$ converges to a finite limit.

But how can I conclude that $\displaystyle \lim_{n\to\infty} \frac{X_n}{n}=0$ ?

I tried to relate it to the series $\displaystyle \sum \frac{X_n}{n}$, but the inequality is the other way round than the one that would be helpful...

...so since inequalities don't help me, I hope you guys can help me

Thanks !

Note : I also - miserably - tried Cesàro's mean.

2. By subtraction

$\displaystyle {X_n\over n} ={S_n-S_{n-1}\over n}$

$\displaystyle ={S_n\over n}-\left({S_{n-1}\over n-1}\right)\left({n-1\over n}\right)$

Now take your limits

3. I had a doubt about it :

If $\displaystyle Y_n\to a$ almost surely, then for any continuous function g, $\displaystyle g(Y_n)\to g(a)$ a.s.

But here, g depends on n ?

4. I think you can use Borel-Cantelli too since you have a first moment...

$\displaystyle \sum_{n=1}^{\infty}P\{|X|>\epsilon n\}<\infty$

5. Originally Posted by matheagle
I think you can use Borel-Cantelli too since you have a first moment...

$\displaystyle \sum_{n=1}^{\infty}P\{|X|>\epsilon n\}<\infty$
Hum is it okay whether X is a continuous or a discrete rv ? And P(|X|>en) is the cdf of Xn/n, okay, but how does it help actually ?

Sorry, I'm a bit lost (and tired, it's 2am here )

And the hint says to use the LLN...

6. Look at Theorem 2 on page 125 of..........

Probability theory: independence ... - Google Books

7. Originally Posted by Moo
I had a doubt about it :

If $\displaystyle Y_n\to a$ almost surely, then for any continuous function g, $\displaystyle g(Y_n)\to g(a)$ a.s.

But here, g depends on n ?
MathEagle's proof works fine : almost surely,
- the sequence $\displaystyle \Big(\frac{S_n}{n}\Big)_n$ converges to $\displaystyle E[X_1]$
- the sequence $\displaystyle \Big(\frac{S_{n-1}}{n-1}\Big)_n$ converges to $\displaystyle E[X_1]$ (because of the first point)
- the sequence $\displaystyle \Big(\frac{n-1}{n}\Big)_n$ converges to 1 (obviously),
and these three events together (I could also have let the third one appart) imply that $\displaystyle \Big(\frac{X_n}{n}\Big)_n$ converges to 0, hence this latter limit holds almost-surely.
By the way, I'm not sure how to use Borel-Cantelli here.

8. Thank(s to) you, I feel stupid now !

9. Originally Posted by Moo
Thank(s to) you, I feel stupid now !
I'm glad that I could help.

10. If $\displaystyle E|X|<\infty$ then $\displaystyle E\left|{X\over \epsilon}\right|<\infty$ for all $\displaystyle \epsilon>0$

Hence $\displaystyle \sum_{n=1}^{\infty}P(|X_n|>n\epsilon)=\sum_{n=1}^{ \infty}P(|X|>n\epsilon)<\infty$

This follows from

$\displaystyle \sum_{n=1}^{\infty}P(|X|>n^{1/r})\le E|X|^r\le \sum_{n=0}^{\infty}P(|X|>n^{1/r})<\infty$

Thus $\displaystyle X_n/n\to 0$

I don't understand why the rvs are nonnegative.

11. Originally Posted by matheagle
If $\displaystyle E|X|<\infty$ then $\displaystyle E\left|{X\over \epsilon}\right|<\infty$ for all $\displaystyle \epsilon>0$

Hence $\displaystyle \sum_{n=1}^{\infty}P(|X_n|>n\epsilon)=\sum_{n=1}^{ \infty}P(|X|>n\epsilon)<\infty$

This follows from

$\displaystyle \sum_{n=1}^{\infty}P(|X|>n^{1/r})\le E|X|^r\le \sum_{n=0}^{\infty}P(|X|>n^{1/r})<\infty$

Thus $\displaystyle X_n/n\to 0$

I don't understand why the rvs are nonnegative.
I was wondering, as I did when I saw your second post, but I'm not sure : doesn't this inequality hold for discrete random variables only ?

Perhaps they're nonnegative because I'm studying populations, so it can't be negative lol. And it's E[X], not E|X|

12. Originally Posted by Moo
I was wondering, as I did when I saw your second post, but I'm not sure : doesn't this inequality hold for discrete random variables only ?

Perhaps they're nonnegative because I'm studying populations, so it can't be negative lol. And it's E[X], not E|X|
Let $\displaystyle a_n$ be any positive strictly increasing sequence going to infinity. Let $\displaystyle a(x)$ be any continuous extension of $\displaystyle a_n$ then for all positive random variables X...

$\displaystyle \sum_{n=1}^{\infty}P(X\ge a_n) \le E(a^{-1}(X))\le \sum_{n=0}^{\infty}P(X> a_n)$

If we let $\displaystyle f(x)=a^{-1}(x)$ and

$\displaystyle W=\sum_{k=1}^{\infty}kI(k\le f(X)<k+1)$

$\displaystyle Y=\sum_{k=0}^{\infty}(k+1)I(k< f(X)\le k+1)$

we have $\displaystyle W\le f(X)\le Y$ so $\displaystyle E(W)\le E(f(X))\le E(Y)$

However

$\displaystyle \sum_{n=1}^{\infty}P(X\ge a_n) \le \sum_{n=1}^{\infty}P(f(X)\ge n)$

$\displaystyle =\sum_{n=1}^{\infty}\sum_{k=n}^{\infty}P(k\le f(X)<k+1)$

$\displaystyle =\sum_{k=1}^{\infty}kP(k\le f(X)<k+1)=E(W)$

while

$\displaystyle \sum_{n=0}^{\infty}P(X> a_n) \le \sum_{n=0}^{\infty}\sum_{k=n}^{\infty}P(k< f(X)\le k+1)$

$\displaystyle =\sum_{k=0}^{\infty}(k+1)P(k< f(X)\le k+1)=E(Y)$

So for any random variable X and positive number r...

$\displaystyle \sum_{n=1}^{\infty}P(|X|\ge n^{1/r})\le E|X|^r\le \sum_{n=0}^{\infty}P(|X|> n^{1/r})$