# Thread: A simple problem of convergence...

1. ## A simple problem of convergence...

...But I can't get it

We have a sequence of iid nonnegative rv's $(X_n)_{n\in\mathbb{N}}$

Assuming that $E[X_1]$ is finite, we have by the sLLN that $\frac{X_1+\dots+X_n}{n}$ converges to a finite limit.

But how can I conclude that $\lim_{n\to\infty} \frac{X_n}{n}=0$ ?

I tried to relate it to the series $\sum \frac{X_n}{n}$, but the inequality is the other way round than the one that would be helpful...

...so since inequalities don't help me, I hope you guys can help me

Thanks !

Note : I also - miserably - tried Cesàro's mean.

2. By subtraction

${X_n\over n} ={S_n-S_{n-1}\over n}$

$={S_n\over n}-\left({S_{n-1}\over n-1}\right)\left({n-1\over n}\right)$

If $Y_n\to a$ almost surely, then for any continuous function g, $g(Y_n)\to g(a)$ a.s.

But here, g depends on n ?

4. I think you can use Borel-Cantelli too since you have a first moment...

$\sum_{n=1}^{\infty}P\{|X|>\epsilon n\}<\infty$

5. Originally Posted by matheagle
I think you can use Borel-Cantelli too since you have a first moment...

$\sum_{n=1}^{\infty}P\{|X|>\epsilon n\}<\infty$
Hum is it okay whether X is a continuous or a discrete rv ? And P(|X|>en) is the cdf of Xn/n, okay, but how does it help actually ?

Sorry, I'm a bit lost (and tired, it's 2am here )

And the hint says to use the LLN...

6. Look at Theorem 2 on page 125 of..........

Probability theory: independence ... - Google Books

7. Originally Posted by Moo

If $Y_n\to a$ almost surely, then for any continuous function g, $g(Y_n)\to g(a)$ a.s.

But here, g depends on n ?
MathEagle's proof works fine : almost surely,
- the sequence $\Big(\frac{S_n}{n}\Big)_n$ converges to $E[X_1]$
- the sequence $\Big(\frac{S_{n-1}}{n-1}\Big)_n$ converges to $E[X_1]$ (because of the first point)
- the sequence $\Big(\frac{n-1}{n}\Big)_n$ converges to 1 (obviously),
and these three events together (I could also have let the third one appart) imply that $\Big(\frac{X_n}{n}\Big)_n$ converges to 0, hence this latter limit holds almost-surely.
By the way, I'm not sure how to use Borel-Cantelli here.

8. Thank(s to) you, I feel stupid now !

9. Originally Posted by Moo
Thank(s to) you, I feel stupid now !
I'm glad that I could help.

10. If $E|X|<\infty$ then $E\left|{X\over \epsilon}\right|<\infty$ for all $\epsilon>0$

Hence $\sum_{n=1}^{\infty}P(|X_n|>n\epsilon)=\sum_{n=1}^{ \infty}P(|X|>n\epsilon)<\infty$

This follows from

$\sum_{n=1}^{\infty}P(|X|>n^{1/r})\le E|X|^r\le \sum_{n=0}^{\infty}P(|X|>n^{1/r})<\infty$

Thus $X_n/n\to 0$

I don't understand why the rvs are nonnegative.

11. Originally Posted by matheagle
If $E|X|<\infty$ then $E\left|{X\over \epsilon}\right|<\infty$ for all $\epsilon>0$

Hence $\sum_{n=1}^{\infty}P(|X_n|>n\epsilon)=\sum_{n=1}^{ \infty}P(|X|>n\epsilon)<\infty$

This follows from

$\sum_{n=1}^{\infty}P(|X|>n^{1/r})\le E|X|^r\le \sum_{n=0}^{\infty}P(|X|>n^{1/r})<\infty$

Thus $X_n/n\to 0$

I don't understand why the rvs are nonnegative.
I was wondering, as I did when I saw your second post, but I'm not sure : doesn't this inequality hold for discrete random variables only ?

Perhaps they're nonnegative because I'm studying populations, so it can't be negative lol. And it's E[X], not E|X|

12. Originally Posted by Moo
I was wondering, as I did when I saw your second post, but I'm not sure : doesn't this inequality hold for discrete random variables only ?

Perhaps they're nonnegative because I'm studying populations, so it can't be negative lol. And it's E[X], not E|X|
Let $a_n$ be any positive strictly increasing sequence going to infinity. Let $a(x)$ be any continuous extension of $a_n$ then for all positive random variables X...

$\sum_{n=1}^{\infty}P(X\ge a_n) \le E(a^{-1}(X))\le \sum_{n=0}^{\infty}P(X> a_n)$

If we let $f(x)=a^{-1}(x)$ and

$W=\sum_{k=1}^{\infty}kI(k\le f(X)

$Y=\sum_{k=0}^{\infty}(k+1)I(k< f(X)\le k+1)$

we have $W\le f(X)\le Y$ so $E(W)\le E(f(X))\le E(Y)$

However

$\sum_{n=1}^{\infty}P(X\ge a_n) \le \sum_{n=1}^{\infty}P(f(X)\ge n)$

$=\sum_{n=1}^{\infty}\sum_{k=n}^{\infty}P(k\le f(X)

$=\sum_{k=1}^{\infty}kP(k\le f(X)

while

$\sum_{n=0}^{\infty}P(X> a_n) \le \sum_{n=0}^{\infty}\sum_{k=n}^{\infty}P(k< f(X)\le k+1)$

$=\sum_{k=0}^{\infty}(k+1)P(k< f(X)\le k+1)=E(Y)$

So for any random variable X and positive number r...

$\sum_{n=1}^{\infty}P(|X|\ge n^{1/r})\le E|X|^r\le \sum_{n=0}^{\infty}P(|X|> n^{1/r})$