# E(∑ |X_k|) < ∞ ? How can we prove it?

• Oct 16th 2009, 02:04 AM
kingwinner
E(∑ |X_k|) < ∞ ? How can we prove it?
Let |X_k| ≥ 0 be a sequence of random variables.

If we are given that

∑ |X_k| < ∞ and
k=1

+∞
E(∑ |X_k|) < ∞ ,
k=1

does this imply that

k=n
E(∑ |X_k|) < ∞ ?
k=1

An infinite sum is by definition the limit of the sequence of partial sums and intuition seems to suggest that the above is true, but how can we prove it rigorously?

Note: the approach of my book starts with the following axioms for expecation
1. X≥0 =>E(X)|≥0
2. E(cX+dY) = c E(X)+d E(Y)
3. E(1)=1
4. If X_1<X_2<...<X_n and lim Xn(ω)=X(ω), then lim E(X_n) = E(X) [same as monotone convergence theorem]

• Oct 16th 2009, 01:59 PM
rn443
Quote:

Originally Posted by kingwinner
Let |X_k| ≥ 0 be a sequence of random variables.

If we are given that

∑ |X_k| < ∞ and
k=1

+∞
E(∑ |X_k|) < ∞ ,
k=1

does this imply that

k=n
E(∑ |X_k|) < ∞ ?
k=1

An infinite sum is by definition the limit of the sequence of partial sums and intuition seems to suggest that the above is true, but how can we prove it rigorously?

Since |X_n| >= 0, the sequence of partial sums is increasing. The result follows by the monotone convergence theorem.
• Oct 16th 2009, 02:54 PM
Laurent
No need of monotone convergence theorem here. Simply say $E[\sum_{k=1}^n |X_k|]\leq E[\sum_{k=1}^\infty |X_k|]<\infty$, where the first inequality is your first axiom (more explicitly, if $X\leq Y$ then $Y-X\geq 0$ hence $E[Y-X]\geq 0$ by 1., i.e. $E[X]\leq E[Y]$, called monotonicity of the expectation)
• Oct 16th 2009, 03:18 PM
kingwinner
Quote:

Originally Posted by Laurent
No need of monotone convergence theorem here. Simply say $E[\sum_{k=1}^n |X_k|]\leq E[\sum_{k=1}^\infty |X_k|]<\infty$, where the first inequality is your first axiom (more explicitly, if $X\leq Y$ then $Y-X\geq 0$ hence $E[Y-X]\geq 0$ by 1., i.e. $E[X]\leq E[Y]$, called monotonicity of the expectation)

Hi,

I follow your second point, but I don't understand the step before it.
Why is it true that $\sum_{k=1}^n |X_k|\leq \sum_{k=1}^\infty |X_k|$?
Here the right side is really defined as a LIMIT...(the result SEEMS obvious here, but how can we JUSTIFY it? i.e. how can we show that the left side is less than or equal to the LIMIT of the right side?)

Thanks for explaining! :)
• Oct 16th 2009, 03:29 PM
rn443
Quote:

Originally Posted by kingwinner
Hi,

I follow your second point, but I don't understand the step before it.
Why is it true that $\sum_{k=1}^n |X_k|\leq \sum_{k=1}^\infty |X_k|$?
Here the right side is really defined as a LIMIT...(the result SEEMS obvious here, but how can we JUSTIFY it? i.e. how can we show that the left side is less than or equal to the LIMIT of the right side?)

Thanks for explaining! :)

Since each r.v. is nonnegative, the infinite sum |X_(n+1)| + |X_(n+2)| + ... is itself nonnegative. (This is because the partial sums of this series are increasing and nonnegative, and so their supremum cannot be negative.) Therefore |X_1| + |X_2| + ... + |X_n| <= (|X_1| + |X_2| + ... + |X_n|) + (|X_(n+1)| + |X_(n+2)| + ...) = |X_1| + |X_2| + ...