Let |X_k| ≥ 0 be a sequence of random variables.
If we are given that
∞
∑ |X_k| < ∞ and
k=1
+∞
E(∑ |X_k|) < ∞ ,
k=1
does this imply that
k=n
E(∑ |X_k|) < ∞ ?
k=1
An infinite sum is by definition the limit of the sequence of partial sums and intuition seems to suggest that the above is true, but how can we prove it rigorously?
Note: the approach of my book starts with the following axioms for expecation
1. X≥0 =>E(X)|≥0
2. E(cX+dY) = c E(X)+d E(Y)
3. E(1)=1
4. If X_1<X_2<...<X_n and lim Xn(ω)=X(ω), then lim E(X_n) = E(X) [same as monotone convergence theorem]
I have been thinking about this for an hour now and I am pretty confused...can someone please help? I would really appreciate it!
Hi,
I follow your second point, but I don't understand the step before it.
Why is it true that ?
Here the right side is really defined as a LIMIT...(the result SEEMS obvious here, but how can we JUSTIFY it? i.e. how can we show that the left side is less than or equal to the LIMIT of the right side?)
Thanks for explaining!
Since each r.v. is nonnegative, the infinite sum |X_(n+1)| + |X_(n+2)| + ... is itself nonnegative. (This is because the partial sums of this series are increasing and nonnegative, and so their supremum cannot be negative.) Therefore |X_1| + |X_2| + ... + |X_n| <= (|X_1| + |X_2| + ... + |X_n|) + (|X_(n+1)| + |X_(n+2)| + ...) = |X_1| + |X_2| + ...