Let |X_k| ≥ 0 be a sequence of random variables.
If we are given that
∑ |X_k| < ∞ and
E(∑ |X_k|) < ∞ ,
does this imply that
E(∑ |X_k|) < ∞ ?
An infinite sum is by definition the limit of the sequence of partial sums and intuition seems to suggest that the above is true, but how can we prove it rigorously?
Note: the approach of my book starts with the following axioms for expecation
1. X≥0 =>E(X)|≥0
2. E(cX+dY) = c E(X)+d E(Y)
4. If X_1<X_2<...<X_n and lim Xn(ω)=X(ω), then lim E(X_n) = E(X) [same as monotone convergence theorem]
I have been thinking about this for an hour now and I am pretty confused...can someone please help? I would really appreciate it!