Let |X_k| ≥ 0 be a sequence of random variables.

If we are given that

∞

∑ |X_k| < ∞ and

k=1

+∞

E(∑ |X_k|) < ∞ ,

k=1

does this imply that

k=n

E(∑ |X_k|) < ∞ ?

k=1

An infinite sum is by definition thelimitof the sequence of partial sums and intuition seems to suggest that the above is true, but how can weproveit rigorously?

Note:the approach of my book starts with the followingaxiomsfor expecation

1. X≥0 =>E(X)|≥0

2. E(cX+dY) = c E(X)+d E(Y)

3. E(1)=1

4. If X_1<X_2<...<X_n and lim Xn(ω)=X(ω), then lim E(X_n) = E(X) [same as monotone convergence theorem]

I have been thinking about this for an hour now and I am pretty confused...can someone please help? I would really appreciate it!