Convergence of random variables

Given a sequence of independent random variables with non-negative values, how do you prove that:

the series converges almost surely iif the expectation series converges.

the implication is easy but I'm not certain about the other way round. When converges, using Markov inequality and the Borel-Cantelli lemma, one can prove that almost surely, hence finite expectation and variance for large enough (This relies on the fact the function is strictly increasing on ). But how do we know about the series behavior?

I thought about using Cauchy's criteria with Kolmogorov inequality, in a similar way to the 3 series Theorem, but I'm probably wrong. I suspect there is a much simpler way of doing, I would really appreciate an expert view on this one.

Thanks by advance for your help.