# Math Help - Convergence of random variables

1. ## Convergence of random variables

Given a sequence of independent random variables $(w_i)_i$with non-negative values, how do you prove that:

the series $\Sigma w_i$ converges almost surely iif the expectation series $\Sigma \mathbb{E}(w_i/(1+w_i))$ converges.

the $\Rightarrow$ implication is easy but I'm not certain about the other way round. When $\Sigma \mathbb{E}(w_i/(1+w_i))$ converges, using Markov inequality and the Borel-Cantelli lemma, one can prove that $w_i \rightarrow 0$ almost surely, hence finite expectation and variance for $i$ large enough (This relies on the fact the function $x \rightarrow x/(1+x)$ is strictly increasing on $[0,\infty[$). But how do we know about the series behavior?

I thought about using Cauchy's criteria with Kolmogorov inequality, in a similar way to the 3 series Theorem, but I'm probably wrong. I suspect there is a much simpler way of doing, I would really appreciate an expert view on this one.