# Convergence of random variables

Given a sequence of independent random variables $(w_i)_i$with non-negative values, how do you prove that:
the series $\Sigma w_i$ converges almost surely iif the expectation series $\Sigma \mathbb{E}(w_i/(1+w_i))$ converges.
the $\Rightarrow$ implication is easy but I'm not certain about the other way round. When $\Sigma \mathbb{E}(w_i/(1+w_i))$ converges, using Markov inequality and the Borel-Cantelli lemma, one can prove that $w_i \rightarrow 0$ almost surely, hence finite expectation and variance for $i$ large enough (This relies on the fact the function $x \rightarrow x/(1+x)$ is strictly increasing on $[0,\infty[$). But how do we know about the series behavior?