Convergence of random variables

Given a sequence of independent random variables $\displaystyle (w_i)_i $with non-negative values, how do you prove that:

the series $\displaystyle \Sigma w_i$ converges almost surely iif the expectation series $\displaystyle \Sigma \mathbb{E}(w_i/(1+w_i))$ converges.

the $\displaystyle \Rightarrow$ implication is easy but I'm not certain about the other way round. When $\displaystyle \Sigma \mathbb{E}(w_i/(1+w_i))$ converges, using Markov inequality and the Borel-Cantelli lemma, one can prove that $\displaystyle w_i \rightarrow 0$ almost surely, hence finite expectation and variance for $\displaystyle i$ large enough (This relies on the fact the function $\displaystyle x \rightarrow x/(1+x)$ is strictly increasing on $\displaystyle [0,\infty[$). But how do we know about the series behavior?

I thought about using Cauchy's criteria with Kolmogorov inequality, in a similar way to the 3 series Theorem, but I'm probably wrong. I suspect there is a much simpler way of doing, I would really appreciate an expert view on this one.

Thanks by advance for your help.