# Thread: How to show this sequence converges

1. ## How to show this sequence converges

The sequence $b_n=1+2x+3x^2+...+nx^{n-1}, |x|<1$ converges to $(1-x)^{-2}$

I know how to prove that it converges to that value, but I'm unsure of how to prove that the sequence converges, ie, that the limit exists. I thought maybe trying to prove that the sequence is contractive would work, but I couldn't get a proof using that method. Does anyone know of another way to show that it converges?

2. Originally Posted by paupsers
The sequence $b_n=1+2x+3x^2+...+nx^{n-1}, |x|<1$ converges to $(1-x)^{-2}$

I know how to prove that it converges to that value, but I'm unsure of how to prove that the sequence converges, ie, that the limit exists. I thought maybe trying to prove that the sequence is contractive would work, but I couldn't get a proof using that method. Does anyone know of another way to show that it converges?
Either the ratio test or the root test should work nicely. They should be the first thing you think of with power series.

3. I can't use the root test, and the ratio test will only tell me if it converges to 0 or diverges to infinity.

I was thinking of this, though... consider x=0.9. Then $\frac{1}{(1-x)^2}$= 100.
For $x=.999$, the value increases to 1000000.

So, does the sequence actually converge? It seems that as x increases it diverges to infinity. But then, I guess the only way it could actually "reach" infinity is if x=1, in which case you'd have 1 + 2 + 3 + ...

I was thinking about trying to show it was Cauchy or contractive, but I didn't get anywhere with that either. Any help?

4. The Ratio Test does more than just show divergence. If the limit is less than 1, the series converges absolutely.

5. Unfortunately, the ratio test yields a limit of 1, so it is inconclusive.

Any other ideas, anyone?

6. Let $S=1+2x+3x^2+...+nx^{n-1}$. Then $xS=x+2x^2+...nx^n$.

$S-xS=1+x+x^2+...x^n$

The RHS becomes $\frac{1}{1-x}$ by geometric sum rules.

Factor the LHS: $S-xS=S(1-x)$.

Finally divide and solve for S. You kept saying sequence but this is proving the series, which is what I think you meant. I normally don't do full solutions, but it was fun working this out.

7. I know how to show that S = $\frac{1}{(1-x)^2}$, I was just trying to show that S does not diverge to infinity, and thus converges. Is that even necessary? Could I argue that S only approaches infinity as x approaches 1, but since x never equals 1, the sum S can never equal infinity?

I don't care so much that S = $\frac{1}{(1-x)^2}$, I'm more concerned about proving that S does, in fact, converge to SOME value (or does not diverge).

8. If the series converges, then the sequence does as well. A divergent sequence could not sum up to a finite number. This only works though when |x|<1. Outside of that the other relation doesn't work.

The sequence does not converge to 1/[(1-x)^2], the series does. I don't know what else you are trying to prove.

9. $S = \sum{(n+1)x^n} \Rightarrow R = \frac{1}{\limsup{\sqrt[n]{n+1}}} = 1$

And so the series converges $\forall |x|<1$

10. Originally Posted by Jameson
If the series converges, then the sequence does as well. A divergent sequence could not sum up to a finite number. This only works though when |x|<1. Outside of that the other relation doesn't work.

The sequence does not converge to 1/[(1-x)^2], the series does. I don't know what else you are trying to prove.
I think that's the connection I needed. Since S = $\frac{1}{(1-x)^2}$, and S IS the summation, then obviously the sum doesn't diverge to infinity since $\frac{1}{(1-x)^2}$ is a real value for |x|<1. For some reason I wans't making that obvious observation.

Thanks!

11. Originally Posted by Defunkt
$S = \sum{(n+1)x^n} \Rightarrow R = \frac{1}{\limsup{\sqrt[n]{n+1}}} = 1$

And so the series converges $\forall |x|<1$
Very nice. The terms sequence and series were used incorrectly in this thread and I had no idea what the OP really wanted. By the problem, I felt like we were supposed to use the fact that $S=\frac{1}{(1-x)^2}$, |x|<1. Show that the RHS converges, thus S does. However, your proof is concise.

paupsers - The formula for the sum of the series diverges whenever x is outside the bounds |x|<1. It isn't just at x=1. Choosing x=2 shows this clearly, as this sum is not 1/(-1)^2=1 : it in fact diverges. I think now you were saying you want to prove that the sequence (and series) converges only when |x|<1. I would have shown that when x>1, the sequence is monotonically increasing, thus diverges. When x<1 you have an alternating series but since a_n is always increasing it diverges as well. 1 is undefined and -1 alternates but diverges. So anyway, obviously doing this conditionally is a long way to do it and the above poster has a nice concise solution.

12. I realise now when I finished typing this all in that you have answered the question,
but I am going to post this any way. Forgot to refresh this window, after I went to ead dinner

It is almost the same as what Jameson said.

For any $x\ne1$

$(1-x)\sum_{i=0}^{n}x^i=\sum_{i=0}^{\infty}x^i-\sum_{i=0}^{n}x^{i+1}$
$=\sum_{i=0}^{\infty}(x^i-x^{i+1})$

And now you have a telescoping series, so all terms cancel out but the first and the last so:

$(1-x))\sum_{i=0}^{n}x^i=(1-x^{n+1})$
And therefore
$\sum_{i=0}^{n}x^i=\frac{1-x^{n+1}}{1-x}$

Now you see that the only way that the series converges when $n\to\infty$
is that the term $x^{n+1}$ vanishes, and that happens only when $\vert x\vert<1$

So just use the method that jameson posted, but remember where he said:

The RHS becomes $\frac{1}{1-x}$ by geometric sum rules.
It is supposed to be:

$\frac{1-x^{n+1}}{1-x}$

13. If you are taking a partial sum then yes you are correct, but if you are considering the infinite series where n approaches infinity then it simplifies to what I wrote. I just skipped demonstrating why.

14. I was just showing how to derive that $\vert x\vert<1$ must be true for the sequence to converge when $n\to\infty$.