The sequence converges to
I know how to prove that it converges to that value, but I'm unsure of how to prove that the sequence converges, ie, that the limit exists. I thought maybe trying to prove that the sequence is contractive would work, but I couldn't get a proof using that method. Does anyone know of another way to show that it converges?
I can't use the root test, and the ratio test will only tell me if it converges to 0 or diverges to infinity.
I was thinking of this, though... consider x=0.9. Then = 100.
For , the value increases to 1000000.
So, does the sequence actually converge? It seems that as x increases it diverges to infinity. But then, I guess the only way it could actually "reach" infinity is if x=1, in which case you'd have 1 + 2 + 3 + ...
I was thinking about trying to show it was Cauchy or contractive, but I didn't get anywhere with that either. Any help?
Let . Then .
The RHS becomes by geometric sum rules.
Factor the LHS: .
Finally divide and solve for S. You kept saying sequence but this is proving the series, which is what I think you meant. I normally don't do full solutions, but it was fun working this out.
I know how to show that S = , I was just trying to show that S does not diverge to infinity, and thus converges. Is that even necessary? Could I argue that S only approaches infinity as x approaches 1, but since x never equals 1, the sum S can never equal infinity?
I don't care so much that S = , I'm more concerned about proving that S does, in fact, converge to SOME value (or does not diverge).
If the series converges, then the sequence does as well. A divergent sequence could not sum up to a finite number. This only works though when |x|<1. Outside of that the other relation doesn't work.
The sequence does not converge to 1/[(1-x)^2], the series does. I don't know what else you are trying to prove.
paupsers - The formula for the sum of the series diverges whenever x is outside the bounds |x|<1. It isn't just at x=1. Choosing x=2 shows this clearly, as this sum is not 1/(-1)^2=1 : it in fact diverges. I think now you were saying you want to prove that the sequence (and series) converges only when |x|<1. I would have shown that when x>1, the sequence is monotonically increasing, thus diverges. When x<1 you have an alternating series but since a_n is always increasing it diverges as well. 1 is undefined and -1 alternates but diverges. So anyway, obviously doing this conditionally is a long way to do it and the above poster has a nice concise solution.
I realise now when I finished typing this all in that you have answered the question,
but I am going to post this any way. Forgot to refresh this window, after I went to ead dinner
It is almost the same as what Jameson said.
And now you have a telescoping series, so all terms cancel out but the first and the last so:
Now you see that the only way that the series converges when
is that the term vanishes, and that happens only when
So just use the method that jameson posted, but remember where he said:
It is supposed to be:The RHS becomes by geometric sum rules.