Convergence of a sequence
I have got a question whereby a(n) = 1/(1+a(n-1)). This is a sequence and I am asked to find whether it converges.
I have thought of one way to solve it whereby we assume that it converges e.g to a value "a", and therefore you get a quadratic equation a^2+a=1 and therefore you get a value of a, and therefore we conclude that it converges.
However, it has been bugging me that we have used an assumption to derive the answer. By coming up with the answer after using the assumption, does it not just prove that we have a solution in the case that it converges, and not that it actually converges per se? If it actually diverges for some parameters, e.g. when a(n) belongs to some domain, then won't we be missing this part of the solution?
Although this may be a handful to read, I would really appreciate if anyone can enlighten me on this.