Hello all,
How do I show that if the distances between the terms of a sequence are getting smaller, then the sequence is convergent?
In other words, if for all n, then is a convergent sequence.
In fact, I have the following inequality: , and I know that
Thank you
Mohammad
I'm sorry guys. It seems like I the things I omitted were important.
Well, the problem says: is a sequence defined by: and . Find an inequality between and , then use the inequality to prove that the sequence is is convergent.
First, we can note that for all n. From this, we can find the following inequalities: .
I tried to prove that the sequence is Cauchy, but the the bound I'm getting on depends on n and m.
Help please.
Thank you..
Hello Plato,
But the problem is that the sequence is not monotone. That's why we are asked to find the inequality between the differences. We notice that these differences are getting smaller and smaller. How can we conclude that the sequence is convergent?
Thank you,
You cannot conclude that a sequence is convergent simply because the differences between successive terms is getting smaller. For instance, define a sequence recursively by . Clearly the difference between successive terms is strictly decreasing, but it's always greater than 1, so obviously this sequence cannot converge.
But even if the limit of the difference between successive terms goes to zero, you still cannot conclude convergence; the harmonic series is the standard example.
I'm sorry guys. I really need help with this one. I will repeat the problem: is a sequence defined by: and . Find an inequality between and , then use the inequality to prove that the sequence is is convergent.
First, we can note that for all n. From this, we can find the following inequalities: .
I tried to prove that the sequence is Cauchy, but the the bound I'm getting on depends on n and m.
Note that the sequence is not monotone, that's why we were asked to find above inequality.
Help please.
Thank you..