I'm actually completely lost with convergence of sequences.. Let's just say our lecturer isn't the best lol.
Here's one of the easier questions I need to do.
Given that the sequences and converge to limits and respectively, show that:
(a) The sequence converges to ;
(b) If , the sequence converges to .
How do I go about showing something converges? This is all completely new to me..
When we start to learn about differentiation we say things like:
So what do all these symbols mean?
Well, they mean that you can make value of (which represents the gradient of a chord) as close to (the gradient of the tangent) as you like by taking close enough to zero.
It's important that you understand that last sentence, so read it again!
Let me explain it a bit more.
You can't actually put equal to zero in an expression like to find the gradient of a tangent, because that would violate the First Commandment of Arithmetic: Thou shalt not divide by zero. So you have to sneak up on the value by letting get ever closer to zero without actually ever getting there.
And it's like that with convergence of sequences, except that we're now talking about 'infinity' (whatever that is) rather than zero. So, if we want to say that the sequence converges to the limit , we would write:
and this means something very similar to the differentiation thing, which is: You can make the value of as close to as you like, by taking large enough.
Again, that sentence is really important, so read it again!
Let me give you a simple example to show you what I mean. Look at the successive sums of the series:
The successive sums are:
and it's pretty obvious that this sequence of numbers converges to the number 2. If we want to write this in formal notation, we should say:
Let be the sum of the first n terms of the series . Then
So what does this mean, again? Right: you can make as close to 2 as you like by taking large enough. Can you actually make equal to 2? No. But you can get as close as you like. In other words, if someone challenges you to get within 0.000...(a million 0's)...01 of the answer 2, you could still do it by taking a value of big enough.
That's all it means: no more, no less.
So, how do you do you first question?
You need to show that you can get as close to as you like by taking large enough. So how close would you like to get? Within a small amount , say? OK, then you can argue something like this:
By definition, I know that if I take large enough I can make the difference between and less than . Suppose this value is . And I can make the difference between and less than too, by taking large enough, say . Then if I choose the value of to be the larger of and , then the difference between and will be less than . And that completes the proof.
Have a go at the second one in a similar way.
converges to we say that . Now for this to be true we must have that there exists a with the following property: For every there exists a such that if then . Or as Grandad said, you can make as arbitrarily close to as you want by making arbitrarily large.