Math Help - definition of convergence of a sequence

1. definition of convergence of a sequence

Hi guys,
I have just started learning elementary analysis and am trying to understand exactly what the precise definition of a sequence to converge means, despite doing well on the problems in the book, including the more abstract, general proofs, but stilll don't really understand exactly what is going on here. So the definition is:

$s_n \to s$ if and only if for any $\epsilon > 0$, there exists $N \in \mathbb{R}$ such that $n > N$ implies that $|s_n - s| < \epsilon$.

I was told to think of it as a challenge. What do they mean by that? I can only think of two ways to think of it that way and do not know which is the most logical.

You give me any $\epsilon$ you want, as small as you want. No matter what, I can always find an $N$ in the natural numbers where, for every number bigger than it, the distance from each of the terms of my sequence from $s$ is less than your $\epsilon$

Or, are you challenging me with an $N$ and asking me to show that for any number bigger than your $N[$, ( $n$), the difference between L and $s_n$ is less than all $\varepsilon$.

I know this question probably seems ridiculous but if I don't understand the easy stuff I won't understand the rest of the material.

Thanks!
James

2. Re: definition of convergence of a sequence

Originally Posted by james121515
Hi guys,
I have just started learning elementary analysis and am trying to understand exactly what the precise definition of a sequence to converge means, despite doing well on the problems in the book, including the more abstract, general proofs, but stilll don't really understand exactly what is going on here. So the definition is:

$s_n \to s$ if and only if for any $\epsilon > 0$, there exists $\boxed{N \in \mathbb{R}}$ such that $n > N$ implies that $|s_n - s| < \epsilon$.
It is common practice to take N as a natural number in this definition. Even if you take N as a real number there is no incorrectness in the definition, because from the Archimedean axiom it holds that there exists a natural number(say M) such that M>N.

I was told to think of it as a challenge. What do they mean by that? I can only think of two ways to think of it that way and do not know which is the most logical.

You give me any $\epsilon$ you want, as small as you want. No matter what, I can always find an $N$ in the natural numbers where, for every number bigger than it, the distance from each of the terms of my sequence from $s$ is less than your $\epsilon$
Correct. Another more compact way of saying that is, you could make the distance between $s_{n}\mbox{ and }s$ arbitrary small by making N sufficiently large.

Or, are you challenging me with an $N$ and asking me to show that for any number bigger than your $N[$, ( $n$), the difference between L and $s_n$ is less than all $\varepsilon$.
This is incorrect. If $|s_n - s|$ could be made less than all $\varepsilon$ the only value you could take for $|s_n - s|$ would be zero, since $\varepsilon$ is any positive number.

I know this question probably seems ridiculous but if I don't understand the easy stuff I won't understand the rest of the material.

Thanks!
James
You are welcome!

3. Re: definition of convergence of a sequence

You seem a little stuck in the mechanics and not so grounded in the concepts.

I like the casual concepts of "hangs around" and "wanders off". This is what epsilon is all about.

If 's' is the limit, then the partial sums will not get away from the limit. Once it gets close, it stays close. This is what N is all about.

If you promise to take n > N, the partial sums promise to stay closer than epsilon.

Anyway, sometimes it just helps to say it a couple different ways. It is important that epsilon is arbitrary and not zero (0).

4. Re: definition of convergence of a sequence

here is a very standard example of a convergent sequence:

$s_n = \frac{1}{n}$ (or the sequence 1, 1/2, 1/3, 1/4, 1/5, 1/6,......)

this sequence converges to 0 (that is the "s").

now if we pick a large ε (like 100, for example), N is easy to find: any natural number will do.

if we pick ε = 1, then N = 2 (or any larger integer) will do.

but these ε are not every possible ε, ε could be very small: like 1/1000, for example. if ε = 1/1000, to ensure that $|s_n - 0| < \epsilon$, we have to choose N larger than 1000.

in fact, in order to ensure that $|s_n| < \epsilon$, for any ε, it suffices to choose N > 1/ε. since 1/ε is finite (although it could be very large) and the natural numbers are an infinitely increasing set, we can eventually find a natural number greater than any (given) finite number.

sometimes this is expressed as "1/n tends to 0 for large n", or "1/n is eventually 0", or "0 is a cluster point for the sequence". intuitively, what is meant is that 1/n is "close to 0" for "large n". how close? "within ε". how large an n? "every single one larger than 1/ε". in other words, the use of:

$|s_n - s| < \epsilon$, is a formal way of measuring "closeness" (by using a distance function). if one imagines just graphing the image set on the real line (perhaps as an animation, where $s_n$ is drawn at the n-th second), you see that as time goes by, the new plot points appear to "cluster" around the origin.

by contrast, the following sequence does not display the desired behavior:

1,0,1,0,1,0,1,0.......

we can find an N as long as ε > 1, but if not, then NO N will work. and all we have to do to prove "not convergent" is find a single ε for which we cannot find an appropriate N. for example, in this case ε = 1/2 will serve (i'm sure you can see there are many other choices, but we just need to find one).