# How to prove the limit does not exist at 0

• March 27th 2011, 03:11 PM
alice8675309
How to prove the limit does not exist at 0
I know that all of the following functions with domain R-{0}, the limit does not exist at 0. However, I'm not sure how to prove them using the Divergence Criterion: f does not have a limit at a if and only if there is an input sequence (xn) with elements in D-{a} such that (xn) converges to a by (f(xn)) diverges.

(1) f(x) = sin(1/x)
(2) f(x) =x+sin(1/x)
(3) f(x)=(1/x)sin(1/x)
• March 27th 2011, 04:31 PM
If you look at the first function, and pose $x_n = \frac{1}{(\frac{n \pi}{2})}$, well you can easily prove that $x_n$ converges to 0, since it is only the harmonic sequence multiplied by a constant, and you notice that $f(x_n)=sin((\frac{n \pi}{2}))$={1, 0, -1, 0, 1 ...}

It can easily be shown that f(xn) diverges, since there exists an epsilon greater than zero(just pick smaller than 1) for which, no matter what rank N of the sequence $f(x_n)$ you pick, there exists i greater than N such that $f(x_i)$- $f(x_{i+1})$ is greater than epsilon(two successive terms in f(xn) always have a distance of exactly 1), therefore $f(x_n)$ is not Cauchy, which implies that it does not converge.
• March 27th 2011, 07:46 PM
alice8675309
Quote:

If you look at the first function, and pose $x_n = \frac{1}{(\frac{n \pi}{2})}$, well you can easily prove that $x_n$ converges to 0, since it is only the harmonic sequence multiplied by a constant, and you notice that $f(x_n)=sin((\frac{n \pi}{2}))$={1, 0, -1, 0, 1 ...}

It can easily be shown that f(xn) diverges, since there exists an epsilon greater than zero(just pick smaller than 1) for which, no matter what rank N of the sequence $f(x_n)$ you pick, there exists i greater than N such that $f(x_i)$- $f(x_{i+1})$ is greater than epsilon(two successive terms in f(xn) always have a distance of exactly 1), therefore $f(x_n)$ is not Cauchy, which implies that it does not converge.

ok so say for the second one, would you use n+1/n and show that that diverges. Sorry im just trying to work in that divergence criterion. For the last one, would I show something along the lines of (1/x)(1/x)?

Also, don't these proofs need epsilon and delta? To use the Divergence Criterion?
• March 28th 2011, 06:36 AM
HallsofIvy
YOU stated, in your first post that the Divergence Criterion say s " f(x) does not have a limit at a if and only if there is an input sequence (xn) with elements in D-{a} such that (xn) converges to a by (f(xn)) diverges." It is not always necessary to use "epsilon-delta" proofs to show that a sequence does or does not converge.
• March 28th 2011, 06:44 AM
alice8675309
Quote:

Originally Posted by HallsofIvy
YOU stated, in your first post that the Divergence Criterion say s " f(x) does not have a limit at a if and only if there is an input sequence (xn) with elements in D-{a} such that (xn) converges to a by (f(xn)) diverges." It is not always necessary to use "epsilon-delta" proofs to show that a sequence does or does not converge.

Oh, ok so basically I just have to show the imput sequence and show it converges but that f(xn) diverges?
• March 28th 2011, 01:40 PM