Results 1 to 6 of 6

Math Help - How to prove the limit does not exist at 0

  1. #1
    Member
    Joined
    Apr 2010
    Posts
    133

    How to prove the limit does not exist at 0

    I know that all of the following functions with domain R-{0}, the limit does not exist at 0. However, I'm not sure how to prove them using the Divergence Criterion: f does not have a limit at a if and only if there is an input sequence (xn) with elements in D-{a} such that (xn) converges to a by (f(xn)) diverges.

    (1) f(x) = sin(1/x)
    (2) f(x) =x+sin(1/x)
    (3) f(x)=(1/x)sin(1/x)
    Follow Math Help Forum on Facebook and Google+

  2. #2
    Junior Member RaisinBread's Avatar
    Joined
    Mar 2011
    Posts
    37
    If you look at the first function, and pose x_n = \frac{1}{(\frac{n \pi}{2})}, well you can easily prove that x_n converges to 0, since it is only the harmonic sequence multiplied by a constant, and you notice that f(x_n)=sin((\frac{n \pi}{2}))={1, 0, -1, 0, 1 ...}

    It can easily be shown that f(xn) diverges, since there exists an epsilon greater than zero(just pick smaller than 1) for which, no matter what rank N of the sequence f(x_n) you pick, there exists i greater than N such that f(x_i)- f(x_{i+1}) is greater than epsilon(two successive terms in f(xn) always have a distance of exactly 1), therefore f(x_n) is not Cauchy, which implies that it does not converge.
    Follow Math Help Forum on Facebook and Google+

  3. #3
    Member
    Joined
    Apr 2010
    Posts
    133
    Quote Originally Posted by RaisinBread View Post
    If you look at the first function, and pose x_n = \frac{1}{(\frac{n \pi}{2})}, well you can easily prove that x_n converges to 0, since it is only the harmonic sequence multiplied by a constant, and you notice that f(x_n)=sin((\frac{n \pi}{2}))={1, 0, -1, 0, 1 ...}

    It can easily be shown that f(xn) diverges, since there exists an epsilon greater than zero(just pick smaller than 1) for which, no matter what rank N of the sequence f(x_n) you pick, there exists i greater than N such that f(x_i)- f(x_{i+1}) is greater than epsilon(two successive terms in f(xn) always have a distance of exactly 1), therefore f(x_n) is not Cauchy, which implies that it does not converge.
    ok so say for the second one, would you use n+1/n and show that that diverges. Sorry im just trying to work in that divergence criterion. For the last one, would I show something along the lines of (1/x)(1/x)?

    Also, don't these proofs need epsilon and delta? To use the Divergence Criterion?
    Last edited by alice8675309; March 27th 2011 at 07:18 PM.
    Follow Math Help Forum on Facebook and Google+

  4. #4
    MHF Contributor

    Joined
    Apr 2005
    Posts
    15,775
    Thanks
    1514
    YOU stated, in your first post that the Divergence Criterion say s " f(x) does not have a limit at a if and only if there is an input sequence (xn) with elements in D-{a} such that (xn) converges to a by (f(xn)) diverges." It is not always necessary to use "epsilon-delta" proofs to show that a sequence does or does not converge.
    Follow Math Help Forum on Facebook and Google+

  5. #5
    Member
    Joined
    Apr 2010
    Posts
    133
    Quote Originally Posted by HallsofIvy View Post
    YOU stated, in your first post that the Divergence Criterion say s " f(x) does not have a limit at a if and only if there is an input sequence (xn) with elements in D-{a} such that (xn) converges to a by (f(xn)) diverges." It is not always necessary to use "epsilon-delta" proofs to show that a sequence does or does not converge.
    Oh, ok so basically I just have to show the imput sequence and show it converges but that f(xn) diverges?
    Follow Math Help Forum on Facebook and Google+

  6. #6
    Junior Member RaisinBread's Avatar
    Joined
    Mar 2011
    Posts
    37
    Yes, and the input sequence has to converge to zero in your case. Maybe your confusion comes from the fact that, the definition of a convergent function involves epsilon-delta.

    However this criterion does not. A more intuitive way of explain it is that, if there is "some way" of approaching zero (in our case the input sequence xn) such that f(xn) does not converge, this must mean that f(x) doesn't converge at x=0, because if it were the case, it would converge no matter how you approach zero.

    as for using n + 1/n, if you mean using that as an input sequence for the second function, I don't think it would work, because your input sequence has to converge to zero, otherwise you can't use the criterion you mentioned in the first post. For the criterion to work, your input sequence MUST converge to zero, and the f(xn) diverge.
    Follow Math Help Forum on Facebook and Google+

Similar Math Help Forum Discussions

  1. Prove lim(sin(1/x)) as x tends to 0 does not exist
    Posted in the Differential Geometry Forum
    Replies: 10
    Last Post: June 15th 2011, 02:28 AM
  2. Prove that limit doesn't exist
    Posted in the Differential Geometry Forum
    Replies: 2
    Last Post: February 8th 2010, 06:50 PM
  3. prove the limit exist (Partial Differential)
    Posted in the Calculus Forum
    Replies: 6
    Last Post: November 22nd 2009, 05:23 PM
  4. How do you prove that a limit does not exist?
    Posted in the Calculus Forum
    Replies: 4
    Last Post: November 4th 2009, 06:58 PM
  5. Prove that lim (x -> 0) (1/x^2) does not exist
    Posted in the Differential Geometry Forum
    Replies: 2
    Last Post: October 26th 2009, 07:46 AM

Search Tags


/mathhelpforum @mathhelpforum