Results 1 to 2 of 2

Math Help - Proof question

  1. #1
    Junior Member
    Joined
    Oct 2010
    Posts
    71

    Proof question

    Let X be a proper subset of R and let f: X to R. Prove (from the definition) that if f(x) tends to L as x tends to infinity then -f(x) tends to -L as x tends to infinity.

    My attempt:

    (For all epsilon > 0) (There exists k > 0) (For all x E X) x>k implies |f(x) - L| < epsilon

    So -epsilon< |f(x)-L| < epsilon

    Similarly, if -f(x) tends to -L as x tends to infinity then

    (For all epsilon > 0) (There exists k > 0) (For all x E X) x>k implies |-f(x) + L| < epsilon

    -epsilon < |-f(x) + L| < epsilon which is the same as -epsilon< |f(x)-L| < epsilon

    So if f(x) tends to L, -f(x) tends to -L

    Is this correct?
    Follow Math Help Forum on Facebook and Google+

  2. #2
    MHF Contributor Also sprach Zarathustra's Avatar
    Joined
    Dec 2009
    From
    Russia
    Posts
    1,506
    Thanks
    1
    yes.
    Follow Math Help Forum on Facebook and Google+

Similar Math Help Forum Discussions

  1. question on a proof
    Posted in the Differential Geometry Forum
    Replies: 1
    Last Post: October 31st 2009, 09:55 PM
  2. Another proof question
    Posted in the Pre-Calculus Forum
    Replies: 4
    Last Post: October 31st 2009, 09:22 PM
  3. Proof question
    Posted in the Number Theory Forum
    Replies: 2
    Last Post: September 30th 2009, 06:41 PM
  4. Proof Question 2
    Posted in the Advanced Algebra Forum
    Replies: 15
    Last Post: January 16th 2008, 12:51 PM
  5. [SOLVED] Proof Question
    Posted in the Discrete Math Forum
    Replies: 1
    Last Post: August 29th 2006, 09:56 AM

Search Tags


/mathhelpforum @mathhelpforum