Let X be a proper subset of R and let f: X to R. Prove (from the definition) that if f(x) tends to L as x tends to infinity then -f(x) tends to -L as x tends to infinity.

My attempt:

(For all epsilon > 0) (There exists k > 0) (For all x E X) x>k implies |f(x) - L| < epsilon

So -epsilon< |f(x)-L| < epsilon

Similarly, if -f(x) tends to -L as x tends to infinity then

(For all epsilon > 0) (There exists k > 0) (For all x E X) x>k implies |-f(x) + L| < epsilon

-epsilon < |-f(x) + L| < epsilon which is the same as -epsilon< |f(x)-L| < epsilon

So if f(x) tends to L, -f(x) tends to -L

Is this correct?