Let f: D->R be continuous at a point a D and assume f(a)>0. How would I prove that there exists a >0 such that f(x)>0 for all x D (a- , a+ )?

Printable View

- April 3rd 2011, 04:57 AMalice8675309prove that there exists a delta>0 such that f(x)>0...
Let f: D->R be continuous at a point a D and assume f(a)>0. How would I prove that there exists a >0 such that f(x)>0 for all x D (a- , a+ )?

- April 3rd 2011, 05:45 AMPlato
- April 3rd 2011, 05:53 AMalice8675309
Oops sorry it is supposed to be intersection. But how to I go about starting it?

- April 3rd 2011, 05:59 AMPlato
- April 3rd 2011, 06:44 AMalice8675309
Right. My biggest problem is going from what we have to what I need, which is basically writing the proof. Following the format of my notes and other proofs, I've been trying to put together a proof. I don't know but is this right:

Assume that f is continuous at a and that f(a)>0. Now we choose epsilon=f(a)/2>0. By the definition of continuous, there exists a delta>0 such that for any x, |x-a|<delta then |f(x)-f(a)|<f(a)/2.

I'm not sure about this but its what I managed to get out. - April 3rd 2011, 07:13 AMTheEmptySet