Let f: D->R be continuous at a point a D and assume f(a)>0. How would I prove that there exists a >0 such that f(x)>0 for all x D (a- , a+ )?
Assume that f is continuous at a and that f(a)>0. Now we choose epsilon=f(a)/2>0. By the definition of continuous, there exists a delta>0 such that for any x, |x-a|<delta then |f(x)-f(a)|<f(a)/2.
I'm not sure about this but its what I managed to get out.