Now b has me completely confused on what it is even asking...
1. Suppose f(x) = x2 sin(1/x)+x/2 if x is not 0 and defne f(0) = 0:
(a) Show that f'(0) > 0:
(b) Show that no matter how small the positive number h may be, there are infnitely
many points on both sides of x = 0 and within a distance of h of x = 0 at which
f'(x) = 3/2 and also infnitely many at which f(x) = -1/2 : This shows that there is
no interval, in particular no neighborhood Vh(0), on which f is increasing, in spite
of the fact that the slope of the curve y = f(x) is positive at 0.
2. Let f be differentiable when a < x < b: Suppose x1 and x2 are distinct points of the
interval, and let P1 and P2 be the corresponding points of the curve y = f(x):
(a) Show that the condition for P1 to lie above that line tangent to the curve at P2 is
f(x1)-f(x2) - (x1-x2)f'(x2) > 0:
(b) If the condition in (a) holds for every pair of points on the curve, show that f'(x)
increases as x increases. In fact, if x1 < x2, show that
f'(x1) <(f(x2) -f(x1))/(x2 -x1)< f'(x2):
I just get stuck starting.
Like I don't know if I use the MVTtheorem or what?