# f differentiable implies f' continuous.

• Mar 31st 2011, 08:42 AM
Dagger2006
f differentiable implies f' continuous.
Suppose that f is differentiable on (a, b) and that f ' is increasing on (a, b). Prove that f ' is continuous on (a, b).

I'm actually at a lost on how to do this problem. The method I keep finding myself trying is to assume that some point on f ' is not continuous. Then show this a contradiction. The problem is I keep finding myself assuming that the rest of f ' is continuous and I have no clue how f ' increasing contributes to this proof.

It seems to me that any differentiable function would have f ' has continuous, otherwise, the original function would have a 'sharp corner' causing it to not be differentiable.

The derivative of a function is not necessarily continuous but it does satisfy the "intermediate value" property: if f is differentiable on [a, b] and c is a number between f'(a) and f'(b), then there exist an $x_0$ such that $f'(x_0)$ is between f'(a) and f'(b). A consequence of that is that if $\lim_{x\to x_0^+} f'(x)$ and $\lim_{x\to x_0^-} f'(x)$ exist, then they are equal and the derivative is continuous there. So if f' is NOT continuous at least one of those one-sided limits does not exist. Use "increasing" to show that cannot happen.
As suggested $f^'$ satisfies the intermediate value property now for any arbitrary $c \in (a,b)$, given an $\epsilon >0$ consider $(f^{'}(c) - \epsilon , f^{'}(c)+\epsilon)$ $\subset (f^{'}(a) , f^{'}(b))$ by intermediate value property $\exists (\alpha , \beta) \subset (a,b)$ such that $f^{'}(\alpha, \beta)=(f^{'}(c) - \epsilon , f^{'}(c)+\epsilon), \alpha < c < \beta$ now lets choose $\delta = min((c-\alpha),(\beta-c))$ we always have $|f^{'}(x) -f^{'}(c)| < \epsilon, \forall |x-c| < \delta$. Hence $f^{'}$ is continuous on (a,b).