Thread: f differentiable implies f' continuous.

1. f differentiable implies f' continuous.

Suppose that f is differentiable on (a, b) and that f ' is increasing on (a, b). Prove that f ' is continuous on (a, b).

I'm actually at a lost on how to do this problem. The method I keep finding myself trying is to assume that some point on f ' is not continuous. Then show this a contradiction. The problem is I keep finding myself assuming that the rest of f ' is continuous and I have no clue how f ' increasing contributes to this proof.

It seems to me that any differentiable function would have f ' has continuous, otherwise, the original function would have a 'sharp corner' causing it to not be differentiable.

3. The derivative of a function is not necessarily continuous but it does satisfy the "intermediate value" property: if f is differentiable on [a, b] and c is a number between f'(a) and f'(b), then there exist an $\displaystyle x_0$ such that $\displaystyle f'(x_0)$ is between f'(a) and f'(b). A consequence of that is that if $\displaystyle \lim_{x\to x_0^+} f'(x)$ and $\displaystyle \lim_{x\to x_0^-} f'(x)$ exist, then they are equal and the derivative is continuous there. So if f' is NOT continuous at least one of those one-sided limits does not exist. Use "increasing" to show that cannot happen.
4. As suggested $\displaystyle f^'$ satisfies the intermediate value property now for any arbitrary $\displaystyle c \in (a,b)$, given an $\displaystyle \epsilon >0$ consider $\displaystyle (f^{'}(c) - \epsilon , f^{'}(c)+\epsilon)$ $\displaystyle \subset (f^{'}(a) , f^{'}(b))$ by intermediate value property $\displaystyle \exists (\alpha , \beta) \subset (a,b)$ such that $\displaystyle f^{'}(\alpha, \beta)=(f^{'}(c) - \epsilon , f^{'}(c)+\epsilon), \alpha < c < \beta$ now lets choose $\displaystyle \delta = min((c-\alpha),(\beta-c))$ we always have $\displaystyle |f^{'}(x) -f^{'}(c)| < \epsilon, \forall |x-c| < \delta$. Hence $\displaystyle f^{'}$ is continuous on (a,b).