Can somebody please help I do not know how to do this and have an exam tommorow!
Suppose that f : [a; b] --> R is a differentiable function, and assume f'(a) <
k < f'(b). Show that there exists c in (a; b) with f'(c) = k. Do NOT assume that f' is
continuous! Hint: consider g(x) = f(x) - kx.
Here's a slightly different proof that, although derivative functions are not necessarily continuous, they satisfy the "intermediate value" property, which is, I believe, what Sampras is referring to as the "Fermat's theorem".
Define h(x)= (f(x)- f(a))/(x- a). Then h is continuous and, at x= a, h(a)= f'(a). Therefore, h takes on all values between f'(a) and h(b)= (f(b)- f(a))/(b-a) on the interval [a, b]. Further, by the mean value theorem applied to the interval [a, x], there exist c such that f'(c)= (f(x)- f(a))/(x- a). That is, f' takes on all such values.
Define j(x)= (f(b)- f(x))/(b- x). Then j is continuous and, at x= b, j(b)= f'(b).
Therefore, j takes on all values between j(a)= (f(b)- f(a))/(b-a) and f'(b). Further, by the mean value theorem applied to the interval [x, b], there exist c such that f'(c)= (f(b)- f(x))/(x- a). That is, f' takes on all such values. Since those two intervals include all numbers between f'(a) and f'(b), for any k such that f'(a)< k< f'(b), there exist c such that f'(c)= k.
This shows, in turn, by the way, that if exists, then it must be equal to f'(a). The only way a derivative can not be continuous at a point is if that limit does not exist there.