Could someone please check this proof for me?
Question: Suppose f is a differentiable function. Prove that if f(0) = 0 and then f(x) = 0 for x in (0,1).
Assume there exists a c in (0,1) such that f(c) > 0. Then by the MVT there exists a c* in (0,c) such that . We know that . So since c is in (0,1).
It follows that f(c) < f(c*) whenever c* < c. But this means that the function is decreasing. However, f(0) = 0 and from our assumption that f(c) > 0, this must mean that f must be increasing towards f(c), which is a contradiction. The proof for assuming f(c) < 0 is similar.
I think the proof is a bit shaky in the last paragraph. How is it?