I think it's right! (x!=c)
My professor has this proof in his notes and I would appreciate some feedback. It appears in the Mean Value Theorem section but I'm not sure if I need to incorporate that or not.
Let be an interval, suppose and are differentiable on . If on and is in , then prove that for all in .
Let and as . Then, implies that . This implies that as as desired.
What happens if . Surely that would then mean that ...
I don't think you can omit this case, because derivatives are found using limits (which by the way, you need to write in your proof), which means you need to check what happens as from both sides...
So are you saying that it can't be proven at all because it is always false or do I need to approach it another way?
I was looking at it again and I was wondering if you could use the Mean Value Theorem. We know that on some interval, , if then the average rate of change on that interval will be greater for . So I guess what I am asking is what if we assumed that was not some random point but the point that makes for some true? Then is the statement still false?