Anyone would have an idea of how my teacher would want the proof to be?
Hi,
I would like to know how to prove the following generalization of the Fundamental Theorem of Calculus using the method below.
Suppose there is a finite set E in [a,b] and functions f,φ : [a,b]—> R (reals) such that
(a) φ is continuous on [a,b]
(b) φ'(x) = f(x) for all x ∈[a,b]\E
(c) f ∈R[a,b] (i.e. f is Riemann integrable)
Then
∫f (from a to b) = φ(b) - φ(a)
Method :Start by assuming E = {a,b} and remember that the Mean Value Theorem for derivatives applied to a (closed) interval [c,d], say, requires continuity on all of [c,d] but only requires differentiability on (c,d).
Once you have done it for this special case, explain why it is still true if you add another point to E. The general case then follows.
I am a bit confuse about the method specified by my teacher.
It would be helpful if someone can guide me through this.
Thank you very much.