Thread: Is this proof correct? (about the derivative of the limit of a sequence of functions)

1. Is this proof correct? (about the derivative of the limit of a sequence of functions)

Hi, I am trying to prove the following statement (part of a bigger proof):

Suppose that the sequence of functions $\displaystyle \left\{f_n \right\}$ converges pointwise to $\displaystyle f$ on an interval $\displaystyle \left$a,b \right$$, that each $\displaystyle f_n$ is differentiable on $\displaystyle \left$a,b \right$$ and that the sequence $\displaystyle \{f_n'\}$ converges uniformly to some $\displaystyle g$ on $\displaystyle \left$a,b\right$$. Then, $\displaystyle f$ is differentiable on $\displaystyle \left$$a,b\right$$$ and $\displaystyle f'=g$.

My attempt at a proof (which may or may not be correct, but one step seems a little fishy to me):

************************************************** *********
Since $\displaystyle \lim_{n\rightarrow \infty}{f_n(x)}} = f(x)$ and $\displaystyle \lim_{n\rightarrow \infty}{f_n(x+h)}} = f(x+h)$ for any given $\displaystyle x$ and $\displaystyle x+h$ in the interval,

then $\displaystyle \lim_{n\rightarrow \infty}{\frac{f_n(x+h)-f_n(x)}{h}}} = \frac{f(x+h)-f(x)}{h}$ for these numbers.

Now, for any $\displaystyle n$, $\displaystyle \frac{f_n(x+h)-f_n(x)}{h} = f_n'(y)$ for some $\displaystyle y$ in $\displaystyle \left$x,x+h \right$$ (or $\displaystyle \left$x+h,x\right$$), by the Mean Value Theorem.

Since $\displaystyle \lim_{n\rightarrow \infty}{f_n'(y)}} = g(y)$, it follows that $\displaystyle \frac{f(x+h)-f(x)}{h} = g(y)$ for some $\displaystyle y$ in $\displaystyle \left$x,x+h \right$$ (or $\displaystyle \left$x+h,x\right$$).

This equation is true for all sufficiently small $\displaystyle |h|>0$ (in order that $\displaystyle x+h$ be in the interval).

//now comes the fishy step

Moreover, $\displaystyle \lim_{h\rightarrow\0}{g(y)}} = g(x)$ since as h approaches 0, the interval $\displaystyle \left\[x,x+h\right]$ eventually can only contain the number $\displaystyle y = x$.

Therefore, $\displaystyle \lim_{n\rightarrow\infty}{\frac{f(x+h)-f(x)}{h}} = g(x)$ which proves the theorem.
**********************************************

This seems simple, but I'm worried about how rigorous the last step is. I tried using a more formal "epsilon-delta" argument but I could not construct the inequalities necessary (one reason is that h depends on n, and at the same time n depends on h). Also, I am worried that I did not even use uniform convergence of the {fn'} sequence...
Is this flawed, or is my proof actually valid?

Thank you!

(EDIT: I think I found another flaw. The y that works for $\displaystyle f_n'(y)$ might not be the same for all $\displaystyle n$, which makes taking the limit as I did incorrect. I am really stumped because it doesn't seem like such a hard statement to prove, but I just can't get my head around it... A few days earlier I had thought of a proof using an "epsilon over 3" argument but I forgot what it was (I lost the paper) and I am not even sure it was correct either.)