1. ## Weierstrass M test

Hello,

In another forum, someone said to another one to use an analogue of this test for proving this theorem : Differentiation under the integral sign - Wikipedia, the free encyclopedia

I've read that Weierstrass M test is made for series, so is it really possible to use it here ?

Anyway, can anyone give me the guidelines for the proof of this theorem ?

If I've been unclear, please tell me.

Thanks.

2. Originally Posted by Moo
Hello,

In another forum, someone said to another one to use an analogue of this test for proving this theorem : Differentiation under the integral sign - Wikipedia, the free encyclopedia

I've read that Weierstrass M test is made for series, so is it really possible to use it here ?

Anyway, can anyone give me the guidelines for the proof of this theorem ?

If I've been unclear, please tell me.

Thanks.
Hello Moo! I looked at the Wikipedia and did not see any use of the Weierstrass M-test? Do you mean the integral analouge of it? that roughly if:

$\displaystyle f(x,n)\in\mathcal{C}$ and $\displaystyle M_n\in\mathcal{C}$, and

$\displaystyle \forall{x,n}\in[a,b]~|f(x,n)|\leqslant{M_n}$

And if $\displaystyle \int_a^n{M_n}~dn$ converges, then $\displaystyle \int_a^b{f(x,n)}dx$ converges uniformly?

3. Originally Posted by galactus
Oh I'm sorry, I asked for a proof of the theorem : differentiation under the integral sign :s

Originally Posted by Mathstud28
Hello Moo! I looked at the Wikipedia and did not see any use of the Weierstrass M-test? Do you mean the integral analouge of it? that roughly if:

$\displaystyle f(x,n)\in\mathcal{C}$ and $\displaystyle M_n\in\mathcal{C}$, and

$\displaystyle \forall{x,n}\in[a,b]~|f(x,n)|\leqslant{M_n}$

And if $\displaystyle \int_a^n{M_n}~dn$ converges, then $\displaystyle \int_a^b{f(x,n)}dx$ converges uniformly?
Where the heck does this analogue come from ? And is it correctly stated ?
And what is $\displaystyle \mathcal{C}$ ?

4. Originally Posted by Moo
Oh I'm sorry, I asked for a proof of the theorem : differentiation under the integral sign :s

Where the heck does this analogue come from ? And is it correctly stated ?
And what is $\displaystyle \mathcal{C}$ ?
I'm sorry I thought you were saying that the analouge was used in the proof you were studying and needed to see the proof of it! The improper analouge $\displaystyle \int_a^{\infty}f(x,n)$ is a smooth continuation from the infinite series notion of uniform convergence, the non improper integral analouge is a little more complicated and unless you have a burning desire I won't go into it. And $\displaystyle \mathcal{C}$ refers to continuous, that is what all my books use.

5. Are you asking for the proof of this:

$\displaystyle \frac{d}{dx}\int_{h(x)}^{g(x)}f(t)dt=f(g(x))g'(x)-f(h(x))h'(x)$?.

If so, I think I may be able to help. Let's see:

Let's say f is continuous on an open interval we can call I.

If g(x), h(x) and a are in I, then

$\displaystyle \int_{h(x)}^{g(x)}f(t)dt=\int_{h(x)}^{a}f(t)dt+\in t_{a}^{g(x)}f(t)dt$

$\displaystyle =-\int_{a}^{h(x)}f(t)dt+\int_{a}^{g(x)}f(t)dt$

So, $\displaystyle \frac{d}{dx}\int_{h(x)}^{g(x)}f(t)dt=-f(h(x))h'(x)+f(g(x))g'(x)$

Is that what you meant?. Probably not, but there it is anyway.

6. Originally Posted by galactus
Are you asking for the proof of this:

$\displaystyle \frac{d}{dx}\int_{h(x)}^{g(x)}f(t)dt=f(g(x))g'(x)-f(h(x))h'(x)$?.

If so, I think I may be able to help. Let's see:

Let's say f is continuous on an open interval we can call I.

If g(x), h(x) and a are in I, then

$\displaystyle \int_{h(x)}^{g(x)}f(t)dt=\int_{h(x)}^{a}f(t)dt+\in t_{a}^{g(x)}f(t)dt$

$\displaystyle =-\int_{a}^{h(x)}f(t)dt+\int_{a}^{g(x)}f(t)dt$

So, $\displaystyle \frac{d}{dx}\int_{h(x)}^{g(x)}f(t)dt=-f(h(x))h'(x)+f(g(x))g'(x)$

Is that what you meant?. Probably not, but there it is anyway.
hm right, that's the version of the English wikipedia, not the French one's but thanks, it's never bad to learn something.
see below

Originally Posted by Mathstud28
I'm sorry I thought you were saying that the analouge was used in the proof you were studying and needed to see the proof of it! The improper analouge $\displaystyle \int_a^{\infty}f(x,n)$ is a smooth continuation from the infinite series notion of uniform convergence, the non improper integral analouge is a little more complicated and unless you have a burning desire I won't go into it. And $\displaystyle \mathcal{C}$ refers to continuous, that is what all my books use.
that's the problem, what for indefinite integral ?
At least not under this name... So if you could clarify .. ?

The theorem I'm interested in is more or like this one :
$\displaystyle \begin{gathered}A \subseteq \mathbb{R}, ~ I \subseteq \mathbb{R} \hfill \\ f ~:~ A \times I \to \mathbb{R} \hfill \end{gathered}$
$\displaystyle F(x)=\int_I f(x,t) ~dt$

$\displaystyle \begin{gathered} \bullet t \mapsto f(x,t) \text{ is continuous and integrable} \hfill \\ \bullet t \mapsto \frac{\partial f(x,y)}{\partial x} \text{ exists and is continuous} \hfill \\ \bullet x \mapsto \frac{\partial f(x,y)}{\partial x} \text{ is continuous} \hfill \\ \bullet \exists \varphi \geqslant 0 \text{ integrable}, ~ \forall x,t ~ \left|\frac{\partial f(x,t)}{\partial x} \right| \leqslant \varphi(x) \hfill \end{gathered}$

Then $\displaystyle F$ is $\displaystyle \mathcal{C}^1$ (that is continuous and differentiable) and we have :
$\displaystyle F'(x)=\int \frac{\partial f(x,t)}{\partial x} ~ dt$

And it's for proving this stuff that someone suggested to use the Weierstrass M test

7. hm right, that's the version of the English wikipedia
Actually, that is what I just wrote out myslef. I did not see it on Wikipedia.
I assume then it is similar to what they have.

8. Originally Posted by Moo
Hello,

In another forum, someone said to another one to use an analogue of this test for proving this theorem : Differentiation under the integral sign - Wikipedia, the free encyclopedia

I've read that Weierstrass M test is made for series, so is it really possible to use it here ?

Anyway, can anyone give me the guidelines for the proof of this theorem ?
There are essentially two ways for this proof: one with Lebesgue's dominated (or bounded) convergence theorem (the "analogue of Weierstrass M-test", I guess, even though that's quite different), which is the more general proof, and the elementary one, requiring stronger hypotheses and only valid for integrals on a segment (finite limits in the integral).

-----------

Let's begin with the "elementary" one. Before differentiation, let's deal with continuity since it is more natural to begin with.

Proposition: Suppose $\displaystyle f:I\times[a,b]$ is continuous, where $\displaystyle I$ is an open interval (or a metric space, for instance). Then $\displaystyle F:x\mapsto F(x)=\int_a^b f(x,t) dt$ is continuous on $\displaystyle I$.

Proof: Let $\displaystyle x\in I$. Let $\displaystyle \varepsilon>0$. Choose a compact neighbourhood $\displaystyle K$ of $\displaystyle x$ contained in $\displaystyle I$ (like $\displaystyle K=[x-\delta,x+\delta]$ for some small $\displaystyle \delta$). The function $\displaystyle f$ is continuous on $\displaystyle K\times [a,b]$, which is compact, hence $\displaystyle f$ is uniformly continuous on $\displaystyle K\times [a,b]$. In particular, there is $\displaystyle \alpha>0$ such that, if $\displaystyle |h|<\alpha$ (and $\displaystyle x+h\in K$), then for every $\displaystyle t\in[a,b]$, $\displaystyle |f(x+h,t)-f(x,t)|<\varepsilon$. As a consequence, for these values of $\displaystyle h$, $\displaystyle |F(x+h)-F(x)|\leq \int_a^b|f(x+h,t)-f(x,t)|dt\leq (b-a)\varepsilon$. This is it.

Proposition: Suppose $\displaystyle f:I\times[a,b]\to\mathbb{R}$ is differentiable with respect to the first variable and that $\displaystyle (x,t)\mapsto\frac{\partial f}{\partial x}(x,t)$ is continuous on $\displaystyle I\times[a,b]$. Then $\displaystyle F:x\mapsto F(x)=\int_a^b f(x,t)dt$ is differentiable on $\displaystyle I$, and $\displaystyle F'(x)=\int_a^b\frac{\partial f}{\partial x}(x,t)dt$.

Proof: Let $\displaystyle x\in I$. In the same way as above, we can find a compact $\displaystyle K\subset I$ containing $\displaystyle x$, and we say that the function $\displaystyle (x,t)\mapsto\frac{\partial f}{\partial x}(x,t)$ is uniformly continuous on $\displaystyle K\times[a,b]$, hence there is $\displaystyle \alpha>0$ such that, if $\displaystyle |h|<\alpha$, then $\displaystyle \left|\frac{\partial f}{\partial x}(x+h,t)-\frac{\partial f}{\partial x}(x,t)\right|<\varepsilon$. Then, the conclusion comes from:
$\displaystyle \left|\frac{1}{h}(F(x+h)-F(x))-\int_a^b\frac{\partial f}{\partial x}(x,t)dt\right|$
.........$\displaystyle =\left| \int_a^b \left(\frac{1}{h}\int_0^h \frac{\partial f}{\partial x}(x+z,t)dz - \frac{\partial f}{\partial x}(x,t)\right)\right|$
.........$\displaystyle =\left| \int_a^b \frac{1}{h}\int_0^h \left(\frac{\partial f}{\partial x}(x+z,t)- \frac{\partial f}{\partial x}(x,t)\right)dz\right|$
.........$\displaystyle \leq \int_a^b \frac{1}{h}\int_0^h \left|\frac{\partial f}{\partial x}(x+z,t)- \frac{\partial f}{\partial x}(x,t)\right|dz$
.........$\displaystyle \hspace{1cm}\leq (b-a)\varepsilon$.

---------

These proofs are nice because of their simplicity, but they require integration on a compact space to get uniform continuity. For the integration on a more general interval, an additional hypothesis (a "domination") is needed. This is where Lebesgue's dominated convergence theorem comes into play.

Lebesgue's dominated convergence theorem is a bit like what MathStud28 wrote, but his statement is not correct (why would the sequence of integrals converge?). The dominated convergence theorem is a theorem of "limit under the integration sign". It says: (you can replace "measurable" by "piecewise continuous", or almost whatever that makes the integration make sense for you)

Theorem: suppose $\displaystyle f_n$, $\displaystyle n\in\mathbb{N}$, $\displaystyle f$ and $\displaystyle \phi$ are (measurable) functions $\displaystyle I\to\mathbb{R}$ (where $\displaystyle I$ is an interval), such that for all $\displaystyle t\in I$ (or almost all), $\displaystyle f_n(t)\to_n f(t)$, for every $\displaystyle n\in\mathbb{N}$ and every $\displaystyle t\in I$, $\displaystyle |f_n(t)|\leq \phi(t)$, and such that $\displaystyle \int_I \phi(t)dt<\infty$ ($\displaystyle \phi$ is integrable on $\displaystyle I$). Then, for every $\displaystyle n$, $\displaystyle f_n$ is integrable on $\displaystyle I$, and $\displaystyle \int_I f_n(t)dt\to_n f(t)$.

I won't prove it because it is definitely not an "elementary" theorem; it is not even clear why $\displaystyle f$ can be integrated (a limit of piecewise continuous functions doesn't have to be piecewise continuous). But it is simple to deduce the continuity and differentiation theorems from this one. The bounded convergence theorem deals with sequences, so we shall use sequences $\displaystyle h_n\to 0$ instead of $\displaystyle h\to 0$, which is equivalent (if we consider any sequence converging to 0).

Proposition: if $\displaystyle f:I\times J\to\mathbb{R}$ is continuous with respect to the first variable (and measurable), and if there is a (measurable) function $\displaystyle \phi:J\to[0,+\infty)$ such that $\displaystyle |f(x,t)|\leq\phi(t)$ for any $\displaystyle (x,t)\in I\times J$ and $\displaystyle \int_J\phi(t)dt<\infty$, then $\displaystyle F:x\mapsto F(x)=\int_J f(x,t)dt$ is continuous on $\displaystyle J$.

Proof: Notice $\displaystyle F$ is well-defined thanks to the domination by $\displaystyle \phi$. let $\displaystyle x\in I$. Let $\displaystyle (h_n)_n$ be a sequence converging to 0. Then $\displaystyle F(x+h_n)-F(x)=\int_J (f(x+h_n,t)-f(x,t))dt$, and we apply the dominated convergence theorem to $\displaystyle f_n(t)=f(x+h_n,t)-f(x,t)$: we have $\displaystyle |f_n(t)|\leq 2\phi(t)$, and $\displaystyle f_n(t)\to_n 0$, hence $\displaystyle \int_J f_n(t)dt\to \int_J 0dt =0$. This is it.

Proposition: Let $\displaystyle f:I\times J\to\mathbb{R}$ be differentiable with respect to the first variable, and such that there is $\displaystyle \phi:J\to[0,+\infty)$ with $\displaystyle \left|\frac{\partial f}{\partial x}(x,t)\right| \leq \phi(t)$ for all $\displaystyle (x,t)\in I\times J$, and $\displaystyle \int_J \phi(t)dt<\infty$. Then $\displaystyle F:x\mapsto\int_J f(x,t)dt$ is well-defined, differentiable on $\displaystyle I$, and $\displaystyle F'(x)=\int_J \frac{\partial f}{\partial x}(x,t)dt$.

Proof: To see that $\displaystyle F$ is well-defined, you can notice that you can deduce an upper bound for $\displaystyle |f(x,t)|=\left|f(x_0,t)+\int_{x_0}^x \frac{\partial f}{\partial x}(y,t) dy\right|$ from the upper bound given by assumption, and this upper bound does not depend on $\displaystyle x$ when $\displaystyle x$ is in a compact. Never mind, because it is usually easy to see that $\displaystyle F$ is well defined, by other means. Now, to show that it is differentiable at $\displaystyle x$, let $\displaystyle (h_n)_n$ be a sequence converging to 0. Let us write $\displaystyle \frac{1}{h_n}(F(x+h_n)-F(x))=\int_J f_n(t) dt$, where $\displaystyle f_n(t)=\frac{1}{h_n}(f(x+h_n,t)-f(x,t))$, and apply the bounded convergence theorem. We have $\displaystyle f_n(t)\to_n \frac{\partial f}{\partial x}(x,t)$, and the domination is a consequence of the mean value theorem: $\displaystyle |f(x+h_n,t)-f(x,t)|\leq h_n \left|\frac{\partial f}{\partial x}(x+c,t)\right|\leq |h_n|\phi(t)$, for some $\displaystyle c\in [0,h_n]$, so that $\displaystyle |f_n(t)|\leq \phi(t)$. Hence $\displaystyle \int_J f_n(t)dt\to\int_J \frac{\partial f}{\partial x}(x,t)dt$. And this is it.

I hope you enjoyed it... "The dominated convergence theorem" translates as "le théorème de convergence dominée" and this is a must-know in higher studies.

9. About the case when the bounds of the integral depend on $\displaystyle x$...

I assume here a little bit of knowledge about functions of several variables, namely that if the partial derivatives of a function exist and are continuous at one point, then the function is differentiable at that point.

Let us consider $\displaystyle F(x)=\int_{a(x)}^{b(x)} f(x,t)dt$, where $\displaystyle f(x,t)$ is continuously differentiable with respect to $\displaystyle x$ and continuous with respect to $\displaystyle t$.

We can write $\displaystyle F(x)=\Psi(x,a(x),b(x))$ where $\displaystyle \Psi(x,a,b)=\int_a^b f(x,t)dt$. The interest in doing this is that we know how to differentiate $\displaystyle \Psi$ with respect to any of its three variables: $\displaystyle \frac{\partial\Psi}{\partial x}(x,a,b)=\int_a^b \frac{\partial f}{\partial x}(x,t)dt$ (this is what we did above), $\displaystyle \frac{\partial\Psi}{\partial a}(x,a,b)=-f(x,a)$ and $\displaystyle \frac{\partial\Psi}{\partial b}(x,a,b)=f(x,b)$ (this is the usual property relating primitive function and integration).

These three partial derivatives are continuous (for the first one, because of the continuity theorem, and a bit more). Hence the function $\displaystyle \Psi$ is differentiable. So that, by chain rule,
$\displaystyle F'(x)=\frac{d}{dx}\left(\Psi(x,a(x),b(x))\right)=$ $\displaystyle \frac{\partial\Psi}{\partial x}(x,a(x),b(x))+a'(x)\frac{\partial \Psi}{\partial a}(x,a(x),b(x))+b'(x)\frac{\partial\Psi}{\partial b}(x,a(x),b(x))$,
and this should give the formula on the Wikipedia.

10. Hello Moo, I am going to be different from Laurent and I hope this is okay for you. Tell me if you do not like it and maybe we can find something better. I tell you this because this is not as general as you want

Theorem 1: Let $\displaystyle R = [c,d]\times [a,b]$ be a rectangle in $\displaystyle \mathbb{R}^2$ and $\displaystyle f$ (if you perfer $\displaystyle f(x,t)$) be a continous function on $\displaystyle R$ with $\displaystyle f_t$ a continous function on $\displaystyle R$. Then we have that, $\displaystyle \frac{d}{dt}\int_c^d f(x,t) dx = \int_c^d \frac{\partial f}{\partial t}(x,t) dx$ for all $\displaystyle t\in (a,b)$.

In order to prove this we use a theorem called Dominated Convergence Theorem.

Theorem 2: For ever real number $\displaystyle y$ in a interval $\displaystyle Y$ around $\displaystyle y_0$ let $\displaystyle F(x,y)$ be a piecewise continous function on a finite interval $\displaystyle I$ (a function of $\displaystyle x$ - to be a little informal here but hopefully clearer). Say that $\displaystyle \lim_{y\to y_0} F(x,y)$ defines a piecewise continous function, call it $\displaystyle f(x)$. If there is a piecewise continous function $\displaystyle g(x)$ such that $\displaystyle |F(x,y)|\leq g(x)$ on $\displaystyle I$ then $\displaystyle \lim_{y\to y_0}\int_I F(x,y) dx = \int_I \lim_{y\to y_0} F(x,y) dx = \int_I f(x) dx$.

Theorem 2 is basically passing the limit under the integral sign. We will not prove this result. If you wish I can try to post a proof of this but accept it for now. With this we can prove Theorem 1.

Proof: For a particular $\displaystyle t\in (a,b)$ we have by definition:
$\displaystyle \frac{d}{dt} \int_c^d f(x,t) dx = \lim_{y\to 0} \int_c^d \frac{f(x,t+y)-f(x,y)}{y} dx$
Let $\displaystyle M$ be a maximum for $\displaystyle |f_t|$ on $\displaystyle R$.
Then we see that,
$\displaystyle |f(x,t+y) - f(x,t)| = \left| \int_t^{t+y} f_t(x,z)dz \right| \leq \int_t^{t+y} |f_t(x,z)| dz \leq M |y|$
Thus, we see that for $\displaystyle y\not =0$ near $\displaystyle 0$ that $\displaystyle \left| \frac{f(x,t+y)-f(x,t)}{y} \right| \leq M$.
Define, $\displaystyle F(x,y) = \frac{f(x,t+y)-f(x,t)}{y}$ for $\displaystyle y\not = 0$ and $\displaystyle F(x,0) = f_t(x,t)$ (remember $\displaystyle t$ is fixed) for each $\displaystyle x\in [c,d]$.
Let $\displaystyle g(x) = M$ on $\displaystyle [c,d]$.
We see that $\displaystyle |F(x,y)| \leq g(x)$ by the above work.
Therefore, by Theorem 2 we see that,
$\displaystyle \lim_{y\to 0} \int_c^d F(x,y) dx = \int_c^d \lim_{y\to 0} F(x,y) dx = \int_c^d F(x,0) dx = \int_c^d \frac{\partial f}{\partial t}(x,t) dx$

11. Thank you very much to you two !
I'm gonna print these messages and read them again...I have some difficulties to follow such things on a screen...

I hope you enjoyed it... "The dominated convergence theorem" translates as "le théorème de convergence dominée" and this is a must-know in higher studies.
Yes, it is very clear, and yup, I know this theorem. I was more interested in proofs and hence my questions ^^
It looks like a lecture !
it is possible that there are some unclear points, if so, I'll give it a feedback asap.

Hello Moo, I am going to be different from Laurent and I hope this is okay for you. Tell me if you do not like it and maybe we can find something better. I tell you this because this is not as general as you want
As soon as it uses the Dominated convergence theorem, it is ok, it's these types of proofs I was looking for

Once again, thanks a bunch guys !

12. Originally Posted by Moo
Thank you very much to you two !
I'm gonna print these messages and read them again...I have some difficulties to follow such things on a screen...

- When you consider integration on a segment $\displaystyle [a,b]$, the constant functions are integrable and thus can be used as a domination (i.e. as the function $\displaystyle \phi$ in my post, or $\displaystyle g$ in ThePerfectHacker's). That's why the continuity of the derivative is sufficient to prove the domination in this case: a continuous function on a segment is bounded. In this setting, as I said, I however prefer to use uniform continuity (my first proof) because this is easier, but both approaches are valid of course.
- To avoid using limits along a sequence $\displaystyle (h_n)_n$ like me for the continuity and the differentiation, ThePerfectHacker included it once and for all in his version of the dominated convergence theorem. This simplifies the redaction indeed.

,
,

,

,

,

# uniform convergece of m test in improper integral

Click on a term to search for related topics.