# Thread: Justifying the integration of both sides

1. ## Justifying the integration of both sides

How can you prove using the definition of anti-differentiation, that if two functions are equal, then their anti-derivatives must differ by a constant?

Also can this be used to prove the constant rule for integration??

2. First, if H'(x) = 0, then by definition of the derivative, it must be that H(x) = constant.

If you have F and G s.t. F'=G' = f, then (F - G)' = F' - G' = f - f = 0.
Since (F - G)'(x) = 0, then it must be that (F - G)(x) = constant, or F(x) - G(x) = constant, or F(x) = G(x) + constant.

That is if both F and G belong to the set of anti-derivatives of f, they may differ by a constant.

3. Originally Posted by Skyrim
How can you prove using the definition of anti-differentiation, that if two functions are equal, then their anti-derivatives must differ by a constant?

Also can this be used to prove the constant rule for integration??
It's by a simple fact that if $f:\mathbb{R}\to\mathbb{R}$ (or any connected subspace of $\mathbb{R}$ in fact) is such that $f'\equiv 0$ then $f$ is constant. To see this fact let $x,y\in\mathbb{R}$ be arbitrary. By the MVT there exists some $\xi\in(x,y)$ such that $f(x)-f(y)=(x-y)f'(\xi)=0$ and thus $f(x)=f(y)$. Since $x,y$ were arbitrary the ocnclusion follows.

4. Originally Posted by Skyrim
How can you prove using the definition of anti-differentiation, that if two functions are equal, then their anti-derivatives must differ by a constant?
The statement is not true. Choose for example:

$F,G0,1)\cup (2,3)\rightarrow{\mathbb{R}}" alt="F,G0,1)\cup (2,3)\rightarrow{\mathbb{R}}" />

$F(x)=\begin{Bmatrix} 0& \mbox { if }& x\in (0,1)\\1 & \mbox{if}& x\in (2,3)\end{matrix}\quad G(x)=\begin{Bmatrix} 1 & \mbox{ if }& x\in (0,1)\\0 & \mbox{if}& x\in (2,3)\end{matrix}$

Then, $F'=G'$ , however $F$ and $G$ don't differ by a constant.

The statement is true if the domain of definition for $F$ and $G$ is an interval of $\mathbb{R}$ .

Hint : Use the Mean Value Theorem.

Fernando Revilla

P.S Edited: Sorry, I didnīt see the previous posts. Anyway, perhaps the counter-example could be useful.

5. Originally Posted by FernandoRevilla
The statement is not true. Choose for example:

$F,G0,1)\cup (2,3)\rightarrow{\mathbb{R}}" alt="F,G0,1)\cup (2,3)\rightarrow{\mathbb{R}}" />

$F(x)=\begin{Bmatrix} 0& \mbox { if }& x\in (0,1)\\1 & \mbox{if}& x\in (2,3)\end{matrix}\quad G(x)=\begin{Bmatrix} 1 & \mbox{ if }& x\in (0,1)\\0 & \mbox{if}& x\in (2,3)\end{matrix}$

Then, $F'=G'$ , however $F$ and $G$ don't differ by a constant.

The statement is true if the domain of definition for $F$ and $G$ is an interval of $\mathbb{R}$ .

Hint : Use the Mean Value Theorem.

Fernando Revilla

P.S Edited: Sorry, I didnīt see the previous posts. Anyway, perhaps the counter-example could be useful.
Could you please prove it for me, assuming that indeed F and G lie on the real intervals.

6. Originally Posted by Skyrim
Could you please prove it for me, assuming that indeed F and G lie on the real intervals.
I suppose you meant the first part. The second one has already been proved in this thread.

For every $x\inS=(0,1)\cup (2,3)$ we verify $F'(x)=G'(x)=0$ . That is, $F$ and $G$ are primitives of $f=0$ on $S$. However:

$(F-G)(x)=\begin{Bmatrix} -1& \mbox { if }& x\in (0,1)\\\;\;1 & \mbox{if}& x\in (2,3)\end{matrix}$

i.e. $F-G$ is a non constant function.

Fernando Revilla