# Thread: Not Understanding Proof for Derivative.

1. ## Not Understanding Proof for Derivative.

If y' = f'(x)
Then y = f(x) in one integral.
Now Let g(x) be any other integral in y' = f'(x) that is g'(x) = f'(x). Show that g(x) can differ from f(x) by at most a constant
Then Let : w' = f'(x) - g'(x) = 0
Then the equation of w as a function of x must be w = constant
hence we see that w = f(x) - g(x) equals g(x) = f(x) + constant.
Because g(x) is any integral other than f(x) all integrals are given by y = f(x) + c

How did they get from w = constant to g(x) = f(x) + constant algebraically?
second.... they said Now Let g(x) be any other integral in y' = f'(x) that is g'(x) = f'(x). But if they are two dif integrals why did they set g'(x) = f'(x)? Doesn't setting something equal to something mean they are equal? Is there something that I am misunderstanding?

2. Originally Posted by Mike012
If y' = f'(x)
Then y = f(x) in one integral.
Now Let g(x) be any other integral in y' = f'(x) that is g'(x) = f'(x). Show that g(x) can differ from f(x) by at most a constant
Then Let : w' = f'(x) - g'(x) = 0
Then the equation of w as a function of x must be w = constant
hence we see that w = f(x) - g(x) equals g(x) = f(x) + constant.
Because g(x) is any integral other than f(x) all integrals are given by y = f(x) + c

How did they get from w = constant to g(x) = f(x) + constant algebraically?
second.... they said Now Let g(x) be any other integral in y' = f'(x) that is g'(x) = f'(x). But if they are two dif integrals why did they set g'(x) = f'(x)? Doesn't setting something equal to something mean they are equal? Is there something that I am misunderstanding?
g'(x) = f'(x).

Integrate both sides wrt x:

g(x) = f(x) + C ....

3. it's a basic fact from the derivative: if $\displaystyle g'(x)=0$ for all $\displaystyle x,$ then $\displaystyle g(x)=k.$

so in your problem if $\displaystyle f'(x)=g'(x)$ for all $\displaystyle x,$ then $\displaystyle f'(x)-g'(x)=(f(x)-g(x))'=0,$ that implies $\displaystyle f(x)-g(x)$ is constant, hence, the result.

4. How could we assume though that g'(x) = f'(x) ?

5. This proof is almost as if the person made this proof was like....
If there is a function say x^3 + c OR x^3
Then by differentiating , both functons would be 3x^2
Now the question pops up... If we had a differentiated function f'(x) without knowing the original function f(x)...
How could we know that the integration of f'(x) differs by only a constant....
Well if we set
x^3 + c = x^3
then by diff..
= 3x^2 + c' = 3x^2
which equals
c' = 0....
Am I correct?

6. Originally Posted by mr fantastic
g'(x) = f'(x).

Integrate both sides wrt x:

g(x) = f(x) + C
mr. f,

this is what his textbook was proving!

7. Originally Posted by Mike012
How could we assume though that g'(x) = f'(x) ?
What they're showing, essentially, is that if g'(x) = f'(x), then f(x) - g(x) = constant.

So if g'(x) = f'(x), then f'(x) - g'(x) = 0.

It seems to me that they should have defined w and not w', i.e., w(x)=f(x) - g(x),

So that w'(x) = f'(x) - g'(x) = 0 ... and on with the rest of their proof.

8. This problem has been troubled me recently but I have found the solution.
Suppose that y' = g' where both y' and g' are functions of x.
It is required to prove that the difference between y and g does not vary with x. This proof is important because it justifies taking the integral of both sides of any equation.

If y' = g', for all x
y' - g' = 0, for all x
(y-g)' = 0, for all x

The set of functions whose derivative is always 0 are known as the constant functions. Since (y-g)' = 0, y-g is a constant function. Let it be denoted by C.

y-g = C

Adding g to both sides,

y = g + C where C is a function such that its derivative with respect to x is always 0.

In this proof we assume that y and g are continuously differentiable on the interval in question.

9. Krizalid gave you the exact same proof at post #3!