# Not Understanding Proof for Derivative.

• Jan 13th 2011, 12:35 AM
Mike012
Not Understanding Proof for Derivative.
If y' = f'(x)
Then y = f(x) in one integral.
Now Let g(x) be any other integral in y' = f'(x) that is g'(x) = f'(x). Show that g(x) can differ from f(x) by at most a constant
Then Let : w' = f'(x) - g'(x) = 0
Then the equation of w as a function of x must be w = constant
hence we see that w = f(x) - g(x) equals g(x) = f(x) + constant.
Because g(x) is any integral other than f(x) all integrals are given by y = f(x) + c

How did they get from w = constant to g(x) = f(x) + constant algebraically?
second.... they said Now Let g(x) be any other integral in y' = f'(x) that is g'(x) = f'(x). But if they are two dif integrals why did they set g'(x) = f'(x)? Doesn't setting something equal to something mean they are equal? Is there something that I am misunderstanding?
• Jan 13th 2011, 01:49 AM
mr fantastic
Quote:

Originally Posted by Mike012
If y' = f'(x)
Then y = f(x) in one integral.
Now Let g(x) be any other integral in y' = f'(x) that is g'(x) = f'(x). Show that g(x) can differ from f(x) by at most a constant
Then Let : w' = f'(x) - g'(x) = 0
Then the equation of w as a function of x must be w = constant
hence we see that w = f(x) - g(x) equals g(x) = f(x) + constant.
Because g(x) is any integral other than f(x) all integrals are given by y = f(x) + c

How did they get from w = constant to g(x) = f(x) + constant algebraically?
second.... they said Now Let g(x) be any other integral in y' = f'(x) that is g'(x) = f'(x). But if they are two dif integrals why did they set g'(x) = f'(x)? Doesn't setting something equal to something mean they are equal? Is there something that I am misunderstanding?

g'(x) = f'(x).

Integrate both sides wrt x:

g(x) = f(x) + C ....
• Jan 13th 2011, 05:21 AM
Krizalid
it's a basic fact from the derivative: if \$\displaystyle g'(x)=0\$ for all \$\displaystyle x,\$ then \$\displaystyle g(x)=k.\$

so in your problem if \$\displaystyle f'(x)=g'(x)\$ for all \$\displaystyle x,\$ then \$\displaystyle f'(x)-g'(x)=(f(x)-g(x))'=0,\$ that implies \$\displaystyle f(x)-g(x)\$ is constant, hence, the result.
• Jan 13th 2011, 08:14 AM
Mike012
How could we assume though that g'(x) = f'(x) ?
• Jan 13th 2011, 08:46 AM
Mike012
This proof is almost as if the person made this proof was like....
If there is a function say x^3 + c OR x^3
Then by differentiating , both functons would be 3x^2
Now the question pops up... If we had a differentiated function f'(x) without knowing the original function f(x)...
How could we know that the integration of f'(x) differs by only a constant....
Well if we set
x^3 + c = x^3
then by diff..
= 3x^2 + c' = 3x^2
which equals
c' = 0....
Am I correct?
• Jan 13th 2011, 08:46 PM
SammyS
Quote:

Originally Posted by mr fantastic
g'(x) = f'(x).

Integrate both sides wrt x:

g(x) = f(x) + C

mr. f,

this is what his textbook was proving!
• Jan 13th 2011, 08:55 PM
SammyS
Quote:

Originally Posted by Mike012
How could we assume though that g'(x) = f'(x) ?

What they're showing, essentially, is that if g'(x) = f'(x), then f(x) - g(x) = constant.

So if g'(x) = f'(x), then f'(x) - g'(x) = 0.

It seems to me that they should have defined w and not w', i.e., w(x)=f(x) - g(x),

So that w'(x) = f'(x) - g'(x) = 0 ... and on with the rest of their proof.
• Jan 13th 2011, 10:10 PM
Skyrim
This problem has been troubled me recently but I have found the solution.
Suppose that y' = g' where both y' and g' are functions of x.
It is required to prove that the difference between y and g does not vary with x. This proof is important because it justifies taking the integral of both sides of any equation.

If y' = g', for all x
y' - g' = 0, for all x
(y-g)' = 0, for all x

The set of functions whose derivative is always 0 are known as the constant functions. Since (y-g)' = 0, y-g is a constant function. Let it be denoted by C.

y-g = C