int sin x dx = -cos x + C
is there any proof for this too?
and why does always +c comes for any indefinite integration
i just got answer for one question
d/dx sin(x) = lim(d->0) ( sin(x+d) - sin(x) ) / d
= lim ( sin(x)cos(d) + cos(x)sin(d) - sin(x) ) / d
= lim ( sin(x)cos(d) - sin(x) )/d + lim cos(x)sin(d)/d
= sin(x) lim ( cos(d) - 1 )/d + cos(x) lim sin(d)/d
= sin(x) lim ( (cos(d)-1)(cos(d)+1) ) / ( d(cos(d)+1) ) + cos(x) lim sin(d)/d
= sin(x) lim ( cos^2(d)-1 ) / ( d(cos(d)+1 ) + cos(x) lim sin(d)/d
= sin(x) lim -sin^2(d) / ( d(cos(d) + 1) + cos(x) lim sin(d)/d
= sin(x) lim (-sin(d)) * lim sin(d)/d * lim 1/(cos(d)+1) + cos(x) lim sin(d)/d
= sin(x) * 0 * 1 * 1/2 + cos(x) * 1 = cos(x)
was my proof correct?
That works fine once you have and .
Those often are deduced by a slightly "hand-waving" geometric argument, using the definition of sin(x) and cos(x) in terms of the unit circle.
It is also possible to define sin(x) to be and define cos(x) to be . It is not difficult to show that those series converge uniformly for all x and so we can differentiate "term by term". That is, .
An indefinite integration is an "anti-derivative": is any function, F(x) such that F(x)'= f(x). Since the derivative of a constant is 0, if F(x)'= f(x), then (F(x)+ C)'= F'(x)+ 0= f(x) so adding any constant to F(x) gives us an anti- derivative.and why does always +c comes for any indefinite integration
(The other way can also be proved- that if F'(x)= f(x) and G'(x)= f(x), then F(x)- G(x) is a constant but that's a bit harder.)