Why not just u = sin(t)?
I am revising for an exam and am stuck on one question.
\int sin^24tcos4t
I have done the following:
u^2 = sin^2
Now I am stuck and not sure were to go. Can any one explain how to do this so I can how they got the answer?
Thanks
M
I would tell them that I know derivatives.
I can recognize a derivative when I see one.
I was trained that way.
I graduated with a degree in mathematics never having seen U-substitution.
We were required to call it what it is: anti-differentiation.
In fact, to get full credit, we had to identify the 'derivatiive form'.
This way to evaluate an integral is too hard for the mid-level student.
When you give him a function and ask him THIS IS A DERIVATIVE OF WHAT? imo you will confuse him.
But I believe, the student who leaerned it this way will be better than the student who learned it using u-substitutions.
But how in earth did not you see a u-substituion ?
Surely, you've learned trigonometric-substitution. Which is a substitution itself. so you should have been learned the substitution technique then the trigonometric substitution.
That is a fair point. I do not remember. I guess by the time we needed
a trigonometric substitution, we could see why it made sense to use one.
Anyway, I think all of this is mute. I think that with in twenty years calculus textbooks will not have a chapter on Techniques of Intergrations. The students users are already demanding to be allowed to use laptops connected to the internet. Here is why.
Students do get what they demand today.
basically "using a u-substitition" is THE SAME THING as recognizing that f(x) dx = g'(u) du.
one person sees adding a+b+c as a 1-step operation, another sees it as a 2-step operation. who is right?
in this particular problem, we have (something squared), along with something else that is very close to the derivative of (something).
typically, one uses "u" for "something", and hopefully, what else is left is close enough to "du" that we can fiddle with it, and make it work.
specifically, if we let u = sin(4t), then du = 4cos(4t). well, we don't have 4cos(4t), we just have cos(4t).
but we can write cos(4t) = (1/4)(4cos(4t)) and take the factor of 1/4 outside the integral. so we have (1/4)∫ u^2 du
= u^3/12 + C = sin^3(4t)/12 + C.
now, let's pretend we never learned a thing about u-substitution. we're looking for some function f, such that:
f'(t) = sin^2(4t)cos(4t). it appears that f will have to be some combination of trig functions. so let's try:
f(t) = sin^a(mt) + cos^b(nt), and differentiate (does this seem like a reasonable guess?).
f'(t) = (am)sin^(a-1)(mt)cos(mt) - (bn)cos^(b-1)(nt)sin(nt).
compare this to sin^2(4t)cos(4t). well, the second term doesn't have a high enough power of sin,
so we might suppose that b = 0. if, in the first term, we let a = 3, m = 4, we get:
f'(t) = 12sin^2(4t)cos(4t). oh, so close! but it appears we are off by a factor of 12. but 12 is a constant, that's no trouble:
simply consider g(t) = f(t)/12, which will fix everything: g'(t) = (1/12)f'(t) = sin^2(4t)cos(4t), which is what we want to integrate.
so ONE anti-derivative of g'(t) is g(t), and any anti-derivative of g'(t) is of the form g(t) + C, that is:
sin^3(4t)/12 + C.
Just in case a picture helps guide the mental logic on which Plato is curiously unwilling to introspect (glad he attacks the automatic habit of a u-sub, though!)...
We spot that the integrand looks like the result of a chain-rule differentiation, i.e. it might fit the bottom level of this shape...
... where (key in spoiler) ...
Spoiler:
Maybe...
But...
Whereas...
Next, anti-differentiate with respect to the dashed balloon, just like a u-sub...
... and that's why the solution is...
Generally the drift is...
(For a 'trig sub', slightly different - see below.)
_________________________________________
Don't integrate - balloontegrate!
Balloon Calculus; standard integrals, derivatives and methods
Balloon Calculus Drawing with LaTeX and Asymptote!
The difference is that between understanding what is going on and following a set of rules learned by rote. Today the latter is a useless waste of time when it comes to calculus, since it is more reliable when money and lives depend on the result to use machine assistance. Now when using a machine in this way it is vitally important that you understand what is going on in principle so that you can apply a reality check to the results (after all it is still you legally liable for wrong results, not your computer of Stephen Wolfram).
CB
I fear this thread is getting slightly derailed, but I would just summarise by saying that I learned the following rule, which is essentially the same thing as everyone else is applying, albeit in a more general sense:
It needs some slight adaptation here because the constants don't quite match, but you can easily take that into account when applying the rule.
Whatever works for you... I just find that mappings can be a touch more intuitive (for me) than substitutions.
Anyway, I'm really only posting again to point out that CB probably meant to quote 'Plato' and not Deveno. (Hence some sense of 'derailment'?) Tried the 'report' button instead but found a severe warning about usage...
yes. as a matter of practice, applying a u-substitution is sort of a (self-)test on how well you understand WHY they work. because if you do not, you'll have trouble picking the right "u".
as tom@ballooncalculus pointed out, it involves recognizing when you are seeing the results of derivative in which the chain rule has been used. this isn't a 100% hard-and-fast rule, but if the integrand is of the form (f(t))(g(t)), it's a good thing to check out first.
often, people feel as if they know how to integrate, if they can evaluate most integrals they see. this is a bit misleading, as often, the problems posed in a class "already have answers", so they aren't a perfect test. and remembering various forms for integrals, can be taxing on the memory (no doubt explaining the continuing popularity of lists of common integrals in print, and on the internet). one is in much better shape (in terms of "knowing" how to integrate), if one knows that the various rote "rules" are actually theorems. i feel rather strongly, that no one, not an undergraduate, not a college professor, nor a professional using mathematics for a living, should ever take a theorem on faith. mathematics is not, after all, a religion, but a way of expressing knowledge. if you can't prove something, you have no right to claim you know it is true (although you may suspect it is). i use the word "prove" loosely, in the sense that you could (if given enough time, and reference materials) prove it, at least to your own satisfaction (perhaps not to the satisfaction of the Inquisitors of the Grand Council of Rigorous Standards).
yes, i am afraid we ARE a bit off-topic. i suspect OP has already taken his exam, and is on to other things. and we are beating a dead horse, and preaching to the choir. CaptainBlack, Plato, tom@ballooncalculus, et alius, don't need to read this rant. in a perfect world, i wish the original poster would. the reason one goes to school, is NOT to pass the courses (or shouldn't be). when one is a NASA engineer, calculating a re-entry trajectory for a space-craft, no one cares what grade you got on your final, they want (correct) results.
This is a slow time so why not continue this?
Here is a quote from Keith Devlin.
I mean using properly -- calculators and computers does not represent a reduction in skill or the need for accuracy. On the contrary, successful use of today's computational aids requires far greater mathematical skill, and much more mathematical insight, than we old timers had to master to get our sums right. In addition to ensuring that our students can get the right answer using modern technology, we should also try to interest them in mathematics as a human creation, developed over the centuries to improve the quality of our lives. To do that, we need to show them some of the many different ways that mathematics plays a major role in today's society, including some of the mathematics developed during our own lifetime. In my view, those who cry "Back to basics" have got it wrong. The call should be "Forward to (the new) basics."
Who knows, if we answer that call, we might even produce a generation that is not math phobic or paralyzed by math anxiety.
That is exactly what I meant by my post.
In other words: We must adjust to the world that technology has trust upon us.
We adjust or die.