The integral of the polynomial between 0 and 1 is zero.
That means that either the polynomial is identically zero or the area lying above the x axis is equal to the area below the x axis.
Which means that .... .
Here is the question:
a_0 + a_1/2 + ... + a_n/(n+1) = 0, all a's are real.
Show that at least one real root betw. 0 and 1 exists for a_0 + a_1*x + ... + a_n*x^n = 0.
I decided to go at this with induction. The n=1 step is easy; just solve a 2X2 system of equations. But I am stuck on the inductive step. I asked a classmate, and he suggested integrating the polynomial and then using the intermediate value theorem. But integrating the polynomial with respect to x doesn't produce anything that seems relevant or usable. What should I do?
Hint: Let . Note that . What does Rolle's theorem tell you?
Unfortunately this does not actually work. Consider, for example, the fact that yet is not simultaneously zero on that interval (you can use this to cook up an example for . If you know that was nonnegative on this would work (since would need to be zero a.e. and thus zero everywhere since it's continuous).
Best,
Alex
Hi Drexel28
Could you expand on your objection to the statement I made in my earlier post ?
The example you give, doesn't contradict my statement.
My statement would say that the integral is equal to zero because the integrand is identically zero, (which clearly it isn't), because the part of the area lying above the x-axis is equal to the part of the area lying below the x-axis, (which is the case here).
The implication of that is that is zero at at least one point in the interval, (in order that the curve crosses the x-axis).