calculate the full error series for the midpoint rule where:

I=∫f(x)dx is the excact value and I'=∫f*(x)dx is the approximate value, and the local error is given by I-I*, use a generic taylor series for f(x),ie f(x)=A+Bx+cX^2...

then find an expression for f* and integrate ∫f(x)dx-∫f*(x)dx

between x=x_{i} and x_{i+i}

i know that there are many web pages with the answer but i don't follow how they actually arrive at the answer.