Given the integral e^x dx between a & b exists, evaluate it using the formula 1+r+r^2...r^n = 1 - r^(n+1)/(1-r), where r doesn't = 1.
I tried, using (delta)x = (b-a)/n and xi = a + i(delta)x, but when I get to taking the limit I get stuck. I'm not even sure if that's what I'm suppose to do... but I'm not sure of any other way to go about it. Please help!
The course is Real Analysis. And we didn't evaluate any Riemann sums like that, but I'm guessing that must be what they want me to do. The question just says:
"Given the integral between a & b e^x dx exists, evaluate it using the formula: 1 + r + r^2 + ... + r^n = (1-r^(n+1))/(1-r).
I think that yahoo link might be something like what I need to do, but mine is just generally between a and b.
And the definition of the integral I have is for every E>0 there's a d>0 : ||P|| < d -> |sigma - L| < E ... but I'm not really sure if I would apply that to this question?
Do you guys know how I should go about solving this?
Divide the interval from a to b into n equal parts, each of length . The value of the function at the left end of each sub-interval, is , , , etc.
In other words, the integral is the limit of . Do you see what "r" must be?
delta x = (b-a)/n = 0 as x-> 00 and e^0 = 1
so that would just be multiplying the top and bottom by 1?
lim x->00 = e^(a+deltax)*(1- e^(deltax*n+deltax))/(1-e^delax)
but then even when I factor e^deltax out of the top and bottom, I'm still left with
((1/e^deltax) - 1) on the bottom.