
Error Estimation
Let f belong to C[a,b] be a function whose derivative exits on (a,b).
Suppose f is to be evaluated at d in (a,b), but instead of computing the actual value f(d), the approximate value,f~(d), is the actual value of f at (d+epsilon), that is f~(d)=f(d+epsilon)
Use the mean value theorem to estimate the absolute error:f(d)f~(d)and the relative error: (f(d)  f~(d)) / f(d)

Re: Error Estimation
State the mean value theorem, and your solution should be evident. It tells you exactly how to find a mean value for a function between two points. (Why not choose and ). Now, you have your range of error. Find the mean value. Multiply by distance. You have your estimate of absolute error. Then, divide by and you have relative error, as well.

Re: Error Estimation
I guess I don't understand the theorem. How can I use it when I don't know the function?
f'(c) = f(dE)f(d+e) / (de)(d+e)

Re: Error Estimation
Do you know of a point in there that will have a derivative that is close? What about the midpoint? That ought to be close if you get the distance small enough (as your epsilon approaches zero). So, take the derivative at the midpoint of and . That should be a decent estimate of your average value, right?

Re: Error Estimation
I understand that the midpoint approximates the number as epsilon gets small, but I don't understand this problem. The midpoint of de and d+e is d, but what I need is f(d)f(d+epsilon). I am confused about whether you are talking about f or f~

Re: Error Estimation
Adjust your range to whatever you think the midpoint should be. I gave you an example of a range, not the answer to your problem. You clearly see what the range should be. So pick a midpoint there, take a derivative of f.