I have question about percent error in a value. If you're estimating some value by a number then what is the formula for the percent error? I always thought that it was the absolute value of the difference between the accepted value and the estimated value divided by the accepted value. But in this textbook that reading out of it claims that if we estimate a value than the percent error is the estimate divided by the estimate.

For example if we estimate 2000 with 1500 then the first way claims that the error = .25

But the other way claims that the error = .333...

Anybody have any idea as to why there seems to be two different methods to find error. I always thought error was equal to the first method but I came across this "other method" in a textbook that was showing how much error you have when you estimate sin x=x. Sorry if this isn't in the right place but does anybody have any comments???