I'm finding the root of some function in a given interval using linear interpolation. I'm required to find it to 2 d.p. There's an example in the book where they're also finding a root using linear interpolation, and they note that they get 2 successive approximations which are the same to the required degree of accuracy, and so they stop, saying that's the solution.
However in a later exercise, the same thing arises, two successive solutions that, when rounded to the required degree of accuracy, give the same number. So I stopped, however I got the wrong solution, and in their solution they continued for another two iterations, failing to mention why they decided to go on. So I got
to 2 decimal places to 2 decimal places
and concluded that the solution correct to 2 d.p. was 1.09. However had I continued I would have found that the solution was in fact 1.10 to 2 d.p. (as the next two iterations would show)
So I guess my question is, when do I stop? When I get 2 successive approximations the same, 3, 4,...?
Nov 9th 2009, 05:26 AM
I would stop when two consecutive iterations gave me the same result to 3 decimal places. That way it would be much less likely to impact the answer.
You can be as strict about it as you want, but I don't see any reason for going more than another decimal.