Hi,

I'm finding the root of some function in a given interval using linear interpolation. I'm required to find it to 2 d.p. There's an example in the book where they're also finding a root using linear interpolation, and they note that they get 2 successive approximations which are the same to the required degree of accuracy, and so they stop, saying that's the solution.

However in a later exercise, the same thing arises, two successive solutions that, when rounded to the required degree of accuracy, give the same number. So I stopped, however I got the wrong solution, and in their solution they continued for another two iterations, failing to mention why they decided to go on. So I got

$\displaystyle x_3 = 1.086 = 1.09$ to 2 decimal places

$\displaystyle x_4 = 1.092 = 1.09$ to 2 decimal places

and concluded that the solution correct to 2 d.p. was 1.09. However had I continued I would have found that the solution was in fact 1.10 to 2 d.p. (as the next two iterations would show)

So I guess my question is, when do I stop? When I get 2 successive approximations the same, 3, 4,...?

Thanks

Stonehambey