We assume that f(x), f'(x) and f''(x) are continuous in [a,b], and that for some , we have and . We show that if is chosen close enough to , the iterates

converge to .

I tried to use Taylor's expansion for (centered at ), and I got to this expression

where

and I guess I want the right hand side to be 0 to get to the answer. But I am not sure how to prove this.