We assume that f(x), f'(x) and f''(x) are continuous in [a,b], and that for some $\displaystyle \alpha \in (a,b) $ , we have $\displaystyle f( \alpha )= 0 $ and $\displaystyle f'( \alpha ) \neq 0 $ . We show that if $\displaystyle x_{0}$ is chosen close enough to $\displaystyle \alpha $ , the iterates

$\displaystyle x_{n+1} = x_{n}- \frac{f(x_{n})}{f'(x_{n})}$

converge to $\displaystyle \alpha$ .

I tried to use Taylor's expansion for $\displaystyle f( \alpha )$ (centered at $\displaystyle x_{n} $ ), and I got to this expression

$\displaystyle lim_{n \rightarrow \infty} (\alpha -x_{n+1})= lim_{n \rightarrow \infty} - \frac{1}{2} f''(c) \frac{( \alpha - x_{n} )^{2}}{f'(x_{n})}$

where $\displaystyle c \in ( \alpha , x_{n} )$

and I guess I want the right hand side to be 0 to get to the answer. But I am not sure how to prove this.