1. ## Restoring Quadratic Convergence of Newton's Method

I am studying for finals and am looking over previous problems that I missed significant points on. My instructor doesn't comment on why things are incorrect and he doesn't discuss homework, so I am on my own to figure out why I am wrong. Anyway, I would like some help figuring out where I went wrong.

Prove that if $\displaystyle r$ is a zero of multiplicity $\displaystyle k$ of the function $\displaystyle f$, then quadratic convergence in Newton's iteration will be restored by making this modification
$\displaystyle x_{n+1} = x_n - k \frac{f(x_n)}{f'(x_n)}$

Proof:
If $\displaystyle x^*$ is a zero of multiplicity $\displaystyle k$ then $\displaystyle f(x)$ can be rewritten as
$\displaystyle f(x) = (x - x^*)^k q(x)$
where $\displaystyle q(x*) \ne 0,$ and so we have $\displaystyle f^{(k)}(x^*) \ne 0.$ Writing $\displaystyle f(x) \,\text{and} \, f'(x)$ as its Taylor series expansion about $\displaystyle (x - x*)$
$\displaystyle f(x) = \frac{f^{(k)}(\xi)}{k!}(x - x^*)^k, \quad f'(x) = \frac{f^{(k)}(\eta)}{(k - 1)!}(x - x^*)^{k - 1}$
where $\displaystyle \xi, \eta$ are between $\displaystyle x$ and $\displaystyle x^*.$ Thus we have
$\displaystyle g(x) = x - \frac{f^{(k)}(\xi)}{f^{(k)}(\eta)}(x - x^*)$
$\displaystyle \lim_{x \to x^*} g'(x) = 1 - \lim_{x \to x^*}\frac{d}{dx}\left[ \frac{f^{(k)}(\xi)}{f^{(k)}(\eta)}(x-x^*)\right]$
This is where I went wrong. I originally thought that the derivative was just $\displaystyle 1 - \frac{f^{(k)}(\xi)}{f^{(k)}(\eta)}$, but $\displaystyle \xi, \eta$ depend on $\displaystyle x$ so that is not true.
$\displaystyle \lim_{x \to x^*} g'(x) = 1 - \lim_{x \to x^*}\left[\frac{d}{dx} \left(\frac{f^{(k)}(\xi)}{f^{(k)}(\eta)}\right)(x-x^*) + \frac{f^{(k)}(\xi)}{f^{(k)}(\eta)}\right]$
$\displaystyle = 1 - \lim_{x \to x^*}\left[\frac{d}{dx} \left(\frac{f^{(k)}(\xi)}{f^{(k)}(\eta)}\right)(x-x^*)\right] + \lim_{x \to x^*}\frac{f^{(k)}(\xi)}{f^{(k)}(\eta)}$
$\displaystyle = 1 - \lim_{x \to x^*}\left[\frac{d}{dx} \left(\frac{f^{(k)}(\xi)}{f^{(k)}(\eta)}\right)(x-x^*)\right] + 1$
$\displaystyle = \lim_{x \to x^*}\frac{d}{dx} \left[\frac{f^{(k)}(\xi)}{f^{(k)}(\eta)}\right](x-x^*).$
$\displaystyle = \lim_{x \to x^*}\frac{d}{dx} \left[\frac{\frac{df^{(k)}}{dx}(\xi) f^{(k)}(\eta) - f^{(k)}(\xi)\frac{df^{(k)}}{dx}(\eta) }{\left[f^{(k)}(\eta)\right]^2}\right](x-x^*).$
This is suppose to go to zero, but I don't really see how. I suppose it has something to do with $\displaystyle \frac{df^{(k)}}{dx}(\xi) \to \frac{df^{(k)}}{dx}(\eta)\text{ as } x \to x^*\text{ and } f^{(k)}(\xi) \to f^{(k)}(\eta)\text{ as } x \to x^*$.

Any help is appreciated, also if you have a different way to do this I will be glad to see it. Thank you in advance.

2. If the error in the $\displaystyle n^{th}$ iterate is denoted by $\displaystyle \epsilon_{n},$ then Newton's method may be written

$\displaystyle x^{*}+\epsilon_{n+1}=x^{*}+\epsilon_n-\frac{f(x^{*}+\epsilon_n)}{f'(x^{*}+\epsilon_n)}$.

The Taylor expansions for the two functions are

$\displaystyle f(x^{*}+\epsilon_n)=f(x^{*})+\epsilon_nf'(x^{*})+\ epsilon^{2}_nf''(x^{*})/2!+\dots\quad(1)$
and
$\displaystyle f'(x^{*}+\epsilon_n)=f'(x^{*})+\epsilon_nf''(x^{*} )+\epsilon^{2}_nf'''(x^{*})/2!+\dots\quad(2)$

If the zero is of order 1 then $\displaystyle f(x^{*})=0$ so when you divide (1) by (2) you get $\displaystyle \epsilon_n$ plus higher order terms.
That means that the $\displaystyle \epsilon_n$'s on the RHS of the method cancel and you get second order convergence.
If though the zero is of order 2 then $\displaystyle f(x^{*})=f'(x^{*})=0$ so that when the division is carried out the leading term is $\displaystyle \epsilon_n/2$ and in which case the $\displaystyle \epsilon_n$'s no longer cancel out and you are left with first order convergence.
To restore the quadratic convergence the fractional term in the method is multiplied by 2.
Now just follow the reasoning 'up the line' to a $\displaystyle k^{th}$ order zero.

3. I guess I am missing something and it is probably obvious, but I don't see where the quadratic convergence comes in. Quadratc convergence is defined as given a limit $\displaystyle x^*$ a sequence $\displaystyle x_n$ is said to converge quadratically if
$\displaystyle \lim_{n\to \infty} \frac{\|x_{n+1} - x^*\|}{\|x_n - x^*\|^2} = M$ where $\displaystyle M > 0$. However, I don't see where that comes in.

4. $\displaystyle x_{n+1}=x^{*}+\epsilon_{n+1},\quad x_n=x^{*}+\epsilon_n,$ so your definition is equivalent to $\displaystyle \epsilon_{n+1}=M\epsilon_n^{2},$ (in the limit as n tends to infinity).
If, in the limit, $\displaystyle \epsilon_{n+1}=M\epsilon_n,$ then the convergence is linear.
It seems that you are looking for a slightly more rigorous analysis !

5. $\displaystyle \epsilon_{n+1}=\epsilon_n-k\frac{f(x_n)}{f'(x_n)}$
Taking the taylor series of $\displaystyle f(x_n) \text{ and } f'(x_n) \text{ about } x^*$ gives
$\displaystyle f(x_n) = \frac{f^{(k)}(x_n)}{k!}\epsilon_n^k + \frac{f^{(k+1)}(\xi)}{(k+1)!}\epsilon_n^{k+1}$
$\displaystyle f'(x_n) = \frac{f^{(k)}(\eta)}{(k-1)!}\epsilon_n^{k-1}$.
Substituting these taylor series we have
$\displaystyle e_{n+1} = e_n - \frac{f^{(k)}(x_n)}{f^{(k)}(\eta)}\epsilon_n - \frac{1}{k} \frac{f^{(k+1)}(\xi)}{f^{(k)}(\eta)}\epsilon^2_n$.
Now taking the limit as $\displaystyle x_n \to x^*$ gives
$\displaystyle e_{n+1} = e_n - \lim_{x_n \to x^*} \left[\frac{f^{(k)}(x_n)}{f^{(k)}(\eta)}\epsilon_n - \frac{1}{k} \frac{f^{(k+1)}(\xi)}{f^{(k)}(\eta)}\epsilon^2_n\r ight]$
$\displaystyle \frac{|e_{n+1}|}{|e_n|^2} = \frac{1}{k} \frac{|f^{(k+1)}(x^*)|}{|f^{(k)}(x^*)|}$

So, I believe this is the solution.

6. I have decided that I don't like the proof above. Does anyone have something to add? I am studying for my qualifying exams and I want to make sure I have this one down.