# Thread: Newtons method help?

1. ## Newtons method help?

Ok so newtons method is stated to be have quadratic convergence when $\displaystyle f'(x)!=0$ (Doesnt Equal Zero) and of course a couple other conditions.

My Question how can we improve the convergence of newtons method when $\displaystyle f'(x)=0$

Newtons Method:
$\displaystyle x_n=x_(_n_-_1_) - f(x_(_n_-_1_))/f'(x_(_n_-_1_))$

As you can see if the derivative of f is equal to zero then we are screwed. Ha and I dont remb how to do it. THank you

2. Use a different method if you have to.

The Bisection Method is guaranteed to converge, though it converges very slowly, and the Secant Method is more realiable, but again, is slightly slower than Newton's Method.

3. Newton's method essentially uses a tangent line at $\displaystyle x_{n-1}$ to approximate where the function would be zero for $\displaystyle x_n$.
When the slope of the tangent line is 0, it is completely horizontal. There is no way to make a guess for where the function is zero based on this information.
For one thing, you don't even know whether the solution should be smaller or larger.

You can either perturb the point a little bit, and try newton's method again, or use another method (secant, mid-point, etc.)

4. Well basically im trying to study for a final and Im pretty sure according to our guide our prof is specifically looking for our understanding of Newtons method. So the question is more along the theoretical. I know when you have higher multiplicities u can use Fixed newtons method... which looks something like....
$\displaystyle x-(f(x)*f'(x)/([f'(x)]^2-f(x)*f''(x))$

However I am new to all this and I may be way off. What would u do if you had the derivtive of f equal to zero?

5. The question give is as follows:

How can we imporve the convergence rate when $\displaystyle f'(x*)=0$?

Now im safe in saying that under the standard Newton's method this does not converge? correct because of the simple idea of dividing by 0 being undefind. I understand that there are other methods out there but the question is regarding newtons method specifically!

Thanks for your input so far! I have to say I love this forum for the simple fact that the people who view it tend to be very well educated and helpful!!!!!! Glad I found it

6. A note about the quadratic convergence. If you look at the proof for quadratic convergence, it assumes the initial guess is in a sufficiently close neighborhood of the solution (and f''(x) exists at that point). If it does not converge quickly enough, you can always make another guess. If you guess closely enough (or lucky enough) it will converge quadratically.

7. I did look at the proof... thanks to wikipedia. However that only tells me that it would not converge at quadratic rate.... And i am aware that if u make a bad guess then u can have a more linear convergence or even a divergence. But agian Im asking how can we improve the convergence for when the first derivative of f is equal to zero.

8. I think you mean something like the higher order iterations describe here: Newton's method and high order iterations

Most of the time Newton's method talks about the first order version. I think the higher order version is called Householder's method: http://en.wikipedia.org/wiki/Householder%27s_method

This gives better convergence most of the time, but it will never be fool proof if you make a bad guess.

9. THough that seems like what im talking about I dont remb it...And it seems that Householder's method is also susceptible to failure when the derivative of f is equal to faluire... unless im reading it wrong.

10. If you check the method for d=2 and simplify, you will see that it no longer has f'(x) in the denominator.
Notice $\displaystyle (1/f)^{(d)}$ is the d-th derivative of (1/f)

For d=2, I get something that looks like what you described:

$\displaystyle \displaystyle x_{n+1} = x_n - \frac{2f(x_n)f'(x_n)}{2f'(x)^2-f(x)f''(x)}$

11. But couldnt we assum that if f'(x)=0 then wouldnt f''(x) also be zero... so then wouldnt the fixed newtons method not work... or am I missing something Also the derivative of f or f' is on the bottom and if f' is zero them f'' is also zero therefore the denominator is zero

12. $\displaystyle f'(x_n) = 0$ does not mean $\displaystyle f'(x) = 0$ for all x, and $\displaystyle f''(x_n)$ may not be zero.
Simple example: consider f(x) = -x^2 + 1, f'(0) = 0, but f''(0) = -2

But, f'(x) = 0 does not help much for d = 2. You are correct. Because the numerator would be zero, meaning the iteration will give you the same value. It will not help unless you use d >= 3. Why? Think about a parabola. When slope = 0, convexity will still not tell you which direction your solution is.

13. well our prof states it as f(x*)=0 and i dont kno what he means by x*.... could he mean all x in X? I think he may mean they actual root!

14. x* is normally used to denote the fixed point. The point s.t. if $\displaystyle x_n = x*$, then $\displaystyle x_{n+1) = x*$
And yes, all roots are fixed points.

15. Ok for my own understanding let me post the whole question.
Assume f(x*)=0 and we have and intial approx $\displaystyle x_0$ whic is close enough to x*. what is the convergence rate for Newtons method?
if f'(x*)!=0 (doesnt equal zero)?....................I go this part
if f'(x*)=0?........................This is where im stuck

Page 1 of 2 12 Last