# Can't make simple iteration work

• Aug 3rd 2010, 11:06 PM
Mollier
Can't make simple iteration work
Hi,

I let $f(x)=e^x-3x-1=0$ for $x\in [1,2]$ and rewrite the equation such that $x=ln(3x+1)$.
I then rewrite $f(x)=0$ to the equivalent form $x-g(x)=0$ where $g(x)=ln(3x+1)$.

Since $g'(x)=\frac{3}{3x+1}$ I have that $g'(1)=\frac{3}{4}$ and $g'(2)=\frac{3}{7}$.

By the mean value theorem I know that there exists a number $\eta\in [1,2]$ such that
$|g(x)-g(y)|=|g'(\eta)||x-y|$.

Since $g''(x)=-\frac{9}{(3x+1)^2}$, $g$ is monotonic decreasing and so
$g'(1)\geq g'(\eta) \geq g'(2)$.

$g$ is then a contraction with $L=g'(1)=\frac{3}{4}$.

I then look at the sequence $x_{k+1}=g(x_k)}$ and write a small piece of code that prints out the first 20 iterations.

(0, 1)
(1, 1.3862943611198906)
(2, 1.6407200993500939)
(3, 1.778701297541748)
(4, 1.846264051572333)
(5, 1.8777524625894917)
(6, 1.8920959939166664)
(7, 1.8985621417401013)
(8, 1.9014635019263759)
(9, 1.9027626111520601)
(10, 1.9033437520231324)
(11, 1.9036036091052895)
(12, 1.9037197823293694)
(13, 1.9037717150438356)
(14, 1.9037949295626633)
(15, 1.9038053065444669)
(16, 1.9038099450615822)
(17, 1.9038120184745679)
(18, 1.9038129452869204)
(19, 1.9038133595703104)
(20, 1.9038135447541562)

The maximum number of iterations I need to get an accuracy of $|x_k-\xi|\leq \epsilon$ where $x_k$is the kth iteration and $\xi$ is the fixed point og $g$ is given by:

$k_0(\epsilon) \leq \right[ \frac{ln|x_1-x_0| - ln(\epsilon(1-L))}{ln(1/L)}\left] +1$

If I calculate this with $x_1 = 1.386294$, $x_0=1$, $L=3/4$ and $\epsilon=0.5*10^{-4}$ (to get an accuracy of 4 decimal digits) I get approx. 41 iterations.

But from my calculation of the sequence it seems as if the number of iterations needed for an accuracy of 4 decimal digits is not more than about 15.

Funny thing is, in the book I am reading they use $g(x)=ln(2x+1)$and that seems to work..

Sorry for the long post, but I hope someone can help me out.
• Aug 4th 2010, 02:48 AM
Ackbeet

Why are you using the second derivative of $g$ to check the monotonicity of $g$? That information is contained in the first derivative. The function $g$ is actually monotone increasing. I think you meant to say that $g'$ is monotone decreasing, which it is.

So you have proven that $g$ is a contraction mapping.

At the end, I think you're wondering why the actual number of iterations needed to converge is smaller than the theoretical number of iterations needed to converge. Is that correct? If so, I would say that it's simply a fact that iteration schemes sometimes converge faster than theory says they will. All your theory tells you is that you won't need to go over a certain number of iterations, which you didn't. So your theoretical bound on the number of iterations isn't very sharp. That's all. It doesn't strike me as a shortcoming. If you want your theoretical bound to be a little sharper, try a few different starting points, and see how that affects iteration numbers. Sometimes if you start farther away from the fixed point, the number of iterations will go up (though I'm guessing not by a whole lot!).

Does this make sense?
• Aug 4th 2010, 02:52 AM
Mollier
Hi,

Yes, I meant that g'(x) is monotonically decreasing.

I did not know that the max theoretical number of iterations should deviate this much from the actual one.
I mean if I want an accuracy of 6 decimal digits I need about 20 iterations. The thoery gives me 50-something. Can this really be so?

Thanks!
• Aug 4th 2010, 02:59 AM
Ackbeet
Quote:

I mean if I want an accuracy of 6 decimal digits I need about 20 iterations. The thoery gives me 50-something. Can this really be so?
Absolutely. For my Ph.D. dissertation, I proved a ridiculous bound on what's called the "chirp" in a fiber optic cable. I showed that solitons cease to exist when you chirp a signal enough. This result was known experimentally for a long time, but I proved the first theoretical bound on it. The bound I proved was extremely non-sharp. We're talking orders of magnitude here. So, in reality, solitons disappeared long before I said they would.

But this is the way math goes. Someone will hopefully come along and prove a better bound, maybe: like the better mousetrap.

Regards.
• Aug 4th 2010, 03:01 AM
Mollier
Kind of surprising that such an un-sharp bound is in use. I quess it's better than no bound at all.
Thank you.
• Aug 4th 2010, 03:03 AM
Ackbeet
Quote:

I quess it's better than no bound at all.
Exactly. Any bound at all can limit your search space and save time. Of course better bounds are, well, better.

You're welcome, and have a good one!