# Math Help - Trapezoid method, fixed-point iteration

1. ## Trapezoid method, fixed-point iteration

Hi,

problem:

The trapezoid method for solving ordinary differential equations is,

$y(x_{i+1}) = y(x_i) + h \frac{f(y(x_{i+1})) + f(y(x_i))}{2}$.

Since this method is implicit, we must solve for $y(x_{i+1})$ in every step. If we use fixed-point iteration $x=g(x)$, we will have convergence if $|g'(x)|<1$ for $x \in[y(x_i), y(x_{i+1})]$.

(a) Give the expression for $g$.

(b) Give a condition on $h$ such that the fixed-point iterations converges at each step.

attempt:

(a)

$g(x) = y(x_i) + h \frac{f(x) + f(y(x_i))}{2}$

For each iteration $i$, we have to run fixed-point iteration enough times to make sure we get the wanted accuracy.
We take a guess at $x=y_{i+1}$, calculate $g(x)$ and see how big $|x-g(x)|$ is. If it's close enough to $0$, we stop the fixed-point iteration.

(b)

I'm not really sure what to make of this question...
Fixed-point methods converge if $|g'(x)|<1$.


\begin{aligned}
&|g'(x)| = \frac{h}{2}\Big[f'(x) + f'(y(x_i))\Big] < 1&\\
&\Rightarrow h < \frac{2}{\Big[f'(x) + f'(y(x_i))\Big]} &
\end{aligned}

I do not see much sense in my last expression.
Thanks!

2. I think your answer for part (a) is correct. However, for part (b), everything except plain ol' x is treated like a constant when you take the derivative, right? I mean, you've already found $y(x_{i}),$ so when you're trying to find $y(x_{i+1}),$ the previous value doesn't change. I think this may simplify your expression a bit.

3. Hi Ackbeet, thanks for replying.

To kick of fixed-point iteration I guess a value for $x = y_{i+1}$ so that I can calculate $f(x)$. But this means that I know $f'(x)$. I now calculate $g(x)$ and use it as the new $y_{i+1}$.
To me it looks like I always know that value of $f'(y_{i+1})$.

But then again, if I think of $x$ just as some unknown I guess I end up with,

$h < \frac{2}{f'(x)}$

This bound still does not make much sense to me though...

4. Well, the fraction on the RHS there, namely, 2/f'(x), is a bound on the step size for your method. What if you always knew that

$h<\dfrac{2}{\max|f'(x)|}?$

Then you would always have a contraction mapping, and the iteration would converge (by the Banach fixed-point theorem). Make sense?

5. That does make sense. Thank you!

6. You're welcome. Have a good one!