Hi,
I am having some problems understanding the proof of the following theorem.

Theorem:
Suppose that g is a real-valued function, defined and continuous on a bounded closed interval $\displaystyle [a,b]$ of the real line, and assume that $\displaystyle g(x)\in [a,b]$ for all $\displaystyle x\in [a,b]$.
Let $\displaystyle \xi = g(\xi)\in [a,b]$ be a fixed point of $\displaystyle g$, and assume that $\displaystyle g$ has a continuous derivative in some neighborhood of $\displaystyle \xi$ with $\displaystyle |g'(x)|<1$. Then the sequence $\displaystyle (x_k)$ defined by $\displaystyle x_{k+1}=g(x_k),k\geq 0$, converges to $\displaystyle \xi$ as $\displaystyle k\rightarrow\infty$, provided that $\displaystyle x_0$ is sufficiently close to $\displaystyle \xi$.

Proof:
By hypothesis, there exists $\displaystyle h>0$ such that $\displaystyle g'$ is continuous in the interval
$\displaystyle [\xi-h, \xi+h]$. Since $\displaystyle |g'(\xi)|<1$ we can find a smaller interval
$\displaystyle I_{\delta}=[\xi-\delta,\xi+\delta]$, where $\displaystyle 0<\delta\leq h$, such that $\displaystyle |g'(x)\leq L|$ in this interval, with $\displaystyle L<1$.
To do so, take $\displaystyle L=\frac{1}{2}(1+|g'(\xi)|)$ and then choose $\displaystyle \delta\leq h$ such that,
$\displaystyle |g'(x)-g'(\xi)|\leq \frac{1}{2}(1-|g'(\xi)|) $
for all $\displaystyle x$ in $\displaystyle I_{\delta}$; this is possible since $\displaystyle g'$ is continuous at $\displaystyle \xi$.

I will stop there as already I am not sure what's going on.
Why should I take $\displaystyle L=\frac{1}{2}(1+|g'(\xi)|)$?
By the way, in this book L is used to denote the "contraction factor" such that;
$\displaystyle |g(x)-g(y)|\leq L|x-y|$ for all $\displaystyle x,y\in[a,b]$.

Thanks.