# Thread: Iterative Methods Convergence (Linear Systems)

1. ## Iterative Methods Convergence (Linear Systems)

I am studying for my last Qualifying exam and came across two related problems that I can't seem to solve, both are Iterative Methods for Linear Systems.

Given A is a strictly diagonally dominant matrix, then show that the Gauss Seidel and Jacobi methods converge.

Jacobi:
Let $A = D + R$ where
$D = \begin{bmatrix} a_{11} & 0 & \cdots & 0 \\
0 & a_{22} & \cdots & 0 \\
\vdots & \vdots & \ddots & \vdots \\
0 & 0 & \cdots & a_{nn} \\\end{bmatrix}$

$R = \begin{bmatrix} 0 & a_{12} & \cdots & a_{1n} \\
a_{21} & 0 & \cdots & a_{2n} \\
\vdots & \vdots & \ddots & \vdots \\
a_{n1} & a_{n2} & \cdots & 0 \\\end{bmatrix}$

Thus,
$Ax = b \Rightarrow (D+R)x = b \Rightarrow x = D^{-1}b - D^{-1}R x$
and the iterative method is given by
$x^{(k+1)} = D^{-1}b - D^{-1}R x^{(k)}$.

Gauss-Seidel:
Let $A = L + U$ where
$L = \begin{bmatrix} a_{11} & 0 & \cdots & 0 \\
a_{21} & a_{22} & \cdots & 0 \\
\vdots & \vdots & \ddots & \vdots \\
a_{n1} & a_{n2} & \cdots & a_{nn} \\\end{bmatrix}$

$U = \begin{bmatrix} 0 & a_{12} & \cdots & a_{1n} \\
0 & 0 & \cdots & a_{2n} \\
\vdots & \vdots & \ddots & \vdots \\
0 & 0 & \cdots & 0 \\\end{bmatrix}$

Thus,
$Ax = b \Rightarrow (L+U)x = b \Rightarrow x = L^{-1}b - L^{-1}U x$
and the iterative method is given by
$x^{(k+1)} = L^{-1}b - L^{-1}U x^{(k)}$.

I have proven that if we have an iterative method
$X^{(k+1)} = Q X^{(k)} + d$
converges if $\|Q\| < 1$ where $\|\cdot\|$ is a matrix norm. Thus, all I would have to do is prove that $\|-D^{-1}R\| < 1$ and $\|-L^{-1}U\| < 1$.

However, I have a theorem that says the following are equivalent:
1. $X^{(k+1)} = Q X^{(k)} + d$ converges
2. $\rho(Q) < 1$

So if I can prove that the spectral radius is less than one I have shown that the method converges. I don't really care which way I prove this, but I can't figure out how to do it.

2. Take a look at Kreyszig's Introductory Functional Analysis with Applications, pages 307 to 312. That's one approach (Banach fixed-point theorem).

3. I have that book, so I will check it when I get to my office. Thanks.

4. Well, I was speaking to some other students and they said that, since it just says matrix norm I can use the max norm. This makes the proof quite easy.

Since the matrix $A$ is strictly diagonally dominant we have
$\left|a_{ii}\right| > \sum_{\stackrel{j=1}{j\ne i}}^n \left|a_{ij}\right|$
$\left\|D^{-1}R\right\|_{\infty} \le max_{i=1,\dots,n} \sum_{\stackrel{j=1}{j\ne i}}^n \frac{\left|a_{ij}\right|}{\left|a_{ii}\right|} < 1$
Thus the Jacobi method is convergent. Gauss-Seidel is similar.