Results 1 to 4 of 4

Math Help - Iterative Methods Convergence (Linear Systems)

  1. #1
    Senior Member
    Joined
    Mar 2009
    Posts
    378

    Iterative Methods Convergence (Linear Systems)

    I am studying for my last Qualifying exam and came across two related problems that I can't seem to solve, both are Iterative Methods for Linear Systems.

    Given A is a strictly diagonally dominant matrix, then show that the Gauss Seidel and Jacobi methods converge.

    Jacobi:
    Let A = D + R where
    D = \begin{bmatrix} a_{11} & 0 & \cdots & 0 \\<br />
0 & a_{22} & \cdots & 0 \\<br />
\vdots & \vdots & \ddots & \vdots \\<br />
0 & 0 & \cdots & a_{nn} \\\end{bmatrix}

    R = \begin{bmatrix} 0 & a_{12} & \cdots & a_{1n} \\<br />
a_{21} & 0 & \cdots & a_{2n} \\<br />
\vdots & \vdots & \ddots & \vdots \\<br />
a_{n1} & a_{n2} & \cdots & 0 \\\end{bmatrix}
    Thus,
     Ax = b \Rightarrow (D+R)x = b \Rightarrow x = D^{-1}b - D^{-1}R x
    and the iterative method is given by
    x^{(k+1)} = D^{-1}b - D^{-1}R x^{(k)}.

    Gauss-Seidel:
    Let A = L + U where
    L = \begin{bmatrix} a_{11} & 0 & \cdots & 0 \\<br />
a_{21} & a_{22} & \cdots & 0 \\<br />
\vdots & \vdots & \ddots & \vdots \\<br />
a_{n1} & a_{n2} & \cdots & a_{nn} \\\end{bmatrix}

    U = \begin{bmatrix} 0 & a_{12} & \cdots & a_{1n} \\<br />
0 & 0 & \cdots & a_{2n} \\<br />
\vdots & \vdots & \ddots & \vdots \\<br />
0 & 0 & \cdots & 0 \\\end{bmatrix}
    Thus,
     Ax = b \Rightarrow (L+U)x = b \Rightarrow x = L^{-1}b - L^{-1}U x
    and the iterative method is given by
    x^{(k+1)} = L^{-1}b - L^{-1}U x^{(k)}.

    I have proven that if we have an iterative method
    X^{(k+1)} = Q X^{(k)} + d
    converges if \|Q\| < 1 where \|\cdot\| is a matrix norm. Thus, all I would have to do is prove that  \|-D^{-1}R\| < 1 and  \|-L^{-1}U\| < 1.

    However, I have a theorem that says the following are equivalent:
    1. X^{(k+1)} = Q X^{(k)} + d converges
    2. \rho(Q) < 1

    So if I can prove that the spectral radius is less than one I have shown that the method converges. I don't really care which way I prove this, but I can't figure out how to do it.
    Follow Math Help Forum on Facebook and Google+

  2. #2
    A Plied Mathematician
    Joined
    Jun 2010
    From
    CT, USA
    Posts
    6,318
    Thanks
    4
    Awards
    2
    Take a look at Kreyszig's Introductory Functional Analysis with Applications, pages 307 to 312. That's one approach (Banach fixed-point theorem).
    Follow Math Help Forum on Facebook and Google+

  3. #3
    Senior Member
    Joined
    Mar 2009
    Posts
    378
    I have that book, so I will check it when I get to my office. Thanks.
    Follow Math Help Forum on Facebook and Google+

  4. #4
    Senior Member
    Joined
    Mar 2009
    Posts
    378
    Well, I was speaking to some other students and they said that, since it just says matrix norm I can use the max norm. This makes the proof quite easy.

    Since the matrix A is strictly diagonally dominant we have
    \left|a_{ii}\right| > \sum_{\stackrel{j=1}{j\ne i}}^n \left|a_{ij}\right|
    \left\|D^{-1}R\right\|_{\infty} \le max_{i=1,\dots,n} \sum_{\stackrel{j=1}{j\ne i}}^n \frac{\left|a_{ij}\right|}{\left|a_{ii}\right|} < 1
    Thus the Jacobi method is convergent. Gauss-Seidel is similar.
    Last edited by lvleph; January 8th 2011 at 11:56 AM.
    Follow Math Help Forum on Facebook and Google+

Similar Math Help Forum Discussions

  1. Replies: 1
    Last Post: March 14th 2011, 07:41 AM
  2. Rate of Convergence for iterative method.
    Posted in the Advanced Algebra Forum
    Replies: 2
    Last Post: September 11th 2009, 10:40 AM
  3. MATLAB Iterative Methods: URGENT
    Posted in the Math Software Forum
    Replies: 1
    Last Post: October 15th 2008, 06:12 AM
  4. Iterative methods
    Posted in the Discrete Math Forum
    Replies: 1
    Last Post: February 13th 2008, 07:24 AM
  5. Iterative methods
    Posted in the Calculus Forum
    Replies: 1
    Last Post: February 11th 2008, 10:58 AM

Search Tags


/mathhelpforum @mathhelpforum