Numerical Analysis Home

Jacobi and Gauss-Seidel Methods

    


Consider a system of \(n\) equations in \(n\) variables \(x_1,\ldots,x_n\): \(A\overrightarrow{x}=\overrightarrow{b}.\)

Throughout this section assume
(1) there is a unique solution and
(2) \(a_{ii}\neq 0\) for \(i=1,\ldots, n\).

The Jacobi method constructs a sequence \(\{\overrightarrow{x}^{(k)}\}\) to approximate the unique solution of \(A\overrightarrow{x}=\overrightarrow{b}\). First decompose \(A\) into \[A=D+R,\] where \(D=\text{diag}(a_{11},\ldots,a_{nn})\) is the diagonal part of \(A\) and \(R=A-D\) is the remainder part of \(A\). Since \(a_{ii}\neq 0\) for \(i=1,\ldots, n\), \(\det D=a_{11}\cdots a_{nn}\neq 0\) and hence \(D^{-1}\) exists. \[\begin{align*} A\overrightarrow{x}=\overrightarrow{b} &\implies (D+R)\overrightarrow{x}=\overrightarrow{b}\\ &\implies D\overrightarrow{x}=\overrightarrow{b}-R\overrightarrow{x}\\ &\implies \overrightarrow{x}=D^{-1}\big(\overrightarrow{b}-R\overrightarrow{x}\big) \end{align*}\] The iteration formula for the Jacobi method is \[\overrightarrow{x}^{(k+1)}=D^{-1}\big(\overrightarrow{b}-R\overrightarrow{x}^{(k)}\big),\; k=0,1,2,\ldots\] If we write \(\overrightarrow{x}^{(k)}=[x^{(k)}_1,\ldots, x^{(k)}_n]^T\), then the above iteration formula gives \[\boxed{x^{(k+1)}_i=\frac{1}{a_{ii}}\Bigg[b_i-\sum^n_{\substack{j=1\\j\neq i}}a_{ij} x^{(k)}_j \Bigg],\; i=1,\ldots,n.}\]


Example. Do three iterations by the Jacobi method with \(\overrightarrow{x}^{(0)}=[0,0,0]^T\) to approximate the unique solution of the following system of equations. \[\begin{array}{rcrcrcr} 7x_1 &+ &x_2 &- &3x_3 &=&6\\ x_1 &- &11x_2 &+ &6x_3 &=&3\\ 2x_1 &+ &x_2 &+ &9x_3 &=&5 \end{array}\] Solution. Note that \(a_{11}\), \(a_{22}\), and \(a_{33}\) are all nonzero. \[\begin{array}{rcrcrcr} x_1^{(k+1)} &=&\displaystyle\frac{1}{7}\Big(6 &-& x_2^{(k)} &+& 3x_3^{(k)} \Big)\\ x_2^{(k+1)} &=&\displaystyle\frac{1}{-11}\Big(3 &-&x_1^{(k)}&-&6x_3^{(k)} \Big)\\ x_3^{(k+1)} &=&\displaystyle\frac{1}{9}\Big(5 &-&2x_1^{(k)} &-&x_2^{(k)}\Big) \end{array}\] \[\begin{array}{|c|r|r|r|r|} \hline k & x_1^{(k)} & x_2^{(k)} &x_3^{(k)}\\ \hline 0 & 0 & 0 &0 \\ \hline 1 & 0.85 & -0.27 &0.55 \\ \hline 2 & 1.13 & 0.10 &0.39 \\ \hline 3 & 1.01 & 0.04 &0.29 \\ \hline \end{array} \] The actual solution is \((x_1,x_2,x_3)=\left(1,0,\frac{1}{3}\right)\).

Note in the Jacobi iteration formula that in time of computing \(x^{(k+1)}_i,\;i>1\), we have calculated \(x^{(k+1)}_1,\ldots, x^{(k+1)}_{i-1}\). So we can replace \(x^{(k)}_1,\ldots, x^{(k)}_{i-1}\) by \(x^{(k+1)}_1,\ldots, x^{(k+1)}_{i-1}\) respectively. That is how we get the iteration formula of the Gauss-Seidel method: \[\boxed{x^{(k+1)}_i=\frac{1}{a_{ii}}\Bigg[b_i -\sum_{j=1}^{i-1}a_{ij} x^{(k+1)}_j -\sum_{j=i+1}^{n}a_{ij} x^{(k)}_j\Bigg],\; i=1,\ldots,n.}\]


Example. Do three iterations by the Gauss-Seidel method with \(\overrightarrow{x}^{(0)}=[0,0,0]^T\) to approximate the unique solution of the following system of equations. \[\begin{array}{rcrcrcr} 7x_1 &+ &x_2 &- &3x_3 &=&6\\ x_1 &- &11x_2 &+ &6x_3 &=&3\\ 2x_1 &+ &x_2 &+ &9x_3 &=&5 \end{array}\] Solution. Note that \(a_{11}\), \(a_{22}\), and \(a_{33}\) are all nonzero. \[\begin{array}{rcrcrcr} x_1^{(k+1)} &=&\displaystyle\frac{1}{7}\Big(6 &-& x_2^{(k)} &+& 3x_3^{(k)} \Big)\\ x_2^{(k+1)} &=&\displaystyle\frac{1}{-11}\Big(3 &-&x_1^{(k+1)}&-&6x_3^{(k)} \Big)\\ x_3^{(k+1)} &=&\displaystyle\frac{1}{9}\Big(5 &-&2x_1^{(k+1)} &-&x_2^{(k+1)}\Big) \end{array}\] \[\begin{array}{|c|r|r|r|r|} \hline k & x_1^{(k)} & x_2^{(k)} &x_3^{(k)}\\ \hline 0 & 0 & 0 &0 \\ \hline 1 & 0.85 & -0.19 &0.38 \\ \hline 2 & 1.04 & 0.02 &0.32 \\ \hline 3 & 0.99 & 0.00 &0.33 \\ \hline \end{array} \] The actual solution is \((x_1,x_2,x_3)=\left(1,0,\frac{1}{3}\right)\).

Convergence:
The sequence \(\{\overrightarrow{x}^{(k)}\}\) constructed by the Jacobi or Gauss-Seidel method does not always converge to the unique solution of A\(\overrightarrow{x}=\overrightarrow{b}\). However the convergence is guaranteed for any \(\overrightarrow{x}^{(0)}\) if \(A\) is strictly diagonally dominant (i.e., \(|a_{ii}|>\displaystyle\sum^n_{\substack{j=1\\j\neq i}} |a_{ij}|\) for all \(i\)).

Example. For the following \(A\), determine if the Jacobi and Gauss-Seidel iterations converge
(a) \(A=\left[\begin{array}{rrr}2&1&0\\-3&4&1\\0&1&3\end{array} \right]\), (b) \(A=\left[\begin{array}{rrr}2&1&0\\-3&-5&1\\0&1&3\end{array} \right]\).
Solution. (a) \(|2|>|1|+|0|\), but \(|4|\ngtr |-3|+|1|\). So \(A\) is not strictly diagonally dominant and convergence is not guaranteed.
(b) \(|2|>|1|+|0|\), \(|-5|>|-3|+|1|\), \(|3|>|0|+|1|\). So \(A\) is strictly diagonally dominant and convergence is guaranteed.


The spectral radius \(\rho(A)\) of a matrix \(A\) is the largest modulus of an eigenvalue of \(A\). There are necessary and sufficient spectral conditions for convergence:

Remark. A strictly diagonally dominant matrix satisfies the above two conditions. But the converse is not true. For example, \(A=\left[\begin{array}{rr}4&1\\3&2\end{array}\right]\) is not strictly diagonally dominant but \(\rho(D^{-1}R)=\sqrt{3/8}<1\).

The above necessary and sufficient spectral conditions for convergence of the Jacobi and Gauss-Seidel methods can be explained from the following theorem:

Theorem. For any \(\overrightarrow{x}^{(0)}\in \mathbb R^d\), the sequence \(\{\overrightarrow{x}^{(n)}\}\) defined by \[\overrightarrow{x}^{(n)}=A\overrightarrow{x}^{(n-1)}+\overrightarrow{c},\; n\geq 1,\] converges to the unique solution \(\overrightarrow{x}=A\overrightarrow{x}+\overrightarrow{c}\) if and only if \(\rho(A)<1\).


Last edited