=Paper= {{Paper |id=Vol-2147/p07 |storemode=property |title=Comparison of Two Stationary Iterative Methods |pdfUrl=https://ceur-ws.org/Vol-2147/p07.pdf |volume=Vol-2147 |authors=Oleksandra Osadcha |dblpUrl=https://dblp.org/rec/conf/system/Osadcha18 }} ==Comparison of Two Stationary Iterative Methods== https://ceur-ws.org/Vol-2147/p07.pdf
    Comparison of Two Stationary Iterative Methods
                                                       Oleksandra Osadcha
                                                 Faculty of Applied Mathematics
                                                Silesian University of Technology
                                                         Gliwice, Poland
                                                Email:olekosa338@student.polsl.pl




   Abstract—This paper illustrates comparison of two stationary              Some convergence result for the block Gauss-Seidel method
iteration methods for solving systems of linear equations. The            for problems where the feasible set is the Cartesian product of
aim of the research is to analyze, which method is faster solve           m closed convex sets, under the assumption that the sequence
this equations and how many iteration required each method for
solving. This paper present some ways to deal with this problem.          generated by the method has limit points presented in.[5]. In
                                                                          [9] presented improving the modiefied Gauss-Seidel method
       Index Terms—comparison, analisys, stationary methods,              for Z-matrices. In [11] presented the convergence Gauss-Seidel
Jacobi method, Gauss-Seidel method                                        iterative method, because matrix of linear system of equations
                                                                          is positive definite, but it does not afford a conclusion on the
                         I. I NTRODUCTION                                 convergence of the Jacobi method. However, it also proved
                                                                          that Jacobi method also converges.
                                                                          Also these methods were present in [12], where described each
   The term ”iterative method” refers to a wide range of
                                                                          method.
techniques that use successive approximations to obtain more
                                                                             This publication present comparison of Jacobi and Gauss-
accurate solutions to a linear system at each step. In this publi-
                                                                          Seidel methods. These methods are used for solving systems
cation we will cover two types of iterative methods. Stationary
                                                                          of linear equations. We analyze, how many iterations required
methods are older, simpler to understand and implement, but
                                                                          each method and which method is faster.
usually not as effective. Nonstationary methods are a rela-
tively recent development; their analysis is usually harder to
understand, but they can be highly effective. The nonstationary                                II. JACOBI METHOD
methods we present are based on the idea of sequences of                    We consider the system of linear equations
orthogonal vectors. (An exception is the Chebyshev iteration
method, which is based on orthogonal polynomials.)                                                    Ax = B
   The rate at which an iterative method converges depends
greatly on the spectrum of the coeffcient matrix. Hence,                  where A ∈ Mn×n i detA 6= 0.
iterative methods usually involve a second matrix that trans-             We write the matrix in form of a sum of three matricies:
forms the coeffcient matrix into one with a more favorable
spectrum. The transformation matrix is called a precondi-                                          A=L+D+U
tioner. A good preconditioner improves the convergence of
the iterative method, suffciently to overcome the extra cost of           where:
constructing and applying the preconditioner. Indeed, without               • L - the strictly lower-triangular matrix
a preconditioner the iterative method may even fail to converge             • D - the diagonal matrix
[1].                                                                        • U - the strictly upper-triangular matrix
   There are a lot of publications with stationary iterative              If the matrix has next form:
methods. As is well known, a real symmetric matrix can be                                                                 
transformed iteratively into diagonal form through a sequence                                  a11 a12        ...      a1n
of appropriately chosen elementary orthogonal transformations                               a21 a22          ...      a2n 
                                                                                                                          
(in the following called Jacobi rotations): Ak → Ak+1 =                                     ..        ..     ..        .. 
                                                                                            .          .        .       . 
UkT Ak Uk (A0 = given matrix), this transformations pre-
                                                                                               an1    an2     ...      ann
sented in [2]. The special cyclic Jacobi method for computing
the eigenvalues and eigenvectors of a real symmetric matrix               then
annihilates the off-diagonal elements of the matrix succes-
                                                                                                                               
                                                                                                      0      0       ...     0
sively, in the natural order, by rows or columns. Cyclic Jacobi                                     a21     0       ...     0 
method presented in [4].                                                                  L=
                                                                                                                               
                                                                                                      ..     ..      ..      .. 
                                                                                                      .      .         .     . 
  Copyright held by the author(s).                                                                   an1    an2      ...     0



                                                                     39
                                                     
                          a11    0      ...     0                        We have:
                          0    a22     ...     0                                                                                 
               D=
                 
                           ..    ..     ..      ..
                                                      
                                                                                     4 −1 0                                      2
                           .     .        .     .                            A =  −1 4 −1                               b= 6 
                          0     0       ...    ann                                    0 −1 4                                      2
                                                                                                                           1        
                           0 a12       ...     a1n                                  0 −1 0                                      4   0 0
                          0 0         ...     a2n                      L + U =  −1 0 −1                               D−1  0 14  0 
                U =
                                                  
                           .. ..       ..       ..                                 0 −1 0                                      0 0 − 14
                           .  .          .      . 
                           0 0         ...     0                         We count
                                                                                                     − 14
                                                                                                                                               
   If the matrix A is nonsingular (detA 6= 0), then we can                                                                     0              0
                                                                                       −1
rearrange its rows, so that on the diagonal there are no zeros,              M = −D · (L + U ) =     0                       − 41            0 ·
                                                                                                                                              1
so aii 6= 0, ∀i∈(1,2,...,n)                                                                           0                        0              4
   If the diagonal matrix D is nonsingular matrix (so aii 6= 0,                                                                1
                                                                                                                                                  
                                                                                     0 −1 0           0                                   0
i ∈ 1, . . . , n), then inverse matrix has following form:                                                                        4
                                                                               ·  −1 0 −1  =  14                           0   1 
                                                                                                                                  4
                           1                                                                                                  1
                                    0 ...       0                                    0 −1 0           0                      −4 0
                                                   
                             a11
                                   1
                                                ..                                      1                                     1 
                           0           ...      .                                         0   0                            2
                  D−1 =           a22                                                   4                                            2
                           .
                           ..      ..   ..                              W = D−1 · b =  0 14  0 ·                        6  =  32 
                                     .      .   0 
                                                 1                                        0 0 − 41                           2        − 12
                           0     ...      0     ann
                                                                         The iterative scheme has the next form:
By substituting the above distribution for the system of equa-                                       1
                                                                                                               i   1 
tions, we obtain:                                                                               0     4    0     x          2
                                                                         v i+1 = M v i + w =  41     0    1  
                                                                                                           4   · y i  +  32 
                      (L + D + U )x = b                                                         0 − 14 0         zi        − 12
After next convertions, we have:                                         i=0
                                                                                             1
                                                                                                       1   1 
                                                                                                      
                    Dx = −(L + U )x + b                                             0        4   0  0         2           2
                                                                         v 1 =  14  0     1  
                                                                                           4    ·   0  +  32  =  32 
                x = −D−1 (L + U )x + D−1 b                                       0 − 14 0           0        − 12        − 21
                                                                                                         1               
Using this, we get the iterative scheme of the Jacobi method:                              4 −1 0                2        2
 (                                                                       Av i − b =  −1 4 −1  ·  32  −  6  =
   xi+1 = −D−1 (L + U )xi + D−1 b                                                          0 −1 4             − 12        2
                                                i = 0, 1, 2 . . .
   x0 − initial approximation                                                        1  
                                                                                                              − 23
                                                                                                                   
                                                                                       2          2
Matrix −D−1 (L + U ) and vector D−1 b do not change during                      =  6 − 6  =  0  =
                                                                                       1
calculations.
                                                                                       2          2           − 32
   Algorithm of Jacobi method:                                                      r
   1) Data :                                                                               3 2        3 2       3√
                                                                                  =      −      + 02 + −       =       2
         • Matrix A                                                                        2              2        2
         • vector b                                                      i=1
                    0
         • vector x                                                                      1                  1               1                         3          1
                                                                                                                                                              
                                                                                    0    4       0          2               2                         8          2
         • approximation ε ; (ε > 0)
                                                                         v1 =      1
                                                                                    4    0       1
                                                                                                 4
                                                                                                     ·    3
                                                                                                            2
                                                                                                                  +       3
                                                                                                                            2
                                                                                                                                      =              0 +     3
                                                                                                                                                                 2
                                                                                                                                                                       =
   2) We set matrices L + U i D−1                                                   0   − 41     0         − 21            − 12                       − 38      − 12
   3) We count matrix M = −D−1 (L + U ) and vector w =
                                                                                                                7
                                                                                                                     
       D−1 b                                                                                                    8
                                                                                                                3
   4) We count xi+1 = M xi +w so long until Axi+1 − b >                                               =        2
                                                                                                                      
       ε                                                                                                       − 78
   5) The result is the last counted xi+1                                                                                   7
                                                                                                                                                
                                                                                         4 −1 0                                8              2
   Example. Use Jacobi’s method to solve the system of equa-              Av i − b =  −1 4 −1  ·                            3          − 6  =
                                                                                                                               2
tions. The zero vector is assumed as the initial approximation.                          0 −1 4                               − 78            2
       
       4x − y = 2
                                                       
                                                    0                              
                                                                                      2
                                                                                         
                                                                                             2
                                                                                                  
                                                                                                                             0
                                                                                                                               
         −x + 4y − z = 6                    v0 =  0                             =  6 − 6  =                           0  =0
                                                     0
       
         −y − 4z = 2                                                                  2      2                               0
       




                                                                    40
                                                                                        − 31 0 0
                                                                                                               
                III. G AUSS -S EIDEL METHOD                                                                0 0 0
                                                                     M1 = −D−1 · L =  0      1
                                                                                              4   0 · 1 0 0 =
  Analogously to the Jacobi method, in the Gauss-Seidel                                   0   0 12         0 −1 0
method, we write the matrix A in the form of a sum:                                                  
                                                                                           0    0 0
                      A=L+D+U                                                        =  14     0 0 
where matrices L,D and U they are defined in the same way                                  0 − 21 0
as in the Jacobi method. Next we get:                                                
                                                                                         − 31 0 0
                                                                                                      
                                                                                                           0 −1 1
                                                                                                                  

                   Dx = −Lx − U x + b                                M2 = −D−1 · U =  0      1
                                                                                              4   0 · 0 0 1 =
                                                                                          0   0 12         0 0 0
where                                                                                        1     1
                                                                                                      
              x = −D−1 Lx − D−1 U x + D−1 b                                                0 3 −3
                                                                                                   1 
                                                                                     = 0 0        4
Based on the above formula, we can write an iterative scheme                               0 0     0
of the Gauss-Seidel method:                                                         1                          
(                                                                                    3    0     0         3       1
  xi+1 = −D1 Lxi+1 − D−1 U x + D−1 b                                 w = D−1 · b =  0 − 41     0  ·  −2  =  21 
                                                   i = 0, 1, 2 . . .                                              3
    0
  x − initial approximation                                                          0    0 − 12         −3       2

In the Jacobi method, we use the corrected values of subse-            The iterative scheme has following form:
                                                                               i+1                         i+1 
quent coordinates of the solution only in the next step of the                    x1            0    0 0          x1
iteration. In the Gauss - Siedel method, these corrected values                xi+1     = 1       0   0   ·  xi+1 +
                                                                                    2           4                  2
are used immediately when we count the next coordinates.                          x3i+1
                                                                                                0 −2 0 1           i+1
                                                                                                                  x3
The formula of Gauss-Seidel method for coordinates has next
                                                                                        0 31 − 31        xi1
                                                                                                                  
form                                                                                                                 1
                                                                                 + 0 0        1  
                                                                                               4     ·   xi2  +  12 
                                                                                                                     3
                            1 X
                               k−1                                                      0 0    0         xi3         2
                  xi+1
                   k   =           akj xi+1
                                        j −
                           akk j=1                                     first step (i=0)
                                                                                             1 0 1 0
          1 X
             n
                          1                                                               x11 =x − x +1=1
     −         akj xij +     bk          k = 1, 2, . . . , n                                 3 2 3 3
         akk             akk                                                           1     1     1   1    1 3
            j=k=1
                                                                                 x12 = x11 + x03 + = + 0 + =
  Algorithm of Gauss-Seidel method:                                                    4     4     2   4    2 4
  1) Data :                                                                                1     1 1 3    9
                                                                                          x3 = − x2 + =
                                                                                                 2    2   8
       • Matrix A
                                                                                                  3 9 T
       • vector b                                                                           x1 = 1, ,
       • vector x
                  0                                                                                 4 8
                                                                                                        r
       • approximation ε ; (ε > 0)                                                                    3 5
                                                                                               1
                                                                                            Ax − b =
  2) We count as long as Axi+1 − b > ε:                                                               4 2
     For k = 1, 2, . . . , n                                           second step (i=1)
                    k−1                n
                 1 X                1 X             1                                1 1 1 1             1 3 1 9   7
       xi+1 =           akj xi+1 −       akj xij +     bk                     x21 =    x − x +1= · − · +1=
        k
                akk j=1      j
                                   akk             akk                               3 2 3 3             3 4 3 8   8
                                       j=k=1
                                                                                     1      1      1     1 7 1 9 1
  3) Result: last count x  i+1                                                x22 = x21 + x13 + = · + · + = 1
                                                                                     4      4      2     4 8 4 8 2
Example. Calculate the third approximation of Ax = b                                         1      3        1   3
                                                                                     x23 = − x22 + = − · 1 + = 1
     
        3 −1 1
                              
                                   3
                                                
                                                    0
                                                                                            2      2        2   2
                                            x0 =  0 
                                                                                                    7        T
 A= 1 4            1     b =  −2 
                                                                                               x2 =     , 1, 1
        0 −1 −2                   −3                0                                                 8
                                                                                                               r
      
         0 0 0
                                     1
                                             0    0
                                                      
                                                                                                 2           1 5
                                         3                                                    Ax − b =
 L= 1 0 0                  D−1 =  0 − 14       0                                                         4 2
         0 −1 0                         0    0 − 12                    third step (i=2)
                        
                           0 −1 1
                                                                                                 1 2 1 2
                                                                                          x31 =    x − x +1=1
                    U = 0 0 1                                                                   3 2 3 3
                           0 0 0



                                                                  41
                                                                  Tab. I
                                                         C OMPARISON OF METHODS

                                                  Gauss-Seidel method                       Jacobi method
                             N     n     Iteration   Time in ms Time in ti      Iteration    Time in ms   Time in ti
                              5   25        28            5          10639          54           12         22360
                             10   100       103           53         98264         205           65         121525
                             15   225       229          504         933331        451           528        978357
                             20   400       405          2370       4388630        808          3769       6977776
                             25   625       636          9147      16933626       1266          9506      17597628
                             30   900       918         27755      51379812       1839         30047      55623008




                  1 3 1 2 1          1 1 1
          x32 =    x + x + = + + =1
                  4 1 4 3 2          4 4 2
                            1       3
                     x33 = − x32 + = 1
                            2       2
                                   T
                        x3 = 1, 1, 1

                          Ax3 − b = 0
               IV. C OMPARISON OF METHODS
   Here we will present differences between these tests. We
carry out the series of research, where we analyze how much
time require each test and also we compare number of iteration
of each test.                                                                                 Fig. 2. Graph of iterations number

A. Analysis of time
                                                                                                    V. C ONCLUSIONS
                                                                             We have compared two method implemented to solve sys-
                                                                          tems of linear equations. This paper can helps us to decide,
                                                                          which method is better. After all research, we can conclude
                                                                          that Gauss-Seidel method is better method than Jacobi method,
                                                                          because it faster and required less number of iterations. So,
                                                                          if we want to get better results and do it faster, we should
                                                                          use Gauss-Seidel method. Advantages of our proposition is
                                                                          ease and clarity of description of each method with shown
                                                                          algorithms. Also in our paper presented examples of each
                                                                          method, which show how these methods could be used.

                                                                                                       R EFERENCES
                  Fig. 1. Graph of time comparison
                                                                          [1] Richard Barrett, Michael Berry, Tony F. Chan, James Demmel, June M.
  First, we analize speed of each test. After research, we                    Donato, Jack Dongarra, Victor Eijkhout, Roldan Pozo, Charles Romine
                                                                              and Henk Van der Vorst Templates for the Solution of Linear Systems:
get data with time in ms and CPU ticks, which presented in                    Building Blocks for Iterative Methods. 1994, ISBN: 978-0-89871-328-2
Table.I. Also, we prepared graph of time comparison in ms.                [2] H.Rutishauser The Jacobi Method for Real Symmetric Matrices Nu-
So, we can conclude that Gauss-Seidel method is faster than                   merische Mathematik 9, 1-10 (1966)
                                                                          [3] Marszalek, Z., Marcin, W., Grzegorz, B., Raniyah, W., Napoli, C.,
Jacobi method.                                                                Pappalardo, G., and Tramontana, E. Benchmark tests on improved merge
                                                                              for big data processing. In Asia-Pacific Conference on Computer Aided
B. Analisys of efficiency                                                     System Engineering (APCASE), pp. 96-101. 2015.
   Next, we analize the efficiency of these tests. We do research         [4] H. P. M. Van Kempen On the Quadratic Convergence of the Special
                                                                              Cyclic Jacobi Method Numerische Mathematik 9, 19-22 (1966)
for different number of n, where n is size of matrix A.                   [5] L. Grippo, M. Sciandrone On the convergence of the block nonlinear
When we compare the numbers of each tests, we can see that                    GaussSeidel method under convex constraints, Operations Research Let-
Gauss-Seidel method required less number of iterations than                   ters 26 (2000) 127136
                                                                          [6] Woniak M, Poap D, Napoli C, Tramontana E. Graphic object feature
Jacobi method. So, we can conclude, that Gauss-Seidel method                  extraction system based on cuckoo search algorithm. Expert Systems with
has better efficiency.                                                        Applications. Volume 66, pp. 20-31, 2016.




                                                                     42
                                            Fig. 3. Jacobi algorithm (left) and Gauss-Seidel algorithm (right).



[7] Woniak M, Poap D, Napoli C, Tramontana E. Graphic object feature
    extraction system based on cuckoo search algorithm. Expert Systems with
    Applications. Volume 66, pp. 20-31, 2016.
[8] Capizzi, G., Sciuto, G.L., Napoli, C., Tramontana, E. and Woniak, M.
    Automatic classification of fruit defects based on co-occurrence matrix
    and neural networks. In Federated Conference on Computer Science and
    Information Systems (FedCSIS), pp. 861-867, 2015.
[9] Toshiyuki Kohno, Hisashi Kotakemori, Hiroshi Niki and Masataka Usui
    Improving the Modified Gauss-Seidel Method for Z-Matrices, 3 January
    1997
[10] Wozniak, M., Polap, D., Borowik, G., and Napoli, C. A first attempt
    to cloud-based user verification in distributed system. In Asia-Pacific
    Conference on Computer Aided System Engineering (APCASE), pp. 226-
    231, 2015.
[11] A. Mitiche, A.-R. Mansouri On convergence of the Horn and Schunck
    optical-flow estimation method, DOI: 10.1109/TIP.2004.827235, 18 May
    2004.
[12] Feng Ding, Tongweng Chen Iterative least-squares solutions of coupled
    Sylvester matrix equations, Systems & Control Letters Volume 54, Issue
    2, February 2005, Pages 95-107




                                                                              43