Second Refinement of Accelerated over Relaxation Method for the Solution of Linear System

: This paper describes a method for the numerical solution of linear system of equations. The main interest of refinement of accelerated over relaxation (RAOR) method is to minimize the spectral radius of the iteration matrix in order to increase the rate of convergence of the method comparing to the accelerated over relaxation (AOR) method. That is minimizing the spectral radius means increasing the rate of convergence of the method. This motivates us to refine the refinement of accelerated over relaxation method called second refinement of accelerated over relaxation method (SRAOR). In this paper, we proposed a second refinement of accelerated over relaxation method, which decreases the spectral radius of the iteration matrix significantly comparing to that of the refinement of accelerated over relaxation (RAOR) method. The method is a two-parameter generalization of the refinement of accelerated over relaxation methods and the optimal value of each parameter is derived. The third, fourth and in general the k th refinement of accelerated methods are also derived. The spectral radius of the iteration matrix and convergence criteria of the second refinement of accelerated over relaxation (SRAOR) are discussed. Finally a numerical example is given in order to see the efficiency of the proposed method comparing with that of the existing methods.


Introduction
A new demand for new means of solving systems of linear equations appeared at the same time as the computing technology emerged which promoted a rapid development of numerical methods for modelling physical processes by sampling (sub-dividing) the calculation range as well as replacing the differential operations by similar algebraic operations. According to the requirements of the final differences, final elements and their modifications, direct and iterative methods for approaching a poorly completed diagonal matrix with a strong main diagonal were developed. Methods for efficient storage of the equation system were developed, taking into account the symmetry of the matrix according to the main diagonal for both direct and iterative methods. In recent years, with the introduction of new numerical methods (super elements, the method of border elements), there has been a necessity for solving systems of linear equations with a completely filled matrix and one which does not possess the main diagonal dominance. Iterative methods are often used for solving such tasks and the methods have been developed from the Gauss-Seidel method. Solving systems of linear equations by iterative methods (such as Gauss Jacobi, Gauss-Seidel, Successive over relaxation (SOR), Accelerated over relaxation (AOR) method) involves the correction of one searched-for unknown value in every step by reducing the difference of a single individual equation; moreover, other equations in this process are not used. In order to accelerate the convergence of the iterative process, the methods are complemented by wellness principles which optimize the rate of the variable change in the iterative process [14].
Linear iteration coincides with multiplication by successive powers of a matrix; convergence of the iterates depends on the magnitude of its eigenvalues. We discuss in some detail a variety of convergence criteria based on the spectral radius, on matrix norms, and on eigenvalue estimates provided by the Gerschgorin Circle Theorem. We will then turn our attention to the three most important iterative schemes used to accurately approximate the solutions to linear algebraic systems. The classical Jacobi method is the simplest, while an evident serialization leads to the popular Gauss-Seidel method. Completely general convergence criteria are hard to formulate, although convergence is assured for the important class of diagonally dominant matrices that arise in many applications. A simple modification of the Gauss-Seidel scheme, known as Successive Over-Relaxation (SOR), can dramatically speed up the convergence rate, and is the method of choice in many modern applications.
In this paper, we describe one of the iterative method techniques, which requires an initial approximation.
The rate of convergence of an iterative technique depends on the spectral radius of the matrix associated with the method. One way to select a procedure to accelerate convergence is to choose a method whose iteration matrix has minimal spectral radius. Before describing a procedure for selecting such a method, we need to introduce a new means of measuring the amount by which an approximation to the solution to a linear system differs from the true solution to the system [8].
One of the most important problems in numerical analysis is to solve the linear system System of form (1) appears in many applications such as linear elasticity, fluid dynamics, and constrained quadratic programming. When the coefficient matrix of the linear system (1) is large and sparse, iterative methods are recommended against direct methods. In order to solve (1) more effectively by using the iterative methods, usually, efficient splitting of the coefficient matrix are required. For example, the classical Jacobi and Gauss-Seidel iterations are obtained by splitting the matrix into its diagonal and offdiagonal parts.
For the numerical solution of (1), the accelerated over relaxation (AOR) method was introduced by Hadjidimos in and is a two-parameter generalization of the successive over relaxation (SOR) method [1]. In certain cases the AOR method has better convergence rate than Jacobi, JOR, Gauss-Seidel, or SOR method. Many authors have considered sufficient conditions for the convergence of the AOR method. To improve the convergence rate of the AOR method, the preconditioned AOR (PAOR) method has been considered by many authors [3,4,9,13].
The purpose of this paper is to present a new version of the refined accelerated over relaxation (RAOR) method for the linear system (1), which is called the second refinement accelerated over relaxation (SRAOR) method. We discuss some sufficient conditions for the convergence of the SRAOR method when the coefficient matrices is irreducible one with weak diagonal dominance where is non-singular square matrix with non-vanishing diagonal elements of size , and are unknown and known dimensional vectors respectively.
The matrix can be splitted as Where is the diagonal andand− are strictly lower and upper triangular parts of respectively.
Then the system (1) can be put in the form Where = , = and = Now the system (1) takes the form = Where = = − − The Jacobi method for the solution of (4) is given by Where = + is the Jacobian iteration matrix and whose spectral radius is The Gauss-Seidel method for the solution of (4) is Where = − is the Gauss-Seidel iteration matrix and whose spectral radius is The Successive Over Relaxation (SOR) iterative method for solving (4) is given by Where = − ! " 1 − ! + ! $ is the SOR iteration matrix [15,16], whose spectral radius is For the choice of ! i.e., The Acceleration over Relaxation (AOR) iterative method for solving (4) is defined by [1,2,11] = * +, + , − !
Hadjidimos noticed that the Jacobi, Gauss-Seidel, SOR and AOR methods converge if is irreducible matrix with weak diagonal dominance [1,2].
V. B. Kumar Vatti, Ramadevisri, M. S. kumar mylapallinoticed that refinement of Accelerated over relaxation method converge if is irreducible matrix with weak diagonal dominance [1].
The remainder of the paper is organized as follows. The second refinement of AOR method is derived in section 2 and some convergence theorems are proved for the SRAOR method when the coefficient matrices is irreducible one with weak diagonal dominance is given in section 3 [12]. A numerical example is studied in section 4. Finally a conclusion is drawn in section 5

Second Refinement of Accelerated over Relaxation (SRAOR) Method
Multiply both sides of the equation (4)

Convergence of SRAOR Method
Theorem 1: Let be irreducible matrix with weak diagonal dominance. Then SRAOR method converges for any arbitrary choice of the initial approximation.

Theorem 4:
The second refinement of AOR method converges faster than the refinement of AOR method if refinement AOR method is convergent.
Proof: Consider is the solution of (4) obtained by (20) and * be the solution of (4) obtained by (16).
From ( (1) and (2) second refinement of AOR method converges faster than the refinement of AOR.

Conclusion
The spectral radius of the iterative matrix is conclusive for the convergence and stability of the method, and the smaller it is, the faster the method converges when the spectral radius is smaller than 1 [5].
Minimizing the spectral radius of the iteration matrix is per amount to increase the rate of convergence of the method for the numerical solution of the system of linear equations.
The main objective of Hadjidimos and V. B. Kumar Vatti, Ramadevi, M. S. Kumar Mylapalli are to increase the rate of convergence of the methods by minimizing the spectral radius of the iteration matrix [1,2]. The need of the refinement of accelerated over relaxation (RAOR) is to minimize the largest eigen value of the iteration matrix of the method in magnitude (0 < * +, < 1). When we refine the accelerated over relaxation (AOR) method again and again the exponent of * +, becomes larger and larger.
Since 0 < * +, < 1, the spectral radius of a more refined AOR method becomes smaller and smaller. Our proposed method that is second refinement of accelerated over relaxation (SRAOR) method is more refined than the existing methods as a result the largest eigen values of the iteration matrix of this method is very small comparing to the previous methods.
The given numerical example confirms that the second and higher refinement of accelerated over relaxation method makes the spectral radius of the iteration matrix small enough and increase the rate of convergence.