if the convergence critera (rms<tolerance) is satidfied, stop the iteration
if (rms/init_rms) <= 1e-12
break
end
end
```
✝ ✆

**Figure 20.** Comparison of exact solution and numerical solution computed using Gauss-Seidel iterative method for the Poisson equation.

#### 6.2.2. Non-Stationary Methods: Conjugate Gradient Algorithm

Non-stationary methods differ from stationary methods in that the iterative matrix changes at every iteration. These methods work by forming a basis of a sequence of matrix powers times the initial residual. The basis is called as the Krylov subspace and mathematically given by K*n*(*A*, *b*) = span{*b*, *<sup>A</sup>b*, *<sup>A</sup>*2*b*, ... , *<sup>A</sup>n*−1*b*}. The approximate solution to the linear system is found by minimizing the residual over the subspace formed. In this paper, we discuss the conjugate gradient method which is one of the most effective methods for symmetric positive definite systems.

The conjugate gradient method proceeds by calculating the vector sequence of successive approximate solution, residual corresponding the approximate solution, and search direction used in updating the solution and residuals. The approximate solution *u***(***k***)** is updated at every iteration by a scalar multiple *α<sup>k</sup>* of the search direction vector *p*(*k*):

$$
\mathfrak{u}^{(k+1)} = \mathfrak{u}^{(k)} + \mathfrak{a}\_k \mathfrak{p}^{(k)}.\tag{108}
$$

Correspondingly, the residuals *r***(***k***)** are updated as

$$r^{(k+1)} = r^{(k)} - a\_k q^{(k)} \quad \text{where} \quad q^{(k)} = \mathcal{A} p^{(k)}.\tag{109}$$

The search directions are then updated using the residuals

$$
\mathfrak{p}^{(k+1)} = \mathfrak{r}^{(k+1)} + \beta\_k \mathfrak{p}^{(k)},\tag{110}
$$

where the choice *<sup>β</sup><sup>k</sup>* <sup>=</sup> *<sup>r</sup>*(*k*)*<sup>T</sup> r*(*k*)/*r*(*k*−1)*<sup>T</sup> r*(*k*−1) ensures that *p*(*k*+1) and *r*(*k*+1) are orthogonal to all previous <sup>A</sup>*p*(*k*) and *<sup>r</sup>*(*k*) respectively. A detailed derivation of the conjugate gradient method can be found in [42]. The complete procedure for the conjugate gradient method is given in Algorithm 4. The implementation of conjugate gradient method in Julia is given in Listing 22.

The comparison of the exact and numerical solution is shown in Figure 21. The residual history for the conjugate gradient method is displayed in Figure 24. It can be seen that the rate of convergence is slower at the start of iterations. As the conjugate gradient algorithm finds the correct direction for residual, the rate of convergence increases. There are several types of iterative methods which are not discussed in this paper and for further reading we refer to a book "Iterative Methods for Sparse Linear Systems" [43].

```
Listing 22. Implementation of conjugate gradient method for Poisson equation in Julia. ✞ ☎
```

```