Next Article in Journal
Dynamical Behaviors in a Stage-Structured Model with a Birth Pulse
Previous Article in Journal
Raindrop-Removal Image Translation Using Target-Mask Network with Attention Module
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

A Parameterized Multi-Splitting Iterative Method for Solving the PageRank Problem

1
College of Economics and Management, Nanchang Normal College of Applied Technology, Nanchang 330108, China
2
School of Big Data, Fuzhou University of International Studies and Trade, Fuzhou 350202, China
*
Author to whom correspondence should be addressed.
These authors contributed equally to this work.
Mathematics 2023, 11(15), 3320; https://doi.org/10.3390/math11153320
Submission received: 26 June 2023 / Revised: 19 July 2023 / Accepted: 26 July 2023 / Published: 28 July 2023

Abstract

:
In this paper, a new multi-parameter iterative algorithm is proposed to address the PageRank problem based on the multi-splitting iteration method. The proposed method solves two linear subsystems at each iteration by splitting the coefficient matrix, considering therefore inner and outer iteration to find the approximate solutions of these linear subsystems. It can be shown that the iterative sequence generated by the multi-parameter iterative algorithm finally converges to the PageRank vector when the parameters satisfy certain conditions. Numerical experiments show that the proposed algorithm has better convergence and numerical stability than the existing algorithms.

1. Introduction

Consider the following linear equation system:
A x = x ,
where A is a convex combination Google matrix composed of matrix P and matrix E, and  A = α P + ( 1 α ) E , where α ( 0 , 1 ) denotes the damping factor that determines the weight given to the web link graph, and  E = v e T e = ( 1 , 1 , , 1 ) T R n , and  v = e n is a personalization vector or a teleportation vector. n is the dimension of P, and x is the desired eigenvector.
The system of linear equations in (1) above is what we refer to as the PageRank problem. Google’s PageRank algorithm has grown to be one of the most well-known algorithms in online search engines thanks to the rapid development of the internet. The link analysis method called PageRank is used to rank online pages and assess their significance in relation to the link structure of the Web, calculating the primary eigenvectors of the Google matrix. This forms the basis of the PageRank algorithm. Although Google’s exact ranking technology and calculation approaches have gradually improved, the PageRank problem is still a major concern and has recently gained a lot of attention in the scientific and engineering fields in the era of intelligence.
To solve the PageRank problem, the power method, as one most classical algorithm, is easy to carry out. While all other eigenvalues of matrix A except for the principal eigenvalues are simply scalar times the corresponding eigenvalues of matrix P. As a result, the power approach converges very slowly when the primary eigenvalue of matrix A is closely related to other eigenvalues, or when the damping factor is close to 1. The power method is not the ideal way to solve this problem, but a quicker and more logical way to solve the principal eigenvectors of the Google matrix is required to speed up the calculation of the PageRank. The network graph is extremely large, with 1 billion or even 10 billion web page nodes. Additionally, a good search algorithm should minimize the lag time, which is the time from the search target proposed to the search result feedback to the web browser. In recent years, numerous researchers have proposed various methods to speed up the calculation of PageRank. For instance, Gleich et al. [1] proposed an inner outer iteration method combined with Richardson iteration, in which each iteration needs to solve a linear system whose algebraic structure is similar to the original system; Gu and Xie [2] proposed the PIO iteration algorithm, which combines the power method and the inner-outer iteration method, after that, Ma et al. [3] suggested a relaxed two-step splitting iteration strategy to address the PageRank problem based on [1,2], adding a new relaxation parameter; Gu et al. [4] introduced a two parameter iteration approach based on multiplicative splitting iteration in order to increase the possibility of optimizing the iterative process. Based on the iteration framework [5] and relaxed two-step splitting (RTSS) iteration method [3], two relaxed iteration techniques are presented by Tian et al. [6] for resolving the PageRank issue. In [7], Mendes et al. provided a novel approach combining the Lumping and Matrix Analogue of the Accelerated Overrelaxation (MAAOR) methods for the solution of the linear system, which showed superior performance.  Additionally, the PageRank problem can be solved by using Krylov subspace methods, which is an important class of methods for solving linear equations. For instance, Wu and Wei propose a hybrid algorithm, power-Arnoldi algorithm [8], which combines its power technique and thick restart Arnoldi algorithm; as well as the Arnoldi-extrapolation method [9] and speeding the Arnoldi-type algorithm [10]. We cite [3,4,5,6,8,9,10,11,12,13,14,15,16,17,18,19,20,21,22,23,24,25,26] for a more in-depth theoretical research.
Since literature [4] shows strong competition in solving PageRank problem and has received wide attention, this paper considers further revising the convergence performance of this literature. This inspires us to take advantage of parameterized methods to improve performance and show strong superiority, which will give convincing results in both theoretical and numerical experiments. Since the paper [4] shows strong performance in solving the PageRank problem and has received wide attention, this paper considers further revising the convergence performance of this paper. This inspires us to take advantage of parameterized methods to improve performance and show strong superiority, which will give convincing results in both theoretical and numerical experiments.
The structure of this essay is as follows: The inner–outer iterative PageRank problem techniques are briefly introduced in Section 2. In Section 3, we first examine the theoretical foundations of the multiplicative splitting iterative method before introducing our brand-new approach, i.e., the parameterized MSI iteration method. In Section 4, some numerical results are illustrated in detail. Finally, Section 5 provides a few succinct closing notes.

2. The Inner–Outer Method

First, we provide a brief summary of the methodological inside-out iteration procedure proposed by Gleich et al. [1] for computing the PageRank problem. It is clear that the eigenvector problem (1) can be rewritten as the following linear system:
( I α P ) x = ( 1 α ) v ,
since e T x = 1 .
Clearly, it is relatively simple to solve the PageRank problem when the damping vector is small. Gleich et al. defined the outer iteration with a smaller damping factor β ( 0 < β < α ), rather than immediately resolving Equation (2). Therefore, the linear system (2) is further rearranged as the equation
( I β P ) x = ( α β ) P x + ( 1 α ) v .
So the following stationary outer iteration scheme is generated:
( I β P ) x ( k + 1 ) = ( α β ) P x ( k ) + ( 1 α ) v , k = 0 , 1 , 2 , .
For computing x ( k + 1 ) , define the inner linear system as
( I β P ) y = f ,
where f = ( α β ) P x ( k ) + ( 1 α ) v , and compute x ( k + 1 ) via the Richardson inner iteration
y ( j + 1 ) = β P y ( j ) + ( α β ) P x ( k ) + ( 1 α ) v , j = 0 , 1 , 2 , . . . , l 1 ,
where y ( 0 ) = x ( k ) . The l-th step inner solution y ( l ) is assigned to be the next new x ( k + 1 ) . Stopping criteria are given as follows, i.e., the outer iteration (2) terminates if
( 1 α ) v ( I α P ) x ( k + 1 ) 2 < τ ,
while the inner iteration (6) terminates if
f ( I β P ) y ( j + 1 ) < η , j = 0 , 1 , 2 , . . . , l 1 ,
where η and τ are the inner and outer tolerances, respectively.
Gu et al. suggested the multi-splitting iteration (MSI) approach in [4] to expedite the PageRank vector calculation. Now, a quick overview of the MSI approach will be provided, and the MSI approach entails writing I α P as
I α P = ( I β 1 P ) ( α β 1 ) P = ( I β 2 P ) ( α β 2 ) P ,
where 0 < β 1 < α , 0 < β 2 < α .
The MSI iteration method can be concluded as follows.
Given an initial vector x ( 0 ) , the following two-step iteration
( I β 1 P ) u ( k + 1 ) = ( α β 1 ) P x ( k ) + ( 1 α ) v , ( I β 2 P ) x ( k + 1 ) = ( α β 2 ) P u ( k + 1 ) + ( 1 α ) v
will be performed until the sequence { x ( k ) } converges to the exact solution x , where k = 0 , 1 , 2 , .
Theorem 1
([4]). Let M i = I β i P , N i = ( α β i ) P be the two splittings of the matrix I α P   ( i = 1 , 2 ) , and  α be the damping factor in the PageRank linear system. Then the iterative matrix H ˜ M S I ( β 1 , β 2 ) of the MSI method for PageRank computation is given by
H ˜ M S I ( β 1 , β 2 ) = ( I β 2 P ) 1 ( α β 2 ) P ( I β 1 P ) 1 ( α β 1 ) P ,
and its spectral radius ρ ( H ˜ M S I ( β 1 , β 2 ) ) is bounded by
σ ( β 1 , β 2 ) ( α β 2 ) ( α β 1 ) ( 1 β 2 ) ( 1 β 1 ) ,
therefore, it holds that
ρ ( H ˜ M S I ( β 1 , β 2 ) ) σ ( β 1 , β 2 ) < 1 , 0 β 1 < α , 0 β 2 < α .
The multiplicative splitting iteration method for PageRank computation converges to the unique solution x C n of the linear system of equations, i.e., the PageRank vector.

3. The Parameterized MSI Iteration Method

The PageRank problem model is presented in this section, and it illustrates how the problem can be solved more simply by choosing a smaller damping factor α . We introduce a parameter ω based on the MSI method in order to further control the range of α , reduce the spectral radius, and speed up convergence. This result, denoted as the parameterized MSI (PMSI) method below, is described as follows:
The PMSI iteration method can be concluded as:
( I β 1 P ) u ( k + 1 ) = ( ω α β 1 ) P x ( k ) + ( 1 ω ) x ( k ) + ω ( 1 α ) v , ( I β 2 P ) x ( k + 1 ) = ( ω α β 2 ) P u ( k + 1 ) + ( 1 ω ) u ( k + 1 ) + ω ( 1 α ) v ,
with ω > 0 and 0 < β 1 < α < 1 0 < β 2 < α < 1 . If  ω = 1 , then the PMSI iteration method becomes the MSI iteration method Algorithm 1.
Algorithm 1 PMSI method
Input: Parameters P, q, v, α , τ , η ;
Output: x.
1:
x ν , y P x
2:
 
3:
if  ( 1 α ) v + α y x 1 τ   then
4:
 
5:
     f 1 = ( ω α β 1 ) y + ( 1 ω ) x + ω ( 1 α ) v ;
6:
 
7:
    repeat
8:
 
9:
         x = β 1 y + f 1 ;
10:
 
11:
         y = P x ;
12:
 
13:
    until  f 1 + α y x 1 < η
14:
 
15:
     f 2 = ( ω α β 2 ) y + ( 1 ω ) x + ω ( 1 α ) v ;
16:
 
17:
    repeat
18:
 
19:
         x = β 2 y + f 2 ;
20:
 
21:
         y = P x ;
22:
 
23:
    until  f 2 + α y x 1 < η
24:
 
25:
     x = β 2 y + f 2 ;
26:
 
27:
     y = P x ;
28:
 
29:
end if
30:
 
31:
x = α y + ( 1 α ) v .
Remark 1.
The computational cost of the PMSI iteration approach is somewhat higher than that of (9) since it simply requires an additional sappy operation, i.e., ( 1 ω ) u ( k + 1 ) vector additions and the price ( ω α β 1 ) P u ( k + 1 ) + ( 1 ω ) u ( k + 1 ) for each iteration of with o ( n ) flops.
In the sequel, we will analyze the convergence property of the parameterized MSI iteration method.
Lemma 1
([4]). Let A C n × n , A = M i N i ( i = 1 , 2 ) be two splittings of the matrix A , and let x ( 0 ) C n be a given initial vector. If x ( k ) generates by the two-step iteration sequence
M 1 x ( k + 1 2 ) = N 1 x ( k ) + b , M 2 x ( k + 1 ) = N 2 x ( k + 1 2 ) + b .
Then
x ( k + 1 ) = M 2 1 N 2 M 1 1 N 1 x ( k ) + M 2 1 ( I + N 2 M 1 1 ) b , k = 0 , 1 , 2 , .
Moreover, if the spectral radius ρ ( M 2 1 N 2 M 1 1 N 1 ) is less than 1, then the iteration sequence x ( k ) converges to the unique solution x C n × n of the system of linear Equation (2) for all initial vectors x ( 0 ) C n .
The multiplicative splitting iteration method for (2) is obviously related to the splitting of the coefficient matrix I α P , and we will subsequently demonstrate that there exists a plausible convergent domain of two parameters for the parameterized method.
I α P = M i N i ( i = 1 , 2 ) , M 1 = I β 1 P , N 1 = ( ω α β 1 ) P + ( 1 ω ) I , M 2 = I β 2 P , N 2 = ( ω α β 2 ) P + ( 1 ω ) I .
According to (13), the two-step iterative matrix corresponding to the multiplication split iterative method is as follows:
G ˜ P M S I ( β 1 , β 2 ) = M 2 1 N 2 M 1 1 N 1 = ( I β 2 P ) 1 ( ω α β 2 ) P + ( 1 ω ) I ( I β 1 P ) 1 ( ω α β 1 ) P + ( 1 ω ) I .
Now we examine the convergence property of the multiplicative splitting iterative method. By applying Lemma 1, we can obtain the following main theorem.
Theorem 2.
Let α be the damping factor in the PageRank linear system, and let M i = I β i P , N i = ( α β 2 ) P + ( 1 ω ) ( i = 1 , 2 ) be the splitting schemes of the matrix I α P . Then the iterative matrix G ˜ P M S I ( β 1 , β 2 ) of the PMSI method for PageRank computation is given by
G ˜ P M S I ( β 1 , β 2 ) = ( I β 2 P ) 1 ( ω α β 2 ) P + ( 1 ω ) I ( I β 1 P ) 1 ( ω α β 1 ) P + ( 1 ω ) I ,
and its spectral radius ρ ( G ˜ P M S I ( β 1 , β 2 ) ) is bounded by
ψ ( β 1 , β 2 ) 1 ( 1 α ) ω [ 2 + ω ( 1 α ) β 2 β 1 ] ( 1 β 1 ) ( 1 β 2 ) .
Therefore, it holds that
ρ G ˜ P M S I ( β 1 , β 2 ) ψ ( β 1 , β 2 ) < 1 , 0 β 1 < α , 0 β 2 < α .
That is, the multiplicative splitting iteration method for PageRank computation converges to the unique solution x C n of the linear system of equations.
Proof 
From Lemma 1, we are able to obtain the iterative matrix of the PMSI method for PageRank computation Equation (17). Let β = min { β 1 , β 2 } . Since e T P = e T , β α ω 1 , the matrix ( ω α β i ) P + ( 1 ω ) I ( i = 1 , 2 ) is a nonnegative matrix and the matrix G ˜ k is also nonnegative.
In addition, from (17), it turns out that
e T G ˜ P M S I ( β 1 , β 2 ) = e T [ ( I β 2 P ) 1 ( ω α β 2 ) P + ( 1 ω ) I ( I β 1 P ) 1 ( ω α β 1 ) P + ( 1 ω ) I ] .
If λ i is an eigenvalue of P, then the spectral radius of G ˜ P M S I ( β 1 , β 2 ) is
ρ ( G ˜ P M S I ( β 1 , β 2 ) ) = ( ω α β 2 ) + ( 1 ω ) ( ω α β 1 ) + ( 1 ω ) ( 1 β 1 ) ( 1 β 2 ) .
Obviously, the numerator of the above formula can be transformed as
( ω α β 2 ) + ( 1 ω ) ( ω α β 1 ) + ( 1 ω ) = ω ( α 1 ) + 1 β 2 ω ( α 1 ) + 1 β 1 = ( ω ( α 1 ) ) 2 + ω ( α 1 ) ( 2 β 2 β 1 ) + ( 1 β 2 ) ( 1 β 1 ) = ( 1 β 2 ) ( 1 β 1 ) ( 1 α ) ω [ 2 + ω ( 1 α ) β 2 β 1 ] .
Since ( 2 + ω ( 1 α ) β 2 β 1 ) > 0 , combining the above relations (21) and (22), we have
ρ ( G ˜ P M S I ( β 1 , β 2 ) ) < 1 ( 1 α ) ω [ 2 + ω ( 1 α ) β 2 β 1 ] ( 1 β 1 ) ( 1 β 2 ) < 1 .
So for any given constants β 1 and β 2 , 0 β 1 < α , 0 β 2 < α , the PMSI method converges to a unique solution to the linear system (2). □
Since 0 < β i < α < 1 ( i = 1 , 2 ) , then β α < ω < 1 . Immediately, a comparison result is obtained for the parameterized MSI iteration method compared with the MSI iteration method.
Theorem 3.
Let 0 < β i < α < 1 ( i = 1 , 2 ) , β = min { β 1 , β 2 } . If β α < ω < 1 , then the parameterized MSI iteration method converges more fast than the MSI iteration method.
Proof. 
From Equation (21), it follows that the spectral radius of the parameterized MSI iteration method is
ρ ( G ˜ P M S I ( β 1 , β 2 ) ) = ( ω α β 2 ) + ( 1 ω ) ( ω α β 1 ) + ( 1 ω ) ( 1 β 1 ) ( 1 β 2 ) .
Let ω = 1 in (24). Then we obtain the spectral radius of the MSI iteration method as follows:
ρ ( H ˜ M S I ( β 1 , β 2 ) ) = ( α β 2 ) ( α β 1 ) ( 1 β 1 ) ( 1 β 2 ) .
For 0 < β i < α < 1 , β i α < ω < 1 ( i = 1 , 2 ) , from (24) and (25), it is clear that
ρ ( G ˜ P M S I ( β 1 , β 2 ) ) = ( ω α β 2 ) + ( 1 ω ) ( ω α β 1 ) + ( 1 ω ) ( 1 β 1 ) ( 1 β 2 ) = ( ω α β 2 ) ( ω α β 1 ) ( 1 ω ) 2 ( 1 β 1 ) ( 1 β 2 ) < ( ω α β 2 ) ( ω α β 1 ) ( 1 β 1 ) ( 1 β 2 ) < ( α β 2 ) ( α β 1 ) ( 1 β 1 ) ( 1 β 2 ) = ρ ( H ˜ M S I ( β 1 , β 2 ) ) .
So ρ ( G ˜ P M S I ( β 1 , β 2 ) ) < ρ ( H ˜ M S I ( β 1 , β 2 ) ) , which completes the proof. □
Corollary 1.
In the range β α < ω < 1 , β = min { β 1 , β 2 } , 0 < β i < 1 ( i = 1 , 2 ) , when ω increases gradually within the value range, the smaller the iterative spectral radius of the PMSI algorithm, the faster the convergence speed.
Proof. 
According to Equation (21), we know that
ρ ( G ˜ P M S I ( β 1 , β 2 ) ) = ( ω α β 2 ) + ( 1 ω ) ( ω α β 1 ) + ( 1 ω ) ( 1 β 1 ) ( 1 β 2 ) .
Let
f ˜ ( ω ) = ( ω α β 2 ) + ( 1 ω ) ( ω α β 1 ) + ( 1 ω ) ( 1 β 1 ) ( 1 β 2 ) = ω ( α 1 ) + 1 β 2 ω ( α 1 ) + 1 β 1 ( 1 β 1 ) ( 1 β 2 ) = ( ω ( α 1 ) ) 2 + ω ( α 1 ) ( 2 β 1 + β 2 ) + ( 1 β 1 ) ( 1 β 2 ) ( 1 β 1 ) ( 1 β 2 ) .
Thus
f ˜ ( ω ) = 2 ω ( α 1 ) + ( α 1 ) ( 2 β 1 β 2 ) ( 1 β 1 ) ( 1 β 2 ) ( 1 β 1 ) ( 1 β 2 ) 2 = 2 ω ( α 1 ) + ( α 1 ) ( 2 β 1 β 2 ) ( 1 β 1 ) ( 1 β 2 ) = ( α 1 ) ( 2 ω + 2 β 1 β 2 ) ( 1 β 1 ) ( 1 β 2 ) .
Since β α < ω < 1 , 0 < β i < α < 1 ( i = 1 , 2 ) , we have ( 2 ω + 2 β 1 β 2 ) ) > 0 and f ˜ ( ω ) < 0 . Hence, the PMSI method will be more efficient when ω is large. □

4. Numerical Results

In this section, we compare the performance of the parameterized multi-splitting (PMSI) iteration method to that of the inner–outer (IO) and multi-splitting (MSI) iteration methods, respectively. On dual-core processing, numerical experiments are carried out in MATLAB R2018a (2.30 GHz, 8 GB RAM). Four iteration parameters, the number of matrix vectors (denoted as MV), the iteration step (denoted as IT), the calculation time in seconds (denoted as CPU), and the relative residual (denoted as res(k)) are used to test these iterative approaches. Further, we define
r e s ( k ) = r k 2 ( 1 α ) ν 2 , k = 0 , 1 , .
where r k = ( 1 α ) v ( I α P ) x k .
Table 1 lists the properties of the test matrices P, where average non-zero refers to each row of non-zero elements, and
ρ d e n = n n z n × n × 100 .
All test matrices can be downloaded from https://www.cise.ufl.edu/research/sparse/matrices/listbyid.html (accessed on 20 June 2023). For the interest of fairness, we assume that the transfer vector x ( 0 ) = v ( 0 ) = e n ( e = ( 1 , 1 , , 1 ) T ) is the initial guess for each test matrix. In all numerical tests, the damping factor α is set to be 0.98 , 0.99 , 0.995 , 0.997 , and 0.998 , respectively. The residual specification η = 0.01 , τ < 10 8 determines when all algorithms end.
Example 1.
In this example, we compare the PMSI iteration method with the MSI iteration method. The test matrices are the wb-cs-stanford, and amazon0312 matrices, respectively. In order to verify the efficiency of the PMSI iteration method, we use
S P M S I = C P U M S I C P U P M S I C P U M S I
to describe the speedups of the PMSI iteration compared with the MSI iteration associated with CPU time.
The numerical outcomes of the MSI and PMSI iterative procedures, where ω = 0.9 and β 1 = 0.9 and β 2 = 0.8 , are displayed in Table 2 and Table 3. Table 2 and Table 3 show that the PMSI iterative technique performs better than the MSI iterative method in terms of IT, MV, and CPU time, especially for bigger α, such as the α = 0.998 in Table 2 and Table 3. As can be observed, most S P M S I values are more than 20%, sometimes even reaching 50%.
Example 2.
With the test matrices being the wb-cs-stanford matrix and amazon0312 matrix with various ω parameters, we will further examine the convergence performance of the PMSI iterative method in this example. We have set the ω value range to 0.4 0.9 . The numerical outcomes are shown in Figure 1, where β 1 = 0.9 and β 2 = 0.8 . According to the findings, the number of repetitions constantly lowers as ω rises. Because of this, we used ω = 0.9 in our studies, which is consistent with the finding in Corollary 1.
Example 3.
Theorem 2 states that the PMSI method converges for any value of β 1 and β 2 , satisfying the conditions of 0 β 1 < α and 0 β 2 < α . This is what we take into consideration in this example. For two matrices, wb-cs-stanford and amazon0312, Table 4 and Table 5 display the number of iterations of the PMSI approach. The values of β 1 and β 2 change from 0.1 to 0.9, respectively, when α = 0.99. From Table 4 and Table 5, it can be inferred that, once one of the parameters β 1 and β 2 is determined, the number of iteration steps typically decreases first before increasing as more parameters are added. For instance, in Table 3, β 2 increased from 0.8 to 0.9. The number of iteration steps first declines, and then β 1 sets 0.1 to 0.5, and the number of iteration step continues to rise. Finding an explicit link between β 1 and β 2 , or the ideal β 1 and β 2 , for the universal PageRank matrix, is quite difficult. Our considerable experience has shown that selecting β 1 = 0.9 and β 2 = 0.8 usually results in good performance. For this reason, in our studies, we used β 1 = 0.9 and β 2 = 0.8 in the PMSI approach.
Indeed, the selection of optimal parameters is difficult, so in Figure 2, the setting and selection of double parameters are only uniformly distributed. In other words, parameters are tentatively selected from two-dimensional grid points.
All in all, whether from the decline rate of residuals in Figure 3 and Figure 4 or the changes in the number of iterations in Figure 1 and Figure 2 for different parameter values, we can clearly see the convergence, feasibility, and superiority of the PMSI method.

5. Conclusions

In this paper, in order to further improve the two-step splitting iterative method, we propose a parameterized multi-splitting iterative method to solve the PageRank problem by introducing a relaxation parameter ω . When ω = 1, the PMSI method will reduce to the MSI method. Numerical experiments show that the sequence of iterations generated by the PMSI method converges to the PageRank vector when the parameters ω , β 1 , and β 2 satisfy specific conditions. The proposed method also has better convergence performance than IO and MSI methods. Since the new algorithm is parameter-dependent, how to obtain the optimal parameter in the general case is still an open subject.

Author Contributions

Methodology, Y.X., L.H. and C.M.; validation, Y.X., L.H. and C.M.; investigation, L.H.; writing—review & editing, C.M.; visualization, Y.X.; funding acquisition, Y.X. and L.H. All authors have read and agreed to the published version of the manuscript.

Funding

This research received no external funding.

Data Availability Statement

Not Applicable.

Conflicts of Interest

The authors declare no conflict of interest.

References

  1. Gleich, D.-F.; Gray, A.-P.; Greif, C.; Lau, T. An inner-outer iteration for computing PageRank. SIAM J. Sci. Comput. 2010, 32, 349–371. [Google Scholar] [CrossRef]
  2. Gu, C.-Q.; Xie, F.; Zhang, K. A two-step matrix splitting iteration for computing PageRank. J. Comput. Appl. Math. 2015, 278, 19–28. [Google Scholar] [CrossRef]
  3. Xie, Y.-J.; Ma, C.-F. A relaxed two-step splitting iteration method for computing PageRank. J. Comput. Appl. Math. 2018, 37, 221–233. [Google Scholar] [CrossRef]
  4. Gu, C.-Q.; Wang, L. On the multi-splitting iteration method for computing PageRank. J. Comput. Appl. Math. 2013, 42, 479–490. [Google Scholar] [CrossRef]
  5. Tian, Z.-L.; Li, X.-J.; Liu, Z.-Y. A general multi-step matrix splitting iteration method for computing PageRank. Filomat 2021, 35, 679–706. [Google Scholar] [CrossRef]
  6. Tian, Z.-L.; Zhang, Y.; Wang, J.-X.; Gu, C.-Q. Several relaxed iteration methods for computing PageRank. J. Comput. Appl. Math. 2021, 388, 21. [Google Scholar]
  7. Mendes, I.; Vasconcelos, P. Pagerank computation with maaor and lumping methods. Math. Comput. Sci. 2018, 12, 129–141. [Google Scholar] [CrossRef]
  8. Wu, G.; Wei, Y.-M. A power-Arnoldi algorithm for computing PageRank. Numer. Linear Algebra Appl. 2007, 14, 521–546. [Google Scholar] [CrossRef]
  9. Wu, G.; Wei, Y.-M. An Arnoldi-extrapolation algorithm for computing PageRank. J. Comput. Appl. Math. 2010, 234, 3196–3212. [Google Scholar] [CrossRef] [Green Version]
  10. Pu, B.-Y.; Huang, T.-Z.; Wen, C. A preconditioned and extrapolation-accelerated GMRES method for PageRank. Appl. Math. Lett. 2014, 37, 95–100. [Google Scholar] [CrossRef]
  11. Tian, M.-Y.; Zhang, Y.; Wang, Y.-D.; Tian, Z.-L. A general multi-splitting iteration method for computing PageRank. Comput. Appl. Math. 2019, 38, 60. [Google Scholar]
  12. Chen, X.-D.; Li, S.-Y. A generalized two-step splitting iterative method modified with the multi-step power method for computing PageRank. J. Numer. Methods Comput. Appl. 2018, 39, 243–252. (In Chinese) [Google Scholar]
  13. Tian, Z.-L.; Liu, X.-Y.; Wang, Y.-D.; Wen, P.-H. The modified matrix splitting iteration method for computing PageRank problem. Filomat 2019, 33, 725–740. [Google Scholar] [CrossRef]
  14. Gu, C.-Q.; Nie, Y.; Wang, J.-B. Arnoldi-PIO algorithm for PageRank. J. Shanghai Univ. Nat. Sci. 2017, 23, 555–562. (In Chinese) [Google Scholar]
  15. Qiu, Z.-H.; Gu, C.-Q. A GMRES-RPIO algorithm for computing PageRank problem. J. Numerical Math. J. Chin. Univ. 2018, 40, 331–345. [Google Scholar]
  16. Gu, C.-Q.; Wang, W.-W. An Arnoldi-MSI algorithm for computing PageRank problems. Numer. Math. J. Chin. Univ. 2016, 38, 257–268. (In Chinese) [Google Scholar]
  17. Gu, C.-Q.; Shao, C.-C. A GMRES-in/out algorithm for computing PageRank problems. J. Shanghai Univ. Nat. Sci. 2017, 23, 179–184. (In Chinese) [Google Scholar]
  18. Gu, X.-M.; Lei, S.-L.; Zhang, K.; Shen, Z.-L.; Wen, C.; Carpentieri, B. A Hessenberg-type algorithm for computing PageRank problems. Numer. Algorithms 2022, 89, 1845–1863. [Google Scholar] [CrossRef]
  19. Xu, W.-K.; Chen, X.-D. A Modified Multi-Splitting Iterative Method With the Restarted GMRES to Solve the PageRank Problem. Appl. Math. Mech. 2022, 43, 330–340. (In Chinese) [Google Scholar]
  20. Huang, N.; Ma, C.-F. Parallel multi-splitting iteration methods based on M-splitting for the PageRank problem. Appl. Math. Comput. 2015, 271, 337–343. [Google Scholar]
  21. Wu, G.; Zhang, Y.; Wei, Y.-M. Accelerating the Arnoldi-type algorithm for the PageRank problem and the ProteinRank problem. J. Sci. Comput. 2013, 57, 74–104. [Google Scholar] [CrossRef]
  22. Tan, X.-Y. A new extrapolation method for PageRank computations. J. Comput. Appl. Math. 2017, 313, 383–392. [Google Scholar] [CrossRef]
  23. Wen, C.; Hu, Q.-Y.; Pu, B.-Y.; Huang, Y.-Y. Acceleration of an adaptive generalized Arnoldi method for computing PageRank. AIMS Math. 2021, 6, 893–907. [Google Scholar] [CrossRef]
  24. Guo, P.-C.; Gao, S.-C.; Guo, X.-X. A modified Newton method for multilinear PageRank. Taiwan. J. Math. 2018, 22, 1161–1171. [Google Scholar] [CrossRef]
  25. Pu, B.-Y.; Wen, C.; Hu, Q.-Y. A multi-power and multi-splitting inner-outer iteration for PageRank computation. Open Math. 2020, 18, 1709–1718. [Google Scholar] [CrossRef]
  26. Zhang, H.-F.; Huang, T.-Z.; Wen, C.; Shen, Z.-L. FOM accelerated by an extrapolation method for solving PageRank problems. J. Comput. Appl. Math. 2016, 296, 397–409. [Google Scholar]
Figure 1. Convergence effect of three algorithms for wb-cs-stanford matrix, τ = 10 8 .
Figure 1. Convergence effect of three algorithms for wb-cs-stanford matrix, τ = 10 8 .
Mathematics 11 03320 g001aMathematics 11 03320 g001b
Figure 2. Convergence effect of three algorithms for amazon0312 matrix, τ = 10 8 .
Figure 2. Convergence effect of three algorithms for amazon0312 matrix, τ = 10 8 .
Mathematics 11 03320 g002
Figure 3. Numerical results for the wb-cs-stanford matrix in Example 2.
Figure 3. Numerical results for the wb-cs-stanford matrix in Example 2.
Mathematics 11 03320 g003aMathematics 11 03320 g003b
Figure 4. Numerical results for the amazon0312 matrix in Example 2.
Figure 4. Numerical results for the amazon0312 matrix in Example 2.
Mathematics 11 03320 g004
Table 1. Properties of test matrices.
Table 1. Properties of test matrices.
Size n nnz ρ den
wb-cs-stanford9914 × 99142,312,4970.291 ×  10 2
amazon0312400,727 × 400,7273,200,4401.993 ×  10 3
Table 2. Test results for the wb-cs-stanford matrix.
Table 2. Test results for the wb-cs-stanford matrix.
α IOMSIPMSI S pt
IT (MV)536270 (541)228 (457)
0.98CPU0.12610.12680.107914.90%
IT (MV)1096537 (1075)417 (835)
0.99CPU0.21510.22740.164827.52%
IT (MV)21681095 (2191)962 (1525)
0.995CPU0.72800.38190.294222.96%
IT (MV)35771806 (3613)1213 (2427)
0.997CPU0.63800.59540.435026.93%
IT (MV)54502698 (5397)1663 (3327)
0.998CPU0.93540.86690.586232.37%
Table 3. Test results for the amazon0312 matrix.
Table 3. Test results for the amazon0312 matrix.
α IOMSIPMSI S pt
IT (MV)367178 (357)170 (341)
0.98CPU7.38676.80736.56583.54%
IT (MV)733363 (727)292 (585)
0.99CPU15.710814.104512.067014.44%
IT (MV)1436723 (1447)5110 (1021)
0.995CPU30.113729.410721.265027.69%
IT (MV)25071164 (2329)717 (1435)
0.997CPU30.113748.665927.841042.79%
IT (MV)36301863 (3727)911 (1823)
0.998CPU90.784675.388837.692750.00%
Table 4. Numerical results for the wb-cs-stanford matrix in Example 3.
Table 4. Numerical results for the wb-cs-stanford matrix in Example 3.
β 2 β 1 0.10.20.30.40.50.60.70.80.9
0.1418 (837)410 (821)419 (839)419 (839)414 (829)422 (845)415 (831)417 (835)415 (831)
0.3416 (833)415 (831)418 (837)417 (835)420 (841)415 (831)426 (853)422 (845)417 (835)
0.5412 (825)420 (841)415 (831)416 (833)425 (851)411 (823)426 (853)424 (569)418 (837)
0.7418 (837)414 (829)417 (837)414 (829)419 (839)416 (833)412 (825)419 (839)417 (835)
0.9420 (841)422 (845)428 (857)420 (841)424 (849)422 (845)416 (833)417 (835)412 (825)
Table 5. Numerical results for the amazon0312 matrix in Example 3.
Table 5. Numerical results for the amazon0312 matrix in Example 3.
β 2 β 1 0.10.20.30.40.50.60.70.80.9
0.1293 (587)275 (551)262 (525)302 (605)284 (569)269 (539)275 (551)277 (555)262 (525)
0.3262 (525)265 (531)294 (589)262 (525)263 (527)303 (607)271 (543)269 (539)258 (517)
0.5281 (563)257 (535)269 (539)308 (617)256 (513)276 (553)303 (607)284 (569)278 (557)
0.7322 (665)275 (551)303 (607)255 (511)301 (603)272 (545)269 (539)314 (629)264 (529)
0.9259 (517)276 (553)274 (549)270 (541)251 (503)274 (549)272 (545)272 (545)255 (511)
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Xie, Y.; Hu, L.; Ma, C. A Parameterized Multi-Splitting Iterative Method for Solving the PageRank Problem. Mathematics 2023, 11, 3320. https://doi.org/10.3390/math11153320

AMA Style

Xie Y, Hu L, Ma C. A Parameterized Multi-Splitting Iterative Method for Solving the PageRank Problem. Mathematics. 2023; 11(15):3320. https://doi.org/10.3390/math11153320

Chicago/Turabian Style

Xie, Yajun, Lihua Hu, and Changfeng Ma. 2023. "A Parameterized Multi-Splitting Iterative Method for Solving the PageRank Problem" Mathematics 11, no. 15: 3320. https://doi.org/10.3390/math11153320

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop