Next Article in Journal
Clustering Component Synchronization of Nonlinearly Coupled Complex Networks via Pinning Control
Next Article in Special Issue
Automatic Evaluation Method for Functional Movement Screening Based on a Dual-Stream Network and Feature Fusion
Previous Article in Journal
Assessing the Efficiency of Foreign Investment in a Certification Procedure Using an Ensemble Machine Learning Model
Previous Article in Special Issue
A New Perspective on Moran’s Coefficient: Revisited
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Efficient Maintenance of Minimum Spanning Trees in Dynamic Weighted Undirected Graphs

School of Computer Science, Hubei University of Technology, Wuhan 430068, China
*
Author to whom correspondence should be addressed.
Mathematics 2024, 12(7), 1021; https://doi.org/10.3390/math12071021
Submission received: 3 February 2024 / Revised: 25 February 2024 / Accepted: 26 March 2024 / Published: 28 March 2024
(This article belongs to the Special Issue Advances in Graph Theory: Algorithms and Applications)

Abstract

:
This paper presents an algorithm for effectively maintaining the minimum spanning tree in dynamic weighted undirected graphs. The algorithm efficiently updates the minimum spanning tree when the underlying graph structure changes. By identifying the portion of the original tree that can be preserved in the updated tree, our algorithm avoids recalculating the minimum spanning tree from scratch. We provide proof of correctness for the proposed algorithm and analyze its time complexity. In general scenarios, the time complexity of our algorithm is comparable to that of Kruskal’s algorithm. However, the experimental results demonstrate that our algorithm outperforms the approach of recomputing the minimum spanning tree by using Kruskal’s algorithm, especially in medium- and large-scale dynamic graphs where the graph undergoes iterative changes.

1. Introduction

Given a weighted undirected connected graph, the minimum spanning tree (MST) problem involves finding a subgraph spanning all vertices in the graph without ring structures and with the minimized sum of edge weights. This problem was first introduced by Brouvka in 1926. Nesetril and Nesetrilova [1] have provided a detailed overview of the origin and development of the MST problem. MST has widespread applications in real-life scenarios, including transportation problems, communication network design, and many other combinatorial optimization problems. For example, MST is commonly employed in clustering (see [2,3,4]). It is a common method to study relationships in financial markets by constructing an MST from correlation matrices (see [5,6,7]). MST is also used for analyzing brain networks (see [8,9,10]). The MST problem is also common in various production and life applications, such as power grid configuration [11], dynamic power distribution reconfiguration [12], optimal configuration problem [13], etc.
Real-world problems requiring solutions to the MST problem often involve graphs of immense scale and dynamic changes. Although polynomial-time algorithms exist for the MST problem, the real-time computation for large-scale graphs remains challenging. Moreover, real-world scenarios typically involve modifications to only a small portion of the original graph. Recalculating the entire graph using MST algorithms for the new graph results in significant redundant computations. This underscores the need for effective maintenance of the MST in dynamic graphs. Solutions designed for static problems are generally inadequate to meet the requirements of dynamic graphs [14]. In the context of the Internet of Things (IoT) domain, Wireless Ad Hoc Networks are commonly used and often employ a minimum dominating tree (MDT) structure as the backbone for network traffic management [15]. MDT is a spanning tree that encompasses a specific dominating set within the network. Given the dynamic nature of Ad Hoc Networks, they frequently change over time, necessitating corresponding adjustments in their backbone structure. To ensure efficient maintenance of the backbone, an effective solution to the dynamic minimum spanning tree (MST) problem is necessary. Moreover, the algorithm for dynamic MST can also be employed as a sub-procedure in the MDT algorithm to enhance convergence [16].
This paper studies the problem of updating the original MST of a graph when its structure changes. Given a weighted undirected graph and its MST, the problem queries a new MST based on the original one if the graph changes its structure, i.e., if  edges and vertices are added or removed.
Existing algorithms for maintaining the minimum spanning tree in dynamic graphs are usually designed for various scenarios. Frederickson et al. [17] proposed a top-tree-based algorithm, which is specifically designed for graphs where the maximum degree of vertices is three. Amato et al. [18] conducted the first experimental study on dynamic minimum spanning tree algorithms. They implemented different versions of Frederickson’s algorithm using partitioning and top-trees. Additionally, to achieve the fastest random input, Henzinger and King [19] introduced a fully dynamic minimum spanning tree algorithm. Holm et al. [20] improved the algorithm proposed by Henzinger and King. They were the first to introduce a fully dynamic minimum spanning tree algorithm with logarithmic-time complexity. Cattaneo et al. [21] further optimized several algorithms for this problem. Eppstein et al. [22] introduced sparsification techniques on top of Frederickson’s algorithm, reducing the time complexity to O ( n ) . Based on Eppstein’s work, Kopelowitz et al. [23] proposed the algorithm with the best-known results, achieving a time complexity of O ( n · lg n ) per update for each vertex, but only allowing single-edge updates. Tarjan and Wereck [24] conducted experiments on two variants of dynamic tree data structures (self-adjusting and worst-case) and compared their advantages and disadvantages. Chen et al. [25] proposed the LDP-MST algorithm, which calculates density based on the results of the natural neighbor search algorithm, identifies local density peaks, and then constructs the minimum spanning tree. Junior et al. [26] proposed an algorithm for solving the variant of the fully dynamic minimum spanning tree problem, addressing the incremental minimum spanning tree problem through complete backtracking. Anderson et al. [22] proposed a parallel batch-processing dynamic algorithm for incremental minimum spanning trees. They introduced a parallel data structure for batch-processing incremental MSF problems, achieving polynomial time for each edge insertion, which is more efficient than the fastest sequential single-update data structure for sufficiently large batches. Tseng et al. [27] implemented parallelization based on Holm et al.’s work and proposed a parallel batch algorithm to maintain MSF.
After reviewing the current state of research, it is apparent that the majority of the existing studies have predominantly concentrated on scenarios involving single changes. In this paper, our objective is to explore a specific scenario where a portion of the graph undergoes alteration. Furthermore, we propose a method that is well suited for preserving MST in situations where the graph undergoes iterative structural modifications, with only a small segment of the graph being altered in each iteration.
The following of the paper is organized as follows: Section 2 describes the problem studied by this paper in detail; Section 3 introduces the proposed MST maintenance algorithm; Section 4 includes theoretical proofs and an analysis of the time complexity of the proposed algorithm; Section 5 presents the experiments conducted and evaluates the proposed algorithm; finally, Section 6 concludes the paper and outlines prospects for future work.

2. Problem Description

Given a weighted undirected graph G = ( V , E ) , where V is the set of vertices and E is the set of edges, and its minimum spanning tree, this paper investigates a rapid updating and generation algorithm for the new minimum spanning tree when partial structural changes occur in graph G. If the updated graph becomes a non-connected graph, the algorithm should provide a minimum spanning forest. This section describes two specific scenarios of graph changes and generalizes the common variations in the graph accordingly.

2.1. Minimum Spanning Tree of the Subgraph in the Original Graph

If the changes to the graph only involve the removal from the original graph, the updated graph is a subgraph of the original. The problem then becomes finding the minimum spanning tree of a subgraph of the original graph. This is defined as follows: Given a weighted undirected graph G = ( V , E ) and its spanning tree T, the task is to find the minimum spanning tree T of graph G = ( V , E ) , where E E .
The MST of a subgraph may differ from or be identical to the MST of the original graph. Figure 1 illustrates an example of the subgraph MST problem. Given the original graph shown in Figure 1a, where solid lines represent edges in the MST (following schematic figures), a subgraph is formed by removing edge A , B , as depicted in Figure 1b. It can be observed that the MST of this subgraph is different from that of the original graph. If the removed edge is not part of the original MST, the MST of the subgraph is identical to the original MST, as shown in Figure 1c.
Removing a vertex from the original graph can be reduced to removing multiple edges. The new minimum spanning tree obtained after removing a vertex from the original graph has the same weight as the minimum spanning forest obtained by removing all edges connected to this vertex, after which this vertex becomes an isolated vertex, and its presence or absence does not affect the weight of the minimum spanning tree. For example, in the original graph shown in Figure 2a, after deleting vertex B, the subgraph and its minimum spanning tree are shown in Figure 2b. After deleting all edges connected to vertex B, the subgraph and its minimum spanning forest are shown in Figure 2c. Deleting a vertex from the original graph changes the minimum spanning tree.

2.2. Minimum Spanning Tree of the Supergraph in the Original Graph

If the modifications to the graph only involve additions to the original graph, then the updated graph becomes a supergraph of the original. The problem becomes finding the minimum spanning tree (MST) of a certain supergraph of the original graph. The definition is as follows: Given a weighted undirected graph G = ( V , E ) and its spanning tree T, the objective is to find the minimum spanning tree T + of graph G + = ( V , E + ) , where E + E .
Figure 3 illustrates an example of the supergraph minimum spanning tree problem. Given the original graph and its spanning tree shown in Figure 3a, the supergraph with the added edge A , E and its minimum spanning tree are depicted in Figure 3b. It can be observed that in this example, the spanning tree of the supergraph differs from that of the original graph. Through analysis, it can be deduced that when the weight of the added edge is less than the weight of a certain edge forming a cycle in the original MST, the new MST changes. For example, in this instance, adding A , E creates a cycle in the original MST: A , E A , B B , E ; the weight of A , E is less than the weight of A , B in the cycle. If the weight of the added edge is greater than the weights of all other edges in the cycle, then the MST of the supergraph remains the same as that of the original graph, as shown in Figure 3c.
From the original graph, the addition of a vertex can be reduced to adding multiple edges. Consider the vertex to be added firstly as an isolated vertex in the original graph, and then add the edges associated with it to the original graph. For example, considering the original graph and its spanning tree shown in Figure 4 (left), after adding vertex Q along with edges Q , C and Q , D , the resulting supergraph and its minimum spanning tree are illustrated in Figure 4 (right). Adding a non-isolated vertex to the original graph inevitably alters the minimum spanning tree.

2.3. Minimum Spanning Tree after Modifications to Original Graph

When the modified graph is not a subgraph or supergraph of the original graph, the transformation can be broken down into two distinct sub-processes: removing edges and adding new edges. This leads to the transformation of the original problem into two processes, solving a subgraph minimum spanning tree and a supergraph minimum spanning tree.
Let the original graph be denoted by G = ( V , E ) and the changed graph by G = ( V , E ) . First, we determine the set of edges removed in G compared with G as E = E E and the set of edges added in G compared with G as E + = E E . Subsequently, we calculate the minimum spanning tree of subgraph G = ( V , E E ) . Then, we calculate the minimum spanning tree of supergraph G + = ( V , E E E + ) , where G + is equivalent to G .
For example, as depicted in Figure 5a, the original graph transforms into the new graph shown in Figure 5c (by deleting A , E and adding A , B ). To find the minimum spanning tree of the resulting graph, one can first determine the minimum spanning tree of the subgraph (by deleting A , E ), as shown in Figure 5b, and then compute the minimum spanning tree of the supergraph (by adding A , B ), which is equivalent to the graph in Figure 5c.

3. Fast Updating and Maintenance Algorithm for Minimum Spanning Trees

This section describes an algorithm for efficiently updating and maintaining the minimum spanning tree in a dynamic graph. To facilitate the subsequent algorithm descriptions, we first define the following symbols:
  • E 0 : ordered set of edges in graph E, where the edges are sorted in ascending order of weight.
  • E + : set of edges added in the new graph compared with the original graph.
  • E : set of edges removed in the new graph compared with the original graph.
  • D: array representing the disjoint-set data structure. If  D [ i ] > 0 , it signifies that vertex i is a non-representative vertex, and D [ i ] denotes the representative vertex number of the connected component containing vertex i. If  D [ i ] < 0 , it indicates that vertex i is the representative vertex of its connected component, and the absolute value of D [ i ] is the number of vertices in the current connected component. This data structure is globally visible throughout the algorithm.
  • V: visitation array used as an auxiliary data structure for reconstructing the disjoint set during sub-processes. V [ i ] = TRUE indicates that vertex i has been visited; otherwise, V [ i ] = FALSE . This data structure is globally visible throughout the algorithm.
  • T: set of edges for the current spanning tree.
  • T : set of edges for the updated spanning tree.

3.1. Framework of the Algorithm

The overall framework of the algorithm is described in Algorithm 1. The input to the algorithm consists of the given graph G, the ordered edge set E 0 of G, the minimum spanning tree T of G, the set of edges to be added E + , and the set of edges to be removed E . The algorithm first calculates the minimum spanning tree of the subgraph obtained by removing the edges in E from the original graph (Line 7). Subsequently, it computes the minimum spanning tree of the new graph obtained by adding the edges in E + to this subgraph and its spanning tree (Line 11).
It is worth noting that the algorithm requires information about the sorted edges of the current graph based on edge weights (ordered edge set E 0 ) to enhance computational speed. Upon completion of the algorithm, the ordered edge set E 0 of the new graph will be returned for subsequent updates and maintenance of the minimum spanning tree when further changes occur. The updating algorithm for the minimum spanning tree of the subgraph in Algorithm 1, as well as the updating algorithm for the supergraph, will be specifically described in the following sections.
Algorithm 1 MST updating: dynamic graph MST fast updating and maintenance algorithm
1:
Input: Graph G = ( V , E ) , an ordered edge set E 0 , a minimum spanning tree T, sets of edges to be added E + and to be removed E .
2:
Output: Updated minimum spanning tree T and modified edge set E 0 .
3:
procedure MST_UPDATE( G , E 0 , T , E + , E )
4:
     T
5:
    if  E  then
6:
         G Remove edges in E from G.
7:
         T M S T _ u p d a t e _ s u b ( G , E 0 , T , E )
8:
    end if
9:
    if  E +  then
10:
         G Add edges in E + to G.
11:
         T , E 0 M S T _ u p d a t e _ s u p e r ( G , E 0 , T , E + , E )
12:
    end if
13:
    return  T , E 0
14:
end procedure

3.2. Updating and Maintenance of Minimum Spanning Trees for Subgraphs

This section describes the updating algorithm for the minimum spanning tree of the subgraph, as depicted in Algorithm 2.
Algorithm 2 MST updating subgraph: dynamic graph MST fast updating and maintenance algorithm
1:
Input: Graph G = ( V , E ) , ordered edge set E 0 , minimum spanning tree T, set of edges to be removed E .
2:
Output: Updated minimum spanning tree T .
3:
procedure MST_UPDATE( G , E 0 , T , E )
4:
     T R E M O V E _ E _ F R O M _ T R E E ( T , E )
5:
    if T’ = T then
6:
        return  T
7:
    end if
8:
     n R E B U I L D _ D I S J O I N T _ S E T ( T )
9:
    if  n = 1  then
10:
        return  T
11:
    end if
12:
    for  e E 0  do
13:
        if  e T and e E  then
14:
            T C H E C K _ I F _ T R E E _ E D G E ( T , e )
15:
        end if
16:
        if  | T | = | V | 1  then
17:
           return  T
18:
        end if
19:
    end for
20:
    return  T
21:
end procedure
The algorithm takes as input the current graph G, the minimum spanning tree T before the graph changes, the set of deleted edges E , and the ordered set of edges E 0 from the original graph. The algorithm first removes E from the original minimum spanning tree T to obtain the pruned subgraph T (Line 4). Subsequently, the algorithm reconstructs and returns the disjoint set data structure D based on T by using a depth-first traversal process, along with the count (n) of connected components in T .
If T equals T or forms a connected structure ( n = 1 ), then it is the updated minimum spanning tree (Lines 6–8). Otherwise, the algorithm iteratively examines the edges in set E 0 that do not belong to T or E , and if an edge connects two different connected components in T , it is added to T until T has | V | 1 edges. At this point, T becomes the updated minimum spanning tree (Lines 9–16).
The latter part of the algorithm (Lines 9–16) is essentially similar to Kruskal’s algorithm. The key distinction lies in the fact that Kruskal’s algorithm starts building the minimum spanning tree from scratch, while Algorithm 2 maximizes the utilization of the original graph’s minimum spanning tree. It begins constructing the new graph’s minimum spanning tree from the substructure T derived from the original graph’s minimum spanning tree. To facilitate the continuation of Kruskal’s algorithm based on substructure T , a disjoint set structure D matching the current T is required. Algorithms 3 and 4 describe the logical steps for obtaining the corresponding disjoint set D based on input T .
Algorithm 3 Rebuilding disjoint set and returning current component count
1:
Input: Minimum spanning tree T .
2:
Output: Component count n.
3:
procedure REBUILD_DISJOINT_SET( T )
4:
    Set all elements in D to 1
5:
    Set all elements in V to FALSE
6:
     u F I N D _ U N V I S I T E D ( )
7:
     n 0
8:
    while  u > 1  do
9:
         n n + 1
10:
         V [ u ] TRUE
11:
         C O U N T _ R E C U R ( T , u , u )
12:
         u F I N D _ U N V I S I T E D ( )
13:
    end while
14:
    return n
15:
end procedure
Algorithm 4 Recursive sub-procedure for rebuilding the disjoint set
1:
Input: Minimum spanning tree T , root vertex v r , current vertex v.
2:
Output: None.
3:
procedure COUNT_RECUR( T , v r , v )
4:
    Reverse L v Neighbors of vertex v in T
5:
    for  u L v  do
6:
        if  V [ u ] = FALSE  then
7:
            V [ u ] TRUE
8:
            D [ v r ] D [ v r ] 1
9:
            D [ u ] v r
10:
           COUNT_RECUR( T , v r , u )
11:
        end if
12:
    end for
13:
end procedure
Algorithm 3 initializes D and V. It starts by searching for an unvisited vertex in T after initialization (Line 6), marks it as visited, and performs a depth-first traversal starting from it (Line 11). If there are no unvisited vertices after this depth-first traversal, the algorithm terminates; otherwise, it starts over from another unvisited vertex. The sub procedure find_unvisited in Algorithm 3 traverses the vertices in T and returns the index of the first vertex corresponding to a FALSE V value. If all vertices have TRUE V values, find_unvisited returns −1. The detailed logic of the depth-first traversal process count_recur is described in Algorithm 4.
Algorithm 4 is an iterative depth-first traversal operation for a subgraph T to rebuild the disjoint set, with the representative vertex v r and the current vertex v. If  D [ v ] > 0 , the value represents the representative of vertex v; if D [ v ] < 0 , it indicates that v is a root representative representing all the other vertices in this connected component, and its absolute value represents the number of vertices in the connected component.
Algorithm 5 checks whether the two endpoints of edge e are connected to the two connected components in the current subgraph T. If true, it adds edge e to the current subgraph T, updates the corresponding D values, and returns the updated subgraph T . The process of Algorithm 5 is identical to the corresponding steps in Kruskal’s algorithm.
Algorithm 5 Checking if edge e belongs to minimum spanning tree
1:
Input: Minimum spanning tree T, edge e.
2:
Output: Updated minimum spanning tree T .
3:
procedure CHECK_IF_TREE_EDGE( T , e )
4:
     T T
5:
     i starting vertex of e
6:
     j ending vertex of e
7:
    while  D [ i ] 0  do
8:
         i D [ i ]
9:
    end while
10:
    while  D [ j ] 0  do
11:
         j D [ j ]
12:
    end while
13:
    if  i j  then
14:
        if  i < j  then
15:
            D [ i ] D [ i ] + D [ j ]
16:
            D [ j ] i
17:
        else
18:
            D [ j ] D [ i ] + D [ j ]
19:
            D [ i ] j
20:
        end if
21:
         T T { e }
22:
    end if
23:
    return  T
24:
end procedure
This algorithm differs from Kruskal’s algorithm in that it does not compute the spanning tree from scratch but utilizes the original minimum spanning tree of the graph and reconstructs its corresponding disjoint set. The depth-first traversal operation has linear-time complexity, whereas obtaining the same disjoint-set structure using Kruskal’s algorithm involves polynomial-time complexity.

3.3. Updating and Maintenance of Minimum Spanning Trees for Supergraphs

This subsection describes the minimum spanning tree updating algorithm for supergraphs. The algorithmic logic is shown in Algorithm 6. The input of Algorithm 6 includes the current graph G, the minimum spanning tree T before the graph changes, the set of added edges E + , the set of deleted edges E , and the ordered set of edges E 0 in the original graph. Algorithm 6 first obtains the union of T and E + , denoted by E ; then it executes Kruskal’s algorithm with elements only from E ; finally, it generates the ordered edge set E 0 for the current graph G based on E 0 , E + , and  E , which will be used for the subsequent iterations to update the minimum spanning tree. Although a supergraph does not contain any deleted edges compared with the original graph, the deleted edges E here are the ones from the previous subgraph procedure and are used to generate the ordered edge set E 0 for the next iteration.
Algorithm 6 Supergraph minimum spanning tree fast updating and maintenance algorithm
1:
Input: Graph G, ordered edge set E 0 , minimum spanning tree T, set of edges to be added E + , set of edges to be removed E .
2:
Output: Updated minimum spanning tree T , updated ordered edge set E 0 .
3:
procedure MST_UPDATE_SUPER( G , E 0 , T , E + , E )
4:
     E T E +
5:
     E 0 : Sort elements in E by weight in ascending order
6:
    Set all elements in D to 1
7:
     T
8:
    for  e E 0  do
9:
         T C H E C K _ I F _ T R E E _ E D G E ( T , e )
10:
        if  | T | = | V | 1  then
11:
           goto 14
12:
        end if
13:
    end for
14:
     E 0 G E N _ S O R T E D _ E D G E S ( E 0 , T , E + , E )
15:
    return  T , E 0
16:
end procedure
Algorithm 7 describes how to generate the ordered edge set E 0 for the current graph G. The logic involves a one-time merge–sort operation on two ordered setsone-time merge sort of two ordered sets: the ordered set of elements from E 0 and the ordered set E + 0 , excluding elements belonging to E + . It is important to note that when scanning elements from E 0 one by one, elements belonging to E + should be skipped.
Algorithm 7 Updating ordered edge set
1:
Input: Ordered edge set E 0 , set of edges to be added E + , set of edges to be removed E .
2:
Output: Updated ordered edge set E 0 .
3:
procedure GEN_SORTED_EDGES( E 0 , E + , E )
4:
     E + 0 Sort elements in E + by weight in ascending order
5:
     E 0
6:
     i 0 , j 0
7:
    while  i < | E 0 | and j < | E + 0 |  do
8:
        while  i | E 0 | and E 0 [ i ] E  do
9:
            i i + 1
10:
        end while
11:
        if  i > | E 0 |  then
12:
           goto 22
13:
        end if
14:
        if  ω ( E 0 [ i ] ) < ω ( E + 0 [ j ] )  then
15:
           Add E 0 [ i ] to the end of E 0
16:
            i i + 1
17:
        else
18:
           Add E + 0 [ j ] to the end of E 0
19:
            j j + 1
20:
        end if
21:
    end while
22:
    if  i = | E 0 |  then
23:
        Append all remaining elements in E + 0 from position j to E 0
24:
    else
25:
        Append all remaining elements in E 0 from position i to E 0 excluding the ones in E
26:
    end if
27:
    return  E 0
28:
end procedure
This algorithm differs from Kruskal’s algorithm in that it does not use the complete current graph edge set but only the edge set E , where E is the union of the edge set of the original minimum spanning tree and the added edge set. For large sparse graphs with a relatively small added edge set, the size of E is much smaller than the complete edge set E. Consequently, the number of instruction operations in this algorithm is lower than that of the operations required for performing the complete Kruskal’s algorithm on the current graph.

4. Theoretical Analysis

This section provides the theoretical proof of optimality for the proposed algorithm in this paper. Specifically, given the original graph and its minimum spanning tree, the results provided by the algorithm presented in this paper are proven to be a correct minimum spanning tree for the updated graph. Additionally, this section includes an analysis of the time complexity of the proposed algorithm.

4.1. Correctness Analysis

As described in Section 3.2, the main approach of the algorithm proposed in this paper for computing the minimum spanning tree of a subgraph is to inherit edges from the original minimum spanning tree. The goal is to achieve a fast computation of the minimum spanning tree for the subgraph. Theorem 1 establishes that the results obtained using this approach are guaranteed to be a correct minimum spanning tree for the subgraph.
Theorem 1.
Given a weighted connected undirected graph G = ( V , E ) , where e = ( x , y ) is an edge in its minimum spanning tree T (i.e., e T ), if e also exists in the connected subgraph G = ( V , E ) of graph G, i.e., e E , then for this subgraph G of G, it is guaranteed that a minimum spanning tree containing edge e can be found.
Proof. 
Let us assume there exists a subgraph G containing e while all its minimum spanning trees do not contain e.
Due to e = ( x , y ) being an edge in some minimum spanning tree T of graph G, we can infer two situations (denoted as verdict A) as follows:
A
1: There is no simple path in graph G connecting x and y without e;
2: There exists a simple path P in graph G connecting x and y without e, and in P, there must be an edge with a weight greater than or equal to e.
From the assumption, we can infer verdict B as follows:
B
In G , there exists a simple path P connecting x and y without e, and all edges in P have weights smaller than e.
Since G is a subgraph of G, P must also be a path in G. Verdicts A and B contradict each other. Therefore, the assumption is false, i.e., if the spanning tree edge of the original graph appears in a subgraph, it must be included in some minimum spanning tree of the subgraph. This completes the proof. □
The algorithm described in Section 3.3 aims to efficiently compute the minimum spanning tree of a supergraph. The main idea is to exclude edges that cannot appear in the minimum spanning tree of the supergraph. Theorem 2 establishes that the results obtained using this approach are guaranteed to be a correct minimum spanning tree for the supergraph.
Theorem 2.
Given an undirected weighted connected graph G = ( V , E ) and T as one of its minimum spanning trees, if there exists an edge e = ( x , y ) such that e E and e T , then there exists a minimum spanning tree for some supergraph G + that does not include edge e.
Proof. 
Let us assume that the minimum spanning tree of supergraph G + must include edge e = ( x , y ) .
Due to the minimum spanning tree T of graph G not containing e = ( x , y ) , we can infer verdict A as follows:
A
There must be a simple path P in graph G connecting x and y without e, satisfying that all edges in P have weights smaller than or equal to e.
Let P + be some path in G + that connects x and y including e. From the assumption, we can infer verdict B as follows:
B
The weight of e is less than the weight of the maximum-weight edge in P + .
Since G + is a supergraph of G, P + must also be a path in G + . Therefore, P should satisfy verdict B, i.e., there exists an edge in P with a weight greater than the weight of e. This conclusion contradicts verdict A. Thus, the assumption is false, and a minimum spanning tree without e can certainly be found for supergraph G + . This completes the proof. □
Due to the algorithm decomposing the general case of graph updates into the solutions of minimum spanning trees for subgraphs and supergraphs, it is known from the above two theorems that the algorithm proposed in this paper can provide a correct minimum spanning tree for the updated graph.

4.2. Complexity Analysis

In this section, we conduct an analysis of the time complexity of the algorithm based on the pseudo-code in Section 3. We analyze the upper bound of the worst case and the lower bound of the best case of the proposed algorithm.

4.2.1. Worst-Case Analysis

For Algorithm 7, the main operations include two parts: the sorting of E + in Line 4, with a time complexity of O ( | E + | · log | E + | ) , and the subsequent merge operation of ordered element columns, with a time complexity of O ( | E 0 | + | E + | ) . By combining these, the time complexity of Algorithm 7 is O ( | E + | · log | E + | + | E 0 | + | E + | ) .
For Algorithm 6, the main operations are divided into two parts: from Line 5 to Line 13, we have Kruskal’s algorithm for a specific substructure, with a time complexity of O ( | T E + | · log | V | ) , and Line 14 is as described in Algorithm 7. Overall, the time complexity of Algorithm 6 is O ( | T E + | · log | V | + | E + | · log | E + | + | E 0 | + | E + | ) . Since | T | = | V | 1 , this expression can be simplified to O ( | V | · log | V | + | E + | · log | E + | + | E 0 | + | E + | ) . When the number of added edges is sufficiently small, this expression can be further simplified to O ( | V | · log | V | + | E 0 | ) . For dense graph scenarios, this expression can be further simplified to O ( E 0 ) .
The essence of Algorithm 3 is the depth-first traversal operation for a given substructure T. If the graph is stored using an adjacency list structure, its time complexity is O ( | V | + | T | ) . Since | T | = | V | 1 , this expression can be simplified to O ( | V | ) .
Algorithm 2 has three main operations.
Line 4: Deleting edges from E in the original spanning tree T. If E is stored using a hash table structure, the time complexity is O ( | T | ) .
Line 5: Algorithm 3.
The following steps of Algorithm 2 involve a Kruskal’s algorithm-like operation for elements in E 0 ( T E ) , with a time complexity of O ( | E 0 ( T E ) | · log | V | ) .
Overall, the time complexity of Algorithm 2 is O ( 2 | T | + | V | + | E 0 ( T E ) | · log | V | ) . Since | T | = | V | 1 , this expression can be simplified to O ( 3 | V | + | E 0 ( T E ) | · log | V | ) . When the number of removed edges is sufficiently small, this expression can be further simplified to O ( 3 | V | + | E 0 T | · log | V | ) . For dense graphs, this expression can be further simplified to O ( | E 0 | · log | V | ) .
In summary, the time complexity of the proposed algorithm in general is
O ( 3 | V | + | E 0 ( T E ) | · log | V | + | V | · log | V | + | E + | · log | E + | + | E 0 | + | E + | )
When the amount of changes to the graph is sufficiently small, the algorithm’s time complexity expression can be simplified to
O ( 3 | V | + | E 0 T | · log | V | + | V | · log | V | + | E 0 | )
When the amount of changes to the graph is sufficiently small and the graph is dense, the algorithm’s time complexity expression can be further simplified to
O ( | E 0 | · log | V | )
The time complexity of this algorithm is the same as the standard Kruskal’s algorithm. This implies that the proposed algorithm is theoretically equivalent to performing a complete recalculation of the minimum spanning tree by using the standard Kruskal’s algorithm. However, in practical runtime, the proposed algorithm effectively reduces the actual search space, making it more efficient for multiple iterations of updates on large-scale graphs. The subsequent sections will provide comparative data between the proposed algorithm and Kruskal’s algorithm in actual runtime.

4.2.2. Best-Case Analysis

When the new graph is a subgraph, the best case occurs when the removed edges do not belong to the original tree or their removal does not disconnect the tree. Algorithm 2 can be terminated before Line 10. If the removed edges do not belong to the original tree, the time complexity of Algorithm 2 is exactly the one of the sub-procedure remove_e_from_tree, i.e., Ω ( | E | ) . Let us suppose that some of the removed edges belong to the original tree but their removal does not disconnect it. In that case, the time complexity of Algorithm 2 depends on Algorithm 3, i.e., Ω ( | T | ) . The other sub-procedures do not execute when the updated graph is a subgraph. Therefore, the lower-bound time complexity of the subgraph best case is Ω ( | E | + | T | ) . If the removal is small compared with the graph, it can be simplified to Ω ( | T | ) .
When the new graph is a supergraph, the best case occurs when all the added edges have a greater cost than the ones in the original tree. However, the proposed algorithm does not explicitly check this situation; it proceeds to calculate the spanning tree using the reduced edge set. But, in this case, the updated three edges will be exactly the first ones in the edge set, which means that the algorithm terminates when it has traversed | V | 1 elements in the edge set. Algorithm 7 does not execute, since E is empty. Therefore, the lower-bound time complexity of the supergraph best case is Ω ( | V | · log | V | ) .
Since | T | = | V | 1 , when a spanning tree is successfully found, the overall lower bound of the time complexity of the best case for the proposed algorithm is Ω ( | V | ) .

5. Experimental Analysis

This section presents an experimental analysis of the efficiency of the proposed algorithm. The dynamic graph minimum spanning tree updating and maintenance algorithm proposed in this paper was tested in the following three scenarios.
(1) Given the original graph and its spanning tree, solve the minimum spanning tree of the original graph’s supergraph. (2) Given the original graph and its spanning tree, solve the minimum spanning tree of the original graph’s subgraph. (3) Solve the minimum spanning tree of the new graph after changes to the original graph.
Additionally, a comparison of the runtime between the proposed algorithm and Kruskal’s algorithm, which recalculates the new graph, was conducted.

5.1. Experimental Data

The algorithms used in the experiments were implemented in Java. The experimental platform consisted of a computer with a six-core processor (Intel Core i7 2.6 GHz) and 16 GB of memory, running on the Windows 11 operating system. The timing for all algorithms in the experiments was measured in milliseconds.
The test instances were obtained from a common set of instances for the minimum dominating tree problem [28]. Each instance represents a weighted undirected graph G = ( V , E ) , where V is the set of vertices and E is the set of edges. The instances were generated as follows: | V | vertices were randomly deployed in a 500 m × 500 m area, and edges connected vertices if the straight-line distance between them was less than a certain distance. The edge weights were assigned as the distances between the vertices. The instance set was divided into three groups based on the maximum vertex connection distance, namely, Range_100, Range_125, and Range_150 (the number indicates the maximum vertex connection distance, and a larger number indicates a denser graph). Each group contained 12 instances with a specified number of nodes | V | { 200 , 300 , 400 , 500 } . All instances can be obtained from the author or online (https://github.com/xavierwoo/Dynamic-MST, accessed on 25 March 2024).
To evaluate the algorithm’s performance in large-scale scenarios, we generated multiple clustered graph instances of massive scale. The generation process followed these steps: (1) Vertices were organized into clusters using a matrix layout. (2) Within each cluster, vertices were randomly placed within a square and connected with random probability, where the edge cost was determined by the distance. (3) Adjacent clusters were interconnected by introducing edges between two randomly selected vertices from each, with the distance representing the edge cost. These generated instances, as well as the generator used, are available from the author.

5.2. Experimental Evaluation of Supergraph Minimum Spanning Tree Maintenance Algorithm

This section presents the experimental results for the proposed algorithm applied to the updating and maintenance of the minimum spanning tree (MST) for the original graph’s supergraph.

5.2.1. Experimental Methodology for Supergraph

To evaluate the performance of the proposed algorithm for maintaining the minimum spanning tree of a supergraph, the following testing procedure was employed:
  • Start with a test graph G = ( V , E ) . Randomly remove k edges to obtain a graph G = ( V , E ) , using G as the input for the algorithm.
  • Use the ordered set of edges in E as the algorithm input parameter E 0 .
  • Use the minimum spanning tree of G as the algorithm input parameter T.
  • Use the set E E as the algorithm input parameter E + .
  • Set E as an empty set.
At this point, the test graph G is a supergraph of graph G. The experiments were conducted for different values of k ( k = 10 , 50 , 200 ). The total running time of the algorithm was recorded and compared with the time taken by Kruskal’s algorithm to find the minimum spanning tree of G .
Since the algorithm’s single run time is very short, each test case was independently repeated 1000 times, and the total running time was recorded. To improve the accuracy of the experiments, each set of data was calculated 100 times, and the average running time was reported in seconds.
In all comparative experiments presented in this paper, the timing of the proposed algorithm and Kruskal’s algorithm were conducted independently. The experiments used the same random seed to generate random sets, ensuring that each algorithm calculated on the same graph for every experiment.

5.2.2. Analysis of Experimental Results for Supergraph

The experimental results of the supergraph minimum spanning tree maintenance algorithm are shown in Table 1. The first column of Table 1 represents the group name of the test case, the second column is the name of the test case, and the next six columns compare the time consumption of the maintenance algorithm (MT) and Kruskal’s algorithm (Kru) under different conditions. The last column describes the number of vertices and edges in the current test case. We define a time consumption evaluation index, where μ = T k r u T ( T k r u represents the runtime of Kruskal’s algorithm, and T represents the runtime of the maintenance algorithm). μ indicates the performance difference between the maintenance algorithm and Kruskal’s algorithm. The larger μ is, the better the performance of the maintenance algorithm is. Boldfaced μ in the table indicates that the corresponding algorithm’s runtime is smaller than that of the comparison algorithm. The formats and meanings of the subsequent tables are the same as those of Table 1.
From Table 1, we can conclude that the proposed algorithm for maintaining the minimum spanning tree in supergraphs is significantly superior to Kruskal’s algorithm for a complete recalculation of the MST in the new graph. When comparing Kruskal’s algorithm for different numbers of added edges in the same test case, we observe a relatively smooth fluctuation in its runtime. The fewer edges the maintenance algorithm adds, the larger the value of μ is, indicating a greater advantage of the supergraph maintenance algorithm. Additionally, it is evident that for graphs with the same number of vertices and a similar number of graph changes, the μ value is larger for denser graphs. This suggests that the updating algorithm performs better in calculating the MST of supergraphs, particularly on denser graphs compared with sparse ones.

5.3. Experimental Design for Subgraph

This section tests the updating and maintenance algorithm for the original graph’s subgraph.

5.3.1. Experimental Methodology for Subgraph

We tested the maintenance algorithm for the minimum spanning tree of the subgraph by using the following approach:
  • Let the example graph be denoted by G = ( V , E ) , and use graph G as the input to the algorithm. Randomly remove k edges from G to obtain the subgraph G = ( E , V ) .
  • Let the ordered set of edges from the edge set of the graph G be the algorithm input parameter E 0 .
  • Let the minimum spanning tree of graph G be the algorithm input parameter T.
  • Set E + as an empty set for the algorithm input parameter.
  • Let E be the set E E .
At this point, the example graph G is a subgraph of graph G. The experiment tested the cases where k = { 10 , 50 , 200 } , recorded the total runtime of the entire algorithm, and compared it with the runtime taken by Kruskal’s algorithm to solve the minimum spanning tree for graph G .
The experiment protocol is the same as the one used in the supergraph test.

5.3.2. Analysis of Experimental Results for Subgraph

Through Table 2, we can conclude that for the solution of the minimum spanning tree in the subgraph, the proposed subgraph minimum spanning tree maintenance algorithm in this paper is significantly better than Kruskal’s algorithm, which recomputes from the scratch. By comparing the runtime of Kruskal’s algorithm when deleting different numbers of edges in the same test case, it was observed that the runtime fluctuation of Kruskal’s algorithm is relatively small. The fewer edges the maintenance algorithm deletes, the larger the μ value is, indicating a greater advantage of the subgraph minimum spanning tree maintenance algorithm. It was also observed that for graphs with the same number of vertices and a similar number of graph changes, denser graphs yield larger μ values. This suggests that the updating algorithm performs better on dense graphs compared with sparse graphs when computing the minimum spanning tree of the subgraph.

5.4. Experimental Evaluation of Dynamic Graph Minimum Spanning Tree Maintenance Algorithm

This section tests the proposed dynamic graph minimum spanning tree maintenance algorithm in this paper in general scenarios.

5.4.1. Experimental Design for Dynamic Graph Minimum Spanning Tree Maintenance Algorithm

As presented in this section, we tested the dynamic graph minimum spanning tree maintenance algorithm through continuous multiple iterations.
The initial iteration’s algorithm parameters were set as follows:
  • Let the example graph be G = ( V , E ) . Randomly delete k edges from it to obtain the graph G, which serves as the input original graph for the algorithm.
  • Denote the ordered set of edges from graph G by E 0 , and let it be the algorithm input parameter. Denote the minimum spanning tree of graph G by T, and let it be the algorithm input parameter.
  • Let E + be the set E E and E be a randomly selected subset of size k from set E.
For each subsequent iteration, the algorithm parameters were generated based on the results of the previous iteration, as follows:
  • Remove all edges in E from graph G, and add all edges in E + to obtain the graph G = ( V , E ) .
  • Set E + = E , and let E be a randomly selected subset of size k from set E .
  • T is the minimum spanning tree of graph G , as given by the previous iteration.
  • E 0 is the ordered set of edges from set E , as given by the previous iteration.
These parameters ( G , E 0 , T , E + , and E ) are used as the input for the new iteration.
The experiment protocol is the same as the previous ones.

5.4.2. Analysis of Maintenance Algorithm Results for Dynamic Graph

From Table 3, we can conclude that for solving the minimum spanning tree problem in general dynamic graphs, the proposed dynamic graph minimum spanning tree maintenance algorithm is significantly better than Kruskal’s algorithm, which recomputes from scratch. By comparing different numbers of modified edges (additions and deletions) within the same test case, it was observed that the more edges are modified, the smaller the value of μ is, indicating that the advantage of the dynamic graph minimum spanning tree maintenance algorithm diminishes. Additionally, it was observed that for the same number of vertices and a similar graph modification number, higher-density graphs yield larger μ values. This implies that the updating algorithm performs better on dense graphs than on sparse ones.

5.5. Analysis for Massive Cluster Graphs

In this subsection, we evaluate the performance of our proposed algorithm on massive cluster graphs. We compared our algorithm with Kruskal’s (Kru) and Prim’s (Pri) algorithms. Due to the larger size of the instances used in this experiment compared with those in the previous sections, each test was conducted over 50 iterations, with 50 edges being modified in each iteration. The results are presented in Table 4. The columns μ Kru and μ Pri represent the performance differences between our maintenance algorithm and Kruskal’s and Prim’s algorithms, respectively. The instances’ names follow the format cluster-nxn-g-p-s, where nxn denotes the matrix layout of the entire graph, g represents the number of vertices in a single cluster, p represents the connection probability, and s denotes the random seed used to generate the instance.
Based on the results presented in Table 4, it is evident that the maintenance algorithm outperforms Kruskal’s and Prim’s algorithms on cluster graphs. Specifically, Prim’s algorithm demonstrated better performance compared with Kruskal’s algorithm in this experiment. This observation aligns with the fact that Prim’s algorithm tends to be more efficient than Kruskal’s algorithm when dealing with dense graphs, which was the case for all instances used in this experiment. However, despite this advantage, Prim’s algorithm is still 5 to 8 times slower compared with the proposed maintenance algorithm.

5.6. Analysis of Algorithm Performance with Graph Changes

We conducted experiments on the dynamic graph minimum spanning tree maintenance algorithm iteratively, with the same experimental protocol as the dynamic graph minimum spanning tree maintenance algorithm experiments described above. We wanted to determine the specific point at which computing from scratch becomes more advantageous than maintaining the existing version.
We selected six test cases with different densities and various vertex numbers for experimentation. For each test case, we gradually increased the number of modified edges and recorded the algorithm’s running time. To enhance experimental accuracy, each set of data underwent 100 calculations. We also tested using Kruskal’s algorithm to compute from scratch and compared the results. Note that when edge sets E + and E exceeded half of the total number of edges in the graph, the new graph became entirely different from the original, rendering the algorithm results meaningless.
The results are shown in Figure 6. The x-axis in the figures represents the increasing number of added and deleted edges, while the y-axis represents the algorithm’s running time. Through the comparison of the running time results of the six test cases, it was observed that the maintenance algorithm had a significant advantage when the number of modified edges was small. As the number of modified edges increased, the gap between the two algorithms gradually narrowed. When the number of modified edges approached half of the total edges in the test case, the running times of the two algorithms became close. By comparing the figures, we found that with fewer modified edges, an decrease in the graph density led to a greater advantage for the maintenance algorithm.

6. Conclusions

This paper introduces a dynamic graph minimum spanning tree maintenance algorithm. In the scenario where the original graph and its spanning tree are known, the algorithm quickly solves the minimum spanning tree of a new graph with partial modifications to the original graph. The algorithm also provides an ordered set of the new graph’s edge set to expedite subsequent dynamic minimum spanning tree computations. The paper proves that given the original graph and its spanning tree, the maintenance algorithm yields a correct minimum spanning tree for the new graph. Compared with the traditional Kruskal’s algorithm, the time complexity of the maintenance algorithm is the same as that of Kruskal’s algorithm. This implies that theoretically, the proposed algorithm is on par with the standard Kruskal’s algorithm, which completely recalculates the new graph. In practical execution, the maintenance algorithm efficiently reduces the actual search space, making it more effective for multiple iterations of updates in large-scale graph scenarios. As for future work, we aim to optimize the algorithm’s pruning strategies, for example, maintaining the disjoint-set structure by updating it rather than rebuilding it and using path compression for faster searching. We also hope that the maintenance algorithm proposed in this paper can be extended to more problem domains.

Author Contributions

Conceptualization, X.W.; Investigation, H.Q.; Methodology, M.L.; Project administration, C.X.; Software, H.Q. and X.W.; Writing—original draft, M.L.; Writing—review and editing, D.X. and Y.K. All authors have read and agreed to the published version of the manuscript.

Funding

This research was funded by the National Natural Science Foundation of China (grant Nos. 62201203 and 62002105) and the Science and Technology Research Program of Hubei Province (2021BLB171).

Data Availability Statement

The instances and code used in this paper can be found at https://github.com/xavierwoo/Dynamic-MST (accessed on 25 March 2024).

Conflicts of Interest

The authors declare no conflicts of interest.

References

  1. Nesetril, J.; Nesetrilová, H. The origins of minimal spanning tree algorithms—Boruvka and Jarník. Doc. Math. 2012, 17, 127–141. [Google Scholar]
  2. Lv, X.; Ma, Y.; He, X.; Huang, H.; Yang, J. CciMST: A clustering algorithm based on minimum spanning tree and cluster centers. Math. Probl. Eng. 2018, 2018, 1–14. [Google Scholar] [CrossRef]
  3. Peter, S.J. Discovering local outliers using dynamic minimum spanning tree with self-detection of best number of clusters. Int. J. Comput. Appl. 2010, 9, 36–42. [Google Scholar] [CrossRef]
  4. La Grassa, R.; Gallo, I.; Landro, N. OCmst: One-class novelty detection using convolutional neural network and minimum spanning trees. Pattern Recognit. Lett. 2022, 155, 114–120. [Google Scholar] [CrossRef]
  5. Millington, T.; Niranjan, M. Construction of minimum spanning trees from financial returns using rank correlation. Phys. A Stat. Mech. Its Appl. 2021, 566, 125605. [Google Scholar] [CrossRef]
  6. Danylchuk, H.; Kibalnyk, L.; Serdiuk, O.; Ivanylova, O.; Kovtun, O.; Melnyk, T.; Zaselskiy, V. Modelling of trade relations between EU countries by the method of minimum spanning trees using different measures of similarity. In Proceedings of the CEUR Workshop Proceedings 2020, Copenhagen, Denmark, 30 March 2020; pp. 167–186. [Google Scholar]
  7. Xingguo, Y.; Chi, X. The Study of Fund Market Complex Complex Network Based on Cosine Similarity and MST Method. Theory Pract. Financ. Econ. 2020, 41, 55–61. [Google Scholar]
  8. van Dellen, E.; Sommer, I.E.; Bohlken, M.M.; Tewarie, P.; Draaisma, L.; Zalesky, A.; Di Biase, M.; Brown, J.A.; Douw, L.; Otte, W.M.; et al. Minimum spanning tree analysis of the human connectome. Hum. Brain Mapp. 2018, 39, 2455–2471. [Google Scholar] [CrossRef] [PubMed]
  9. Chen, J.; Wang, H.; Hua, C.; Wang, Q.; Liu, C. Graph analysis of functional brain network topology using minimum spanning tree in driver drowsiness. Cogn. Neurodyn. 2018, 12, 569–581. [Google Scholar] [CrossRef] [PubMed]
  10. Jonak, K.; Krukow, P.; Karakuła-Juchnowicz, H.; Rahnama-Hezavah, M.; Jonak, K.E.; Stępniewski, A.; Niedziałek, A.; Toborek, M.; Podkowiński, A.; Symms, M.; et al. Aberrant structural network architecture in Leber’s hereditary optic neuropathy. Minimum spanning tree graph analysis application into diffusion 7T MRI. Neuroscience 2021, 455, 128–140. [Google Scholar] [CrossRef] [PubMed]
  11. Chreang, S.; Kumhom, P. A Method of Selecting Cable Configuration for Microgrid Using Minimum Spanning Tree. In Proceedings of the 2018 15th International Conference on Electrical Engineering/Electronics, Computer, Telecommunications and Information Technology (ECTI-CON), Chiang Rai, Thailand, 18–21 July 2018; pp. 489–492. [Google Scholar]
  12. Mosbah, M.; Arif, S.; Mohammedi, R.D.; Hellal, A. Optimum dynamic distribution network reconfiguration using minimum spanning tree algorithm. In Proceedings of the 2017 5th International Conference on Electrical Engineering-Boumerdes (ICEE-B), Boumerdes, Algeria, 29–31 October 2017; pp. 1–6. [Google Scholar]
  13. Kebir, N.; Ahsan, A.; McCulloch, M.; Rogers, D.J. Modified Minimum Spanning Tree for Optimised DC Microgrid Cabling Design. IEEE Trans. Smart Grid 2022, 13, 2523–2532. [Google Scholar] [CrossRef]
  14. Ramalingam, G.; Reps, T. On the computational complexity of dynamic graph problems. Theor. Comput. Sci. 1996, 158, 233–277. [Google Scholar] [CrossRef]
  15. Adasme, P.; de Andrade, R.C. Minimum weight clustered dominating tree problem. Eur. J. Oper. Res. 2023, 306, 535–548. [Google Scholar] [CrossRef]
  16. Xiong, C.; Liu, H.; Wu, X.; Deng, N. A two-level meta-heuristic approach for the minimum dominating tree problem. Front. Comput. Sci. 2022, 17, 171406. [Google Scholar] [CrossRef]
  17. Frederickson, G.N. Data structures for on-line updating of minimum spanning trees. In Proceedings of the 15th Annual ACM symposium on Theory of Computing, Boston, MA, USA, 25–27 April 1983; pp. 252–257. [Google Scholar]
  18. Ribeiro, C.C.; Toso, R.F. Experimental analysis of algorithms for updating minimum spanning trees on graphs subject to changes on edge weights. In Proceedings of the Experimental Algorithms: 6th International Workshop, WEA 2007, Rome, Italy, 6–8 June 2007; Springer: Berlin/Heidelberg, Germany, 2007; pp. 393–405. [Google Scholar]
  19. Henzinger, M.R.; King, V. Maintaining minimum spanning trees in dynamic graphs. In Proceedings of the Automata, Languages and Programming: 24th International Colloquium, ICALP’97, Bologna, Italy, 7–11 July 1997; Springer: Berlin/Heidelberg, Germany, 1997; pp. 594–604. [Google Scholar]
  20. Holm, J.; De Lichtenberg, K.; Thorup, M. Poly-logarithmic deterministic fully-dynamic algorithms for connectivity, minimum spanning tree, 2-edge, and biconnectivity. J. ACM (JACM) 2001, 48, 723–760. [Google Scholar] [CrossRef]
  21. Cattaneo, G.; Faruolo, P.; Petrillo, U.F.; Italiano, G.F. Maintaining dynamic minimum spanning trees: An experimental study. Discret. Appl. Math. 2010, 158, 404–425. [Google Scholar] [CrossRef]
  22. Eppstein, D.; Galil, Z.; Italiano, G.F.; Nissenzweig, A. Sparsification—a technique for speeding up dynamic graph algorithms. J. ACM (JACM) 1997, 44, 669–696. [Google Scholar] [CrossRef]
  23. Kopelowitz, T.; Porat, E.; Rosenmutter, Y. Improved worst-case deterministic parallel dynamic minimum spanning forest. In Proceedings of the 30th on Symposium on Parallelism in Algorithms and Architectures, Vienna, Austria, 16–18 July 2018; pp. 333–341. [Google Scholar]
  24. Tarjan, R.E.; Werneck, R.F. Dynamic trees in practice. J. Exp. Algorithmics (JEA) 2010, 14, 4–5. [Google Scholar]
  25. Wulff-Nilsen, C. Fully-dynamic minimum spanning forest with improved worst-case update time. In Proceedings of the 49th Annual ACM SIGACT Symposium on Theory of Computing, Montreal, QC, Canada, 19–23 June 2017; pp. 1130–1143. [Google Scholar]
  26. de Andrade Júnior, J.W.; Duarte Seabra, R. Fully retroactive minimum spanning tree problem. Comput. J. 2022, 65, 973–982. [Google Scholar] [CrossRef]
  27. Tseng, T.; Dhulipala, L.; Shun, J. Parallel batch-dynamic minimum spanning forest and the efficiency of dynamic agglomerative graph clustering. In Proceedings of the 34th ACM Symposium on Parallelism in Algorithms and Architectures, Philadelphia, PA, USA, 11–14 July 2022; pp. 233–245. [Google Scholar]
  28. Sundar, S.; Singh, A. New heuristic approaches for the dominating tree problem. Appl. Soft Comput. 2013, 13, 4695–4703. [Google Scholar] [CrossRef]
Figure 1. Illustration of subgraph minimum spanning tree problem. (a) The original graph and its MST; (b) Removing A , B changes the MST; (c) Removing H , E does not change the MST. (Numbers on edges represent the edge weights. The following figures are the same).
Figure 1. Illustration of subgraph minimum spanning tree problem. (a) The original graph and its MST; (b) Removing A , B changes the MST; (c) Removing H , E does not change the MST. (Numbers on edges represent the edge weights. The following figures are the same).
Mathematics 12 01021 g001
Figure 2. Deleting vertex B provides the same MST as deleting all edges related to B. (a) The original graph; (b) Deleting vertex B; (c) Deleting all edges connected to vertex B.
Figure 2. Deleting vertex B provides the same MST as deleting all edges related to B. (a) The original graph; (b) Deleting vertex B; (c) Deleting all edges connected to vertex B.
Mathematics 12 01021 g002
Figure 3. Example of a supergraph minimum spanning tree. (a) The original graph and its MST; (b) Adding edge Q , C with weight 4 changes the MST; (c) Adding edge Q , C with weight 15 does not change the MST.
Figure 3. Example of a supergraph minimum spanning tree. (a) The original graph and its MST; (b) Adding edge Q , C with weight 4 changes the MST; (c) Adding edge Q , C with weight 15 does not change the MST.
Mathematics 12 01021 g003
Figure 4. Example of a supergraph with added vertex Q.
Figure 4. Example of a supergraph with added vertex Q.
Mathematics 12 01021 g004
Figure 5. Schematic diagram of the splitting operation to solve the minimum spanning tree of the new graph after the change. (a) The original graph and its MST; (b) Removing A , E ; (c) Removing A , E and adding A , B .
Figure 5. Schematic diagram of the splitting operation to solve the minimum spanning tree of the new graph after the change. (a) The original graph and its MST; (b) Removing A , E ; (c) Removing A , E and adding A , B .
Mathematics 12 01021 g005
Figure 6. The relationship between algorithm time consumption and graph changes. (a) Range_150_500_1; (b) Range_150_500_3; (c) Range_125_500_3; (d) Range_100_500_3; (e) Range_125_300_1; (f) Range_100_300_1.
Figure 6. The relationship between algorithm time consumption and graph changes. (a) Range_150_500_1; (b) Range_150_500_3; (c) Range_125_500_3; (d) Range_100_500_3; (e) Range_125_300_1; (f) Range_100_300_1.
Mathematics 12 01021 g006aMathematics 12 01021 g006b
Table 1. Experimental results of supergraph minimum spanning tree maintenance algorithm.
Table 1. Experimental results of supergraph minimum spanning tree maintenance algorithm.
GroupInstanceIncrease 10 EdgesIncrease 50 EdgesIncrease 200 EdgesV/E
MTKru μ MTKru μ MTKru μ
ins_200_161346587.6079245825.79150744542.96200\2188
ins_200_260544607.3776544235.78151944592.94200\2147
ins_200_354643067.8978044105.65140343893.13200\2069
ins_300_1134812,5789.33139212,5198.99233415,9546.84300\4983
ins_300_2125211,5559.23141611,5148.13191312,7216.65300\4737
Range_100ins_300_3112011,0459.86131210,8578.28207714,0236.75300\4577
ins_400_1191123,72212.41239824,02610.02304223,9077.86400\8738
ins_400_2174322,07112.66225222,81010.13288521,9267.60400\8314
ins_400_3168921,66312.83222722,2099.97283921,4867.57400\8109
ins_500_1303939,79113.09369339,67610.74435639,8509.15500\13,716
ins_500_2280837,40613.32349537,07410.61420437,1988.85500\13,069
ins_500_3277836,16413.02326236,02311.04399236,2309.08500\12,681
ins_200_177073199.51111372336.50157171724.57200\3244
ins_200_272270609.78100668556.81166370084.21200\3221
ins_200_377168488.8894568137.21157068374.35200\3090
ins_300_1151419,30912.75195719,4089.92277519,6427.08300\7406
ins_300_2140217,86212.74183517,9969.81247418,0597.30300\7008
Range_125ins_300_3139317,47612.55179517,4019.69244917,3767.10300\6841
ins_400_1268036,70813.70335436,71410.95395436,3049.18400\12,958
ins_400_2238631,48013.19306833,63810.96375133,7689.00400\12,419
ins_400_3241832,70913.53278132,62011.73366832,2888.80400\11,942
ins_500_1455861,87113.57561561,84711.01610162,19310.19500\20,377
ins_500_2434558,72313.52511557,32611.21572758,58810.23500\19,374
ins_500_3418356,01013.39486456,05611.52554556,17610.13500\18,791
ins_200_198310,25410.43122910,1188.23178810,0725.63200\4345
ins_200_2100610,35710.30115510,2898.91186210,4395.61200\4446
ins_200_395310,06410.56121098768.16182810,6725.84200\4246
ins_300_1202726,76313.20266526,84510.07319726,5648.31300\10,039
ins_300_2189825,57913.48238525,57610.72300325,0178.33300\9693
Range_150ins_300_3182424,28613.31225224,39010.83309324,6237.96300\9418
ins_400_1363848,45713.32465251,63711.10557152,9379.50400\17,629
ins_400_2345447,89613.87449749,22210.95552249,4648.96400\17,058
ins_400_3352946,99013.32415646,86111.28497846,6839.38400\16,340
ins_500_1662191,35313.80775193,05012.00857497,64711.39500\27,720
ins_500_2632287,89813.90724488,15912.17801588,39811.03500\26,676
ins_500_3608584,47013.88681484,17312.35769487,44811.37500\25,814
Table 2. Experimental results of subgraph minimum spanning tree maintenance algorithm.
Table 2. Experimental results of subgraph minimum spanning tree maintenance algorithm.
GroupInstanceDecrease 10 EdgesDecrease 50 EdgesDecrease 200 EdgesV/E
MTKru μ MTKru μ MTKru μ
ins_200_146941798.9155041577.5664641986.50200\2188
ins_200_244740569.0755640767.3364340576.31200\2147
ins_200_341638509.2551840807.8861640076.50200\2069
ins_300_190011,51712.8095511,05011.57116011,3989.83300\4983
ins_300_282610,52212.7494110,51911.18105610,5239.96300\4737
Range_100ins_300_381910,38512.6892710,11910.9299610,20710.25300\4577
ins_400_1155321,53813.87157921,73613.77188722,01211.67400\8738
ins_400_2146620,27213.83151920,29013.36180620,64611.43400\8314
ins_400_3142819,62513.74150519,47712.94167319,44711.62400\8109
ins_500_1238735,00314.66249735,09914.06270235,31013.07500\13,716
ins_500_2231933,41814.41240433,22713.82257233,32112.96500\13,069
ins_500_3221732,19114.52229431,97613.94251332,85613.07500\12,681
ins_200_1548649211.85645649710.0771765529.14200\3244
ins_200_2555648111.68633642410.1570964739.13200\3221
ins_200_3526627911.94592647210.9370963298.93200\3090
ins_300_1133017,94913.50135117,58513.02150217,71611.79300\7406
ins_300_2122116,36613.40128916,24712.60139116,64411.97300\7008
Range_125ins_300_3118615,97413.47128416,16512.59136316,10211.81300\6841
ins_400_1225033,49014.88233233,26014.26247032,92513.33400\12,958
ins_400_2210031,24114.88215031,21714.52236731,28813.22400\12,419
ins_400_3200129,60014.79199229,76414.94217229,80513.72400\11,942
ins_500_1360854,64115.14406056,52213.92417656,14413.44500\20,377
ins_500_2345852,56915.20375952,82914.05400452,61513.14500\19,374
ins_500_3328649,10214.94362750,34613.88374450,75513.56500\18,791
ins_200_1698933813.38712927713.03835933711.18200\4345
ins_200_2684955413.97746958712.85812954511.75200\4446
ins_200_3676898613.29705904512.83805890911.07200\4246
ins_300_1166724,53514.72169424,52114.48184824,44713.23300\10,039
ins_300_2158823,52514.81156723,28614.86170323,22913.64300\9693
Range_150ins_300_3152022,58414.86154822,50314.54165022,66913.74300\9418
ins_400_1330245,75813.86321846,09414.32331847,08414.19400\17,629
ins_400_2311643,33113.91320843,74713.64326944,70313.67400\17,058
ins_400_3292943,03014.69289342,70214.76302742,24313.96400\16,340
ins_500_1533083,44015.65587383,12114.15629283,67413.30500\27,720
ins_500_2510479,43315.56564779,07514.00600979,12013.17500\26,676
ins_500_3491976,12215.48543376,34714.05586476,65713.07500\25,814
Table 3. Experimental results of dynamic graph minimum spanning tree maintenance algorithm.
Table 3. Experimental results of dynamic graph minimum spanning tree maintenance algorithm.
GroupInstanceChange 10 EdgesChange 50 EdgesChange 200 EdgesV/E
MTKru μ MTKru μ MTKru μ
ins_200_1763654.80983903.981765373.05200\2188
ins_200_2713595.06903784.201735243.03200\2147
ins_200_3723514.88863624.211715042.95200\2069
ins_300_11369246.791619595.9624413015.33300\4983
ins_300_21338666.511528965.8923112485.40300\4737
Range_100ins_300_31298146.311498735.8622611244.97300\4577
ins_400_124618157.3825818147.0331918175.70400\8738
ins_400_223516467.0024316676.8631516865.35400\8314
ins_400_322416017.1523516717.1130517015.58400\8109
ins_500_136329568.1438829417.5843830616.99500\13,716
ins_500_235627857.8238128467.4742629006.81500\13,069
ins_500_333926717.8836726837.3142128306.72500\12,681
ins_200_1885676.441055715.441868194.40200\3244
ins_200_2865636.551005685.681918094.24200\3221
ins_200_3885526.27985425.531827814.29200\3090
ins_300_119414937.7020014017.0129615145.11300\7406
ins_300_216813598.0919713716.9628919206.64300\7008
Range_125ins_300_317613117.4518613137.0628918836.52300\6841
ins_400_131828378.9236427727.6244129566.70400\12,958
ins_400_229124808.5233925847.6258235666.13400\12,419
ins_400_328424658.6833125097.5856235416.30400\11,942
ins_500_153544858.3855646238.3179550286.32500\20,377
ins_500_250342838.5151243058.4178748566.17500\19,374
ins_500_347742308.8751442948.3559442297.12500\18,791
ins_200_11137786.881227706.311977914.02200\4345
ins_200_21138007.081247916.381987683.88200\4446
ins_200_31157546.561207646.371887534.01200\4246
ins_300_126621438.0627120667.6231521406.79300\10,039
ins_300_224319337.9526120067.6930319336.38300\9693
Range_150ins_300_323918327.6724718817.6230319526.44300\9418
ins_400_145338378.4746239478.5452640537.71400\17,629
ins_400_245537498.2444437208.3848437677.78400\17,058
ins_400_342635318.2945036518.1153239777.48400\16,340
ins_500_1663762811.51766772310.0880179249.89500\27,720
ins_500_2642719511.2173072259.9075272269.61500\26,676
ins_500_3602667811.0971669259.6773171559.79500\25,814
Table 4. Experimental results on cluster graphs.
Table 4. Experimental results on cluster graphs.
InstanceMTKru μ Kru Pri μ Pri V/E
cluster-3x3-500-30-1447529311.8434127.634500\333,999
cluster-3x3-500-30-249338097.7333086.714500\332,934
cluster-3x3-500-30-346639328.4433967.294500\333,244
cluster-3x3-500-60-190211,69412.9679368.804500\666,131
cluster-3x3-500-60-290212,54913.9176488.484500\666,314
cluster-3x3-500-60-3101512,23712.0676797.574500\666,857
cluster-3x3-500-90-1198721,73910.9414,4947.294500\1,000,146
cluster-3x3-500-90-2190421,78111.4413,2416.954500\1,000,481
cluster-3x3-500-90-3196121,82611.1314,6427.474500\1,000,563
cluster-4x4-1000-30-1566353,5209.4531,3895.5416,000\2,374,940
cluster-4x4-1000-30-2528547,2958.9532,2116.0916,000\2,372,733
cluster-4x4-1000-30-3538772,86913.5330,5115.6616,000\2,372,381
cluster-4x4-1000-60-111,743166,82014.2181,4016.9316,000\4,746,653
cluster-4x4-1000-60-211,369154,38113.5879,8387.0216,000\4,745,725
cluster-4x4-1000-60-311,688164,32314.0675,2866.4416,000\4,745,983
cluster-4x4-1000-90-120,378253,49712.44134,8576.6216,000\7,121,777
cluster-4x4-1000-90-218,374261,34814.22135,8707.3916,000\7,120,605
cluster-4x4-1000-90-318,968249,90513.18128,9546.8016,000\7,120,813
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Luo, M.; Qin, H.; Wu, X.; Xiong, C.; Xia, D.; Ke, Y. Efficient Maintenance of Minimum Spanning Trees in Dynamic Weighted Undirected Graphs. Mathematics 2024, 12, 1021. https://doi.org/10.3390/math12071021

AMA Style

Luo M, Qin H, Wu X, Xiong C, Xia D, Ke Y. Efficient Maintenance of Minimum Spanning Trees in Dynamic Weighted Undirected Graphs. Mathematics. 2024; 12(7):1021. https://doi.org/10.3390/math12071021

Chicago/Turabian Style

Luo, Mao, Huigang Qin, Xinyun Wu, Caiquan Xiong, Dahai Xia, and Yuanzhi Ke. 2024. "Efficient Maintenance of Minimum Spanning Trees in Dynamic Weighted Undirected Graphs" Mathematics 12, no. 7: 1021. https://doi.org/10.3390/math12071021

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop