3.1. Metrics on Project Complexity
Pryke stated that Social Network Analysis (SNA) tools can be adapted to study governance in projects, focusing on two key properties: density and centrality. These properties allow for the analysis of important aspects of projects, such as the role of stakeholders and the types of contracts that are established in human activities [
46].
Unlike other small-world networks, studies applying SNA to projects, conducted by various authors such as Carrington et al. [
47] and Kenis and Oerlemans [
48], agree that networks within projects are temporary, non-repetitive and have objectives clearly defined by stakeholders through requirements. Nowadays, projects are not executed by a single company but involve coordination between several companies of different types. These interrelationships can be of a formal nature (through contracts) or informal (verbal agreements or positionings), strong (relationships in a specific phase of the project) or weak (for example, the partnership facing a work affecting a pipe in a street), frequent or infrequent, and can be based on emotional aspects (projects with a social component) or pragmatic (a work without social relevance). Thus, projects generate networks of relationships among their participants, which can present both opportunities and threats.
Borgatti y Halgin [
49] mention two network research models to address governance from a structural perspective: the flow model and the coordination model.
The flow model considers how the structure of the network influences a given flow, which allows studying the positions of the actors within the network. This model evaluates three key structural properties:
Degree centrality, which measures how many connections a node has with other nodes, highlighting those nodes that have high visibility and control over other actors in the periphery. An interesting extension study of this concept is in the strategy-structure term that suggests that performance will be higher when the strategy and structure of the firm are consistent with the strengths of the project [
50].
Closeness centrality, which indicates which nodes are the most efficient for receiving and transmitting information quickly. On this aspect there is a study that delves into the problems of stopping the closeness centrality since many networks are directed and weighted. It is proposed as a solution to use the effective distance versus geometric distance through Dijkstra’s algorithm [
51].
Intermediation centrality evaluates the importance of a node in a network based on how many times this node acts as a bridge in the shortest paths between other nodes. The main problem with this centrality is its calculation since it requires complicated algorithms [
52].
Figure 4 shows the three types of centralities defined.
Density, which is the ratio of existing connections to the maximum possible number of links and reflects the level of connectivity of the network. For large projects, such as international development projects, United Nations peacekeeping missions, etc., calculating density and associating it with centralities is complicated. There are algorithms that allow us to calculate it by successive approximations based on centroid calculations [
53].
The complexity of a project is intrinsically related to the centralization and density of its network. Provan and Milward argue that networks with many actors and interdependencies require high levels of coordination, which increases their complexity [
54]. Kim, Yan and Dooley point out that larger network size, combined with high levels of centrality and density, is associated with higher complexity [
55]. Finally, authors such as Choi and Hong point out that high centralization in the information network can generate weak interactions between the center and the periphery, which could lead to the formation of consortia between peripheral actors. This tends to slow down operations and decision making, which increases costs and the risk of failure, and consequently, the complexity of the project [
56].
The coordination model, on the other hand, employs Beta or Bonacich centrality, which measures the influence of a node in a network not only as a function of the degree of connection but also as a function of connections with influential nodes. This type of centrality adjusts the centrality weight according to whether the node is connected to nodes of high or low centrality, which allows a more focused view on the dynamics of influence in networks [
57].
Adami and Verschoose propose a structural framework for analyzing complex projects using a multi-network that operates at three levels [
58]:
Supply network, which controls the coordination and supervision of goods and services, in addition to distributing power and authority among the actors. A high level of centralization in this network implies greater control and operational burden for suppliers, while centrality at the actor level helps to identify key buyers and suppliers. Companies with high centrality usually assume the role of integrators, organizing goods and resources for the outcome of the project.
Contractual network, which manages the formal involvement of firms through contracts, and regulates the formal relationships between them, including supplier changes. A high degree of centralization in this network can generate weak interactions between central and peripheral companies, and disconnection between different levels of the supply chain. A high degree of connections in this network is usually associated with greater complexity.
Stakeholders’ information network, which deals with informal power related to information and contributes to effective governance. Centralization in this network can restrict the flow of information between central and peripheral nodes, leading to delays in problem solving and the formation of resistant nodes that negatively affect project operations.
The analysis of complex networks in projects requires the use of software tools, such as UCINET 6.790 [
59], which facilitates the graphical and analytical analysis of network structures. UCINET offers a comprehensive analysis of social networks, allowing the study of parameters such as centrality, identification of subgroups and analysis of roles within the network. These metrics help to measure the complexity of a project, analyzing both its static structure and its co-evolutionary dynamics, allowing a better understanding and management of complex construction projects.
Determining whether a system is complex is challenging due to its constantly changing nature [
60]. Even so, it is essential to have metrics to measure such complexity to better understand its structure. The first step is to define what type of network will represent the complex projects under study, as this will influence the identification of the elements that make up the network. For example, random networks could self-generate, completely changing their structure and the interrelationships between nodes.
Within these, small-world networks are particularly relevant. These networks combine a low average characteristic length (like random networks) with a high level of clustering (like regular networks), making them a common model for phenomena as diverse as social networks, electrical networks, technical collaborations, and business partnerships [
61,
62].
Regarding the structural analysis of networks representing complex systems, several interesting metrics are proposed. For example, a key metric is information flow, which is often one of the main complexity factors in projects [
33]. Although these metrics may seem intuitive, they require specific calculations and in-depth knowledge of the system and its representation in the network.
Summers and Shah propose a toolkit that analyses the structure of systems from three dimensions: size, coupling and solvability. Size encompasses the objectives, problems and processes that connect the two. Coupling defines the level of interrelationship within the network, while solubility measures how aligned the system variables are to achieve the stated objectives. These metrics provide a holistic view of the system structure, although they require an in-depth knowledge of networks [
63].
There are other metrics to measure the complexity of the structure in a system that can be found in [
64], but the application of these tools in early stages can be complex and difficult to manage. Therefore, there is a need to develop faster and more efficient metrics that do not require an exhaustive analysis of the network or multi-networks of a project created.
A metric that is easy to apply is the one proposed by Lassen and Aalst in their study [
65]. In this work, they suggest that the structural complexity of systems can be decomposed into three parameters, like those mentioned by Summers and Shah: size, connections, and solvability. These are inspired by McCabe’s cyclomatic complexity described in [
66] but focus only on the number of strongly connected components (SCC). They call this variation Extended Cyclomatic Complexity (ECC), which allows a more detailed analysis (Equation (3)) of the structure of complex systems.
where PN is a Petri net [
67] with three fundamental parameters: P is the number of nodes; T is the number of states or transitions, and F is the number of network interrelationships. E is the number of links, N is the number of nodes, and p is the number of SCC. The following example is proposed, see
Figure 5, where the cyclomatic complexity of a supply network consisting of 10 nodes with 12 links and 4 strongly connected components, in orange, can be established.
Applying the cyclomatic complexity formula, its value is obtained:
According to McCabe, this value is considered to correspond to a simple system. If this formula were applied to a system with more links, for example, 30 links, in a network with the same number of nodes: 10 and the same number of connected components: 4, the cyclomatic complexity defined in Equation (3) would be:
Although cyclomatic complexity is effective for measuring complexity in small-world networks, it does so by considering that state transitions-key elements for understanding movement within the network-are subject to numerous initial constraints, such as the elimination of mutually beneficial links due to a lack of suitable conditions. If any of these constraints are removed, the complexity increases significantly. However, a major problem with this metric is that it does not capture changes in interactions caused by coevolution. For this reason, structural metrics must be complemented with other metrics that allow these evolutionary dynamics to be incorporated into the complexity analysis.
When in a complex network both state variables and interactions have a dynamic behaviour, it is crucial to analyse how both elements evolve over time. In this context, Íñiguez and Barrio propose a simple metric to evaluate the coevolution before going into its detailed evolution, since performing a previous analysis could be complicated [
67]. To approach this analysis, they establish two temporal parameters: the micro dynamics and the macro dynamics of the system. The micro dynamics describes the rapid changes in the network interactions, limited to a specific set of nodal variables, and is denoted as “dt”. On the other hand, macro dynamics refers to the evolution of the whole system, which requires much more time, symbolized as “dT”.
Coevolution, then, is measured through the parameter g, which is defined as the ratio between the times of macro dynamics and micro dynamics:
Depending on the value of g, three different possible scenarios are identified. When g → 0, no coevolution is observed, and the complexity of the system is explained solely by its structure. If g → ∞, the network ceases to exist in topological terms and only a constant evolutionary function remains. Between these two extremes, the interaction between structure and coevolution generates similar emergent properties over time, resulting in a heterogeneous network with the emergence of communities, of small-world networks.
To simplify the calculation of g without the need for a complex network analysis, Íñiguez and Barrio [
68] propose the following formula:
where:
<k> is the average degree of the network. In small-world networks it can be considered as the highest order degree of the network.
N is the number of nodes, and
<C> represents the average clustering coefficient. In small-world networks, with a minimum clustering value around 0.75, it is observed that as N grows, g slowly tends to 0, as shown in
Figure 6.
This behavior suggests that, in networks with few nodes, coevolution can lead to faster and possibly much more polarized opinion dynamics, influenced by time dt and limited connectivity between nodes.
The plot shows noticeable fluctuations in the degree of coevolution g when the number of nodes is low, indicating great variability in the structure and interactions of the network in its early stages of growth. These fluctuations reflect the inherent complexity of the system, since, in small networks, changes in the interrelationships between nodes have a greater impact on the overall dynamics of the network. As the number of nodes increases, the coevolution stabilizes, suggesting that the network becomes more structured and less sensitive to small changes, reducing the dynamic complexity of the system. The strong initial fluctuations are therefore associated with the emerging complexity in the early stages of the network.
3.2. GCI: A Model to Assess the Complexity of Projects
The proposed model to assess the complexity of projects is presented in
Figure 7. The first step is the study and analysis of the project. After project awareness, the second step consists in the development of the networks of the project, according to the supply data, the contractual data and the stakeholder and their relationships. The built networks and their characteristics allow to calculate two metrics: the Structural Complexity (SC) and the Dynamic Complexity (DC).
The SC and DC metrics are defined by the following definitions and scales:
Structural Complexity (SC) through the measurement of Extended Cyclomatic Complexity (ECC). This complexity is always present, since it depends on the structure of the network represented by the project. Thus, the following reference values will be considered [
66] and their assignments to the complexity level:
- ○
ECC ≤ 10. Simple project. Structural complexity will have a value of 0.
- ○
ECC > 10 and ≤20. Complicated project. Structural complexity will have a value of 3.
- ○
ECC > 20. Complex project. Structural complexity shall have a value of 5.
Dynamic Complexity (DC) by applying the coevolution parameter g, which will depend on the number of nodes. Thus, the following reference values [
67] and assignments to the complexity level will be considered:
- ○
Number of nodes (N) ≤ 25. High coevolution. The dynamic complexity will have a value of 5.
- ○
Number of nodes (N) > 25 and ≤100. Medium coevolution. The dynamic complexity will have a value of 3.
- ○
Number of nodes (N) > 100. Low coevolution. The dynamic complexity will have a value of 0.
Taking into account the two metrics described and following different models reported in the scientific literature [
68,
69,
70], such as the model proposed by Stacey [
71] in which he generates a classification of projects according to the degree of agreement in the project team and the complexity of the project interactions, or the model of Williams and Hillson [
72] which, for its part, generates a model by means of the complexity of the project structure and its interactions, a model for evaluating the complexity of projects under uncertainty conditions is proposed for the classification of projects according to a Global Complexity Index (GCI).
The modeling of the uncertainty in the assessment of the SC and DC metrics, and the final assessment of the project complexity (GCI) was performed using fuzzy logic [
73]. The model consists of two input variables (SC and DC) and one output variable (GCI), with a total of eleven membership functions and nine decision rules. The model uses linear membership functions to transform the input variable into a scale ranging from 1 to 0. A value of 0 indicates that the value in question does not belong to the given set, a value of 1 signifies that the value is undoubtedly a member of the given set, and any value between 0 and 1 represents a degree of potential membership (a higher value implies a greater likelihood of membership).
Table 1 provides a detailed description of the membership functions utilized in this study. The parameters of each membership function have been modelled using data from previous projects to introduce uncertainty into the values of each model variable. The SC and DC metrics were modelled using gamma, L and trapezoidal functions (
Table 1) associated with the three valuation levels defined above for each of the two variables: (
Table 2). The index (GCI) was modelled using one gamma function, one L function and three trapezoidal functions (
Table 3). Fuzzification was performed using the max-min method, and defuzzification using the center of gravity method [
74].
The value of the model output variable (GCI), according to the values of the input variables (SC and DC), is defined according to the following decision rules:
If SC = fm1 & DC = fm1 → GCI = fm4
If SC = fm1 & DC = fm2 → GCI = fm2
If SC = fm1 & DC = fm3 → GCI = fm1
If SC = fm2 & DC = fm1 → GCI = fm4
If SC = fm2 & DC = fm2 → GCI = fm3
If SC = fm2 & DC = fm3 → GCI = fm2
If SC = fm3 & DC = fm1 → GCI = fm5
If SC = fm3 & DC = fm2 → GCI = fm4
If SC = fm3 & DC = fm3 → GCI = fm3
The defuzzification process produces values between 0 and 5 for the output variable GCI, which ranks the complexity of the project as described in
Table 4.
The proposed methodology is applicable to larger projects, regardless of the number of nodes and interactions. This is because the networks are managed using the UCINET program, which allows the analysis of large-scale network structures without significant restrictions. However, it is important to note that the complexity of a project is not intrinsically related to its size, but to the nature of its interactions, the density of connections and the level of uncertainty inherent in these interactions. The methodology addresses these factors through an adaptive approach based on fuzzy logic, which may ensure its applicability to a wide variety of projects, including those with larger and more dynamic networks.