Next Article in Journal
An Improvement of the Upper Bound for the Number of Halving Lines of Planar Sets
Previous Article in Journal
Advanced Multimodal Sentiment Analysis with Enhanced Contextual Fusion and Robustness (AMSA-ECFR): Symmetry in Feature Integration and Data Alignment
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Graph-Based Traffic Forecasting with the Dynamics of Road Symmetry and Capacity Performance

1
The Key Laboratory of Road and Traffic Engineering, Ministry of Education, Tongji University, Shanghai 201800, China
2
The China Electronics Technology Group, Taiji Co., Ltd., Beijing 100124, China
3
College of Mathematical Sciences, Harbin Engineering University, Harbin 150001, China
4
Law School, Hainan Normal University, Haikou 571158, China
*
Author to whom correspondence should be addressed.
Symmetry 2024, 16(7), 935; https://doi.org/10.3390/sym16070935 (registering DOI)
Submission received: 19 June 2024 / Revised: 14 July 2024 / Accepted: 19 July 2024 / Published: 22 July 2024
(This article belongs to the Section Engineering and Materials)

Abstract

:
Symmetry in traffic patterns is a fundamental aspect of intelligent transportation systems, aiming to precisely predict traffic flow in real time despite the complex interplay of spatial and temporal factors. This paper presents a novel method of traffic forecasting that incorporates parameters related to road symmetry into a Graph Convolution Network model. Our model is crafted to dynamically adjust to real-time changes in road conditions, including the presence of symmetric and asymmetric road layouts, which substantially influence traffic flow. We have developed a GCN model that not only accounts for standard traffic flow metrics but also integrates a matrix representing road symmetry. The model undergoes training and validation on the METR-LA dataset, showcasing a significant enhancement in prediction accuracy. In the comparative analysis of state-of-the-art methods, our model demonstrated a significant enhancement in performance, achieving 30.68% improvement in Mean Squared Error (MSE) and a 24.28% improvement in Mean Absolute Error (MAE) over the best-performing method. The implications of our research are profound for urban planners, traffic management systems, and navigation service providers, as it offers a more dependable tool for forecasting traffic conditions, aiding in road design, and refining route planning strategies based on the symmetry of road networks.

1. Introduction

Traffic forecasting stands as a pivotal element within intelligent transportation systems, tasked with the formidable challenge of accurately predicting traffic patterns in real time. The complexity of this task arises from the detailed interaction between spatial and temporal dependencies that define traffic flows. Temporal dependence highlights the sequential changes in traffic flow over time, whereas spatial dependence illustrates the geographic spread of traffic. In this context, the spatial connections between various road segments significantly influence traffic patterns.
Symmetry in road networks plays a crucial role in shaping traffic dynamics. Symmetric road layouts, such as grid-like patterns or mirrored configurations, can lead to balanced traffic distribution and more predictable flow patterns. On the other hand, asymmetric road designs, characterized by irregular shapes or uneven distributions of entry and exit points, can introduce complexity and variability in traffic behavior. Incorporating symmetry considerations into traffic forecasting models can enhance their ability to capture the nuances of traffic patterns and improve prediction accuracy.
The landscape of existing traffic models is bifurcated into non-parametric and parametric approaches. Parametric models, which rely on assumptions of smoothness, often fall short in capturing the inherent nonlinearity and unpredictability of traffic states. Non-parametric models, although proficient in discerning trends from historical data, are limited by their structural constraints, which restrict their capacity to fully comprehend the dynamic essence of traffic fluctuations.
The emergence of deep learning technologies has redirected focus towards the utilization of deep neural network models in traffic forecasting. Esteemed methods such as Long Short-Term Memory Networks (LSTMs) [1], Recurrent Neural Networks (RNNs) [2], and Gated Recurrent Units (GRUs) [3] have risen to prominence, exploiting their self-regenerating mechanisms to adeptly model temporal dependencies. The integration of Convolutional Neural Networks (CNNs) into forecasting frameworks has notably advanced the characterization of spatial dependencies. A distinguished approach involves amalgamating CNNs with LSTM units to facilitate short-term traffic flow forecasting, deploying CNNs to distill spatial features while leveraging LSTMs to dissect short-term variances and cyclical trends [4].
Moreover, the Interactive Temporal Recurrent Convolution Network (ITRCN) [5] model introduces an end-to-end paradigm that transmutes network flow interactions into visual representations for CNN-based spatial analysis, concurrently employing GRUs for the extraction of temporal features. This model demonstrates a significant performance enhancement over standalone GRU and CNN models by margins of 14.3% and 13.0%, respectively [5]. In a similar vein, Yu et al. [6] orchestrated the integration of a Deep Convolutional Neural Network (DCNN) to capture spatial correlations and an LSTM for temporal dynamics elucidation, validating the efficacy of their spatiotemporal recurrent convolutional network (SRCN) model with traffic network data hailing from Beijing [7].
While CNNs exhibit prowess in handling Euclidean data like images and grids, their efficacy wanes in traffic networks characterized by non-Euclidean structures. This challenge has been addressed by the advent of Graph Convolutional Networks (GCNs) [8], which have demonstrated their aptitude in discerning the complex structural intricacies of these networks. Concurrently, RNNs and their derivatives maintain their utility in capturing ephemeral short-term trends, thanks to their proficiency in sequential data processing over time.
Despite these advancements, a plethora of models within the graph neural network domain fail to fully embrace the dynamic attributes of roadways, such as temporary closures, repairs, and events, which can significantly alter road capacity and, by extension, traffic flow patterns. Recognizing these limitations, researchers have begun to explore more sophisticated models that incorporate temporal dynamics and real-time data updates. Studies such as Zhou et al. [9], and Cui et al. [10] highlight the development of advanced GNN frameworks capable of integrating dynamic information about road networks. These models represent a promising direction for future research, aiming to create more adaptive and accurate traffic prediction systems. Furthermore, road conditions like cracks, potholes, and variations in surface evenness, typically ensuing from prolonged use, can markedly influence vehicle speed and transit times, thereby affecting the overarching traffic flow dynamics.
This paper endeavors to bridge a notable gap in the existing graph neural network models employed for traffic flow forecasting, particularly their oversight of dynamic road conditions such as temporary closures, repairs, and the ramifications of road degradation—manifested as cracks, potholes, and alterations in surface evenness due to extended usage—on vehicle speed and transit times, which, in turn, critically influences the traffic flow paradigm.
Our contributions are as follows:
  • We propose a novel graph neural network model, augmented with a road capacity matrix, to enhance traffic forecasting capabilities.
  • We propose a road capacity matrix estimation module that initializes the matrix based on initial input states and dynamically adjusts to accommodate changing traffic conditions.
  • We develop a novel loss function that seamlessly integrates the road capacity matrix, ensuring that the model’s forecasting is informed by the practical constraints of road capacities.
  • We empirically validate our model on a public dataset, demonstrating its effectiveness and superiority over existing state-of-the-art methods.
The remainder of this paper is structured as follows. Section 2 reviews the relevant literature, offering insights into previous approaches and identifying gaps in the current understanding of dynamic traffic prediction using GNNs. Section 3 describes the methodology we employed, detailing the design and implementation of our advanced GNN model that integrates dynamic road conditions. Section 4 presents a comprehensive analysis of our experimental results, demonstrating the efficacy of our model compared to existing benchmarks. Section 5 discusses the implications of our findings for traffic management systems. Finally, Section 6 suggests directions for future research.

2. Related Works

Traffic forecasting stands as a fundamental technological challenge in the deployment of intelligent transportation systems. The development of traffic forecasting has seen considerable progress via diverse computational methods [11,12]. The integration of autoregressive models with moving averages in ARIMA, as discussed by Alghamdi et al. [13] and Van Der Voort et al. [14], provided a robust method for time series forecasting in traffic systems. Further enhancement in seasonal pattern recognition was achieved with the SARIMA model, as investigated by Williams and Hoel [15].
The adoption of deep learning techniques marked a pivotal shift in handling complex spatio-temporal dependencies in traffic data [16]. Early models like FC-LSTM and ConvLSTM [17] leveraged the strengths of CNNs and RNNs to encapsulate both spatial and temporal dimensions. Subsequent developments, such as ST-ResNet [18] and ST-LSTM [19], refined these approaches, focusing on grid-based data representations. GRUs have demonstrated effectiveness in grasping the temporal dynamics of traffic states [3]. The integration of attention mechanisms, which modulate the significance of various time points and consolidate global temporal insights, has further enhanced forecasting precision [20].
Graph Convolutional Networks (GCNs) have seen successful applications across diverse areas, especially in computer vision [21,22,23]. Kipf and Welling [8] pioneered the adaptation of convolutional neural networks to graph-structured data through GCNs. These networks have proven effective in identifying spatial dependencies within traffic systems by encapsulating the topological intricacies of road networks. The employment of Graph Convolutional Networks (GCNs) [8] marked a significant advancement in modeling spatio-temporal interactions. For example, Differential Recurrent Neural Networks (DCRNNs) [24] integrated diffusion GCN with bidirectional random walks and simultaneously mapped out spatial and temporal patterns. This fusion of convolutional operations across spatial and temporal dimensions was further illustrated in models such as Spatio-temporal Graph Convolutional Networks (STGCNs) [25] and Graph-WaveNet [26].
Recent trends have seen the emergence of Neural Differential Equations (NDEs) as a novel approach in traffic forecasting. NDEs introduce a mechanism for generating continuous depth in neural networks. The application of Neural Ordinary Differential Equations (NODEs) in traffic forecasting has been exemplified by models like Spatial-Temporal Graph ODE Network (STGODE) [27], which utilizes Neural Graph Ordinary Differential Equations (ODEs) to more effectively capture complex spatio-temporal dependencies.
Spatio-temporal graph networks leverage the capabilities of GCNs and RNNs to comprehend both spatial and temporal dependencies in traffic data. Cao et al. [5] converted interactive network flows into images and analyzed these flows using a CNN. Meanwhile, Zhu et al. [28] introduced the Attention Temporal Graph Convolutional Network (A3T-GCN) for traffic forecasting. This model modulates the impact of historical traffic conditions and identifies global variation trends.
The introduction of attention-based models brought new insights into learning spatial and temporal correlations. Models like Attention-Based Spatial–Temporal Graph Convolutional Networks (ASTGCNs) [29] employed dual attention layers to capture these correlations, while spatial–temporal synchronous graph convolutional networks (STSGCNs) [30] focused on localized spatio-temporal subgraphs to enhance the capture of local correlations. The innovative use of Dynamic Time Warping (DTW) in Spatial–Temporal Fusion Graph Neural Networks (STFGNNs) [31] showcased the ability to learn from data-driven spatial networks, addressing the over-smoothing problem prevalent in deep GNNs. Table 1 provides an overview of various methods used in traffic forecasting.
While graph-based methods have advanced the field of traffic forecasting, there is still room for improvement, particularly in addressing the challenges of managing noise and anomalies within complex traffic networks and enhancing the accuracy of long-term forecasts. Our proposed graph neural network model, which incorporates a road capacity matrix, aims to tackle these issues by providing a more refined representation of traffic dynamics. This model not only considers the initial state of the road network but also adapts to real-time changes through a novel road capacity matrix estimation module. Furthermore, the integration of this matrix into the model is facilitated by a specially designed loss function, which ensures that forecasts are aligned with the practical constraints of road capacities.

3. Method

Our method of traffic forecasting is centered on the development of an innovative graph neural network (GNN) model, augmented with a dynamic road capacity matrix. This section delves into the intricacies of our method, encompassing the architecture of the GNN, the construction and integration of the road capacity matrix, and the formulation of a bespoke loss function.
Our model is inspired by the pioneering work on Graph Convolutional Networks (GCNs) by [8], who demonstrated how convolutional neural networks could be adapted to process data structured as graphs. This approach is particularly effective in capturing spatial dependencies, which is crucial for traffic forecasting.
Additionally, we have incorporated elements from recent advancements in recurrent neural network (RNN) architectures to capture temporal dynamics, as demonstrated by models like LSTM and GRU, which effectively address the temporal sequences in traffic data. The integration of attention mechanisms, as suggested by [20], further allows our model to selectively focus on critical temporal information, enhancing prediction accuracy.

3.1. Graph Neural Network Architecture

The foundation of our graph neural network architecture is heavily inspired by the seminal work on GCNs by [8], who introduced a method for applying convolutional neural networks to graph-structured data. Our use of spectral convolutions is directly derived from their methodology, which utilizes the graph Laplacian matrix and its eigenvectors to effectively process signals on graphs [8]. This approach is crucial for understanding spatial dependencies within traffic networks, as it allows for the modeling of complex interdependencies within non-Euclidean data.
Furthermore, our model extends the conventional capabilities of GCNs by incorporating adaptive features that respond to dynamic changes in traffic patterns and road conditions, a concept that has been explored in recent advancements like the work by [31], who introduced dynamic graph neural networks that adjust to changes in graph topology over time.
At the base of our model lies a meticulous design to capture the complex interdependencies within traffic data through layers adept at processing non-Euclidean data. GCNs, evolving from CNNs, excel in applications like image and document classification by employing spectral convolutions. Our model utilizes these convolutions, leveraging the graph Laplacian matrix and its eigenvectors to process signals on graphs, enhancing the spatial understanding of traffic networks.
GCNs encapsulate the relational in terms of space frameworks of Pavement frameworks and the characteristics of individual road segments by delineating the relational in terms of space connections between a central Pavement segment and its adjacent sections. This method can identify spatial dependencies within the network.
As shown in Figure 1, the traffic flow data from the preceding N moments are inputted, followed by the estimation of the initial traffic capacity matrix. Subsequently, the traffic flow matrix is fed into the graph neural network. The outputs for subsequent moments are then generated, concomitant with updates to the capacity matrix. This dual encoding facilitates a nuanced representation of road networks, enabling the model to incorporate spatial dependencies more effectively. This feature is crucial in improving the predictive precision of traffic flow models, particularly in situations where the road network’s topology and condition undergo frequent alterations. Through this approach, our model not only predicts traffic flow based on historical data but also adapts to real-time changes in road conditions, offering a robust solution to dynamic traffic management and forecasting challenges.

3.2. Dynamic Road Capacity Matrix

A pivotal innovation of our model is the dynamic road capacity matrix (RCM), which encapsulates the fluctuating capacity of roads under varying conditions.
The integration of a dynamic road capacity matrix into our GNN architecture is an innovative approach that has not been widely explored in existing literature. This concept draws on the idea of enhancing neural network models with additional contextual information that reflects real-time environmental changes.
Road capacity is defined as the access performance of vehicles a road segment can handle under optimal conditions. However, this capacity is not static; it fluctuates due to various factors. Road congestion, a common occurrence in urban settings, further strains this capacity by slowing down traffic flow and reducing the number of vehicles that can pass through a given point in a specific timeframe. Temporary road closures for repairs, events, or emergencies can drastically alter the available capacity, redirecting traffic flows and potentially leading to congestion in other parts of the network. Weather changes, too, play a significant role; conditions like snow, ice, or heavy rain can decrease road capacity by affecting vehicle speed and increasing the required spacing between vehicles for safety. In addition, the occurrence and development of road damage also affect this performance, such as potholes, cracks, and surface unevenness from wear and tear. Understanding and adapting to these dynamic changes in road capacity is crucial for accurate traffic forecasting and the effective implementation of intelligent transportation systems.
Matrix Construction The road capacity matrix is initially estimated from historical traffic data and road attributes. It takes into account a multitude of factors, including road width, speed limits, and typical congestion thresholds. This matrix serves as the foundational baseline for subsequent dynamic adjustments. In practice, we average and normalize the traffic flow data of the first n moments to the range of [0, 1] as the initial matrix.
Matrix Integration and Real-Time Adjustment The matrix is dynamically woven into the fabric of GCN, empowering the model to refine its predictions in response to real-time shifts in road capacity. This is achieved through a continuous updating mechanism that reacts to new data inputs, such as temporary road closures, construction activities, or unforeseen congestion patterns.
As shown in Figure 2, factors such as congestion, accidents, and roadway issues can lead to diminished traffic capacity. Conversely, successful repairs or management of roadway issues can enhance traffic capacity. Complete road closures or significant accidents may reduce capacity to the point of rendering roads impassable. The color coding and symbols in Figure 2 are as follows:
  • white: impassable conditions;
  • light black: normal traffic flow;
  • black: slight congestion;
  • green: unimpeded traffic flow;
  • yellow: congestion;
  • purple: severe congestion.
These colors are used in matrices to represent the states caused by various traffic events, which are uniformly summarized in the representation of traffic capacity. Rather than quantifying the detailed impact of each specific situation on traffic, we model the occurrence and impact of these events through the road capacity matrix that follow changes in flow.
This adaptability is facilitated by a continuous update mechanism that responds to incoming data, reflecting changes such as temporary road closures, construction activities, or unexpected congestion patterns. Through this process, the model maintains its relevance and accuracy in the face of evolving traffic conditions, making it a powerful tool for predictive traffic management and planning.
The updating mechanism is articulated as follows:
C t + 1 ( i , j ) = C t ( i , j ) + α · Update ( i , j , RealTimeData )
Here, C t ( i , j ) means the capacity of the road segment between nodes i and j at time t, and α is a scaling factor that controls the impact of real-time data. Update is a function that computes the change in capacity based on real-time data inputs for the road segment between nodes i and j. The function first parameterizes the initial matrix, then participates in the model training, adjusts the matrix according to the traffic flow changes of the corresponding part, and circularly participates in the output of the next moment. In this case, the value of α is 1.

4. Capacity Graph Convolutional Network Model

In the context of Graph Convolutional Networks (GCNs), the convolutional operation found in traditional Convolutional Neural Networks (CNNs) is substituted by a spectrum convolutional operation. This operation takes into account the Capacity Matrix C, the feature matrix X, and the adjacent matrix A. By considering both the graph node and the first-order adjacent domains of nodes, GCNs are able to effectively capture the spatial characteristics of the graph.
Furthermore, GCNs employ a hierarchical propagation rule to superpose multiple networks. This allows for the creation of a multilayer GCN model, which can be mathematically represented as follows:
H ( l + 1 ) = σ ( D ˜ 1 2 A ˜ D ˜ 1 2 H ( l ) θ ( l ) )
where A ˜ represents an adjacent matrix with self-connection structures, which is obtained by adding the identity matrix I N to the original adjacent matrix A. D ˜ denotes a degree matrix, where each diagonal element D ˜ i i is calculated by summing the elements in the corresponding row of A ˜ . H ( l ) represents the output of layer l in the GCN model, which is a matrix of size N × l , where N is the number of nodes in the graph and l is the number of features or channels at layer l. θ ( l ) represents the learnable parameters of layer l, which the GCN model optimizes during training. Finally, σ ( · ) is an activation function applied element-wise to introduce nonlinearity into the model, enabling it to capture complex patterns and relationships. By integrating these components, the multilayer GCN model can effectively learn and propagate information across the graph structure, taking into account the spatial dependencies and self-connections defined by the modified adjacent matrix A ˜ and the degree matrix D ˜ .
Generally, a two-layer GCN model can be expressed as:
f ( X , A ) = σ A ^ ReLU A ^ X W 0 + C W C W 1
The equation describes a Graph Convolutional Network (GCN) model for traffic forecasting. It uses the feature matrix X, adjacent matrix A, and preprocessed adjacent matrix A ^ to capture the graph structure. The model incorporates road capacity information through the capacity matrix C and learns weight matrices W 0 , W C , and W 1 to transform the input features, capacity information, and hidden representations. The output f ( X , A ) provides traffic forecasting predictions for each node in the graph over a forecasting length of T, using the ReLU activation function to introduce nonlinearity.
By discerning the topological relationships between a central road section and its adjacent counterparts, GCNs facilitate a comprehensive understanding of spatial interconnections within the traffic network. This capability allows for the effective capture of spatial dependencies, which are critical in understanding how traffic conditions in one area can influence or be influenced by conditions in nearby areas.
To augment this spatial analysis, the incorporation of a road capacity matrix within the GCN framework presents a significant advancement. The road capacity matrix is a structured representation that quantifies the traffic volume each road section can accommodate. By integrating this matrix, the GCN model can further refine its understanding of spatial dependencies by considering not only the physical and topological proximities of road sections but also their varying capacities. This enriched perspective enhances the model’s ability to predict traffic flow by accounting for the potential bottlenecks and flow constraints imposed by road capacities. Consequently, this study leverages the GCN model’s strengths in learning spatial dependencies, now enriched with insights from the road capacity matrix, to provide a more nuanced and accurate depiction of traffic dynamics across the network.

Loss Function Design

To effectively train our model and ensure its proficiency in traffic forecasting, we have crafted a specialized loss function that incorporates the dynamic road capacity matrix. This loss function is designed to penalize predictions that significantly deviate from observed traffic flow, particularly in scenarios where adherence to road capacity constraints is paramount. The loss function is articulated as:
L = i j w i j · ( Predicted Traffic i j Actual Traffic i j ) 2
where:
  • w i j is a weight factor that increases for road segments where the capacity constraints are more critical, derived from the road capacity matrix;
  • Predicted Traffic i j and Actual Traffic i j are the predicted and actual traffic flows on road segment i j , respectively.
This loss function ensures that the model is not only penalized for overall prediction errors but also for errors that occur in areas where road capacity is a critical factor, thereby fostering a more nuanced and accurate forecasting model.

5. Results

We conduct a rigorous validation of our model on a comprehensive public traffic dataset [32]. This dataset serves as a robust testbed for our model’s performance, allowing us to assess its efficacy in accurately predicting traffic flow under a variety of conditions. The dataset provided by [32] is one of the most comprehensive and widely used datasets in the field of traffic forecasting. This dataset has been employed as a benchmark in numerous seminal studies in the traffic forecasting domain, which allows us to directly compare our model’s performance with other leading-edge methods. It ensures that our results are reproducible and verifiable by other researchers in the community. This transparency is vital for advancing the field and building on each other’s work effectively. Our model is implemented using Python, leveraging pytorch, a widely adopted deep learning framework, and based on pytorch geometric temporal [33] to ensure accessibility and reproducibility.
In this section, we provide an evaluation of our proposed model by comparing its performance with five current, leading-edge methods in traffic forecasting. The main goal of this evaluation is to ascertain the effectiveness and efficiency of our model in forecasting traffic flow. All methods are set so that the first 12 moments predict the next 12 moments, and each moment represents 5 min.
  • RNN (Recurrent Neural Networks) [2]: RNNs, characterized by their ability to model temporal dynamic behavior, are particularly suited for sequence modeling and handling time-dependent data in traffic forecasting.
  • GRU (Gated Recurrent Units) [3]: GRUs streamline recurrent network architecture by merging forget and input gates into a unified update gate, thus offering an efficient solution for capturing temporal dependencies in traffic datasets.
  • LSTM (Long Short-Term Memory) [1]: Designed to overcome the vanishing gradient issue prevalent in conventional RNNs, LSTMs provide a more resilient framework for learning long-term dependencies, which is crucial in traffic forecasting.
  • ARGCN (Adaptive graph convolutional recurrent network for traffic forecasting) [34]: This model combines gated recurrent units for learning short-term trends in time series and graph convolutional networks for understanding spatial dependencies based on the road network’s topology.
  • A3TGCN (Attention temporal graph convolutional network) [35]: This model employs dual modules alongside recurrent networks to autonomously discern fine-grained spatial and temporal correlations within traffic series.
As shown in Table 2, The empirical evaluation conclusively shows that our model outperforms five current, cutting-edge traffic forecasting methods in terms of performance. By attaining the lowest Mean Squared Error (MSE) and Mean Absolute Error (MAE) values of 0.357 and 0.315, respectively, our model underscores its enhanced predictive accuracy and efficiency. This substantiates the effectiveness of our graph neural network model, augmented with a dynamic road capacity matrix, in accurately forecasting traffic flow, thus highlighting its potential for practical applications in traffic management and planning.
In our empirical study, we meticulously analyzed the model’s training performance over a span of 30 epochs. Figure 3 illustrates the Train Root Mean Square Error (RMSE) over these epochs.
Our observations indicate an initial rapid descent in the Train RMSE, indicating the model’s swift adaptation to the training data. This steep decline from 0.3881 to approximately 0.2976 within the initial two epochs showcases the model’s potent learning capabilities. As training progresses, the rate of RMSE reduction slows, suggesting a natural plateau in learning efficiency. The RMSE stabilizes around 0.2630 in the later epochs, indicating the model’s convergence to an optimal state given the current configuration. The minimal fluctuations in RMSE throughout the training process reflect a stable and consistent learning trajectory, indicative of a well-tuned model.
Figure 4 illustrates the comparison between actual traffic flow and the predictions generated by our model. It provides a visual understanding of our model’s predictive performance by displaying side-by-side plots of the original data and the forecasted results.

Ablation Experiments

In this section, we conduct ablation experiments to investigate the impact of adjusting the weight of the road capacity matrix in our model on its performance in comparison and prediction tasks. By selectively modifying the weight of the road capacity matrix in the loss function, we aim to understand its significance and influence on the overall model efficacy.
The experimental setup is as follows:
  • Base Model: This represents the performance metrics of the model without any adjustment to the RCM weight in the loss function.
  • RCM weight 0.5: Here, the weight of the RCM in the loss function has been halved compared to the base model.
  • RCM weight 1: This is the same as the base model, with no adjustment to the RCM weight.
  • RCM weight 2: In this variation, the weight of the RCM in the loss function has been doubled compared to the base model.
  • RCM weight 5: The weight of the RCM in the loss function has been increased by a factor of 5 compared to the base model.
Table 3 shows that our model consistently demonstrates robust performance across different configurations of the loss function. Adding the RCM weight leads to improvements in MSE and MAE compared to the base model. Even when the RCM weight is moderately increased (RCM weight 2 and RCM weight × 5), the model maintains competitive performance, showcasing its resilience to changes in the loss function.

6. Conclusions

In this paper, we presented a method for traffic forecasting by incorporating road capacity difficulty parameters into a Graph Neural Network (GNN) model. Our method stands out by recognizing and embedding the complexities of varying road capacity conditions into the traffic forecasting framework. This integration aims to bridge the gap left by traditional models that often overlook these critical factors. By employing a multi-layered GNN model enriched with a dynamic road capacity matrix, our research not only leverages the typical traffic flow metrics but also embraces the intricate nature of road capacity difficulties.
The empirical validation of our model, conducted using a public dataset, underscores the effectiveness of our approach. The results clearly demonstrate a significant enhancement in prediction accuracy, especially in areas characterized by complex road networks and varied access challenges. The ability of the GNN model to decipher non-linear relationships and dependencies among road capacity factors is pivotal in achieving this improvement. By providing a more accurate and reliable tool for traffic forecasting, our model facilitates improved route planning and traffic management, contributing to more efficient and sustainable urban mobility solutions.

7. Future Work

Several avenues for future research emerge from our study. First and foremost, further exploration into the integration of additional road capacity parameters could enhance the model’s comprehensiveness and accuracy. Parameters such as real-time weather conditions, special event schedules, and even dynamic traffic regulations could provide deeper insights into traffic flow dynamics. The exploration of more sophisticated GNN architectures and training methodologies could uncover new potentials in traffic forecasting accuracy and computational efficiency. Advanced techniques such as attention mechanisms and graph pooling could offer significant improvements in the model’s performance.
Furthermore, cross-domain applications of our model present an exciting frontier. Integrating traffic forecasting models with urban planning and environmental impact assessment tools could lead to more holistic and sustainable urban development strategies.
In conclusion, our research contributes a novel perspective to the field of traffic forecasting, opening new pathways for future exploration. The integration of road capacity difficulty parameters within a GNN framework not only enhances prediction accuracy but also offers a more nuanced understanding of traffic dynamics.

Author Contributions

Y.Y.: Methodology, Validation, writing—original draft preparation; Y.S.: Conceptualization, formal analysis, investigation, data curation, R.M.: writing—review and editing, visualization Y.P.: supervision, project administration, funding acquisition, writing—review and editing All authors have read and agreed to the published version of the manuscript.

Funding

This research work was supported by the National Natural Science Foundation of China: [Grant Number NSFC62206201, 52302517, BK20230893].

Data Availability Statement

Conflicts of Interest

The authors declare no conflicts of interest. The funding sponsors had no role in the design of the study; in the collection, analyses, or interpretation of data; in the writing of the manuscript, and in the decision to publish the results.

References

  1. Hochreiter, S.; Schmidhuber, J. Long short-term memory. Neural Comput. 1997, 9, 1735–1780. [Google Scholar] [CrossRef] [PubMed]
  2. Rumelhart, D.E.; Hinton, G.E.; Williams, R.J. Learning representations by back-propagating errors. Nature 1986, 323, 533–536. [Google Scholar] [CrossRef]
  3. Cho, K.; Van Merriënboer, B.; Gulcehre, C.; Bahdanau, D.; Bougares, F.; Schwenk, H.; Bengio, Y. Learning phrase representations using RNN encoder-decoder for statistical machine translation. arXiv 2014, arXiv:1406.1078. [Google Scholar]
  4. Wu, Y.; Tan, H. Short-term traffic flow forecasting with spatial-temporal correlation in a hybrid deep learning framework. IEEE Trans. Intell. Transp. Syst. 2016, 16, 865–873. [Google Scholar]
  5. Cao, X.; Zhong, Y.; Yun, Z.; Jiang, W.; Zhang, W. Interactive temporal recurrent convolution network for traffic prediction in data centers. IEEE Access 2017, 6, 5276–5289. [Google Scholar] [CrossRef]
  6. Yu, D.; Xiong, W.; Droppo, J.; Stolcke, A.; Ye, G.; Li, J.; Zweig, G. Deep Convolutional Neural Networks with Layer-Wise Context Expansion and Attention. In Proceedings of the Interspeech 2016, San Francisco, CA, USA, 8–12 September 2016; pp. 17–21. [Google Scholar]
  7. Yu, H.; Wu, Z.; Wang, S.; Wang, Y.; Ma, X. Spatiotemporal recurrent convolutional networks for traffic prediction in transportation networks. Sensors 2017, 17, 1501. [Google Scholar] [CrossRef] [PubMed]
  8. Kipf, T.N.; Welling, M. Semi-supervised classification with graph convolutional networks. arXiv 2016, arXiv:1609.02907. [Google Scholar]
  9. Zhou, F.; Yang, Q.; Zhong, T.; Chen, D.; Zhang, N. Variational graph neural networks for road traffic prediction in intelligent transportation systems. IEEE Trans. Ind. Inform. 2020, 17, 2802–2812. [Google Scholar] [CrossRef]
  10. Cui, Z.; Henrickson, K.; Ke, R.; Wang, Y. Traffic graph convolutional recurrent neural network: A deep learning framework for network-scale traffic learning and forecasting. IEEE Trans. Intell. Transp. Syst. 2019, 21, 4883–4894. [Google Scholar] [CrossRef]
  11. Zhang, L.; Liu, Q.; Yang, W.; Wei, N.; Dong, D. An Improved k-Nearest Neighbor Model for Short-term Traffic Flow Prediction. Procedia Soc. Behav. Sci. 2013, 96, 653–662. [Google Scholar] [CrossRef]
  12. Li, Y.; Yu, Z.; He, G.; Shen, Y.; Li, K.; Sun, X.; Lin, S. SPD-DDPM: Denoising Diffusion Probabilistic Models in the Symmetric Positive Definite Space. In Proceedings of the AAAI Conference on Artificial Intelligence 2024, Vancouver, BC, Canada, 20–28 February 2024; Volume 38, pp. 13709–13717. [Google Scholar]
  13. Alghamdi, T.; Elgazzar, K.; Bayoumi, M.; Sharaf, T.; Shah, S. Forecasting Traffic Congestion Using ARIMA Modeling. In Proceedings of the 2019 15th International Wireless Communications & Mobile Computing Conference (IWCMC), Tangier, Morocco, 24–28 June 2019; pp. 1227–1232. [Google Scholar] [CrossRef]
  14. Voort, M.V.D.; Dougherty, M.; Watson, S. Combining Kohonen Maps with ARIMA Time Series Models to Forecast Traffic Flow. Transp. Res. Part C Emerg. Technol. 1996, 4, 307–318. [Google Scholar] [CrossRef]
  15. Williams, B.M.; Hoel, L.A. Modeling and Forecasting Vehicular Traffic Flow as a Seasonal ARIMA Process: Theoretical Basis and Empirical Results. J. Transp. Eng. 2003, 129, 664–672. [Google Scholar] [CrossRef]
  16. Sun, K.; Liu, P.; Li, P.; Liao, Z. ModWaveMLP: MLP-Based Mode Decomposition and Wavelet Denoising Model to Defeat Complex Structures in Traffic Forecasting. In Proceedings of the AAAI Conference on Artificial Intelligence 2024, Vancouver, BC, Canada, 20–28 February 2024; Volume 38, pp. 9035–9043. [Google Scholar]
  17. Shi, X.; Chen, Z.; Wang, H.; Yeung, D.Y.; Wong, W.K.; Woo, W.C. Convolutional LSTM Network: A Machine Learning Approach for Precipitation Nowcasting. Adv. Neural Inf. Process. Syst. 2015, 28, 802–810. [Google Scholar]
  18. Zhang, J.; Zheng, Y.; Qi, D. Deep Spatio-Temporal Residual Networks for Citywide Crowd Flows Prediction. In Proceedings of the AAAI Conference on Artificial Intelligence 2017, San Francisco, CA, USA, 4–9 February 2017. [Google Scholar]
  19. Liu, J.; Shahroudy, A.; Xu, D.; Wang, G. Spatio-Temporal LSTM with Trust Gates for 3D Human Action Recognition. In Proceedings of the European Conference on Computer Vision, Amsterdam, The Netherlands, 11–14 October 2016; Springer: Berlin/Heidelberg, Germany, 2016; pp. 816–833. [Google Scholar]
  20. Vaswani, A.; Shazeer, N.; Parmar, N.; Uszkoreit, J.; Jones, L.; Gomez, A.N.; Kaiser, Ł.; Polosukhin, I. Attention is all you need. Adv. Neural Inf. Process. Syst. 2017, 30, 5998–6008. [Google Scholar]
  21. Li, Y.; Liu, M.; Mishchenko, A.; Yu, C. Verilog-to-PyG-A Framework for Graph Learning and Augmentation on RTL Designs. In Proceedings of the 2023 IEEE/ACM International Conference on Computer Aided Design (ICCAD), San Francisco, CA, USA, 28 October–2 November 2023; IEEE: Piscataway, NJ, USA, 2023; pp. 1–4. [Google Scholar]
  22. Park, N.; Rossi, R.; Wang, X.; Simoulin, A.; Ahmed, N.K.; Faloutsos, C. GLEMOS: Benchmark for Instantaneous Graph Learning Model Selection. Adv. Neural Inf. Process. Syst. 2024, 36, 69887–69899. [Google Scholar]
  23. Zhong, Y.; Sheng, G.; Qin, T.; Wang, M.; Gan, Q.; Wu, C. GNNFlow: A Distributed Framework for Continuous Temporal GNN Learning on Dynamic Graphs. arXiv 2023, arXiv:2311.17410. [Google Scholar]
  24. Veeriah, V.; Zhuang, N.; Qi, G.J. Differential Recurrent Neural Networks for Action Recognition. In Proceedings of the IEEE International Conference on Computer Vision 2015, Santiago, Chile, 7–13 December 2015; pp. 4041–4049. [Google Scholar]
  25. Yu, B.; Yin, H.; Zhu, Z. Spatio-temporal Graph Convolutional Networks: A Deep Learning Framework for Traffic Forecasting. In Proceedings of the Twenty-Seventh International Joint Conference on Artificial Intelligence, IJCAI-18, Stockholm, Sweden, 13–19 July 2018; pp. 3634–3640. [Google Scholar] [CrossRef]
  26. Wu, Z.; Pan, S.; Long, G.; Jiang, J.; Zhang, C. Graph Wavenet for Deep Spatial-temporal Graph Modeling. In Proceedings of the 28th International Joint Conference on Artificial Intelligence, IJCAI’19, Macao, China, 10–16 August 2019; pp. 1907–1913. [Google Scholar]
  27. Fang, Z.; Long, Q.; Song, G.; Xie, K. Spatial-Temporal Graph ODE Networks for Traffic Flow Forecasting. In Proceedings of the 27th ACM SIGKDD Conference on Knowledge Discovery & Data Mining, Virtual, Singapore, 14–18 August 2021. [Google Scholar] [CrossRef]
  28. Zhu, J.; Zhu, J.; Song, Y.; Zhao, L.; Hou, Z.; Du, R.; Li, H. A3T-GCN: Attention Temporal Graph Convolutional Network for Traffic Forecasting. arXiv 2020, arXiv:2006.11583. [Google Scholar]
  29. Guo, S.; Lin, Y.; Feng, N.; Song, C.; Wan, H. Attention Based Spatial-Temporal Graph Convolutional Networks for Traffic Flow Forecasting. In Proceedings of the AAAI Conference on Artificial Intelligence 2019, Honolulu, HI, USA, 27 January–1 February 2019. [Google Scholar]
  30. Song, C.; Lin, Y.; Guo, S.; Wan, H. Spatial-temporal synchronous graph convolutional networks: A new framework for spatial-temporal network data forecasting. In Proceedings of the AAAI Conference on Artificial Intelligence 2020, New York, NY, USA, 7–12 February 2020; Volume 34, pp. 914–921. [Google Scholar]
  31. Li, Z.; Zhu, Z. Spatial-Temporal Fusion Graph Neural Networks for Traffic Flow Forecasting. In Proceedings of the AAAI Conference on Artificial Intelligence 2021, Virtual, 2–9 February 2021. [Google Scholar]
  32. Li, Y.; Yu, R.; Shahabi, C.; Liu, Y. Diffusion convolutional recurrent neural network: Data-driven traffic forecasting. arXiv 2017, arXiv:1707.01926. [Google Scholar]
  33. Rozemberczki, B.; Scherer, P.; He, Y.; Panagopoulos, G.; Riedel, A.; Astefanoaei, M.; Kiss, O.; Beres, F.; Lopez, G.; Collignon, N.; et al. PyTorch Geometric Temporal: Spatiotemporal Signal Processing with Neural Machine Learning Models. In Proceedings of the 30th ACM International Conference on Information and Knowledge Management, Online, 1–5 November 2021; pp. 4564–4573. [Google Scholar]
  34. Bai, L.; Yao, L.; Li, C.; Wang, X.; Wang, C. Adaptive graph convolutional recurrent network for traffic forecasting. Adv. Neural Inf. Process. Syst. 2020, 33, 17804–17815. [Google Scholar]
  35. Bai, J.; Zhu, J.; Song, Y.; Zhao, L.; Hou, Z.; Du, R.; Li, H. A3t-gcn: Attention temporal graph convolutional network for traffic forecasting. ISPRS Int. J.-Geo-Inf. 2021, 10, 485. [Google Scholar] [CrossRef]
Figure 1. The overview of our method.
Figure 1. The overview of our method.
Symmetry 16 00935 g001
Figure 2. A schematic representation illustrates the correlation between traffic flow state and capacity.
Figure 2. A schematic representation illustrates the correlation between traffic flow state and capacity.
Symmetry 16 00935 g002
Figure 3. Train Root Mean Square Error (RMSE) over 30 epochs.
Figure 3. Train Root Mean Square Error (RMSE) over 30 epochs.
Symmetry 16 00935 g003
Figure 4. The comparison between actual traffic flow and the predictions.
Figure 4. The comparison between actual traffic flow and the predictions.
Symmetry 16 00935 g004
Table 1. Methods in traffic forecasting.
Table 1. Methods in traffic forecasting.
MethodKey Contributions and References
ARIMA, SARIMATraditional time series forecasting methods, robust in handling seasonal patterns.
FC-LSTM, ConvLSTMEarly deep learning models capturing spatio-temporal dependencies using CNNs and RNNs for traffic data.
ST-ResNet, ST-LSTMImproved grid-based data representations, focusing on both spatial and temporal dimensions.
GRUsEffective in capturing temporal dynamics of traffic states.
Attention MechanismsEnhanced forecasting precision by focusing on significant time points and global temporal insights.
Graph Convolutional Networks (GCNs)Applied to graph-structured data, identifying spatial dependencies within traffic systems.
Neural Differential Equations (NDEs)Introduced continuous depth in neural networks, capturing complex spatio-temporal dependencies.
Spatio-Temporal Graph NetworksCombined GCN and RNN capabilities to analyze spatial and temporal traffic data dependencies.
Table 2. Comparison with other traffic flow methods.
Table 2. Comparison with other traffic flow methods.
MethodMSEMAE
RNN0.5150.416
LSTM0.7360.528
GRU0.8370.616
ARGCN0.9730.643
A3tgcn0.5490.464
Our0.3570.315
Table 3. Performance comparison of ablation experiments.
Table 3. Performance comparison of ablation experiments.
MethodMSEMAE
Base Model0.5490.464
RCM weight0.50.3560.297
RCM weight10.3570.315
RCM weight20.3560.304
RCM weight50.3540.296
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Yuan, Y.; Peng, Y.; Meng, R.; Sun, Y. Graph-Based Traffic Forecasting with the Dynamics of Road Symmetry and Capacity Performance. Symmetry 2024, 16, 935. https://doi.org/10.3390/sym16070935

AMA Style

Yuan Y, Peng Y, Meng R, Sun Y. Graph-Based Traffic Forecasting with the Dynamics of Road Symmetry and Capacity Performance. Symmetry. 2024; 16(7):935. https://doi.org/10.3390/sym16070935

Chicago/Turabian Style

Yuan, Ye, Yuan Peng, Ruicai Meng, and Yongliang Sun. 2024. "Graph-Based Traffic Forecasting with the Dynamics of Road Symmetry and Capacity Performance" Symmetry 16, no. 7: 935. https://doi.org/10.3390/sym16070935

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Article metric data becomes available approximately 24 hours after publication online.
Back to TopTop