Next Article in Journal
Increasing the Beam Width and Intensity with Refraction Power Effect Using a Combination of Beam Mirrors and Concave Mirrors for Surgical-Fluorescence-Emission-Guided Cancer Monitoring Method
Previous Article in Journal
Short-Term Wind Power Prediction Based on Encoder–Decoder Network and Multi-Point Focused Linear Attention Mechanism
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Spatial–Temporal Transformer Networks for Traffic Flow Forecasting Using a Pre-Trained Language Model

School of Mechanical Engineering and Electronic Information, China University of Geosciences, Wuhan 430074, China
*
Author to whom correspondence should be addressed.
Sensors 2024, 24(17), 5502; https://doi.org/10.3390/s24175502
Submission received: 18 July 2024 / Revised: 18 August 2024 / Accepted: 23 August 2024 / Published: 25 August 2024
(This article belongs to the Section Intelligent Sensors)

Abstract

Most current methods use spatial–temporal graph neural networks (STGNNs) to analyze complex spatial–temporal information from traffic data collected from hundreds of sensors. STGNNs combine graph neural networks (GNNs) and sequence models to create hybrid structures that allow for the two networks to collaborate. However, this collaboration has made the model increasingly complex. This study proposes a framework that relies solely on original Transformer architecture and carefully designs embeddings to efficiently extract spatial–temporal dependencies in traffic flow. Additionally, we used pre-trained language models to enhance forecasting performance. We compared our new framework with current state-of-the-art STGNNs and Transformer-based models using four real-world traffic datasets: PEMS04, PEMS08, METR-LA, and PEMS-BAY. The experimental results demonstrate that our framework outperforms the other models in most metrics.
Keywords: traffic flow forecasting; spatial–temporal dependency; Transformer; LLMs traffic flow forecasting; spatial–temporal dependency; Transformer; LLMs

Share and Cite

MDPI and ACS Style

Ma, J.; Zhao, J.; Hou, Y. Spatial–Temporal Transformer Networks for Traffic Flow Forecasting Using a Pre-Trained Language Model. Sensors 2024, 24, 5502. https://doi.org/10.3390/s24175502

AMA Style

Ma J, Zhao J, Hou Y. Spatial–Temporal Transformer Networks for Traffic Flow Forecasting Using a Pre-Trained Language Model. Sensors. 2024; 24(17):5502. https://doi.org/10.3390/s24175502

Chicago/Turabian Style

Ma, Ju, Juan Zhao, and Yao Hou. 2024. "Spatial–Temporal Transformer Networks for Traffic Flow Forecasting Using a Pre-Trained Language Model" Sensors 24, no. 17: 5502. https://doi.org/10.3390/s24175502

APA Style

Ma, J., Zhao, J., & Hou, Y. (2024). Spatial–Temporal Transformer Networks for Traffic Flow Forecasting Using a Pre-Trained Language Model. Sensors, 24(17), 5502. https://doi.org/10.3390/s24175502

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop