ADTime: Adaptive Multivariate Time Series Forecasting Using LLMs
Abstract
:1. Introduction
- We investigate the domain-specific correlations among multivariate data and the heterogeneous temporal characteristics across different channels. This leads to the development of an elastic forecasting method tailored to diverse temporal patterns, enhancing the capacity of LLMs to process multi-scale temporal information.
- We propose an adaptive LLM-based framework called ADTime, which integrates two modules in order to process multivariate time series data. First, we apply channel clustering and temporal decomposition based on the distinct temporal features of each channel. This is followed by an embedding construction module that uses text alignment and adaptively designed prompts to improve the model’s ability to capture temporal patterns.
- Experimental results on seven forecasting benchmarks show that our model outperforms existing methods. Specifically, ADTime reduces MSE by 9.5% and MAE by 6.1% on public datasets, while achieving reductions of 17.1% and 13.5% on real-world refinery datasets (the refinery dataset utilized in this study was provided by Sinopec, and contains sensor and product generation data from multiple refineries that are not publicly available due to privacy concerns). Additionally, through ablation studies and model analysis, we further investigate the key factors contributing to ADTime’s effectiveness.
2. Related Work
2.1. Transformer-Based Time Series Models
2.2. Decomposition of Time Series
2.3. Channel Strategies in Time Series Models
3. Methodology
3.1. Multivariate Time Series Process
3.2. Modal Alignment
3.3. Adaptive Prompts for LLMs
- Dataset Context: Provides background information about the input time series, incorporating semantic features along with domain-specific knowledge such as chemical- or industry-specific insights.
- Data Features: Includes statistical metrics (e.g., mean and standard deviation), temporal attributes (e.g., periodicity and trends), and correlations derived during preprocessing.
- Task Instructions: These instructions explicitly guide the LLM to perform specific predictive tasks by leveraging the provided data features.
- An example of such a prompt is shown in Figure 4, which illustrates how the prompt incorporates dataset context, data features, and task instructions for a specific time series forecasting problem.
4. Results
4.1. Experiment Setup
4.2. Few-Shot Forecasting
4.3. Zero-Shot Forecasting
4.4. Ablation Study
4.5. Model Analysis
5. Conclusions
Author Contributions
Funding
Data Availability Statement
Acknowledgments
Conflicts of Interest
References
- Angryk, R.A.; Martens, P.C.; Aydin, B.; Kempton, D.; Mahajan, S.S.; Basodi, S.; Ahmadzadeh, A.; Cai, X.; Filali Boubrahimi, S.; Hamdi, S.M.; et al. Multivariate time series dataset for space weather data analytics. Sci. Data 2020, 7, 227. [Google Scholar] [CrossRef] [PubMed]
- Wang, Y.; Huang, N.; Li, T.; Yan, Y.; Zhang, X. Medformer: A Multi-Granularity Patching Transformer for Medical Time-Series Classification. Adv. Neural Inf. Process. Syst. 2024, 37, 36314–36341. [Google Scholar]
- Li, L.; Su, X.; Zhang, Y.; Lin, Y.; Li, Z. Trend modeling for traffic time series analysis: An integrated study. IEEE Trans. Intell. Transp. Syst. 2015, 16, 3430–3439. [Google Scholar] [CrossRef]
- Zhu, Z.; Chen, W.; Xia, R.; Zhou, T.; Niu, P.; Peng, B.; Wang, W.; Liu, H.; Ma, Z.; Gu, X.; et al. Energy forecasting with robust, flexible, and explainable machine learning algorithms. AI Mag. 2023, 44, 377–393. [Google Scholar] [CrossRef]
- Yu, X.; Chen, Z.; Ling, Y.; Dong, S.; Liu, Z.; Lu, Y. Temporal Data Meets LLM–Explainable Financial Time Series Forecasting. arXiv 2023, arXiv:2306.11025. [Google Scholar]
- Brown, T.; Mann, B.; Ryder, N.; Subbiah, M.; Kaplan, J.D.; Dhariwal, P.; Neelakantan, A.; Shyam, P.; Sastry, G.; Askell, A.; et al. Language models are few-shot learners. Adv. Neural Inf. Process. Syst. 2020, 33, 1877–1901. [Google Scholar]
- Liu, T.; Low, B.K.H. Goat: Fine-tuned llama outperforms gpt-4 on arithmetic tasks. arXiv 2023, arXiv:2305.14201. [Google Scholar]
- Touvron, H.; Lavril, T.; Izacard, G.; Martinet, X.; Lachaux, M.A.; Lacroix, T.; Rozière, B.; Goyal, N.; Hambro, E.; Azhar, F.; et al. Llama: Open and efficient foundation language models. arXiv 2023, arXiv:2302.13971. [Google Scholar]
- Radford, A.; Wu, J.; Child, R.; Luan, D.; Amodei, D.; Sutskever, I. Language models are unsupervised multitask learners. OpenAI Blog 2019, 1, 9. [Google Scholar]
- Nie, Y.; Nguyen, N.H.; Sinthong, P.; Kalagnanam, J. A Time Series is Worth 64 Words: Long-term Forecasting with Transformers. In Proceedings of the Eleventh International Conference on Learning Representations, Kigali, Rwanda, 1–5 May 2023. [Google Scholar]
- Zeng, A.; Chen, M.; Zhang, L.; Xu, Q. Are transformers effective for time series forecasting? AAAI Conf. Artif. Intell. 2023, 37, 11121–11128. [Google Scholar] [CrossRef]
- Wen, Q.; Zhou, T.; Zhang, C.; Chen, W.; Ma, Z.; Yan, J.; Sun, L. Transformers in time series: A survey. In Proceedings of the Thirty-Second International Joint Conference on Artificial Intelligence, Macao, China, 19–25 August 2023; pp. 6778–6786. [Google Scholar]
- Liu, Y.; Wu, H.; Wang, J.; Long, M. Non-stationary transformers: Exploring the stationarity in time series forecasting. Adv. Neural Inf. Process. Syst. 2022, 35, 9881–9893. [Google Scholar]
- Kitaev, N.; Kaiser, Ł.; Levskaya, A. Reformer: The efficient transformer. In Proceedings of the International Conference on Learning Representations, Addis Ababa, Ethiopia, 30 April 2020. [Google Scholar]
- Liu, Y.; Hu, T.; Zhang, H.; Wu, H.; Wang, S.; Ma, L.; Long, M. itransformer: Inverted transformers are effective for time series forecasting. In Proceedings of the Twelfth International Conference on Learning Representations, Vienna, Austria, 7–11 May 2023. [Google Scholar]
- Zhou, H.; Zhang, S.; Peng, J.; Zhang, S.; Li, J.; Xiong, H.; Zhang, W. Informer: Beyond efficient transformer for long sequence time-series forecasting. AAAI Conf. Artif. Intell. 2021, 35, 11106–11115. [Google Scholar] [CrossRef]
- Wu, H.; Xu, J.; Wang, J.; Long, M. Autoformer: Decomposition transformers with auto-correlation for long-term series forecasting. Adv. Neural Inf. Process. Syst. 2021, 34, 22419–22430. [Google Scholar]
- Zhou, T.; Ma, Z.; Wen, Q.; Wang, X.; Sun, L.; Jin, R. Fedformer: Frequency enhanced decomposed transformer for long-term series forecasting. In Proceedings of the International conference on machine learning. PMLR, Baltimore, MD, USA, 17–23 July 2022; pp. 27268–27286. [Google Scholar]
- Chen, P.; Zhang, Y.; Cheng, Y.; Shu, Y.; Wang, Y.; Wen, Q.; Yang, B.; Guo, C. Pathformer: Multi-scale transformers with adaptive pathways for time series forecasting. arXiv 2024, arXiv:2402.05956. [Google Scholar]
- Jin, M.; Zhang, Y.; Chen, W.; Zhang, K.; Liang, Y.; Yang, B.; Wang, J.; Pan, S.; Wen, Q. Position: What Can Large Language Models Tell Us about Time Series Analysis. In Proceedings of the Forty-first International Conference on Machine Learning, Vienna, Austria, 21–27 July 2024. [Google Scholar]
- Liu, C.; Yang, S.; Xu, Q.; Li, Z.; Long, C.; Li, Z.; Zhao, R. Spatial-temporal large language model for traffic prediction. In Proceedings of the 2024 25th IEEE International Conference on Mobile Data Management (MDM), Brussels, Belgium, 24–27 June 2024; pp. 31–40. [Google Scholar]
- Zhou, T.; Niu, P.; Sun, L.; Jin, R. One fits all: Power general time series analysis by pretrained lm. Adv. Neural Inf. Process. Syst. 2023, 36, 43322–43355. [Google Scholar]
- Liu, C.; Xu, Q.; Miao, H.; Yang, S.; Zhang, L.; Long, C.; Li, Z.; Zhao, R. TimeCMA: Towards LLM-Empowered Time Series Forecasting via Cross-Modality Alignment. In Proceedings of the AAAI, Dubai, United Arab Emirates, 20–22 May 2025. [Google Scholar]
- Sun, C.; Li, H.; Li, Y.; Hong, S. TEST: Text prototype aligned embedding to activate LLM’s ability for time series. In Proceedings of the Twelfth International Conference on Learning Representations, Vienna, Austria, 7–11 May 2024. [Google Scholar]
- Jin, M.; Wang, S.; Ma, L.; Chu, Z.; Zhang, J.Y.; Shi, X.; Chen, P.Y.; Liang, Y.; Li, Y.F.; Pan, S.; et al. Time-llm: Time series forecasting by reprogramming large language models. In Proceedings of the Twelfth International Conference on Learning Representations, Vienna, Austria, 7–11 May 2024. [Google Scholar]
- Cao, D.; Jia, F.; Arik, S.O.; Pfister, T.; Zheng, Y.; Ye, W.; Liu, Y. Tempo: Prompt-based generative pre-trained transformer for time series forecasting. In Proceedings of the Twelfth International Conference on Learning Representations, Vienna, Austria, 7–11 May 2024. [Google Scholar]
- Wang, S.; Wu, H.; Shi, X.; Hu, T.; Luo, H.; Ma, L.; Zhang, J.Y.; Zhou, J. Timemixer: Decomposable multiscale mixing for time series forecasting. In Proceedings of the Twelfth International Conference on Learning Representations, Vienna, Austria, 7–11 May 2024. [Google Scholar]
- Cleveland, R.B.; Cleveland, W.S.; McRae, J.E.; Terpenning, I. STL: A seasonal-trend decomposition. J. Off. Stat. 1990, 6, 3–73. [Google Scholar]
- Oreshkin, B.N.; Carpov, D.; Chapados, N.; Bengio, Y. N-BEATS: Neural basis expansion analysis for interpretable time series forecasting. In Proceedings of the International Conference on Learning Representations, New Orleans, LA, USA, 6–9 May 2019. [Google Scholar]
- Wu, H.; Hu, T.; Liu, Y.; Zhou, H.; Wang, J.; Long, M. Timesnet: Temporal 2d-variation modeling for general time series analysis. In Proceedings of the Eleventh International Conference on Learning Representations, Kigali, Rwanda, 1–5 May 2023. [Google Scholar]
- Liu, S.; Yu, H.; Liao, C.; Li, J.; Lin, W.; Liu, A.X.; Dustdar, S. Pyraformer: Low-complexity pyramidal attention for long-range time series modeling and forecasting. In Proceedings of the International Conference on Learning Representations, Virtual, 25–29 April 2022. [Google Scholar]
- Zhang, Y.; Yan, J. Crossformer: Transformer utilizing cross-dimension dependency for multivariate time series forecasting. In Proceedings of the Eleventh International Conference on Learning Representations, Kigali, Rwanda, 1–5 May 2023. [Google Scholar]
- Han, L.; Ye, H.J.; Zhan, D.C. The capacity and robustness trade-off: Revisiting the channel independent strategy for multivariate time series forecasting. IEEE Trans. Knowl. Data Eng. 2024, 36, 7129–7142. [Google Scholar] [CrossRef]
- Wang, X.; Zhou, T.; Wen, Q.; Gao, J.; Ding, B.; Jin, R. CARD: Channel aligned robust blend transformer for time series forecasting. In Proceedings of the Twelfth International Conference on Learning Representations, Vienna, Austria, 7–11 May 2024. [Google Scholar]
- Chen, J.; Lenssen, J.E.; Feng, A.; Hu, W.; Fey, M.; Tassiulas, L.; Leskovec, J.; Ying, R. From Similarity to Superiority: Channel Clustering for Time Series Forecasting. arXiv 2024, arXiv:2404.01340. [Google Scholar]
- Li, X.L.; Liang, P. Prefix-tuning: Optimizing continuous prompts for generation. In Proceedings of the 59th Annual Meeting of the Association for Computational Linguistics and the 11th International Joint Conference on Natural Language Processing; Volume 1: Long Papers; Association for Computational Linguistics: Stroudsburg, PA, USA, 2021; pp. 4582–4597. [Google Scholar]
- Gruver, N.; Finzi, M.; Qiu, S.; Wilson, A.G. Large language models are zero-shot time series forecasters. Adv. Neural Inf. Process. Syst. 2024, 36, 861. [Google Scholar]
Methods | Ours | TEMPO | Time-LLM | GPT4TS | PatchTST | Dlinear | Autoformer | Informer | |||||||||||||||||
---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
Metric | MSE | MAE | RSE | MSE | MAE | RSE | MSE | MAE | RSE | MSE | MAE | RSE | MSE | MAE | RSE | MSE | MAE | RSE | MSE | MAE | RSE | MSE | MAE | RSE | |
ETTh1 | 96 | 0.602 | 0.514 | 0.737 | 0.642 | 0.550 | 1.045 | 0.535 | 0.672 | 0.779 | 0.537 | 0.525 | 0.833 | 0.689 | 0.574 | 0.818 | 0.617 | 0.584 | 0.987 | 0.790 | 0.650 | 0.850 | 1.010 | 0.740 | 0.960 |
192 | 0.686 | 0.580 | 0.742 | 0.805 | 0.598 | 1.224 | 0.971 | 0.678 | 0.938 | 0.732 | 0.615 | 0.781 | 0.859 | 0.667 | 0.846 | 0.714 | 0.621 | 0.766 | 0.975 | 0.736 | 0.902 | 1.211 | 0.840 | 1.058 | |
336 | 0.926 | 0.660 | 0.920 | 1.323 | 0.803 | 1.117 | 1.355 | 0.817 | 1.110 | 1.108 | 0.760 | 0.964 | 1.175 | 0.793 | 0.992 | 0.947 | 0.762 | 0.791 | 0.965 | 0.729 | 0.899 | 1.621 | 0.895 | 1.182 | |
avg | 0.738 | 0.584 | 0.800 | 0.923 | 0.651 | 1.129 | 0.954 | 0.722 | 0.942 | 0.792 | 0.634 | 0.859 | 0.907 | 0.678 | 0.885 | 0.759 | 0.655 | 0.848 | 0.910 | 0.705 | 0.884 | 1.281 | 0.825 | 1.067 | |
ETTh2 | 96 | 0.388 | 0.412 | 0.502 | 0.407 | 0.429 | 0.543 | 0.395 | 0.415 | 0.507 | 0.531 | 0.494 | 0.723 | 0.401 | 0.422 | 0.637 | 0.436 | 0.487 | 0.673 | 0.590 | 0.590 | 0.610 | 3.620 | 1.520 | 1.520 |
192 | 0.398 | 0.409 | 0.543 | 0.440 | 0.443 | 0.569 | 0.516 | 0.476 | 0.576 | 0.391 | 0.414 | 0.794 | 0.412 | 0.446 | 0.814 | 0.603 | 0.498 | 0.698 | 0.519 | 0.420 | 0.716 | 3.693 | 1.314 | 1.649 | |
336 | 0.394 | 0.411 | 0.518 | 0.447 | 0.454 | 0.601 | 0.524 | 0.485 | 0.578 | 0.393 | 0.393 | 0.736 | 0.422 | 0.451 | 0.840 | 0.894 | 0.514 | 0.909 | 0.535 | 0.431 | 0.747 | 3.453 | 1.660 | 2.021 | |
avg | 0.393 | 0.411 | 0.521 | 0.431 | 0.442 | 0.571 | 0.478 | 0.459 | 0.554 | 0.439 | 0.434 | 0.751 | 0.412 | 0.440 | 0.764 | 0.645 | 0.499 | 0.760 | 0.548 | 0.480 | 0.691 | 3.588 | 1.498 | 1.730 | |
ETTm1 | 96 | 0.411 | 0.410 | 0.614 | 0.535 | 0.478 | 0.835 | 0.478 | 0.438 | 0.658 | 0.399 | 0.415 | 0.631 | 0.429 | 0.429 | 0.680 | 0.422 | 0.474 | 0.815 | 0.780 | 0.610 | 0.840 | 1.040 | 0.760 | 0.970 |
192 | 0.455 | 0.450 | 0.662 | 0.549 | 0.495 | 0.855 | 0.479 | 0.451 | 0.659 | 0.490 | 0.466 | 0.639 | 0.599 | 0.533 | 0.706 | 0.427 | 0.434 | 0.596 | 0.907 | 0.677 | 0.869 | 1.476 | 0.725 | 0.936 | |
336 | 0.435 | 0.431 | 0.646 | 0.635 | 0.531 | 0.888 | 0.443 | 0.424 | 0.634 | 0.587 | 0.519 | 0.699 | 0.683 | 0.592 | 0.754 | 0.493 | 0.472 | 0.641 | 0.923 | 0.695 | 0.878 | 1.878 | 0.839 | 0.949 | |
720 | 0.487 | 0.467 | 0.694 | 0.667 | 0.553 | 1.051 | 0.593 | 0.512 | 0.733 | 0.728 | 0.591 | 0.781 | 1.010 | 0.711 | 0.920 | 0.563 | 0.525 | 0.687 | 1.036 | 0.762 | 0.931 | 1.096 | 0.844 | 1.110 | |
avg | 0.447 | 0.439 | 0.654 | 0.597 | 0.514 | 0.907 | 0.498 | 0.456 | 0.671 | 0.551 | 0.498 | 0.687 | 0.680 | 0.566 | 0.765 | 0.476 | 0.476 | 0.685 | 0.912 | 0.686 | 0.879 | 1.372 | 0.792 | 0.991 | |
ETTm2 | 96 | 0.173 | 0.253 | 0.374 | 0.225 | 0.307 | 0.489 | 0.240 | 0.311 | 0.397 | 0.194 | 0.277 | 0.441 | 0.219 | 0.299 | 0.468 | 0.320 | 0.380 | 0.460 | 0.370 | 0.340 | 0.490 | 3.590 | 1.510 | 1.530 |
192 | 0.227 | 0.248 | 0.396 | 0.302 | 0.350 | 0.453 | 0.277 | 0.333 | 0.426 | 0.250 | 0.288 | 0.486 | 0.263 | 0.322 | 0.506 | 0.327 | 0.381 | 0.627 | 0.311 | 0.322 | 0.575 | 3.598 | 1.475 | 1.317 | |
336 | 0.250 | 0.325 | 0.454 | 0.348 | 0.382 | 0.510 | 0.342 | 0.375 | 0.472 | 0.296 | 0.346 | 0.542 | 0.298 | 0.341 | 0.560 | 0.379 | 0.424 | 0.694 | 0.331 | 0.333 | 0.603 | 3.792 | 1.213 | 1.380 | |
720 | 0.407 | 0.420 | 0.515 | 0.427 | 0.421 | 0.562 | 0.471 | 0.445 | 0.552 | 0.402 | 0.416 | 0.696 | 0.371 | 0.393 | 0.659 | 0.464 | 0.488 | 0.794 | 0.385 | 0.376 | 0.675 | 3.623 | 1.503 | 1.548 | |
avg | 0.264 | 0.311 | 0.435 | 0.325 | 0.365 | 0.504 | 0.332 | 0.366 | 0.462 | 0.286 | 0.332 | 0.541 | 0.288 | 0.339 | 0.549 | 0.373 | 0.418 | 0.644 | 0.349 | 0.343 | 0.586 | 3.651 | 1.425 | 1.444 | |
Weather | 96 | 0.183 | 0.236 | 0.438 | 0.193 | 0.239 | 0.622 | 0.193 | 0.245 | 0.579 | 0.185 | 0.244 | 0.450 | 0.192 | 0.242 | 0.443 | 0.280 | 0.330 | 0.700 | 0.300 | 0.360 | 0.720 | 0.510 | 0.510 | 0.940 |
192 | 0.193 | 0.257 | 0.604 | 0.246 | 0.289 | 0.706 | 0.244 | 0.282 | 0.650 | 0.243 | 0.283 | 0.648 | 0.264 | 0.306 | 0.676 | 0.227 | 0.281 | 0.627 | 0.381 | 0.401 | 0.813 | 0.708 | 0.599 | 1.108 | |
336 | 0.284 | 0.319 | 0.718 | 0.326 | 0.343 | 0.851 | 0.290 | 0.314 | 0.708 | 0.301 | 0.330 | 0.721 | 0.314 | 0.339 | 0.736 | 0.279 | 0.324 | 0.694 | 0.404 | 0.411 | 0.835 | 0.680 | 0.583 | 1.083 | |
720 | 0.347 | 0.371 | 0.791 | 0.372 | 0.381 | 0.935 | 0.378 | 0.373 | 0.809 | 0.371 | 0.381 | 0.802 | 0.395 | 0.396 | 0.827 | 0.364 | 0.388 | 0.794 | 0.431 | 0.425 | 0.863 | 0.633 | 0.552 | 1.047 | |
avg | 0.252 | 0.296 | 0.638 | 0.284 | 0.313 | 0.778 | 0.277 | 0.303 | 0.686 | 0.275 | 0.309 | 0.655 | 0.291 | 0.321 | 0.671 | 0.288 | 0.331 | 0.704 | 0.379 | 0.399 | 0.808 | 0.633 | 0.561 | 1.044 | |
ECL | 96 | 0.182 | 0.304 | 0.408 | 0.180 | 0.292 | 0.405 | 0.215 | 0.328 | 0.461 | 0.146 | 0.242 | 0.382 | 0.144 | 0.239 | 0.379 | 0.230 | 0.261 | 0.392 | 0.510 | 0.550 | 0.710 | 1.330 | 0.930 | 1.150 |
192 | 0.158 | 0.241 | 0.382 | 0.182 | 0.285 | 0.447 | 0.225 | 0.334 | 0.471 | 0.172 | 0.273 | 0.413 | 0.225 | 0.329 | 0.472 | 0.157 | 0.252 | 0.394 | 0.436 | 0.495 | 0.656 | 1.326 | 0.947 | 1.145 | |
336 | 0.178 | 0.277 | 0.414 | 0.209 | 0.309 | 0.524 | 0.233 | 0.341 | 0.481 | 0.187 | 0.286 | 0.430 | 0.320 | 0.415 | 0.563 | 0.268 | 0.268 | 0.414 | 0.440 | 0.503 | 0.660 | 1.325 | 0.950 | 1.146 | |
720 | 0.224 | 0.312 | 0.449 | 0.279 | 0.355 | 0.574 | 0.283 | 0.377 | 0.531 | 0.241 | 0.331 | 0.490 | 0.292 | 0.382 | 0.539 | 0.218 | 0.309 | 0.465 | 0.519 | 0.551 | 0.719 | 1.279 | 0.926 | 1.128 | |
avg | 0.186 | 0.284 | 0.413 | 0.213 | 0.310 | 0.488 | 0.239 | 0.345 | 0.486 | 0.186 | 0.283 | 0.429 | 0.245 | 0.341 | 0.488 | 0.218 | 0.272 | 0.416 | 0.476 | 0.525 | 0.686 | 1.315 | 0.938 | 1.142 | |
Traffic | 96 | 0.392 | 0.320 | 0.603 | 0.438 | 0.311 | 0.625 | 0.653 | 0.443 | 0.669 | 0.434 | 0.320 | 0.659 | 0.413 | 0.286 | 0.642 | 0.460 | 0.346 | 0.634 | 0.875 | 0.513 | 1.003 | 1.730 | 0.900 | 1.090 |
192 | 0.419 | 0.311 | 0.574 | 0.496 | 0.355 | 0.687 | 0.690 | 0.468 | 0.686 | 0.4597 | 0.329 | 0.560 | 0.441 | 0.298 | 0.548 | 0.447 | 0.314 | 0.572 | 0.876 | 0.538 | 0.772 | 1.547 | 0.818 | 1.027 | |
336 | 0.437 | 0.314 | 0.599 | 0.503 | 0.356 | 0.681 | 0.690 | 0.465 | 0.683 | 0.479 | 0.334 | 0.569 | 0.460 | 0.309 | 0.557 | 0.484 | 0.340 | 0.572 | 0.962 | 0.585 | 0.806 | 1.580 | 0.829 | 1.033 | |
avg | 0.416 | 0.315 | 0.592 | 0.479 | 0.341 | 0.664 | 0.677 | 0.459 | 0.679 | 0.457 | 0.328 | 0.596 | 0.438 | 0.298 | 0.583 | 0.464 | 0.333 | 0.592 | 0.904 | 0.545 | 0.861 | 1.619 | 0.849 | 1.050 |
Methods | Ours | TEMPO | Time-LLM | GPT4TS | PatchTST | Dlinear | Autoformer | Informer | |||||||||||||||||
---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
Metric | MSE | MAE | RSE | MSE | MAE | RSE | MSE | MAE | RSE | MSE | MAE | RSE | MSE | MAE | RSE | MSE | MAE | RSE | MSE | MAE | RSE | MSE | MAE | RSE | |
PI1 | 96 | 1.226 | 0.986 | 0.118 | 1.234 | 1.130 | 0.119 | 1.248 | 1.060 | 0.118 | 1.175 | 1.035 | 0.115 | 1.971 | 1.435 | 0.149 | 1.669 | 1.353 | 0.137 | 1.803 | 1.712 | 0.136 | 5.968 | 3.777 | 2.014 |
192 | 0.971 | 0.992 | 0.120 | 1.628 | 1.277 | 0.143 | 1.534 | 1.215 | 0.132 | 1.669 | 1.311 | 0.137 | 1.593 | 1.331 | 0.134 | 2.196 | 2.817 | 0.285 | 1.965 | 1.733 | 0.149 | 5.926 | 3.749 | 1.725 | |
336 | 1.929 | 1.334 | 0.142 | 2.022 | 1.423 | 0.157 | 1.938 | 1.404 | 0.148 | 2.051 | 1.447 | 0.152 | 3.649 | 2.466 | 0.203 | 1.963 | 2.524 | 0.347 | 2.112 | 1.799 | 0.155 | 5.908 | 3.743 | 1.628 | |
avg | 1.375 | 1.104 | 0.126 | 1.628 | 1.277 | 0.140 | 1.573 | 1.226 | 0.133 | 1.632 | 1.264 | 0.135 | 2.405 | 1.744 | 0.162 | 1.943 | 2.231 | 0.256 | 1.960 | 1.748 | 0.146 | 5.934 | 3.756 | 1.789 | |
PI2 | 96 | 0.410 | 0.651 | 0.366 | 0.442 | 0.724 | 0.412 | 0.417 | 0.670 | 0.393 | 0.458 | 0.683 | 0.412 | 0.430 | 0.773 | 0.399 | 0.382 | 0.710 | 0.376 | 0.489 | 1.009 | 0.409 | 2.730 | 1.821 | 1.540 |
192 | 0.552 | 0.711 | 0.371 | 0.555 | 0.842 | 0.438 | 0.509 | 0.764 | 0.429 | 0.652 | 1.016 | 0.486 | 0.537 | 0.800 | 0.441 | 0.542 | 0.869 | 0.443 | 0.518 | 1.091 | 0.433 | 2.806 | 2.210 | 1.829 | |
336 | 0.498 | 0.677 | 0.363 | 0.587 | 0.879 | 0.445 | 0.518 | 0.809 | 0.426 | 0.686 | 1.005 | 0.490 | 0.666 | 0.930 | 0.483 | 0.608 | 1.084 | 0.461 | 0.502 | 1.106 | 0.419 | 2.899 | 2.661 | 1.726 | |
avg | 0.487 | 0.679 | 0.366 | 0.528 | 0.815 | 0.432 | 0.481 | 0.748 | 0.416 | 0.599 | 0.901 | 0.463 | 0.544 | 0.834 | 0.441 | 0.511 | 0.888 | 0.427 | 0.503 | 1.069 | 0.420 | 2.812 | 2.231 | 1.698 | |
PI3 | 96 | 0.989 | 0.830 | 0.107 | 2.016 | 1.238 | 0.113 | 1.360 | 1.088 | 0.094 | 1.096 | 0.996 | 0.084 | 1.572 | 1.232 | 0.101 | 1.683 | 1.229 | 0.104 | 2.236 | 1.791 | 0.134 | 2.568 | 1.663 | 0.240 |
192 | 0.966 | 0.883 | 0.114 | 1.953 | 1.225 | 0.117 | 2.226 | 1.354 | 0.119 | 1.266 | 1.148 | 0.090 | 1.898 | 1.355 | 0.110 | 1.627 | 1.497 | 0.322 | 2.960 | 1.893 | 0.137 | 2.680 | 1.658 | 0.231 | |
336 | 1.329 | 1.291 | 0.106 | 2.139 | 1.428 | 0.173 | 3.716 | 1.707 | 0.153 | 1.986 | 1.368 | 0.112 | 2.187 | 1.492 | 0.117 | 1.547 | 1.457 | 0.312 | 4.181 | 2.102 | 0.162 | 2.695 | 1.754 | 0.223 | |
avg | 1.094 | 1.001 | 0.109 | 2.036 | 1.297 | 0.134 | 2.434 | 1.383 | 0.122 | 1.449 | 1.170 | 0.095 | 1.886 | 1.360 | 0.109 | 1.619 | 1.394 | 0.246 | 3.126 | 1.929 | 0.145 | 2.648 | 1.691 | 0.231 | |
PI4 | 96 | 0.179 | 0.184 | 0.364 | 0.197 | 0.219 | 0.395 | 0.186 | 0.210 | 0.363 | 0.290 | 0.281 | 0.454 | 0.237 | 0.255 | 0.410 | 0.239 | 0.222 | 0.397 | 0.402 | 0.366 | 0.534 | 1.416 | 0.690 | 1.002 |
192 | 0.204 | 0.211 | 0.356 | 0.230 | 0.277 | 0.420 | 0.208 | 0.226 | 0.384 | 0.219 | 0.284 | 0.448 | 0.229 | 0.245 | 0.404 | 0.229 | 0.245 | 0.404 | 0.389 | 0.409 | 0.526 | 1.545 | 0.741 | 1.048 | |
336 | 0.263 | 0.272 | 0.454 | 0.279 | 0.322 | 0.465 | 0.269 | 0.266 | 0.437 | 0.279 | 0.279 | 0.445 | 0.250 | 0.254 | 0.421 | 0.294 | 0.290 | 0.457 | 0.623 | 0.581 | 0.665 | 1.714 | 0.871 | 1.103 | |
avg | 0.215 | 0.222 | 0.391 | 0.235 | 0.273 | 0.427 | 0.221 | 0.234 | 0.395 | 0.263 | 0.281 | 0.449 | 0.239 | 0.251 | 0.412 | 0.254 | 0.252 | 0.419 | 0.471 | 0.452 | 0.575 | 1.558 | 0.767 | 1.051 | |
PI5 | 96 | 0.651 | 0.256 | 0.326 | 0.654 | 0.265 | 0.362 | 0.695 | 0.262 | 0.358 | 0.670 | 0.268 | 0.346 | 0.808 | 0.309 | 0.386 | 0.753 | 0.287 | 0.372 | 0.912 | 0.391 | 0.410 | 5.856 | 0.925 | 1.038 |
192 | 0.525 | 0.274 | 0.287 | 0.775 | 0.298 | 0.414 | 0.788 | 0.284 | 0.381 | 0.751 | 0.389 | 0.517 | 0.825 | 0.311 | 0.390 | 0.900 | 0.302 | 0.407 | 1.031 | 0.439 | 0.436 | 5.957 | 1.000 | 1.048 | |
336 | 0.663 | 0.372 | 0.332 | 0.878 | 0.327 | 0.391 | 0.963 | 0.347 | 0.421 | 0.865 | 0.318 | 0.399 | 0.825 | 0.311 | 0.390 | 1.268 | 0.418 | 0.483 | 1.368 | 0.675 | 0.502 | 5.744 | 1.033 | 1.028 | |
avg | 0.613 | 0.301 | 0.315 | 0.769 | 0.297 | 0.389 | 0.816 | 0.302 | 0.387 | 0.762 | 0.325 | 0.421 | 0.819 | 0.311 | 0.388 | 0.974 | 0.336 | 0.421 | 1.104 | 0.502 | 0.449 | 5.853 | 0.986 | 1.038 | |
LIMS1 | 96 | 0.955 | 0.573 | 0.781 | 1.277 | 0.702 | 1.128 | 1.726 | 0.864 | 1.045 | 1.020 | 0.606 | 0.807 | 1.093 | 0.638 | 0.830 | 2.078 | 1.113 | 1.152 | 1.118 | 0.690 | 0.838 | 2.014 | 1.102 | 1.124 |
192 | 1.190 | 0.647 | 0.843 | 0.987 | 0.600 | 1.162 | 1.617 | 0.808 | 0.982 | 1.192 | 0.655 | 0.843 | 1.711 | 0.847 | 1.011 | 2.159 | 1.147 | 1.135 | 1.192 | 0.716 | 0.836 | 2.292 | 1.226 | 1.158 | |
avg | 1.073 | 0.610 | 0.812 | 1.132 | 0.651 | 1.145 | 1.671 | 0.836 | 1.013 | 1.106 | 0.630 | 0.825 | 1.402 | 0.742 | 0.920 | 2.119 | 1.130 | 1.144 | 1.155 | 0.703 | 0.837 | 2.153 | 1.164 | 1.141 | |
LIMS2 | 96 | 0.430 | 0.458 | 0.478 | 0.498 | 0.477 | 0.663 | 0.624 | 0.556 | 0.597 | 0.546 | 0.489 | 0.504 | 0.523 | 0.490 | 0.546 | 2.157 | 1.131 | 1.109 | 0.681 | 0.607 | 0.624 | 1.941 | 1.104 | 1.054 |
192 | 0.438 | 0.449 | 0.482 | 0.506 | 0.527 | 0.897 | 0.686 | 0.579 | 0.631 | 0.545 | 0.480 | 0.508 | 0.523 | 0.483 | 0.551 | 1.737 | 1.025 | 1.005 | 0.574 | 0.533 | 0.578 | 2.225 | 1.188 | 1.139 | |
336 | 0.433 | 0.478 | 0.500 | 0.560 | 0.572 | 0.664 | 0.654 | 0.546 | 0.620 | 0.545 | 0.475 | 0.512 | 0.570 | 0.499 | 0.579 | 2.200 | 1.146 | 1.137 | 0.577 | 0.518 | 0.587 | 2.170 | 1.170 | 1.139 | |
avg | 0.434 | 0.461 | 0.486 | 0.521 | 0.526 | 0.741 | 0.655 | 0.560 | 0.616 | 0.545 | 0.481 | 0.508 | 0.539 | 0.491 | 0.559 | 2.031 | 1.100 | 1.083 | 0.610 | 0.553 | 0.597 | 2.112 | 1.154 | 1.111 |
Methods | Ours | Time-LLM | TEMPO | GPT4TS | PatchTST | ||||||||||
---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
Metric | MSE | MAE | RSE | MSE | MAE | RSE | MSE | MAE | RSE | MSE | MAE | RSE | MSE | MAE | RSE |
ETTh1→ETTh2 | 0.367 | 0.422 | 0.431 | 0.375 | 0.424 | 0.473 | 0.392 | 0.436 | 0.460 | 0.406 | 0.422 | 0.460 | 0.380 | 0.405 | 0.443 |
ETTh2→ETTh1 | 0.611 | 0.513 | 0.699 | 0.640 | 0.524 | 0.761 | 0.742 | 0.606 | 0.839 | 0.703 | 0.578 | 0.735 | 0.665 | 0.533 | 0.747 |
ETTm1→ETTm2 | 0.203 | 0.284 | 0.364 | 0.217 | 0.292 | 0.368 | 0.238 | 0.289 | 0.379 | 0.264 | 0.295 | 0.393 | 0.296 | 0.334 | 0.408 |
ETTm2→ETTm1 | 0.442 | 0.425 | 0.633 | 0.478 | 0.438 | 0.658 | 0.515 | 0.478 | 0.662 | 0.769 | 0.567 | 0.679 | 0.568 | 0.492 | 0.639 |
PI5→PI6 | 0.703 | 0.248 | 0.329 | 0.731 | 0.264 | 0.356 | 1.704 | 0.260 | 0.691 | 0.980 | 0.340 | 0.425 | 0.829 | 0.326 | 0.391 |
PI6→PI5 | 0.204 | 0.204 | 0.364 | 0.194 | 0.215 | 0.371 | 0.212 | 0.211 | 0.352 | 0.210 | 0.227 | 0.386 | 0.225 | 0.242 | 0.400 |
LIMS1→LIMS2 | 0.503 | 0.470 | 0.504 | 0.850 | 0.672 | 0.696 | 0.628 | 0.555 | 0.670 | 0.575 | 0.495 | 0.522 | 0.601 | 0.525 | 0.592 |
LIMS2→LIMS1 | 1.075 | 0.675 | 0.810 | 1.220 | 0.685 | 0.877 | 1.270 | 0.697 | 1.116 | 1.176 | 0.684 | 0.794 | 1.298 | 0.692 | 0.872 |
Variants | Weather-96 | Weather-192 | PI5-96 | PI5-192 | ||||
---|---|---|---|---|---|---|---|---|
MSE | MAE | MSE | MAE | MSE | MAE | MSE | MAE | |
ADTime | 0.183 | 0.236 | 0.193 | 0.247 | 0.651 | 0.256 | 0.525 | 0.274 |
TEMPO | 0.193 | 0.239 | 0.246 | 0.289 | 0.654 | 0.265 | 0.775 | 0.298 |
TEMPO + | 0.187 | 0.237 | 0.199 | 0.257 | 0.652 | 0.259 | 0.725 | 0.286 |
PatchTST | 0.192 | 0.242 | 0.264 | 0.306 | 0.808 | 0.309 | 0.825 | 0.311 |
PatchTST + | 0.186 | 0.238 | 0.213 | 0.269 | 0.779 | 0.277 | 0.781 | 0.293 |
Model | ETTh1-96 | ETTh1-192 | ETTh2-96 | ETTh2-192 | PI4-96 | PI4-192 | PI5-96 | PI5-192 | ||||||||
---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
MSE | MAE | MSE | MAE | MSE | MAE | MSE | MAE | MSE | MAE | MSE | MAE | MSE | MAE | MSE | MAE | |
Llama3.2 (32) | 0.561 | 0.501 | 0.647 | 0.538 | 0.375 | 0.393 | 0.372 | 0.426 | 0.152 | 0.166 | 0.190 | 0.193 | 0.679 | 0.267 | 0.779 | 0.284 |
Llama3.2 (8) | 0.592 | 0.507 | 0.698 | 0.551 | 0.361 | 0.416 | 0.377 | 0.427 | 0.159 | 0.178 | 0.192 | 0.203 | 0.694 | 0.260 | 0.683 | 0.280 |
GPT-2 (12) | 0.686 | 0.554 | 0.632 | 0.529 | 0.323 | 0.368 | 0.347 | 0.409 | 0.179 | 0.192 | 0.200 | 0.210 | 0.675 | 0.257 | 0.736 | 0.277 |
GPT-2 (6) | 0.602 | 0.514 | 0.686 | 0.580 | 0.388 | 0.412 | 0.398 | 0.409 | 0.179 | 0.184 | 0.204 | 0.211 | 0.651 | 0.256 | 0.525 | 0.274 |
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content. |
© 2025 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
Pei, J.; Zhang, Y.; Liu, T.; Yang, J.; Wu, Q.; Qin, K. ADTime: Adaptive Multivariate Time Series Forecasting Using LLMs. Mach. Learn. Knowl. Extr. 2025, 7, 35. https://doi.org/10.3390/make7020035
Pei J, Zhang Y, Liu T, Yang J, Wu Q, Qin K. ADTime: Adaptive Multivariate Time Series Forecasting Using LLMs. Machine Learning and Knowledge Extraction. 2025; 7(2):35. https://doi.org/10.3390/make7020035
Chicago/Turabian StylePei, Jinglei, Yang Zhang, Ting Liu, Jingbin Yang, Qinghua Wu, and Kang Qin. 2025. "ADTime: Adaptive Multivariate Time Series Forecasting Using LLMs" Machine Learning and Knowledge Extraction 7, no. 2: 35. https://doi.org/10.3390/make7020035
APA StylePei, J., Zhang, Y., Liu, T., Yang, J., Wu, Q., & Qin, K. (2025). ADTime: Adaptive Multivariate Time Series Forecasting Using LLMs. Machine Learning and Knowledge Extraction, 7(2), 35. https://doi.org/10.3390/make7020035