Communication-Efficient Wireless Traffic Prediction with Federated Learning
Abstract
:1. Introduction
- We develop a communication-efficient federated learning framework for the wireless traffic prediction problem. A gradient compression scheme is designed and implemented through top-K sparsification. The communication between local clients and the central server can be considerably reduced.
- We design a gradient correction scheme by adding a local control variable to correct the gradient information and ensure its update direction is optimal. This scheme can solve the data heterogeneity challenge faced with wireless traffic prediction.
- We propose an adaptive aggregation scheme at the server side based on the gradient correlation. The spatial–temporal dependencies of different local clients can be modeled, and prediction performance can be largely improved.
2. Related Work
2.1. Federated Learning
2.2. Wireless Traffic Prediction
3. Problem Formulation
- Step 1: The central server broadcasts the global model to local base stations. It should be noted that the number of base stations selected to participate in model training is a subset of all base stations. The selection principles can be customized by mobile network operators according to specific requirements.
- Step 2: Each selected base station performs local stochastic gradient descent training with its own dataset. In this step, we introduce error compensation into stochastic gradient descent to overcome the client drift phenomenon confronted with traditional federated learning.
- Step 3: Each selected base station first performs gradient compression to alleviate network burdens and save network bandwidth, and then it transfers the compressed gradient information to the central server.
- Step 4: The central server performs global model aggregation after receiving all the gradient information from local base stations. In this step, we introduce a technique named gradient re-grouping to quantify the different contributions of base stations to the global model and capture spatial dependence among base stations.
4. Proposed Framework
4.1. Local Training
4.2. Gradient Compression
4.3. Model Aggregation
Algorithm 1 FedCE |
|
5. Experimental Results
5.1. Dataset
5.2. Baseline Methods
5.3. Experimental Setup and Results Analysis
5.4. Comparison between Actual Values and Predicted Values
6. Conclusions
Author Contributions
Funding
Data Availability Statement
Conflicts of Interest
References
- Tomkos, I.; Klonidis, D.; Pikasis, E.; Theodoridis, S. Toward the 6G network era: Opportunities and challenges. IT Prof. 2020, 22, 34–38. [Google Scholar] [CrossRef]
- Ahammed, T.B.; Patgiri, R.; Nayak, S. A vision on the artificial intelligence for 6G communication. ICT Express 2022, 19, 197–210. [Google Scholar] [CrossRef]
- Gunkel, D.J. Communication and artificial intelligence: Opportunities and challenges for the 21st century. Communication +1 2012, 1, 1–25. [Google Scholar]
- Zhang, C.; Dang, S.; Shihada, B.; Alouini, M.S. On Telecommunication Service Imbalance and Infrastructure Resource Deployment. IEEE Wirel. Commun. Lett. 2021, 10, 2125–2129. [Google Scholar] [CrossRef]
- Chen, A.; Law, J.; Aibin, M. A survey on traffic prediction techniques using artificial intelligence for communication networks. Telecom 2021, 2, 518–535. [Google Scholar] [CrossRef]
- Yu, B.; Mao, W.; Lv, Y.; Zhang, C.; Xie, Y. A survey on federated learning in data mining. Wiley Interdiscip. Rev. Data Min. Knowl. Discov. 2022, 12, e1443. [Google Scholar] [CrossRef]
- Yin, X.; Zhu, Y.; Hu, J. A comprehensive survey of privacy-preserving federated learning: A taxonomy, review, and future directions. ACM Comput. Surv. (CSUR) 2021, 54, 1–36. [Google Scholar] [CrossRef]
- Konečnỳ, J.; McMahan, H.B.; Ramage, D.; Richtárik, P. Federated optimization: Distributed machine learning for on-device intelligence. arXiv 2016, arXiv:1610.02527. [Google Scholar]
- Li, T.; Sahu, A.K.; Talwalkar, A.; Smith, V. Federated learning: Challenges, methods, and future directions. IEEE Signal Process. Mag. 2020, 37, 50–60. [Google Scholar] [CrossRef]
- Shamsian, A.; Navon, A.; Fetaya, E.; Chechik, G. Personalized federated learning using hypernetworks. In Proceedings of the International Conference on Machine Learning, PMLR, Virtual. 18–24 July 2021; pp. 9489–9502. [Google Scholar]
- Li, T.; Hu, S.; Beirami, A.; Smith, V. Ditto: Fair and robust federated learning through personalization. In Proceedings of the International Conference on Machine Learning, PMLR, Virtual. 18–24 July 2021; pp. 6357–6368. [Google Scholar]
- Li, T.; Sahu, A.K.; Zaheer, M.; Sanjabi, M.; Talwalkar, A.; Smith, V. Federated optimization in heterogeneous networks. Proc. Mach. Learn. Syst. 2020, 2, 429–450. [Google Scholar]
- Murata, T.; Suzuki, T. Bias-variance reduced local sgd for less heterogeneous federated learning. arXiv 2021, arXiv:2102.03198. [Google Scholar]
- Karimireddy, S.P.; Kale, S.; Mohri, M.; Reddi, S.; Stich, S.; Suresh, A.T. Scaffold: Stochastic controlled averaging for federated learning. In Proceedings of the International Conference on Machine Learning, PMLR, Virtual. 13–18 July 2020; pp. 5132–5143. [Google Scholar]
- Wei, W.W. Time Series Analysis; Pearson College Div: Victoria, BC, Canada, 2013. [Google Scholar]
- Gardner, E.S., Jr. Exponential smoothing: The state of the art. J. Forecast. 1985, 4, 1–28. [Google Scholar] [CrossRef]
- Qiu, C.; Zhang, Y.; Feng, Z.; Zhang, P.; Cui, S. Spatio-temporal wireless traffic prediction with recurrent neural network. IEEE Wirel. Commun. Lett. 2018, 7, 554–557. [Google Scholar] [CrossRef]
- Li, D.; Lin, C.; Gao, W.; Chen, Z.; Wang, Z.; Liu, G. Capsules TCN network for urban computing and intelligence in urban traffic prediction. Wirel. Commun. Mob. Comput. 2020, 2020, 6896579. [Google Scholar] [CrossRef]
- Niknam, S.; Dhillon, H.S.; Reed, J.H. Federated learning for wireless communications: Motivation, opportunities, and challenges. IEEE Commun. Mag. 2020, 58, 46–51. [Google Scholar] [CrossRef]
- Yin, F.; Lin, Z.; Kong, Q.; Xu, Y.; Li, D.; Theodoridis, S.; Cui, S.R. FedLoc: Federated learning framework for data-driven cooperative localization and location data processing. IEEE Open J. Signal Process. 2020, 1, 187–215. [Google Scholar] [CrossRef]
- Zhang, C.; Zhang, H.; Yuan, D.; Zhang, M. Citywide cellular traffic prediction based on densely connected convolutional neural networks. IEEE Commun. Lett. 2018, 22, 1656–1659. [Google Scholar] [CrossRef]
- Zhang, C.; Dang, S.; Shihada, B.; Alouini, M.S. Dual attention-based federated learning for wireless traffic prediction. In Proceedings of the IEEE INFOCOM 2021-IEEE Conference on Computer Communications, Vancouver, BC, Canada, 10–13 May 2021; IEEE: Piscataway, NJ, USA, 2021; pp. 1–10. [Google Scholar]
- Liu, W.; Chen, L.; Chen, Y.; Zhang, W. Accelerating federated learning via momentum gradient descent. IEEE Trans. Parallel Distrib. Syst. 2020, 31, 1754–1766. [Google Scholar] [CrossRef]
- Kingma, D.P.; Ba, J. Adam: A method for stochastic optimization. arXiv 2014, arXiv:1412.6980. [Google Scholar]
- Lin, Y.; Han, S.; Mao, H.; Wang, Y.; Dally, W.J. Deep gradient compression: Reducing the communication bandwidth for distributed training. arXiv 2017, arXiv:1712.01887. [Google Scholar]
- Willmott, C.J.; Matsuura, K. Advantages of the mean absolute error (MAE) over the root mean square error (RMSE) in assessing average model performance. Clim. Res. 2005, 30, 79–82. [Google Scholar] [CrossRef]
- Chai, T.; Draxler, R.R. Root mean square error (RMSE) or mean absolute error (MAE). Geosci. Model Dev. Discuss. 2014, 7, 1525–1534. [Google Scholar]
Methods | Trento Dataset | Milano Dataset | ||||||
---|---|---|---|---|---|---|---|---|
MSE | MAE | R2 | Comm. | MSE | MAE | R2 | Comm. | |
Standalone | 2.1411 | 0.7751 | 0.7631 | - | 0.0978 | 0.2202 | 0.7692 | - |
Centralized | 1.8761 | 0.7300 | 0.8550 | - | 0.0859 | 0.2011 | 0.8660 | - |
FedAvg | 4.3719 | 1.1072 | 0.6621 | 0.0908 | 0.2115 | 0.8585 | ||
FedDA | 2.0716 | 0.7632 | 0.8399 | 0.0940 | 0.2128 | 0.8535 | ||
FedCE-0.1 | 1.5590 | 0.7164 | 0.8795 | 0.1037 | 0.2274 | 0.8383 | ||
FedCE-0.2 | 1.5108 | 0.6968 | 0.8832 | 0.0978 | 0.2183 | 0.8476 | ||
FedCE-0.4 | 1.4844 | 0.6867 | 0.8853 | 0.0943 | 0.2132 | 0.8529 |
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content. |
© 2024 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
Gao, F.; Zhang, C.; Qiao, J.; Li, K.; Cao, Y. Communication-Efficient Wireless Traffic Prediction with Federated Learning. Mathematics 2024, 12, 2539. https://doi.org/10.3390/math12162539
Gao F, Zhang C, Qiao J, Li K, Cao Y. Communication-Efficient Wireless Traffic Prediction with Federated Learning. Mathematics. 2024; 12(16):2539. https://doi.org/10.3390/math12162539
Chicago/Turabian StyleGao, Fuwei, Chuanting Zhang, Jingping Qiao, Kaiqiang Li, and Yi Cao. 2024. "Communication-Efficient Wireless Traffic Prediction with Federated Learning" Mathematics 12, no. 16: 2539. https://doi.org/10.3390/math12162539
APA StyleGao, F., Zhang, C., Qiao, J., Li, K., & Cao, Y. (2024). Communication-Efficient Wireless Traffic Prediction with Federated Learning. Mathematics, 12(16), 2539. https://doi.org/10.3390/math12162539