Enhanced CNN-BiLSTM-Attention Model for High-Precision Integrated Navigation During GNSS Outages
Abstract
:1. Introduction
2. GNSS/INS Loosely Coupled Navigation Model
2.1. Conventional Kalman Filter Model
2.2. Improved Kalman Filter Model
2.2.1. Sage–Husa Adaptive Filtering
2.2.2. Strong Tracking Kalman Filter
2.2.3. Implementation of Filter Convergence Assessment
3. The Design of GNSS/INS Integrated Navigation Technology Based on CNN-BiLSTM-Attention
3.1. CNN-BiLSTM-Attention Model
3.2. Integrated Navigation System Assisted by CNN-BiLSTM-Attention
4. Experiments and Analysis
- During GNSS signal outages, the INS cannot integrate with the GNSS to correct system errors. This leads to rapid divergence of positioning errors in the standalone INS over time, with the divergence rate increasing progressively, failing to meet positioning requirements.
- Compared with the standalone INS, the introduction of RNN and LSTM for error compensation significantly improves positioning accuracy in the integrated navigation system. Among them, LSTM outperforms RNN, demonstrating its advantage in extracting temporal feature information. However, as the duration of GNSS signal outages increases, the positioning accuracy of RNN and LSTM in the east-direction error compensation gradually declines. In contrast, the proposed CNN-BiLSTM-Attention model remains unaffected by the prolonged outage duration, benefiting from both the bidirectional processing capability of BiLSTM and the dynamic focus of the Attention Mechanism. The synergy of these two components enables the model to comprehensively capture long-term dependencies and key information within the navigation data, thereby maintaining high positioning accuracy during GNSS signal outages.
- The CNN-BiLSTM-Attention method demonstrates the best compensation performance among the four methods. CNN extracts low-level spatial or temporal features from raw inputs, providing higher-quality input for the subsequent BiLSTM. BiLSTM combines forward and backward information flows, and the Attention Mechanism further optimizes the representation of sequential features by focusing on critical information. This model effectively suppresses the divergence of positioning errors, whether in the short or long term.
- In the north direction, the maximum error (MAX) of the INS reaches 26.8112 m, with MAE and RMSE being 8.2734 m and 11.3666 m, respectively. After introducing RNN, LSTM, and CNN-BiLSTM-Attention, the maximum errors are 4.4577 m, 1.7820 m, and 0.9282 m, respectively; the MAEs are 2.4095 m, 0.5577 m, and 0.4134 m; and the RMSEs are 2.7288 m, 0.7182 m, and 0.4953 m. Compared to INS, RNN, and LSTM, the positioning error of CNN-BiLSTM-Attention decreases by 95.00%, 82.84%, and 25.88% in MAE, and by 95.64%, 81.85%, and 31.04% in RMSE, respectively.
- In the east direction, the maximum error (MAX) of the INS reaches 2.1216 m, with MAE and RMSE being 1.1092 m and 1.3669 m, respectively. After introducing RNN, LSTM, and CNN-BiLSTM-Attention for error compensation, the maximum errors are 5.5597 m, 3.5069 m, and 0.5624 m, respectively; the MAEs are 2.1219 m, 2.0000 m, and 0.3152 m; and the RMSEs are 2.7420 m, 2.1932 m, and 0.3423 m. Compared to INS, RNN, and LSTM, the positioning error of CNN-BiLSTM-Attention decreases by 71.58%, 85.15%, and 84.24% in MAE, and by 74.96%, 87.52%, and 84.39% in RMSE, respectively.
- In the down direction, the maximum error (MAX) of INS reaches 2.3810 m, with MAE and RMSE being 0.4680 m and 0.7323 m, respectively. After introducing RNN, LSTM, and CNN-BiLSTM-Attention for error compensation, the maximum errors are 1.7906 m, 0.7455 m, and 0.5046 m, respectively; the MAEs are 1.0900 m, 0.5381 m, and 0.3893 m; and the RMSEs are 1.2034 m, 0.5713 m, and 0.4083 m. Compared to INS, RNN, and LSTM, the positioning error of CNN-BiLSTM-Attention decreases by 16.82%, 64.28%, and 27.65% in MAE, and by 44.24%, 66.07%, and 28.52% in RMSE, respectively.
- In the three-dimensional direction, the maximum error (MAX) of the INS is 26.9968 m, with MAE and RMSE being 8.3736 m and 11.4719 m, respectively. After introducing RNN, LSTM, and CNN-BiLSTM-Attention, the maximum errors are 7.0729 m, 3.6818 m, and 1.0580 m, respectively; the MAEs are 3.5801 m, 2.2055 m, and 0.6927 m; and the RMSEs are 4.0513 m, 2.3775 m, and 0.7275 m. In terms of MAE, compared to the other methods, the positioning accuracy of CNN-BiLSTM-Attention improves by 91.73%, 80.65%, and 68.59%, respectively. In terms of RMSE, the positioning accuracy of CNN-BiLSTM-Attention improves by 93.66%, 82.04%, and 69.40%, respectively. Overall, CNN-BiLSTM-Attention achieves the highest error compensation accuracy compared to the other methods.
5. Conclusions
Author Contributions
Funding
Data Availability Statement
Conflicts of Interest
References
- Andrade, A.A.L. The Global Navigation Satellite System: Navigating into the New Millennium; Routledge: Oxfordshire, UK, 2017. [Google Scholar]
- Zidan, J.; Adegoke, E.I.; Kampert, E.; Birrell, S.A.; Ford, C.R.; Higgins, M.D. GNSS vulnerabilities and existing solutions: A review of the literature. IEEE Access 2020, 9, 153960–153976. [Google Scholar] [CrossRef]
- Elsanhoury, M.; Mäkelä, P.; Koljonen, J.; Välisuo, P.; Shamsuzzoha, A.; Mantere, T.; Elmusrati, M.; Kuusniemi, H. Precision positioning for smart logistics using ultra-wideband technology-based indoor navigation: A review. IEEE Access 2022, 10, 44413–44445. [Google Scholar] [CrossRef]
- Noureldin, A.; Karamat, T.B.; Eberts, M.D.; El-Shafie, A. Performance enhancement of MEMS-based INS/GPS integration for low-cost navigation applications. IEEE Trans. Veh. Technol. 2008, 58, 1077–1096. [Google Scholar] [CrossRef]
- Li, X.; Ge, M.; Dai, X.; Ren, X.; Fritsche, M.; Wickert, J.; Schuh, H. Accuracy and reliability of multi-GNSS real-time precise positioning: GPS, GLONASS, BeiDou, and Galileo. J. Geod. 2015, 89, 607–635. [Google Scholar] [CrossRef]
- Falco, G.; Pini, M.; Marucco, G. Loose and tight GNSS/INS integrations: Comparison of performance assessed in real urban scenarios. Sensors 2017, 17, 255. [Google Scholar] [CrossRef]
- Jing, H.; Gao, Y.; Shahbeigi, S.; Dianati, M. Integrity monitoring of GNSS/INS based positioning systems for autonomous vehicles: State-of-the-art and open challenges. IEEE Trans. Intell. Transp. Syst. 2022, 23, 14166–14187. [Google Scholar] [CrossRef]
- Zhuang, Y.; Sun, X.; Li, Y.; Huai, J.; Hua, L.; Yang, X.; Cao, X.; Zhang, P.; Cao, Y.; Qi, L.; et al. Multi-sensor integrated navigation/positioning systems using data fusion: From analytics-based to learning-based approaches. Inf. Fusion 2023, 95, 62–90. [Google Scholar] [CrossRef]
- Lu, Y.; Ma, H.; Smart, E.; Yu, H. Real-time performance-focused localization techniques for autonomous vehicle: A review. IEEE Trans. Intell. Transp. Syst. 2021, 23, 6082–6100. [Google Scholar] [CrossRef]
- Li, Q.; Li, R.; Ji, K.; Dai, W. Kalman filter and its application. In Proceedings of the 2015 8th International Conference on Intelligent Networks and Intelligent Systems (ICINIS), Tianjin, China, 1–3 November 2015. [Google Scholar]
- Li, M.; Mourikis, A.I. High-precision, consistent EKF-based visual-inertial odometry. Int. J. Robot. Res. 2013, 32, 690–711. [Google Scholar] [CrossRef]
- Soken, H.E.; Hajiyev, C.; Sakai, S.I. Robust Kalman filtering for small satellite attitude estimation in the presence of measurement faults. Eur. J. Control. 2014, 20, 64–72. [Google Scholar] [CrossRef]
- Hajiyev, C.; Soken, H.E. Robust adaptive Kalman filter for estimation of UAV dynamics in the presence of sensor/actuator faults. Aerosp. Sci. Technol. 2013, 28, 376–383. [Google Scholar] [CrossRef]
- Li, W.; Jia, Y. H-infinity filtering for a class of nonlinear discrete-time systems based on unscented transform. Signal Process. 2010, 90, 3301–3307. [Google Scholar] [CrossRef]
- Arasaratnam, I.; Haykin, S. Square-root quadrature Kalman filtering. IEEE Trans. Signal Process. 2008, 56, 2589–2593. [Google Scholar] [CrossRef]
- Jwo, D.J.; Biswal, A.; Mir, I.A. Artificial neural networks for navigation systems: A review of recent research. Appl. Sci. 2023, 13, 4475. [Google Scholar] [CrossRef]
- Sharaf, R.; Noureldin, A. Sensor integration for satellite-based vehicular navigation using neural networks. IEEE Trans. Neural Netw. 2007, 18, 589–594. [Google Scholar] [CrossRef]
- Tan, X.; Wang, J.; Han, H.; Yao, Y. Improved Neural Network-Assisted GPS/INS Integrated Navigation Algorithm. J. China Univ. Min. Technol. 2014, 43, 526–533. [Google Scholar]
- Gao, W.; Feng, X.; Zhu, D. Neural Network-Assisted Adaptive GPS/INS Integrated Navigation Algorithm. Acta Geod. Cartogr. Sin. 2007, 26, 64–67. [Google Scholar]
- Zhang, Y.; Wang, L. A hybrid intelligent algorithm DGP-MLP for GNSS/INS integration during GNSS outages. J. Navig. 2018, 72, 375–388. [Google Scholar] [CrossRef]
- Dai, H.F.; Bian, H.W.; Wang, R.Y.; Ma, H. An INS/GNSS integrated navigation in GNSS denied environment using recurrent neural network. Def. Technol. 2020, 16, 334–340. [Google Scholar] [CrossRef]
- Zhang, Y. A fusion methodology to bridge GPS outages for INS/GPS integrated navigation system. IEEE Access 2019, 7, 61296–61306. [Google Scholar] [CrossRef]
- Zhang, H.; Zhang, Q.; Shao, S.; Niu, T.; Yang, X. Attention-based LSTM network for rotatory machine remaining useful life prediction. IEEE Access 2020, 8, 132188–132199. [Google Scholar] [CrossRef]
- Simon, D. Kalman filtering. Embed. Syst. Program. 2001, 14, 72–79. [Google Scholar]
- Khodarahmi, M.; Vafa, M. A review on Kalman filter models. Arch. Comput. Methods Eng. 2023, 30, 727–747. [Google Scholar] [CrossRef]
- Narasimhappa, M.; Mahindrakar, A.D.; Guizilini, V.C.; Terra, M.H.; Sabat, S.L. MEMS-based IMU drift minimization: Sage Husa adaptive robust Kalman filtering. IEEE Sens. J. 2019, 20, 250–260. [Google Scholar] [CrossRef]
- Jwo, D.-J.; Wang, S.-H. Adaptive fuzzy strong tracking extended Kalman filtering for GPS navigation. IEEE Sens. J. 2007, 7, 778–789. [Google Scholar] [CrossRef]
- Park, W.B.; Chung, J.; Jung, J.; Sohn, K.; Singh, S.P.; Pyo, M.; Shin, N.; Sohn, K.S. Classification of crystal structure using a convolutional neural network. IUCrJ 2017, 4, 486–494. [Google Scholar] [CrossRef]
- Yang, A.; Yang, X.; Wu, W.; Liu, H.; Zhuansun, Y. Research on feature extraction of tumor image based on convolutional neural network. IEEE Access 2019, 7, 24204–24213. [Google Scholar] [CrossRef]
- Cui, Z.; Chen, W.; Chen, Y. Multi-scale convolutional neural networks for time series classification. arXiv 2016, arXiv:1603.06995. [Google Scholar]
- Salem, F.M. Recurrent Neural Networks; Springer International Publishing: Berlin/Heidelberg, Germany, 2022. [Google Scholar]
- Wang, J.; Sun, L.; Li, H.; Ding, R.; Chen, N. Prediction model of fouling thickness of heat exchanger based on TA-LSTM structure. Processes 2023, 11, 2594. [Google Scholar] [CrossRef]
- Greff, K.; Srivastava, R.K.; Koutník, J.; Steunebrink, B.R.; Schmidhuber, J. LSTM: A search space odyssey. IEEE Trans. Neural Netw. Learn. Syst. 2016, 28, 2222–2232. [Google Scholar] [CrossRef]
- Siami-Namini, S.; Tavakoli, N.; Namin, A.S. The performance of LSTM and BiLSTM in forecasting time series. In Proceedings of the 2019 IEEE International Conference on Big Data (Big Data), Los Angeles, CA, USA, 9–12 December 2019. [Google Scholar]
- Guo, M.-H.; Xu, T.-X.; Liu, J.-J.; Liu, Z.N.; Jiang, P.T.; Mu, T.J.; Zhang, S.-H.; Martin, R.R.; Cheng, M.-M.; HU, S.-M. Attention mechanisms in computer vision: A survey. Comput. Vis. Media 2022, 8, 331–368. [Google Scholar] [CrossRef]
- Liu, Y.; Pu, H.; Sun, D.-W. Efficient extraction of deep image features using convolutional neural network (CNN) for applications in detecting and analysing complex food matrices. Trends Food Sci. Technol. 2021, 113, 193–204. [Google Scholar] [CrossRef]
- Ye, X.; Song, F.; Zhang, Z.; Zeng, Q. A review of small UAV navigation system based on multisource sensor fusion. IEEE Sens. J. 2023, 23, 18926–18948. [Google Scholar] [CrossRef]
Component | Parameter Description | Configured Values |
---|---|---|
CNN | Count of convolutional layers | 2 |
Size and count of convolution kernels | [1, 32], [1, 64] | |
Count of pooling layers | 2 | |
Pooling layer filter size | [1, 1] | |
BiLSTM | Number of hidden layers | 2 |
Neurons per hidden layer | [15, 10] | |
Attention | Total number of channels | 32 |
Initial training | Maximum training iterations | 100 |
Batch size for training | 16 | |
Learning rate | 0.01 | |
Decay factor | 0.1 |
Sensor | Range | Bias | Random Walk | Scale Factor |
---|---|---|---|---|
Gyroscope | 20°/h | 1500 ppm | ||
Accelerometer | 50 mg | 4000 ppm |
Method and Index | INS | RNN | LSTM | CNN- BiLSTM- Attention | Percentage Reduction | |
---|---|---|---|---|---|---|
North error (m) | MAX | 26.8112 | 4.4577 | 1.7820 | 0.9282 | 96.54% |
East error (m) | 2.1216 | 5.5597 | 3.5069 | 0.5624 | 73.49% | |
Height error (m) | 2.3810 | 1.7906 | 0.7455 | 0.5046 | 78.81% | |
3D error (m) | 26.9968 | 7.0729 | 3.6818 | 1.0580 | 96.08% | |
North error (m) | MAE | 8.2734 | 2.4095 | 0.5577 | 0.9282 | 95.00% |
East error (m) | 1.1092 | 2.1219 | 2.0000 | 0.3152 | 71.59% | |
Height error(m) | 0.4680 | 1.0900 | 0.5381 | 0.3893 | 16.82% | |
3D error (m) | 8.3736 | 3.5801 | 2.2055 | 0.6927 | 91.73% | |
North error (m) | RMSE | 11.3666 | 2.7288 | 0.7182 | 0.4953 | 95.64% |
East error (m) | 1.3669 | 2.7420 | 2.1932 | 0.3423 | 74.95% | |
Height error(m) | 0.7323 | 1.2034 | 0.5713 | 0.4083 | 44.24% | |
3D error (m) | 11.4719 | 4.0513 | 2.3775 | 0.7275 | 93.66% |
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content. |
© 2025 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
Dai, W.; Han, H.; Wang, J.; Xiao, X.; Li, D.; Chen, C.; Wang, L. Enhanced CNN-BiLSTM-Attention Model for High-Precision Integrated Navigation During GNSS Outages. Remote Sens. 2025, 17, 1542. https://doi.org/10.3390/rs17091542
Dai W, Han H, Wang J, Xiao X, Li D, Chen C, Wang L. Enhanced CNN-BiLSTM-Attention Model for High-Precision Integrated Navigation During GNSS Outages. Remote Sensing. 2025; 17(9):1542. https://doi.org/10.3390/rs17091542
Chicago/Turabian StyleDai, Wulong, Houzeng Han, Jian Wang, Xingxing Xiao, Dong Li, Cai Chen, and Lei Wang. 2025. "Enhanced CNN-BiLSTM-Attention Model for High-Precision Integrated Navigation During GNSS Outages" Remote Sensing 17, no. 9: 1542. https://doi.org/10.3390/rs17091542
APA StyleDai, W., Han, H., Wang, J., Xiao, X., Li, D., Chen, C., & Wang, L. (2025). Enhanced CNN-BiLSTM-Attention Model for High-Precision Integrated Navigation During GNSS Outages. Remote Sensing, 17(9), 1542. https://doi.org/10.3390/rs17091542