Next Article in Journal
From Browning to Greening: Climate-Driven Vegetation Change in the Irtysh River Basin After the Global Warming Hiatus
Previous Article in Journal
Adopting an Open-Source Processing Strategy for LiDAR Drone Data Analysis in Under-Canopy Archaeological Sites: A Case Study of Torre Castiglione (Apulia)
Previous Article in Special Issue
Exploring the Advantages of Multi-GNSS Ionosphere-Weighted Single-Frequency Precise Point Positioning in Regional Ionospheric VTEC Modeling
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Review

High Precision Navigation and Positioning for Multisource Sensors Based on Bibliometric and Contextual Analysis

1
Innovation Academy for Precision Measurement Science and Technology, Chinese Academy of Sciences, Wuhan 430077, China
2
College of Earth and Planetary Sciences, University of Chinese Academy of Sciences, Beijing 100049, China
*
Author to whom correspondence should be addressed.
Remote Sens. 2025, 17(7), 1136; https://doi.org/10.3390/rs17071136
Submission received: 31 January 2025 / Revised: 18 March 2025 / Accepted: 19 March 2025 / Published: 22 March 2025
(This article belongs to the Special Issue Advanced Multi-GNSS Positioning and Its Applications in Geoscience)

Abstract

:
With the increasing demand for high-precision positioning, integrated navigation technology has become a key approach to achieving accurate and reliable location tracking in modern intelligent mobile platforms. While previous studies have explored the application of various sensor combinations, there is still a lack of systematic analysis regarding the integration of the four major sensors: GNSS, INS, vision, and LiDAR. This study analyzes 5193 academic articles published between 2000 and 2024 in the Web of Science database, employing bibliometric analysis, network analysis, and content analysis to evaluate the development and application of these four sensors in integrated navigation systems. By reviewing the evolution of integrated navigation technology, the study examines four typical integration modes: GNSS/INS, INS/visual, GNSS/INS/visual, and GNSS/INS/visual/LiDAR, discussing their complementarity, fusion algorithm optimization, and emerging application scenarios. Despite significant progress in improving navigation accuracy and environmental adaptability, challenges persist in sensor cooperation and real-time processing capabilities in complex environments. The study concludes by summarizing existing research findings and identifying gaps, with future research focusing on optimizing multisensor fusion algorithms, enhancing system adaptability, improving error models, and enhancing sensor performance in adverse environmental conditions.

Graphical Abstract

1. Introduction

Since the beginning of the 21st century, significant breakthroughs have been made in multisensor technologies, including global navigation satellite systems (GNSS), inertial navigation systems (INS), visual, and LiDAR. GNSS offers global coverage but requires direct satellite visibility, INS bridges temporary signal outages through motion propagation, while visual and LiDAR sensors provide environmental perception critical for sustained operation in GNSS-denied environments such as urban canyons and indoor facilities. However, an individual navigation system is often constrained by its inherent limitations. Therefore, it is difficult to provide high-precision and reliable navigation services in complex environments and limited conditions. Subsequently, integrated navigation technology emerged. This combines multiple navigation systems to effectively overcome the shortcomings of individual systems [1]. It has been demonstrated that integrated navigation has significant adaptability and excellent performance, and has become an important research technology in modern navigation. The continuous development and application of integrated navigation technology has significantly broadened its application across various fields. Initially, integrated navigation primarily focused on the combination of GNSS and INS. However, with advancements in technology, an increasing number of sensors have been integrated into these systems, enabling reliable navigation services even in diverse environmental conditions. There are several advantages of integrated navigation technology. First, it has high adaptability in complex environments and observation-limited conditions. This adaptability is achieved through synergistic sensor interactions: INS maintains trajectory continuity during GNSS signal loss, visual sensors adaptively extract navigational features from dynamic surroundings, and LiDAR constructs persistent spatial references—collectively enabling reliable positioning under challenging conditions such as signal-blocked urban areas, feature-deprived tunnels, and rapidly changing operational scenes. Such multimodal coordination ensures reliable positioning and navigation in diverse applications ranging from high-altitude flight and deep-sea exploration to autonomous driving in densely populated urban areas and unmanned operations in challenging terrains. Even in worse observation conditions, such as GNSS signal obstruction, severe environmental changes, and complex dynamic conditions, reliable navigation information is maintained by using integrated navigation technology [2]. Secondly, integrated navigation technology enables the optimal navigation solution for specific application scenarios by flexibly selecting and integrating different types of sensors and employing either loosely coupled (LC) or tightly coupled (TC) information fusion strategies [2,3]. Integrated navigation technology can be applied in various areas and plays a key role in technological advancement and industrial innovation due to its high degree of flexibility and versatility. With the ongoing advancement of this technology and increasing market demand, integrated navigation technology is advancing in various aspects. From the hardware perspective, novel sensors such as miniaturized INS, high-precision visual sensors, and long-range LiDAR have significantly enhanced the sensing capabilities of integrated navigation systems [4,5]. From the algorithmic perspective, state estimation methods, such as filtering methods, graph optimization, and deep learning, were gradually incorporated into navigation frameworks to improve system performance. Moreover, customized integrated navigation solutions, such as GNSS/INS, INS/visual, GNSS/INS/visual, and GNSS/INS/visual/LiDAR, have been widely adopted and validated in specific application scenarios. This has further explored the applications of integrated navigation technology.
To rigorously investigate the development of integrated navigation technology, this paper applies bibliometric analysis, network analysis, and content analysis methods [6] to comprehensively examine the references, authorship, countries, institutions, keywords, and citation impacts. The objective is to reveal the underlying knowledge structure, distribution of research capabilities, collaboration networks, key research areas, and emerging trends within the field of integrated navigation. These research results can provide robust data support for understanding the current state and trends of integrated navigation technology. Furthermore, the results can help to address future challenges in integrated navigation technology.

2. Research Approach

2.1. Data Collection

In order to explore the developmental trajectory of the knowledge structure, research hotspots, and emerging trends of integrated navigation technology, this study employs the Web of Science Core Collection database as the primary platform for literature retrieval and analysis. This platform includes a large number of high-quality, peer-reviewed journal articles, covering a wide range of academic disciplines, and provides comprehensive and systematic literature resources for this study. Additionally, the citation analysis tools of WOS effectively reveal the citation relationships among the literature, helping to identify the academic trajectory, research hotspots, and emerging trends in the field [7].
Considering the widespread application of integrated navigation technology, we design two sets of keywords to comprehensively cover the relevant literature in the research field of GNSS/INS/visual/LiDAR integrated navigation. The first set includes “Integrated navigation”, “Multi-sensor Fusion”, “combined technology”, and “navigation system”, which focus on the core concepts of system integration and performance optimization, ensuring that the literature related to the fundamental theories and methods of integrated navigation is covered. The second set includes terms such as “GNSS”, “GPS”, “Global Navigation Satellite System”, “INS”, “Inertial Navigation System”, “IMU”, “Inertial Measurement Unit”, “Vision”, “Visual”, “Camera-based Navigation”, “Visual SLAM (simultaneous localization and mapping)”, “LiDAR”, and “Laser Radar”. These keywords were selected because integrated navigation systems involve a wide range of technologies and sensors. This set covers both foundational technologies like GNSS and INS as well as advanced methods such as SLAM and LiDAR. By including these terms, the search ensures the retrieval of the literature covering both traditional and cutting-edge technologies in integrated navigation, providing a comprehensive view of the field’s research progress. The two sets of keywords are linked by the Boolean operator “AND”, while keywords within each set are connected by “OR” to ensure comprehensive coverage of all the relevant literature. The final search expression is formulated as follows: TS = ((Integrated navigation OR Multi-sensor Fusion OR combined technology OR navigation system) AND (((GNSS OR GPS OR Global Navigation Satellite System) AND (INS OR Inertial Navigation System OR IMU OR Inertial Measurement Unit)) OR ((GNSS OR GPS OR Global Navigation Satellite System) AND (Vision OR Visual OR Camera-based Navigation OR Visual SLAM)) OR ((GNSS OR GPS OR Global Navigation Satellite System) AND (LiDAR OR Laser Radar)) OR ((INS OR Inertial Navigation System OR IMU OR Inertial Measurement Unit) AND (Vision OR Visual OR Camera-based Navigation OR Visual SLAM)) OR ((INS OR Inertial Navigation System OR IMU OR Inertial Measurement Unit) AND (LiDAR OR Laser Radar)) OR ((Vision OR Visual OR Camera-based Navigation OR Visual SLAM) AND (LiDAR OR Laser Radar)))) AND DT = (Article OR Review) AND LA = English. Using this search formula in the Web of Science, we initially retrieved 5986 relevant papers. Through rigorous selection to exclude non-navigation articles, we focused on 5193 research papers published during the period from 2000 to 2024 for further bibliometric and network analysis. In order to support the comprehensive study, key citation information was extracted, including author names, article titles, publication details, and citation counts along with abstracts and reference lists.

2.2. Bibliometric, Network, and Content Analysis

In this paper, we adopted an integrated approach combining bibliometric, network, and content analysis to explore the structure, dynamics, and core content of published research [8]. Bibliometric analysis as a quantitative method aims to reveal and emphasize the patterns and impact of scientific output.
Network analysis is an effective method for handling large-scale bibliometric data, which can utilize advanced computational tools to uncover interdisciplinary connections, construct academic network maps, and predict trends while identifying key research focuses. VOSviewer (version 1.6.20) and CiteSpace (version 6.3.R1) are two commonly used bibliometric analysis tools, each with their own unique features. VOSviewer is a visualization software developed by Leiden University in Leiden, the Netherlands and is widely favored for its flexibility and powerful functionality [8]. In this study, VOSviewer was used for network analysis to visually present the research findings in an intuitive graphical format. Content analysis was also applied to summarize key trends and develop a comprehensive framework for the literature review. Through bibliometric analysis, we present the rapidly developing research topics and potentially uncover new topics to deepen the understanding of the current status of the technology and to provide a guide for future research topics. CiteSpace, developed by Chen [9,10], focuses on trend analysis and pattern recognition. It is highly effective in detecting emerging topics, research frontiers, and the evolution of academic networks. Through citation analysis, CiteSpace reveals the core literature, helping researchers identify key knowledge and developmental trends within a specific field [6]. To provide a clear overview of the research methodology, Figure 1 illustrates the key steps involved in data collection, processing, and analysis, offering a comprehensive framework for understanding the research approach.

3. Results

3.1. Analysis of Publication Outputs

The time series of publications on GNSS, INS, visual, and LiDAR-integrated navigation technology reveals changes in research intensity, the evolution of research focus, academic interest, and research and development investment. An upward trend in the total number of publications typically can identify an increase in academic attention or advances in core technologies that drive further research. The research of spatial distribution can be outlined as follows based on an analysis of 5193 publications from the Web of Science.
Figure 2 illustrates the geographical distribution of global scientific publications. It highlights the top five countries by number of publications, with China leading at 46.6%, followed by the USA at 12.4%, and Canada at 7.7%. South Korea and Germany rank fourth and fifth with 5.2% and 4.5%, respectively. These data not only reflect the relative weight of each country in global scientific publications but also reveal their competitiveness in terms of research investment, human resources, innovation capacity, and institutional efficiency. This helps us better understand the role and development trends of different countries in the research of GNSS/INS/visual/LiDAR integrated navigation technology and provides a quantitative reference for identifying leading countries.
Figure 3a illustrates the changes in the number of publications on integrated navigation research from 2000 to 2024. Based on the observed trends, the development of this field can be divided into three periods. The first period is from 2000 to 2005; the number of publications increased slowly, indicating that research in this field was relatively inactive and received little attention. The second period is from 2006 to 2011; the number of publications gradually increased, indicating increasing attention to integrated navigation technology. The third period is from 2012 to 2024; there was a rapid increase in publication numbers, especially in China. Figure 3b illustrates the total number of publications and citations from the top five countries between 2000 and 2024. China was far ahead in both metrics, indicating that it has paid more attention to the field of integrated navigation and has invested considerable effort in related research. With the development of multisensor fusion technology, the applications and research in integrated navigation are receiving more and more attention. Therefore, the number of publications has shown an almost exponential growth trend. This suggests the growing importance of the field. The research topics have also become more diversified, demonstrating that this field is still in a rapid development phase. Both new theories and technologies continually emerge.
In terms of journal publications, there are 15 journals that have published more than 50 articles in this field. Table 1 lists the top 15 major journals based on the analytical sample of the published literature. These journals can be regarded as the most prominent publishing platforms in the field of integrated navigation. As shown in the table, Sensors, IEEE Sensors Journal, Remote Sensing, IEEE Access, and IEEE Transactions on Instrumentation and Measurement are the top five journals in this field. These data highlight that these journals play an important role in promoting academic exchange.

3.2. Comprehensive Analysis of Authors, Countries, and Institutions

3.2.1. Author Contribution Analysis

Based on the VOSviewer analysis, Figure 4 shows a dense network of collaboration links with 533 core authors who have published at least 5 articles. Table 2 lists the top 15 authors ranked by the number of published articles in this field. Further analyses of Figure 4 and Table 2 reveal that most authors have significant advantages in both the number and strength of their collaborations, forming tight and frequent cooperative networks. This suggests that their broad collaborations play a crucial role in knowledge dissemination, interdisciplinary research, and scientific outputs.
Figure 5 illustrates the cocitation network of researchers who have made significant contributions to the multimodal fusion navigation technologies involving GNSS, INS, visual, and LiDAR. This network was generated through an in-depth exploration of the relevant literature database, where each dot represents an author who has published at least 15 papers and received at least 200 citations. A total of 56 core scholars were selected from 13,662 potential authors who met this strict criterion.
Table 3 lists the top 15 authors ranked by citation frequency in this cocitation network. Shaojie Sheng ranks first with 3314 citations and Xiaoji Niu with 2163 citations is close behind. They have demonstrated significant academic influence. The citing and recording of references possess the function of academic evaluation. It plays an important role in reflecting the publishing value of academic papers. A comparison between Table 2 and Table 3 reveals that Xiaoji Niu (Wuhan University) and Noureldin, Aboelmagd (Queen’s University/Royal Military College of Canada) is not only a leading author in publication count but also ranks highly in citation frequency. Additionally, scholars such as El-Sheimy, Naser (University of Calgary) and Hsu, Li-Ta (The Hong Kong Polytechnic University) have also produced substantial research results with high citation rates, clearly demonstrating their profound expertise and impact in GNSS/INS/visual/LiDAR integrated navigation technology.

3.2.2. Geographical Distribution of Contributions

To conduct an in-depth analysis of international research collaborations, a threshold of at least 15 coauthored publications was set. Among the 98 countries analyzed, 41 met this criterion and were included in the construction of the country collaboration network, as shown in Figure 6. Based on highly consistent collaboration patterns within the specified research field, the network is further divided into several distinct clusters [7]. The clusters of different colors reveal collaboration networks between different countries. For instance, the blue cluster highlights the intensive collaboration between countries such as China and the United States, demonstrating their significant joint contributions to research activities, while the green cluster emphasizes the close partnerships between countries like Canada, England, and India, showcasing their collaborative research efforts within the region. Each node in the figure represents a country and its corresponding number of published papers and the size of the node is proportional to the total number of academic articles published. Specifically, China is the largest node with 2422 papers published during the sample period, followed by the United States with 642 papers and Canada with 402 papers. Moreover, the thickness of the connections indicates the strength of collaboration between the associated countries, with the thicker lines indicating that the scientific collaboration between the two countries is stronger and more active. As shown in Figure 6, countries with a higher number of published papers tend to exhibit stronger collaborative relationships in the international cooperation network.

3.2.3. Institutional Collaboration Patterns

According to the available data, a total of 3108 different institutions have contributed to publications in the field of integrated navigation technology. As shown in Figure 7 and Table 4, among the top 15 institutions ranked by the number of publications, Wuhan University holds the top position, with 262 high-quality papers published in this field. This was followed by Beihang University with 199 related research result achievements. Additionally, institutions such as the University of Calgary, the National University of Defense Technology, and Southeast University have also gained leading positions due to their outstanding research performance. The contributions of these institutions are reflected not only in the absolute number of their research results but also in their substantial strength and extensive influence in the research, development, and theoretical exploration of integrated navigation technology. Their research results provide continuous momentum for the advancement of integrated navigation technology and lay a solid foundation for the future development of this field.

3.3. Research Hotspot Analysis

3.3.1. Keyword Analysis

Figure 8 illustrates the keyword network of publications spanning from 2000 to 2024, with a co-occurrence frequency of at least 50. To ensure focus on frequently appearing and influential keywords, a “mean-split” standardization method was applied to measure the importance and relative frequency of the keywords [11]. The size of each node corresponds to the frequency of the keywords’ appearance in the literature. Keywords with a frequency above the mean have larger nodes, while those below the mean have smaller nodes. The strength of the associations between the keywords is represented by the thickness and distance of the connections; the higher the co-occurrence frequency, the thicker the connection.
The figure displays four main research themes, distinguished by color: the blue cluster focuses on the broad applications of GNSS and their derived technologies in positioning and navigation, such as BDS, GPS, precise point positioning (PPP), and real-time kinematic (RTK); the green cluster focuses on INS and sensor fusion, covering technologies like accelerometers, GPS, and Kalman filters, aiming to enhance navigation accuracy and state estimation through sensor data fusion; the yellow cluster focuses on visual perception and positioning optimization and it includes visual–inertial odometry (VIO), pose estimation, feature extraction, optimization algorithms, and visualization techniques to enhance location awareness and accuracy in complex environments; the red cluster centers on autonomous driving and related technologies and it encompasses multisensor fusion, SLAM, LiDAR, deep learning, autonomous navigation, and visual navigation, leveraging advanced sensing and processing methods to improve real-time positioning and navigation for autonomous vehicles.
In this study, we conducted an in-depth analysis of the 25 most notable keywords exhibiting sudden growth from 2000 to 2024 using the CiteSpace tool [12], as shown in Figure 9, to reveal the evolutionary trajectory of research themes. Each keyword is associated with specific year labels and intensity indicators, with red bar charts illustrating the periods of citation bursts and peak intensity. Through the temporal dynamic analysis of these keywords, we can clearly understand the changes in the popularity of research hotspots over time, thereby providing insights for future research directions.
Data from Figure 9 indicate that certain keywords highlight how early technological development primarily relied on the combination of INS and GPS. Between 2000 and 2008, inertial navigation displayed a burst intensity of 20.47, while the global positioning system reached 14.97 from 2000 to 2011. Together, these systems formed the foundational architecture of navigation systems, significantly enhancing their reliability. Meanwhile, GPS/INS integrated navigation, with a burst intensity of 23.97 from 2006 to 2016, emerged as a representative example of multimodal fusion, meeting the demand for high-accuracy positioning in complex scenarios [13,14]. Subsequently, computer vision achieved a burst intensity of 8.63 from 2012 to 2014 and vision technologies reached 7.65 from 2013 to 2016, highlighting a notable improvement in environmental perception capabilities and expanding the application scope of sensor combinations. Additionally, Figure 9 reveals profound transformations in the integration of theoretical innovation and practical application within integrated navigation algorithms, transitioning from the classic Kalman filter to intelligent developments, including particle filter, factor graph optimization, deep learning, machine learning, and the adaptive Kalman filter. In recent years, the emergence of “real-time systems”, “location awareness”, “pose estimation”, and “ambiguity resolution” reflects notable advancements in integrated navigation. These terms highlight a growing focus on improving real-time responsiveness, enhancing environmental adaptability, and achieving greater positioning accuracy [15,16,17,18,19,20]. In summary, the developmental trajectory of these keywords not only revisits the key milestones in integrated navigation technology but also provides significant insights for future research trends, highlighting the broad application prospects of intelligent algorithms in navigation systems.

3.3.2. Citation Impact Analysis

Citation analysis serves as a powerful tool that reveals the cocitation relationships among various works, aiding researchers in exploring the intrinsic connections within the literature knowledge system. This analytical approach effectively identifies key works that have significantly influenced a specific research field and quantifies the strength of the interconnections among these works, thereby providing an effective assessment of the research’s impact and significance in the academic community [21].
Highly cited papers often signify pioneering value within their respective research fields and play a crucial role in the development of the entire domain. Table 5 lists the top ten most cited works in the field of integrated navigation. These studies cover a wide range of topics, including traditional GNSS/INS integration, VIO, and LiDAR–inertial fusion, highlighting significant advancements in multisensor fusion, state estimation accuracy, and real-time performance optimization. For example, in document 10, Noureldin et al. focus on enhancing the performance of low-cost navigation systems by integrating MEMS-based INS and GPS. Their approach improves filtering algorithms to mitigate multipath effects and signal obstructions, significantly increasing navigation accuracy and reliability in urban environments [22]. Similarly, document 9 surveys various in-car positioning and navigation technologies, including GNSS, inertial sensors, and odometry, highlighting their practical applications in modern automotive systems [23]. In the domain of visual–inertial odometry, document 2, Qin et al.’s research on VINS-mono achieves high-precision state estimation by fusing data from visual and inertial sensors, making it particularly suitable for autonomous navigation systems in GPS-denied environments and advancing the development of drone and robotic navigation [24]. Document 5 further refines visual–inertial odometry through manifold preintegration techniques, enhancing real-time performance and accuracy and broadening applications in autonomous driving scenarios. LiDAR–inertial fusion is also a key focus [25]. In document 8, “FAST-LIO2”, integrates LiDAR data with inertial measurements to implement a fast direct odometry approach. This innovation significantly improves real-time navigation efficiency and robustness in dynamic environments, paving the way for next-generation autonomous navigation systems [26]. Overall, these works illustrate a clear progression in integrated navigation technologies. From traditional GNSS/INS combinations to cutting-edge visual–inertial and LiDAR–inertial methods, they have collectively elevated the state of the art, enabling more accurate, reliable, and versatile navigation across a variety of challenging operational environments.
The burst index is a metric that measures sudden citation surges, highlighting works that rapidly gain influence within a short timeframe. Such bursts often reflect a temporary but significant impact on research trends. In CiteSpace II, the burst index helps identify pivotal articles that signal rising importance or shifts in intellectual focus [32]. Figure 10 shows the top 25 documents ranked by their burst index, illustrating these abrupt spikes in citation activity. This literature highlights distinctive research advancements in sensor combination methods, algorithm optimization strategies, and multisensor fusion technologies. Within the field of multisensor fusion, the exploration of various combination approaches has consistently been a central focus. From early GNSS/INS integration to the rise of vision–inertial SLAM and the inclusion of LiDAR with inertial sensors, each added sensor has significantly contributed to enhancing navigation accuracy and robustness. For instance, the study by Groves PD demonstrated a burst index of 42.27 between 2014 and 2018, indicating significant progress in addressing signal occlusion in complex environments through GNSS and inertial navigation system integration [33]. The vision–inertial SLAM algorithm proposed by Leutenegger S had a burst index of 52.37 during the 2017 to 2020 period, showcasing the maturity of this sensor combination in resolving localization drift problems in dynamic environments [34]. Subsequently, the study by Qin T incorporated LiDAR into a multimodal sensor fusion system, achieving a burst index of 55.9 during the 2020 to 2024 period, a milestone that greatly improved the performance of unmanned and robotic navigation technologies [24].
In the realm of algorithm optimization, research on sensor combinations has focused on enhancing the real-time performance and accuracy of data fusion. The filtering algorithm proposed by Noureldin A achieved a burst index of 22.49 during the 2010 to 2014 period. This study significantly reduced the impact of multipath effects on navigation accuracy by improving filtering methods for low-cost inertial navigation systems [22]. The work of Mur-Artal R concentrated on improving visual–inertial SLAM algorithms, attaining burst indices of 38.21 and 37.7 during the 2017–2020 and 2019–2022 periods, respectively. These studies innovatively optimized loop closure detection algorithms and zero-drift localization methods, further enhancing the robustness and real-time performance of SLAM systems in dynamic environments, laying a critical algorithmic foundation for deeper multisensor fusion [35,36]. In conclusion, the evolution of sensor combination technology from single-sensor integration to multimodal fusion clearly reflects the development trajectory of navigation technologies. At each stage, the exploration of sensor combinations and algorithm optimization has consistently enhanced the accuracy and adaptability of navigation systems. From GNSS/INS to vision–inertial SLAM, and finally to multimodal fusion, navigation technology has been advancing toward a highly intelligent, integrated system. In the future, research will continue to improve accuracy, robustness, and applicability, providing reliable support for autonomous driving, robotic navigation, and other complex scenarios.

4. Discussion

4.1. Development and Innovation in Multisource Sensor Technologies

In the current era of information and intelligence, multisensor fusion has emerged as a pivotal technology for achieving high-precision positioning, navigation, environmental perception, and intelligent decision-making. By integrating data from various sensor types, multisensor fusion enables comprehensive and accurate environmental awareness, significantly enhancing system stability and reliability. However, each sensor type has its own strengths and limitations, which must be carefully considered to optimize the effectiveness of multisensor fusion in integrated navigation systems.
As illustrated in Figure 11, a visual analysis of relevant literature databases reveals a growing trend in multisensor fusion technology in recent years. GNSS provides accurate, large-area positioning and is widely used in outdoor applications, such as agricultural automation and open-road autonomous driving, due to its global coverage and high accuracy [53]. However, its reliability is compromised in urban canyons, dense forests, or tunnels due to signal obstruction, interference, and multipath effects. In contrast, INS, an autonomous and passive navigation system, operates independently of external signals and is less affected by environmental interference. It is widely adopted in missile guidance, underwater exploration, and aerospace navigation due to its robustness in extreme environments. However, its cumulative errors over time necessitate periodic calibration through external references (e.g., GNSS or visual landmarks) [54]. To compensate for the limitations of both GNSS and INS, GNSS/INS integration has become one of the most widely used techniques in integrated navigation systems [55]. The integration of GNSS and INS has evolved from independent, large-scale, and power-hungry systems to systems focusing on miniaturization, low power consumption, deep integration, and intelligence, thus laying a solid foundation for integrated navigation [54,56,57,58,59,60,61,62].
Around 2020, the incorporation of LiDAR technology marked a significant expansion in this field, greatly enhancing the environmental perception and three-dimensional mapping capabilities of navigation systems [63,64]. LiDAR provides high-precision 3D point cloud data, capable of capturing detailed terrain and obstacle information. It excels in applications requiring high-precision mapping, such as autonomous vehicles, topographic surveys, and infrastructure inspections. Unlike some other sensors, LiDAR is less affected by lighting conditions and can operate in both daylight and nighttime environments. However, LiDAR systems face challenges, including high cost, sensitivity to weather conditions (such as rain or fog), and substantial power consumption, limiting their large-scale deployment in certain applications [65]. Subsequently, around 2021, the rapid rise of visual sensor technology in integrated navigation applications became another significant milestone [66,67]. The introduction of advanced technologies such as SLAM significantly improved the robustness and accuracy of navigation solutions, especially in environments where GNSS signals are weak or unavailable, such as indoor settings or urban canyons. In these scenarios, visual sensors provide rich environmental information that compensates for the lack of GNSS signals. A practical example is their use in indoor warehouse robots, where visual sensors enable precise navigation along aisles and shelves without the need for external positioning infrastructure [68]. The integration of INS/visual and GNSS/INS/visual combinations effectively improves navigation capabilities. The INS/visual combination leverages visual data to compensate for INS drift, enhancing system stability and long-term accuracy. Meanwhile, the GNSS/INS/visual combination further improves navigation performance by integrating additional positioning information from both GNSS and visual sensors, thereby strengthening navigation capabilities in GNSS-denied or weak signal environments. However, the performance of visual sensors is highly dependent on environmental lighting conditions and affected by factors such as shadows, glare and low light, which increase the difficulty of real-time operation in complex and dynamic environments. For instance, in outdoor construction sites where lighting conditions change rapidly, visual sensors might struggle to maintain consistent performance, necessitating more robust processing algorithms or complementary sensors [69,70]. Nevertheless, the widespread application of visual sensors has accelerated the development of multisensor fusion systems, advancing navigation technologies toward more intelligent and flexible solutions, ultimately enabling breakthroughs in adaptive navigation systems. To further improve navigation accuracy and system stability, the integration of GNSS/INS/visual/LiDAR has emerged. The fusion of LiDAR point cloud data with GNSS and INS provides comprehensive sensing capabilities, further enhancing the system’s navigation accuracy. However, this integration also introduces challenges related to computational complexity and real-time processing, which require continuous optimization to improve overall performance. In recent years, the integration of vision and language has emerged as an innovative solution, significantly enhancing the capabilities of navigation systems. Vision–language navigation combines visual input with natural language instructions, enabling more intuitive and flexible navigation solutions. For example, in robotics, autonomous vehicles, and aerial drones, systems can process both environmental images and verbal commands to navigate, improving autonomy in environments where GNSS signals are unavailable or traditional methods face challenges. The use of vision–language navigation in aerial drones, for instance, allows them to operate autonomously in complex environments by interpreting both visual and language cues, enabling them to make real-time decisions without relying on GNSS signals [71,72,73].
In summary, the progression from the classic integration of GNSS and INS to the incorporation of LiDAR and the application of visual sensors not only highlights the increasing maturity of multisensor information fusion technology but also demonstrates the vitality of ongoing innovation in the navigation field to meet diverse demands [74]. Looking ahead, future research will focus on addressing global, seamless navigation and positioning through low-cost integration methods as well as improving system miniaturization, real-time processing capabilities, and sensor fusion techniques. Additionally, there will likely be continued efforts to overcome the individual limitations of each sensor type, particularly in terms of weather sensitivity, power consumption, and environmental adaptability.

4.2. Integration and State Estimation in Multisource Systems

4.2.1. Integration Methods of Multisource Sensors

The integration methods of multisensor information play a crucial role in combination navigation systems, primarily categorized into LC and TC [50]. These two categories represent different levels of data integration and processing closeness. Based on the LC and TC methods, various sensor combinations are widely applied, including GNSS/INS, INS/visual, GNSS/INS/visual, and GNSS/INS/visual/LiDAR. Each combination leverages the strengths of different sensors to enhance navigation accuracy and reliability. In addition to the LC and TC approaches, GNSS/INS combinations have evolved to include deeply coupled (DL) and ultratightly coupled modes. These advanced methods achieve a higher level of integration by tightly coupling GNSS signal processing with INS data, significantly improving robustness and performance in challenging environments such as urban canyons or GNSS-denied areas. The following sections discuss the technical characteristics, advantages, and limitations of different integration methods along with their implementation in various multisensor configurations.
1.
Loosely coupled integration mode
In an LC integration mode, each sensor operates independently, calculating its own positioning information, with data fusion occurring only during the postprocessing stage to enhance overall positioning performance. This fusion approach is straightforward, requiring no complex real-time interaction mechanisms. However, its positioning accuracy can be significantly affected by the performance fluctuations of individual sensors. In GNSS/INS systems when GNSS is unable to provide a navigation solution, the LC system will degrade to an INS mechanical alignment [55].
In the visual–inertial navigation system (VINS) employing an LC strategy, the visual system and the INS operate independently, each performing separate error estimation and correction. The visual system is primarily responsible for updating the position while the IMU continuously provides real-time dynamic information. During fusion, the two systems are connected through a simple association mechanism such as periodically applying external parameters from the visual system to correct IMU drift. This process does not involve real-time sharing and fusion of observational data.
In GNSS/INS/visual systems, the data processing and fusion of each sensor operate independently, without direct data interaction or constraints. For example, Neal A. Carlson [75] proposed a federated square-root filter design suitable for distributed parallel processing in multisensor systems. The overall operation is divided into multiple independent local subsets, each containing a local filter—specifically, GNSS/IMU and IMU/camera constitute two subfilters. The locally optimal estimates from these subfilters are then input into a main filter for solution, achieving significant performance improvement and optimization for real-time multisensor applications such as integrated navigation systems.
In GNSS/INS/visual/LiDAR systems, GNSS provides absolute position and time information, INS continuously calculates the vehicle’s velocity, attitude, and position, the visual sensor obtains relative position or heading information through image processing, and LiDAR supplies distance and angle information from the environment. These data are processed individually and subsequently integrated to compensate for errors in the IMU’s gyroscope and accelerometer measurements [76,77].
Overall, in a loosely coupled (LC) architecture, each sensor operates independently with separate processing pipelines, generating high-level navigation outputs that are fused at a centralized node. This approach minimizes real-time intersensor communication and computational complexity, making it well-suited for stable environments or systems with limited resources. However, its reliance on independent sensor outputs makes it less effective in dynamic or complex scenarios, where changes in sensor performance can significantly impact the overall accuracy.
2.
Tightly coupled integration mode
In contrast, TC integration achieves a deep fusion of sensor data with real-time information exchange, significantly enhancing the positioning accuracy and stability of the system in complex environments and when the performance of individual sensors is limited [50]. By tightly fusing data from multiple sensors, TC approaches provide a more integrated and robust solution, ensuring reliable performance even in dynamic and challenging scenarios. This high level of integration comes at the cost of increased computational demands and complexity, making TC methods ideal for applications where precision and stability are more critical than simplicity, such as in complex environments. In GNSS/INS systems, the INS not only performs self-positioning but also provides real-time motion state information that assists in the tracking of GNSS signals and error correction, ensuring positioning continuity and accuracy when satellite signals are weak [50,78]. Studies such as [78,79] focus on the research of TC GNSS and INS technologies based on MEMS technology. The postprocessing kinematic/INS tightly coupled mode provided by GINav demonstrates significant advantages in positioning accuracy and fixed solution rate, particularly in environments with severe signal blockage, where it can maintain continuous and accurate trajectory tracking [80].
In an INS/visual system, both visual observation constraints and the IMU kinematic model constraints are integrated into a nonlinear optimization problem for joint state estimation. The TC strategy fully exploits the intrinsic correlation and redundant information between the two sensors, enabling more precise and comprehensive data fusion and error correction. This approach is particularly effective in scenarios with complex and dynamic environments, significant variations in lighting conditions, or when IMU errors accumulate over prolonged operation, leading to increased drift [81].
In more advanced combinations, such as GNSS/INS/visual systems, the data association and constraints between the sensors are rigorously modeled and processed. By leveraging the complementarity of GNSS, IMU, and visual, the TC integration method directly fuses raw GNSS carrier phase and pseudo range measurements, IMU data, and visual features at the observation level using the extended Kalman filter (EKF). This approach effectively utilizes the multisensor information and rejects potential outlier measurements [82,83,84].
In a TC GNSS/INS/visual/LiDAR system, GNSS signals provide the absolute positioning reference and its data are tightly integrated with the outputs of the INS gyroscopes and accelerometers. At the same time, the data from the visual sensor and LiDAR are deeply integrated into this optimization process. These two types of sensors provide rich spatial geometric constraints and other forms of observational information, which, particularly in complex environments such as urban canyons, underground parking garages, or areas with weak GNSS signals, can complement and enhance the navigation system’s state estimation. This significantly improves the accuracy of motion trajectory tracking and the reliability of attitude perception [85]. However, this also demands more computational resources and increases algorithmic complexity. To address this, Li et al. adopted a semitight coupling strategy that combines GNSS PPP with S-VINS. This approach implements a bidirectional interaction mechanism, allowing the high-precision local positioning data generated by the S-VINS system to be exchanged and shared in real-time with the PPP system, creating complementary advantages. This not only maintains high stability but also enables centimeter-level or even subcentimeter-level positioning accuracy [86].
3.
Deeply coupled integration mode
The concept of DL was first introduced in 1996 by Gustafson at the Draper Laboratory in the United States. He was the first to use the term “deeply coupled” to describe a combined navigation method that employs an extended code-tracking loop [87]. However, as research on deeply and ultratightly coupled has progressed, differing interpretations and understandings of these two concepts have emerged within both academic and industry circles. Two main perspectives currently prevail. One perspective holds that ultratightly and deeply coupled are essentially the same, both representing advanced techniques in which deeply integrated INS information aids the tracking loops of GNSS receivers. Another view, however, makes a more nuanced distinction, arguing that ultratightly coupled is a subset of deeply coupled. Specifically, deeply coupled systems can be divided into scalar and vector deeply coupled based on the type of receiver tracking loops used, with ultratightly coupled corresponding to vector deeply coupled. In this mode, the INS not only provides attitude and velocity information but also directly participates in the vector tracking loop of the GNSS receiver, achieving a higher level of fusion. Both viewpoints have their proponents but a consensus has yet to be reached within the industry. Researchers such as Liu [88] adopted the first perspective on deeply coupled GNSS and INS, innovatively introducing real-time carrier motion parameters provided by the IMU into the tracking loop assistance process. This approach successfully addressed the challenge of decreased GNSS carrier phase measurement accuracy in extreme environments. Wu et al. further analyzed and improved the prefiltering technology within deep integration structures and subsequently proposed an adaptive GNSS/INS deep integration algorithm featuring hybrid prefiltering processing [89].
In addition, the GNSS/IMU ultratight integration navigation system generally performs well in high-dynamic and harsh environments as it leverages the complementary strengths of both GNSS and IMU sensors, providing robust performance under challenging conditions. However, in situations where GNSS signals are severely degraded or obstructed, the system’s performance may be affected, despite the assistance of IMU. To address this, researchers proposed a visually assisted GNSS/IMU ultratight integration navigation method. This method incorporates the relative positioning information provided by visual sensors into the conventional ultratight integration system, thereby enhancing the system’s robustness and accuracy in situations where GNSS signals are interfered with or occluded [90,91].
In a word, DC approaches tightly integrate raw sensor measurements at the signal-tracking level within unified processing loops. This enables precise error correction and robust performance in dynamic and high-stress scenarios. However, the increased level of integration comes with greater computational demands and complexity, making DC approaches most suitable for applications requiring ultrahigh precision and stability, such as such as the navigation in urban canyons or UAVs operating under dense foliage, where sensor degradation and multisource interference are prevalent.
In summary, whether it be LC, TC, DL, or ultratightly coupled, each represents a different level of information fusion strategy within integrated navigation systems to enhance positioning accuracy and stability. The choice and application of these fusion methods must be based on the requirements of the specific application scenario, environmental conditions, and a balance between system complexity and cost. With continuous advancements in sensor technology, improved computational power, and the integration of artificial intelligence, especially the development and application of advanced fusion methods like deeply and ultratightly coupled, integrated navigation systems are poised to become more precise and intelligent. This evolution will better meet the complex environmental and high-precision positioning demands of cutting-edge fields such as autonomous driving and unmanned aerial vehicles.

4.2.2. State Estimation Methods in Integrated Navigation

State estimation lies at the core of integrated navigation technology, directly affecting the accuracy and robustness of navigation systems in complex dynamic environments. Figure 12 clearly illustrates the three key technologies in integrated navigation state estimation: filtering techniques, graph optimization methods, and deep learning. As shown in Figure 12, prior to 2018, Kalman filtering [30,92,93,94,95] and its derivative algorithms [96,97,98,99] dominated state estimation and data fusion in navigation. These algorithms, with their rigorous mathematical frameworks and efficient online processing capabilities, provided a solid foundation for high-precision positioning [96,98,99]. However, as environmental complexity increased, data dimensions expanded, and computational capabilities advanced, the limitations of these classical methods in handling nonlinearity, high noise, and model uncertainties have gradually become apparent. Since 2013, researchers have begun exploring optimization-based approaches for integrated navigation to overcome the limitations of traditional filtering algorithms and enhance system robustness and accuracy [100,101,102,103,104]. Around 2020, optimization methods, specifically graph optimization techniques, began to gain significant attention in integrated navigation, further advancing the field’s focus on refined optimization approaches. By 2022, deep learning technologies, with their powerful capabilities in feature extraction, nonlinear modeling, and adaptive learning, effectively addressed challenges faced by traditional methods. In particular, deep learning contributed significantly to the intelligence and accuracy of integrated navigation systems in complex data processing, environmental adaptability, and noise suppression [13]. The deep integration of deep learning with integrated navigation technology opens up vast prospects for future research, heralding a new era of highly intelligent, high-precision navigation.
1.
Filtering techniques in integrated navigation.
The origins of the Kalman filter can be traced back to 1960, when R. E. Kalman introduced this groundbreaking algorithm [105]. Over time, it has become integral in GNSS/INS integrated systems, where it integrates positioning and navigation data to provide accurate estimates of position, velocity, and attitude through prediction and update steps while enhancing the system’s robustness in the absence of GPS signals [106]. To further improve the performance of GNSS/INS, scientists have proposed EKF, particle filtering, and UKF techniques [107]. The EKF algorithm performs a first-order Taylor expansion for nonlinear systems and is suitable for weakly nonlinear systems but it is less effective when dealing with strongly nonlinear systems. While particle filtering and UKF can handle strongly nonlinear systems more effectively, they come with higher algorithmic complexity and computational resource demands [108,109]. To address these issues, the paper [110] presents an improved robust strong tracking unscented Kalman filter (RSTUKF), which specifically enhances the filtering capability of the system in the presence of dynamic model errors, reducing the adverse effects of prior information. Research in 2024 has further expanded the application of these techniques in integrated navigation. Xu et al. combined a GRU neural network with adaptive Kalman filtering (AKF) to dynamically adjust filter parameters during GNSS signal outages, significantly improving the system’s positioning accuracy [111]. Furthermore, Zhang et al. proposed an improved INS-level fusion algorithm that uses a shared covariance matrix to optimize the fusion of IMU arrays and GNSS data, effectively enhancing the consistency of the filtering results and computational efficiency [112]. These studies not only address the limitations of traditional filtering methods in strong nonlinear scenarios but also further enhance positioning accuracy and system efficiency through novel algorithm designs, driving the advancement of integrated navigation technology.
In a VINS system, the filtering algorithm integrates the continuous, high-frequency data from the IMU, which suffers from cumulative errors, with the intermittent, low-frequency but highly accurate environmental feature information obtained from the visual sensor in order to precisely estimate the system’s state parameters. However, filtering-based systems are limited by the constraints of state variable selection, the complexity of constructing state and observation equations, and insufficient capability to handle nonlinear and noisy data [113]. To overcome these limitations, researchers have integrated IMU biases and camera parameters within the multistate constraint Kalman filter (MSCKF) framework, thereby improving data fusion accuracy [114]. Hesch et al. improved the filter’s performance in handling system rotation dynamics by identifying and addressing unobservable modes in nonlinear systems [115]. Meanwhile, R-VIO [116] and MIMC-VINS [117] enhance the accuracy and robustness of state estimation in complex dynamic environments by jointly processing visual and IMU data.
In the development of GNSS/INS/visual integrated navigation technology [101], the MSF-EKF enhances the flexibility of sensor configuration through modular design but it still has limitations when dealing with highly nonlinear dynamics [118]. The article [119] proposes an adaptive federated filter that fuses GNSS, INS, and visual odometry (VO) data, effectively addressing nonlinear issues in high-dynamic environments. By adaptively adjusting the weights, this method improves the system’s accuracy and robustness in complex settings. However, when the dynamics of the environment become more extreme, traditional filtering methods often encounter convergence issues. Researchers further proposed a GNSS/INS/visual integration method based on the invariant extended Kalman filter (IEKF). This method introduces system dynamics invariants, effectively reducing linearization errors and significantly improving convergence speed and system robustness in high-dynamic environments, demonstrating exceptional performance in complex scenarios [120]. To further improve positioning accuracy and system stability, Liao et al. improved positioning accuracy in GNSS signal-constrained environments by combining EKF with VIO [121]. Gu et al. further improved data fusion accuracy and robustness by introducing cascade Kalman filtering and dynamic object removal algorithms [122]. Additionally, with the increasing demand for high-precision and high-reliability GNSS/INS/visual integrated navigation technologies, the InGVIO algorithm was proposed. By utilizing invariant filtering, it ensured filtering performance under specific motion patterns, thereby enhancing navigation accuracy in complex scenarios [82,123]. In the realm of integrity monitoring, researchers have proposed a solution based on the error state EKF model. This approach not only estimates the system’s fundamental state but also explicitly models the system errors, thereby enhancing fault detection and system response capabilities. By incorporating the error state model, this solution significantly improves the robustness and reliability of the system in complex and dynamic environments.
Furthermore, in the integration of GNSS/INS/visual/LiDAR systems, GNSS data is tightly coupled with the outputs of the INS gyroscope and accelerometer. Filtering techniques like EKF or UKF [124,125,126,127] employ real-time error estimation and compensation mechanisms to synchronize sensor data processing. When GNSS signals are weak or lost, INS can provide inertial navigation data to compensate. Additionally, the fusion of visual and LiDAR data further enhances the system’s robustness and accuracy in complex environments. Researchers optimize the processing of point cloud data from both visual and LiDAR sensors, using the visual sensor for feature extraction and recognition in static environments, while LiDAR provides high-precision three-dimensional spatial information. Through joint filtering, these sensors complement one another, reducing individual error accumulation and thus improving the overall system accuracy and reliability. Particularly in dynamic environments, the high resolution of LiDAR and the rich texture information from visual sensors effectively mitigate positioning errors caused by GNSS signal loss [85].
2.
Graph optimization methods in integrated navigation
The graph optimization algorithm estimates state variables using probabilistic graphical models, making it particularly well-suited for handling asynchronous data fusion. By constructing a highly flexible factor graph framework, it enables true “plug-and-play” integration of sensors [128,129,130,131,132]. In GNSS/INS, sliding window and factor graph models are used to reduce computational burdens, while INS redundancy is leveraged to detect and exclude anomalous GNSS data, enhancing the system’s robustness and accuracy [133]. To solve the problem of IMU installation angle estimation, a hybrid factor graph method based on nonlinear optimization has been proposed. By integrating a loosely coupled GNSS/INS algorithm with an optimized objective function, this method effectively reduces errors in the IMU installation angle, significantly improving three-dimensional positioning accuracy [134]. Further research has also led to the optimization of the inertial preintegration model. By introducing a high-precision IMU factor that accounts for Earth’s rotation, combined with factor graph optimization, the performance of the inertial navigation system in attitude estimation has been significantly enhanced [135]. Moreover, in terms of short-term relative accuracy, factor graph optimization has shown improvements in both three-dimensional position and velocity accuracy compared to traditional EKF methods. This approach is particularly well-suited for applications that require short-term high-precision navigation, such as railway inspection and mobile mapping [136].
In VINS, graph optimization and factor graph optimization are also widely applied. By constructing nonlinear optimization problems, these methods effectively address complex nonlinear relationships and cumulative errors, achieving globally optimal state estimation. The incremental smoothing and mapping (iSAM) algorithm [137] used factor graphs for real-time smoothing and mapping, efficiently processing large-scale data and enhancing the global consistency and accuracy of the VINS system. Building on this, Indelman et al. proposed an incremental smoothing framework that allows dynamic updates of sensor data, improving system flexibility and adaptability to complex environments [132]. RD-VIO further utilizes IMU-PARSAC for robust keypoint detection and matching and addresses the pure rotation problem through delayed triangulation, which effectively enhances navigation performance in dynamic environments [138]. Additionally, DynaVIO employs a dynamic probabilistic propagation model to identify and reject dynamic features, thereby improving the robustness of position estimation [139]. In loop closure detection and global map consistency maintenance, graph optimization also plays a key role. For instance, loop closure processing in the MonoSLAM system [66] inspired subsequent optimization strategies in systems like the ORB-SLAM series [28,31,35], where precise graph optimization enables trajectory optimization and global relocalization, enhancing the long-term stability and accuracy of VINS. Additionally, algorithms such as VINS-Mono [24] and VINS-Fusion combine IMU preintegration techniques with nonlinear optimization of visual data, significantly improving system performance in dynamic and complex environments. The development of these technologies has made VINS systems more accurate and stable in handling large-scale data and navigating complex environments.
In GNSS/INS/visual systems, the GOMSF method proposed by Mascaro et al. enhanced the flexibility and robustness of multisensor data fusion through a sliding window pose graph optimization model [140]. The IC-GVINS algorithm [141] further strengthened the application of INS-based real-time navigation systems in complex environments by efficiently integrating multisensor data through the combination of factor graph optimization and sliding window techniques. In the same year, the open-source GICI-LIB library [142] and the Sky-GVINS algorithm [143] improved the performance of navigation systems in dynamic and signal-constrained environments by considering GNSS errors and leveraging visual data. Figure 13 illustrates a typical factor graph optimization framework for GNSS/IMU/visual/LiDAR integration. In this integration, optimization techniques demonstrate unique advantages in addressing complex nonlinear problems and accumulated errors. The NDT SLAM algorithm proposed by Ravi et al. combined graph optimization with deep learning, significantly improving trajectory accuracy and automation levels [144]. Zhou et al. further applied these techniques to UAV platforms, achieving centimeter-level mapping accuracy through precise point cloud matching and SFM strategies [145]. The GVIL algorithm from Wuhan University enhanced navigation performance in complex environments by using a tightly coupled graph optimization approach, overcoming challenges in accuracy and continuity under GNSS-limited conditions, demonstrating the potential for seamless navigation [146].
3.
Deep learning techniques in integrated navigation
The nonlinear modeling capability of neural networks provides new solutions for GNSS/INS integrated navigation. In navigation systems, traditional linear filtering methods (such as EKF) have limitations when handling strong nonlinear dynamic relationships. In contrast, neural networks, through deep analysis and adaptive adjustment of multisource data, are capable of dynamically capturing the nonlinear relationship between GNSS and INS data, demonstrating significant potential in feature modeling and error compensation in complex scenarios. Liu et al. explored the application of LSTM in GNSS/INS integration, compensating for INS errors using pseudo-GNSS position information, which significantly improved navigation performance during signal interruptions [147]. Wu et al. proposed a deep-learning-based adaptive error compensation method, dynamically correcting GNSS accumulated errors caused by signal interruptions using INS data. Compared to traditional extended Kalman filtering, the navigation accuracy improved by 77.7% [148]. Additionally, researchers have combined deep learning with extended Kalman filtering and adaptive Kalman filtering (AKF), dynamically adjusting filtering parameters during GNSS signal loss, significantly improving system positioning accuracy [149,150]. In recent years, neural networks have been further applied in complex environments such as urban canyons and multipath interference, integrating technologies like CNN and transformer, driving the widespread application of GNSS/INS integrated navigation systems in autonomous driving, UAV navigation, and robotic positioning.
In the development of VINS, deep learning has gradually become an important tool for enhancing system autonomy, resilience to interference, and navigation accuracy. Long short-term memory (LSTM) networks are used to learn the temporal sequence characteristics of IMU data, improving the state transition model, while CNN enhances the observation model’s accuracy by extracting multilevel features from visual images. These deep learning modules replace certain traditional filters, reducing errors introduced by simplifying mathematical models [151]. Similarly, the DynaNet proposed by Chen et al. utilizes LSTM to estimate the state transition matrix, combined with CNN to extract high-level features, thereby constructing KF equations to improve the accuracy of state estimation [152]. Additionally, DeepVIO combines self-supervised learning with 3D geometric constraints, utilizing 2D optical flow features and IMU data for trajectory estimation without relying on external calibration data. This approach maintains high robustness in the presence of calibration errors and data loss [153]. On the other hand, VIFT employs a transformer-based deep learning model, utilizing an attention mechanism for pose estimation, optimizing the temporal dependencies of latent feature vectors, and performing rotation learning on the SE (3) manifold while addressing data imbalance. This approach is particularly well-suited for monocular visual–inertial navigation systems in autonomous driving and robotics [154]. Deep learning is also applied to the real-time evaluation of the confidence level of observational data. When encountering outlier data or sensor failures, the filter can intelligently perform data selection and weight assignment, mitigating the impact of erroneous data [155].
Deep learning has also demonstrated immense potential in GNSS/INS/visual integrated navigation, capable of handling signal attenuation in complex environments and dynamically adjusting sensor weights. By combining CNN for visual feature extraction and RNN/LSTM for capturing IMU temporal dependencies, this approach is particularly suited for scenarios with weak or interrupted GNSS signals. Although research is still in its early stages, systems like VD-VIO [156] have already shown the promising prospects of deep learning in these areas. In further research, a deep-learning-assisted fusion technique for inertial, visual, and LiDAR data is introduced, where CNN is employed to extract spatial features from vision and LiDAR while GRU captures the temporal dependencies of IMU data. The method dynamically adjusts the fusion weights of the different data modalities and detects and rejects anomalous sensor data to ensure the reliability of the multimodal fusion process. With advancements in algorithms and hardware, the integration of deep learning and integrated navigation technologies is expected to address challenges such as positioning in extreme environments, rapid adaptation to new environments, and navigation in weak signal conditions, thus driving further development in the field of navigation. This trend signals the arrival of a new era of deep intelligence and high-precision navigation, opening up vast prospects for future research.
In summary, the three key technologies for state estimation in integrated navigation each have their own advantages and applicable scenarios, so a comprehensive consideration of various factors is required when choosing an algorithm. For simple and stable environments with limited computational resources, filtering techniques are more suitable. In cases where data processing, robustness, and fault detection are emphasized and hardware resources are sufficient, graph optimization methods are superior. In complex urban environments, particularly when GNSS signals are frequently interrupted and data accumulation occurs, deep learning techniques demonstrate clear advantages. Table 6 provides a detailed summary of the data fusion algorithms for the integration of GNSS, INS, visual, and LiDAR.
Future research will focus on several key directions. First, improving sensor quality remains fundamental, particularly in enhancing the stability, temperature adaptability, and environmental robustness of inertial sensors (e.g., MEMS inertial sensors). Recent studies have shown that employing noise compensation and temperature compensation techniques can significantly improve the accuracy of MEMS sensors [157,158,159]. Meanwhile, the adaptability of visual sensors in complex lighting conditions and high-speed motion scenarios has been progressively improved through deep learning algorithms, such as CNN-informer. Second, the trend of hardware integration and miniaturization is driving navigation devices toward compact, low-power, and cost-effective solutions [160,161]. For example, the widespread adoption of wearable devices and embedded navigation modules has opened up new application scenarios for navigation technologies [160]. Third, the optimization of information fusion algorithms remains central to technological advancement. Future efforts will focus on improving the real-time performance of algorithms and enhancing the system’s adaptive adjustment capabilities, thereby increasing the reliability and practicality of navigation systems in dynamic and uncertain environments. The development of hybrid navigation strategies to achieve deep integration and complementarity of the three key technologies will be a priority. For example, flexibly combining filtering and optimization techniques allows GNSS and INS data to be processed efficiently through filtering while incorporating graph optimization with visual information ensures optimal performance across diverse scenarios. Deep learning enhances robustness in real-time parameter calibration, environmental perception, and system adaptability. Its integration with traditional filtering methods can further improve system accuracy and adaptability. The combination of factor graph optimization and deep learning enhances system adaptability through real-time feature extraction and dynamic weight adjustment. Finally, the emergence of vision–language navigation has injected innovative vitality into navigation systems. Future research could explore the integration of vision–language navigation with multisensor fusion technologies, fully harnessing the potential of visual and language information to further enhance the decision-making efficiency and accuracy of navigation systems in dynamic environments. This trend heralds a new era of deeply intelligent navigation, offering greater potential for the application of navigation technologies in complex environments.

5. Conclusions

With the rapid development of integrated navigation technologies, significant progress has been made in the fusion of multisource sensors. This paper provides a literature review of research on sensor combinations, including GNSS, INS, visual, and LiDAR, spanning from 2000 to 2024. It outlines the current state, research gaps, and future development trends of mainstream integrated navigation systems such as GNSS/INS, INS/visual, GNSS/INS/visual, and GNSS/INS/visual/LiDAR. Despite the remarkable performance of these technologies in improving navigation accuracy and environmental adaptability, several challenges remain in practical applications. For instance, in complex urban environments, the combination of INS and visual proves to be more advantageous, whereas in open environments, the integration of GNSS and INS performs better. While LiDAR excels in high-precision modeling, its data collection reliability under adverse weather conditions still requires further improvement. In the future, optimizing multisensor fusion algorithms, enhancing real-time data processing capabilities, and developing adaptive adjustment strategies will be key to improving system performance. Additionally, the integration of advanced technologies such as deep learning and neural networks will further enhance the intelligence and automation levels of integrated navigation systems. The ideal future integrated navigation system should be capable of flexibly leveraging the advantages of various sensors based on environmental conditions and seamlessly integrating them through efficient data fusion algorithms. This will ensure that the navigation system maintains high accuracy and reliability even in the event of a sensor failure or environmental changes. This developmental trend will drive the advancement of integrated navigation technology toward higher precision and broader environmental adaptability. To achieve this goal, researchers must focus on overcoming key challenges such as enhancing GNSS resistance to interference, improving the INS error model and correction algorithms to suppress drift, optimizing the robustness of visual sensors under complex conditions, and addressing the performance limitations of LiDAR in adverse weather. Additionally, the development of new LiDAR technologies with better penetration capabilities or the design of more efficient environmental-adaptive data processing algorithms will be essential areas of focus for future research.
Looking ahead, the development of integrated navigation systems will increasingly focus on multisensor fusion and intelligent optimization. The introduction of emerging sensor technologies, such as quantum navigation and bionavigation, along with the application of algorithms like deep learning, will drive the widespread adoption of this technology in fields such as autonomous driving, aerospace, and robotics. These advancements will ensure precise navigation in extremely complex environments.

Author Contributions

Conceptualization, J.W.; methodology, J.W.; validation, J.W.; data curation, J.W.; writing and original draft preparation, J.W.; writing, reviewing, and editing, M.S. and Y.Y.; visualization, J.W.; supervision, Y.Y.; funding acquisition, M.S. All authors have read and agreed to the published version of the manuscript.

Funding

This work was supported by the Hubei Strategic Science and Technology Talent Program (No. KJCXRC202400130), the Natural Science Foundation of Hubei Province of China (No. 2023AFB948), the Project of the State Key Laboratory of Geodesy and Earth’s Dynamics (No. S22L6201), and the National Key Research Program (No. 2022YFB3903903).

Conflicts of Interest

The authors declare no conflicts of interest.

References

  1. Adeel, M.; Gong, Z.; Liu, P.L.; Wang, Y.Z.; Chen, X.; Inst, N. Research and Performance Analysis of Tightly Coupled Vision, INS and GNSS System for Land Vehicle Applications. In Proceedings of the 30th International Technical Meeting of The Satellite-Division-of-the-Institute-of-Navigation (ION GNSS+), Portland, OR, USA, 25–29 September 2017; pp. 3321–3330. [Google Scholar]
  2. Chiang, K.W.; Chang, H.W.; Li, Y.H.; Tsai, G.J.; Tseng, C.L.; Tien, Y.C.; Hsu, P.C. Assessment for INS/GNSS/Odometer/Barometer Integration in Loosely-Coupled and Tightly-Coupled Scheme in a GNSS-Degraded Environment. IEEE Sens. J. 2020, 20, 3057–3069. [Google Scholar]
  3. Geng, X.S.; Guo, Y.; Tang, K.H.; Wu, W.Q.; Ren, Y.C. Research on Covert Directional Spoofing Method for INS/GNSS Loosely Integrated Navigation. IEEE Trans. Veh. Technol. 2023, 72, 5654–5663. [Google Scholar]
  4. Cong, L.; Yue, S.; Qin, H.L.; Li, B.; Yao, J.T. Implementation of a MEMS-Based GNSS/INS Integrated Scheme Using Supported Vector Machine for Land Vehicle Navigation. IEEE Sens. J. 2020, 20, 14423–14435. [Google Scholar]
  5. Shang, X.Y.; Sun, F.P.; Liu, B.D.; Zhang, L.D.; Cui, J.Y. GNSS Spoofing Mitigation With a Multicorrelator Estimator in the Tightly Coupled INS/GNSS Integration. IEEE Trans. Instrum. Meas. 2023, 72, 12. [Google Scholar]
  6. Wang, J.; Wang, S.; Zou, D.; Chen, H.; Zhong, R.; Li, H.; Zhou, W.; Yan, K. Social Network and Bibliometric Analysis of Unmanned Aerial Vehicle Remote Sensing Applications from 2010 to 2021. Remote Sens. 2021, 13, 2912. [Google Scholar] [CrossRef]
  7. Huang, Z.; Wu, X.; Wang, H.; Hwang, C.; He, X. Monitoring Inland Water Quantity Variations: A Comprehensive Analysis of Multi-Source Satellite Observation Technology Applications. Remote Sens. 2023, 15, 3945. [Google Scholar] [CrossRef]
  8. Van Eck, N.; Waltman, L. Software survey: VOSviewer, a computer program for bibliometric mapping. Scientometrics 2010, 84, 523–538. [Google Scholar] [PubMed]
  9. Wei, J.; Wang, F.; Lindell, M.K. The evolution of stakeholders’ perceptions of disaster: A model of information flow. J. Assoc. Inf. Sci. Technol. 2016, 67, 441–453. [Google Scholar]
  10. Chen, C.; Dubin, R.; Kim, M.C. Emerging trends and new developments in regenerative medicine: A scientometric update (2000–2014). Expert Opin. Biol. Ther. 2014, 14, 1295–1317. [Google Scholar]
  11. Schmoch, U. Mean values of skewed distributions in the bibliometric assessment of research units. Scientometrics 2020, 125, 925–935. [Google Scholar]
  12. Chen, C.M. Eugene Garfield’s scholarly impact: A scientometric review. Scientometrics 2018, 114, 489–516. [Google Scholar] [CrossRef]
  13. Yang, L.; Li, Y.; Wu, Y.L.; Rizos, C. An enhanced MEMS-INS/GNSS integrated system with fault detection and exclusion capability for land vehicle navigation in urban areas. GPS Solut. 2014, 18, 593–603. [Google Scholar] [CrossRef]
  14. Bhatti, U.I.; Ochieng, W.Y.; Feng, S.J. Integrity of an integrated GPS/INS system in the presence of slowly growing errors. Part I: A critical review. GPS Solut. 2007, 11, 173–181. [Google Scholar] [CrossRef]
  15. Luo, Q.; Cao, Y.R.; Liu, J.J.; Benslimane, A. Localization and Navigation in Autonomous Driving: Threats and Countermeasures. IEEE Wirel. Commun. 2019, 26, 38–45. [Google Scholar] [CrossRef]
  16. Hollar, S.; Brain, M.; Nayak, A.A.; Stevens, A.; Patil, N.; Mittal, H.; Smith, W.J. A New Low Cost, Efficient, Self-Driving Personal Rapid Transit System. In Proceedings of the 28th IEEE Intelligent Vehicles Symposium (IV), Redondo Beach, CA, USA, 11–14 June 2017; pp. 412–417. [Google Scholar]
  17. Li, S.; Cui, P.Y.; Cui, H.T. Autonomous navigation and guidance for landing on asteroids. Aerosp. Sci. Technol. 2006, 10, 239–247. [Google Scholar] [CrossRef]
  18. Sabatini, R.; Moore, T.; Ramasamy, S. Global navigation satellite systems performance analysis and augmentation strategies in aviation. Prog. Aeosp. Sci. 2017, 95, 45–98. [Google Scholar] [CrossRef]
  19. Walter, T.; Enge, P.; Blanch, J.; Pervan, B. Worldwide Vertical Guidance of Aircraft Based on Modernized GPS and New Integrity Augmentations. Proc. IEEE 2008, 96, 1918–1935. [Google Scholar] [CrossRef]
  20. Chang, L.; Niu, X.J.; Liu, T.Y.; Tang, J.; Qian, C. GNSS/INS/LiDAR-SLAM Integrated Navigation System Based on Graph Optimization. Remote Sens. 2019, 11, 1009. [Google Scholar] [CrossRef]
  21. Yang, Z.; Yu, X.; Dedman, S.; Rosso, M.; Zhu, J.; Yang, J.; Xia, Y.; Tian, Y.; Zhang, G.; Wang, J. UAV remote sensing applications in marine monitoring: Knowledge visualization and review. Sci. Total Environ. 2022, 838, 155939. [Google Scholar]
  22. Noureldin, A.; Karamat, T.B.; Eberts, M.D.; El-Shafie, A. Performance Enhancement of MEMS-Based INS/GPS Integration for Low-Cost Navigation Applications. IEEE Trans. Veh. Technol. 2009, 58, 1077–1096. [Google Scholar] [CrossRef]
  23. Skog, I.; Handel, P. In-Car Positioning and Navigation Technologies—A Survey. IEEE Trans. Intell. Transp. Syst. 2009, 10, 4–21. [Google Scholar] [CrossRef]
  24. Qin, T.; Li, P.; Shen, S. VINS-Mono: A Robust and Versatile Monocular Visual-Inertial State Estimator. IEEE Trans. Robot. 2018, 34, 1004–1020. [Google Scholar] [CrossRef]
  25. Forster, C.; Carlone, L.; Dellaert, F.; Scaramuzza, D. On-manifold preintegration for real-time visual--inertial odometry. IEEE Trans. Robot. 2017, 33, 1–21. [Google Scholar]
  26. Xu, W.; Cai, Y.X.; He, D.J.; Lin, J.R.; Zhang, F. FAST-LIO2: Fast Direct LiDAD-Inertial Odometry. IEEE Trans. Robot. 2022, 38, 2053–2073. [Google Scholar] [CrossRef]
  27. Geiger, A.; Lenz, P.; Stiller, C.; Urtasun, R. Vision meets robotics: The kitti dataset. Int. J. Robot. Res. 2013, 32, 1231–1237. [Google Scholar] [CrossRef]
  28. Campos, C.; Elvira, R.; Rodriguez, J.J.G.; Montiel, J.M.M.; Tardos, J.D. ORB-SLAM3: An Accurate Open-Source Library for Visual, Visual–Inertial, and Multimap SLAM. IEEE Trans. Robot. 2021, 37, 1874–1890. [Google Scholar] [CrossRef]
  29. Paull, L.; Saeedi, S.; Seto, M.; Li, H. AUV Navigation and Localization: A Review. IEEE J. Ocean. Eng. 2014, 39, 131–149. [Google Scholar] [CrossRef]
  30. Li, M.; Mourikis, A.I. High-precision, consistent EKF-based visual-inertial odometry. Int. J. Robot. Res. 2013, 32, 690–711. [Google Scholar]
  31. Mur-Artal, R.; Tardós, J.D. Visual-Inertial Monocular SLAM With Map Reuse. IEEE Robot. Autom. Lett. 2017, 2, 796–803. [Google Scholar] [CrossRef]
  32. Chen, C. CiteSpace II: Detecting and visualizing emerging trends and transient patterns in scientific literature. J. Am. Soc. Inf. Sci. Technol. 2006, 57, 359–377. [Google Scholar]
  33. Groves, P.D. Principles of GNSS, Inertial, and Multisensor Integrated Navigation Systems, 2nd ed.; Artech: Boston, MA, USA, 2013; pp. 1–776. [Google Scholar]
  34. Leutenegger, S.; Lynen, S.; Bosse, M.; Siegwart, R.; Furgale, P. Keyframe-based visual–inertial odometry using nonlinear optimization. Int. J. Robot. Res. 2015, 34, 314–334. [Google Scholar] [CrossRef]
  35. Mur-Artal, R.; Montiel, J.M.M.; Tardos, J.D. ORB-SLAM: A Versatile and Accurate Monocular SLAM System. IEEE Trans. Robot. 2015, 31, 1147–1163. [Google Scholar] [CrossRef]
  36. Mur-Artal, R.; Tardós, J.D. ORB-SLAM2: An Open-Source SLAM System for Monocular, Stereo, and RGB-D Cameras. IEEE Trans. Robot. 2017, 33, 1255–1262. [Google Scholar] [CrossRef]
  37. Jazwinski, A.H. Stochastic Processes and Filtering Theory; Courier Corporation: Chelmsford, MA, USA, 2007. [Google Scholar]
  38. Sinpyo, H.; Man Hyung, L.; Ho-Hwan, C.; Sun-Hong, K.; Speyer, J.L. Observability of error States in GPS/INS integration. IEEE Trans. Veh. Technol. 2005, 54, 731–743. [Google Scholar] [CrossRef]
  39. Groves, P.D. Principles of GNSS, Inertial, and Multisensor Integrated Navigation Systems; Artech: Boston, MA, USA, 2008; pp. 1–518. [Google Scholar]
  40. Farrell, J. Aided Navigation: GPS with High Rate Sensors; McGraw-Hill, Inc.: New York, NY, USA, 2008. [Google Scholar]
  41. Georgy, J.; Noureldin, A.; Korenberg, M.J.; Bayoumi, M.M. Low-Cost Three-Dimensional Navigation Solution for RISS/GPS Integration Using Mixture Particle Filter. IEEE Trans. Veh. Technol. 2010, 59, 599–615. [Google Scholar] [CrossRef]
  42. Kaplan, E.D.; Hegarty, C. Understanding GPS/GNSS: Principles and Applications; Artech house: Boston, MA, USA, 2017. [Google Scholar]
  43. Martinelli, A. Vision and IMU Data Fusion: Closed-Form Solutions for Attitude, Speed, Absolute Scale, and Bias Determination. IEEE Trans. Robot. 2012, 28, 44–60. [Google Scholar] [CrossRef]
  44. Zhang, T.; Xu, X. A new method of seamless land navigation for GPS/INS integrated system. Measurement 2012, 45, 691–701. [Google Scholar] [CrossRef]
  45. Chen, X.; Shen, C.; Zhang, W.-b.; Tomizuka, M.; Xu, Y.; Chiu, K. Novel hybrid of strong tracking Kalman filter and wavelet neural network for GPS/INS during GPS outages. Measurement 2013, 46, 3847–3854. [Google Scholar] [CrossRef]
  46. Chowdhary, G.; Johnson, E.N.; Magree, D.; Wu, A.; Shein, A. GPS-denied indoor and outdoor monocular vision aided navigation and control of unmanned aircraft. J. Field Robot. 2013, 30, 415–438. [Google Scholar] [CrossRef]
  47. Engel, J.; Schöps, T.; Cremers, D. LSD-SLAM: Large-Scale Direct Monocular SLAM. In Proceedings of the Computer Vision—ECCV 2014, Cham, Switzerland, 6–12 September 2014; pp. 834–849. [Google Scholar]
  48. Forster, C.; Pizzoli, M.; Scaramuzza, D. SVO: Fast semi-direct monocular visual odometry. In Proceedings of the 2014 IEEE International Conference on Robotics and Automation (ICRA), Hong Kong, China, 31 May–7 June 2014; pp. 15–22. [Google Scholar]
  49. Yang, Z.; Shen, S. Monocular Visual–Inertial State Estimation With Online Initialization and Camera–IMU Extrinsic Calibration. IEEE Trans. Autom. Sci. Eng. 2017, 14, 39–51. [Google Scholar] [CrossRef]
  50. Falco, G.; Pini, M.; Marucco, G. Loose and Tight GNSS/INS Integrations: Comparison of Performance Assessed in Real Urban Scenarios. Sensors 2017, 17, 255. [Google Scholar] [CrossRef] [PubMed]
  51. Usenko, V.; Engel, J.; Stückler, J.; Cremers, D. Direct visual-inertial odometry with stereo cameras. In Proceedings of the 2016 IEEE International Conference on Robotics and Automation (ICRA), Stockholm, Sweden, 16–21 May 2016; pp. 1885–1892. [Google Scholar]
  52. Sun, K.; Mohta, K.; Pfrommer, B.; Watterson, M.; Liu, S.; Mulgaonkar, Y.; Taylor, C.J.; Kumar, V. Robust Stereo Visual Inertial Odometry for Fast Autonomous Flight. IEEE Robot. Autom. Lett. 2018, 3, 965–972. [Google Scholar] [CrossRef]
  53. Ning, J.; Yao, Y.; Zhang, X. Review of the Development of Global Satellite Navigation System. J. Navig. Position. 2013, 1, 3–8. [Google Scholar] [CrossRef]
  54. Liu, Z.; Zhou, Q.; Qin, Y.; El-Sheimy, N. Vision-Aided Inertial Navigation System with Point and Vertical Line Observations for Land Vehicle Applications. In China Satellite Navigation Conference (CSNC) 2017 Proceedings: Volume II; Lecture Notes in Electrical Engineering; Springer: Berlin/Heidelberg, Germany, 2017; pp. 445–457. [Google Scholar]
  55. Boguspayev, N.; Akhmedov, D.; Raskaliyev, A.; Kim, A.; Sukhenko, A. A comprehensive review of GNSS/INS integration techniques for land and air vehicle applications. Appl. Sci. 2023, 13, 4819. [Google Scholar] [CrossRef]
  56. El-Sheimy, N.; Youssef, A. Inertial sensors technologies for navigation applications: State of the art and future trends. Satell. Navig. 2020, 1, 1–21. [Google Scholar] [CrossRef]
  57. Bird, J.; Arden, D. Indoor navigation with foot-mounted strapdown inertial navigation and magnetic sensors [emerging opportunities for localization and tracking]. IEEE Wirel. Commun. 2011, 18, 28–35. [Google Scholar] [CrossRef]
  58. Vaduvescu, V.A.; Negrea, P. Inertial Measurement Unit—A Short Overview of the Evolving Trend for Miniaturization and Hardware Structures. In Proceedings of the 2021 International Conference on Applied and Theoretical Electricity (ICATE), Craiova, Romania, 27–29 May 2021. [Google Scholar]
  59. Dutta, I.; Savoie, D.; Fang, B.; Venon, B.; Alzar, C.L.G.; Geiger, R.; Landragin, A. Continuous Cold-Atom Inertial Sensor with 1 nrad/sec Rotation Stability. Phys. Rev. Lett. 2016, 116, 183003. [Google Scholar] [CrossRef]
  60. Guessoum, M.; Gautier, R.; Bouton, Q.; Sidorenkov, L.; Landragin, A.; Geiger, R. High Stability Two Axis Cold-Atom Gyroscope. In Proceedings of the 2022 9th IEEE International Symposium on Inertial Sensors and Systems (IEEE INERTIAL 2022), Avignon, France, 8–11 May 2022. [Google Scholar]
  61. Zhang, L.; Gao, W.; Li, Q.; Li, R.B.; Yao, Z.W.; Lu, S.B. A Novel Monitoring Navigation Method for Cold Atom Interference Gyroscope. Sensors 2019, 19, 222. [Google Scholar] [CrossRef]
  62. Palacios-Laloy, A.; Rutkowski, J.; Troadec, Y.; Léger, J.-M. On the Critical Impact of HF Power Drifts for Miniature Helium-Based NMR Gyroscopes. IEEE Sens. J. 2016, 17, 657–659. [Google Scholar] [CrossRef]
  63. Tan, S.X.; Stoker, J.; Greenlee, S. Detection of foliage-obscured vehicle using a multiwavelength polarimetric lidar. In Proceedings of the IGARSS: 2007 IEEE International Geoscience and Remote Sensing Symposium, VOLS 1–12: Sensing and Understanding Our Planet, Barcelona, Spain, 23–28 July 2007; pp. 2503–2506. [Google Scholar]
  64. Zhao, H.; Hua, D.X.; Mao, J.D.; Zhou, C.Y. Correction to near-range multiwavelength lidar optical parameter based on the measurements of particle size distribution. Acta Phys. Sin. 2015, 64, 124208. [Google Scholar] [CrossRef]
  65. Xu, Z.; Yan, Z.; Li, X.; Shen, Z.; Zhou, Y.; Wu, Z.; Li, X. Review of high- precision multi-sensor integrated positioning towards intelligent driving. Position Navig. Timing 2023, 10, 1–20. [Google Scholar] [CrossRef]
  66. Davison, A.J.; Reid, I.D.; Molton, N.D.; Stasse, O. MonoSLAM: Real-Time Single Camera SLAM. IEEE Trans. Pattern Anal. Mach. Intell. 2007, 29, 1052–1067. [Google Scholar] [CrossRef] [PubMed]
  67. Klein, G.; Murray, D. Parallel Tracking and Mapping for Small AR Workspaces. In Proceedings of the 2007 6th IEEE and ACM International Symposium on Mixed and Augmented Reality, Nara, Japan, 13–16 November 2007; pp. 225–234. [Google Scholar]
  68. Ng, Z.Y. Indoor-Positioning for Warehouse Mobile Robots Using Computer Vision; UTAR: Kampar, Malaysia, 2021. [Google Scholar]
  69. Kim, T.-L.; Arshad, S.; Park, T.-H. Adaptive Feature Attention Module for Robust Visual–LiDAR Fusion-Based Object Detection in Adverse Weather Conditions. Remote Sens. 2023, 15, 3992. [Google Scholar] [CrossRef]
  70. Gallego, G.; Delbrück, T.; Orchard, G.; Bartolozzi, C.; Taba, B.; Censi, A.; Leutenegger, S.; Davison, A.J.; Conradt, J.; Daniilidis, K.; et al. Event-Based Vision: A Survey. IEEE Trans. Pattern Anal. Mach. Intell. 2022, 44, 154–180. [Google Scholar] [CrossRef]
  71. Wang, X.; Yang, D.; Wang, Z.; Kwan, H.; Chen, J.; Wu, W.; Li, H.; Liao, Y.; Liu, S. Towards realistic uav vision-language navigation: Platform, benchmark, and methodology. arXiv 2024, arXiv:2410.07087. [Google Scholar]
  72. Liu, S.; Zhang, H.; Qi, Y.; Wang, P.; Zhang, Y.; Wu, Q. Aerialvln: Vision-and-language navigation for uavs. In Proceedings of the IEEE/CVF International Conference on Computer Vision, Paris, France, 2–6 October 2023; pp. 15384–15394. [Google Scholar]
  73. Lee, J.; Miyanishi, T.; Kurita, S.; Sakamoto, K.; Azuma, D.; Matsuo, Y.; Inoue, N. CityNav: Language-Goal Aerial Navigation Dataset with Geographic Information. arXiv 2024, arXiv:2406.14240. [Google Scholar]
  74. Li, T.; Zhang, H.P.; Gao, Z.Z.; Niu, X.J.; El-sheimy, N. Tight Fusion of a Monocular Camera, MEMS-IMU, and Single-Frequency Multi-GNSS RTK for Precise Navigation in GNSS-Challenged Environments. Remote Sens. 2019, 11, 24. [Google Scholar] [CrossRef]
  75. Carlson, N.A. Federated Square Root Filter for Decentralized Parallel Processes. IEEE Trans. Aerosp. Electron. Syst. 1990, 26, 517–525. [Google Scholar] [CrossRef]
  76. Schütz, A.; Sánchez-Morales, D.E.; Pany, T. Precise Positioning Through a Loosely-Coupled Sensor Fusion of Gnss-Rtk, INS and LiDAR for Autonomous Driving. In Proceedings of the 2020 IEEE/ION Position, Location and Navigation Symposium (PLANS), Portland, OR, USA, 20–23 April 2020; pp. 219–225. [Google Scholar]
  77. Li, T.; Pei, L.; Xiang, Y.; Wu, Q.; Xia, S.; Tao, L.; Guan, X.; Yu, W. P 3-LOAM: PPP/LiDAR loosely coupled SLAM with accurate covariance estimation and robust RAIM in urban canyon environment. IEEE Sens. J. 2020, 21, 6660–6671. [Google Scholar]
  78. Gao, Z.; Zhang, H.; Ge, M.; Niu, X.; Shen, W.; Wickert, J.; Schuh, H. Tightly coupled integration of multi-GNSS PPP and MEMS inertial measurement unit data. GPS Solut. 2016, 21, 377–391. [Google Scholar] [CrossRef]
  79. Wang, D.; Dong, Y.; Li, Z.; Li, Q.; Wu, J. Constrained MEMS-Based GNSS/INS Tightly Coupled System With Robust Kalman Filter for Accurate Land Vehicular Navigation. IEEE Trans. Instrum. Meas. 2020, 69, 5138–5148. [Google Scholar] [CrossRef]
  80. Chen, K.; Chang, G.B.; Chen, C. GINav: A MATLAB-based software for the data processing and analysis of a GNSS/INS integrated navigation system. GPS Solut. 2021, 25, 7. [Google Scholar] [CrossRef]
  81. Luo, N.; Hu, Z.; Ding, Y.; Li, J.; Zhao, H.; Liu, G.; Wang, Q. DFF-VIO: A General Dynamic Feature Fused Monocular Visual-Inertial Odometry. IEEE Trans. Circuits Syst. Video Technol. 2024, 35, 1758–1773. [Google Scholar] [CrossRef]
  82. Liu, C.; Jiang, C.; Wang, H. InGVIO: A Consistent Invariant Filter for Fast and High-Accuracy GNSS-Visual-Inertial Odometry. IEEE Robot. Autom. Lett. 2023, 8, 1850–1857. [Google Scholar] [CrossRef]
  83. Li, X.X.; Li, S.Y.; Zhou, Y.X.; Shen, Z.H.; Wang, X.B.; Li, X.; Wen, W.S. Continuous and Precise Positioning in Urban Environments by Tightly Coupled Integration of GNSS, INS and Vision. IEEE Robot. Autom. Lett. 2022, 7, 11458–11465. [Google Scholar] [CrossRef]
  84. Jiang, H.; Yan, D.; Wang, J.; Yin, J. Innovation-based Kalman filter fault detection and exclusion method against all-source faults for tightly coupled GNSS/INS/Vision integration. GPS Solut. 2024, 28, 108. [Google Scholar] [CrossRef]
  85. Li, S.Y.; Li, X.X.; Wang, H.D.; Zhou, Y.X.; Shen, Z.H. Multi-GNSS PPP/INS/Vision/LiDAR tightly integrated system for precise navigation in urban environments. Inf. Fusion 2023, 90, 218–232. [Google Scholar] [CrossRef]
  86. Li, X.X.; Wang, X.B.; Liao, J.C.; Li, X.; Li, S.Y.; Lyu, H.B. Semi-tightly coupled integration of multi-GNSS PPP and S-VINS for precise positioning in GNSS-challenged environments. Satell. Navig. 2021, 2, 14. [Google Scholar] [CrossRef]
  87. Liu, C.L.; Wang, C.; Wang, J. A Bandwidth Adaptive Pseudo-Code Tracking Loop Design for BD/INS Integrated Navigation. In Proceedings of the 2nd International Conference on Control Science and Systems Engineering (ICCSSE), Singapore, 27–29 July 2016; pp. 46–49. [Google Scholar]
  88. Liu, H.R.; Zhang, T.S.; Zhang, P.H.; Qi, F.R.; Li, Z. Accuracy Analysis of GNSS/INS Deeply-Coupled Receiver for Strong Earthquake Motion. In Proceedings of the 8th China Satellite Navigation Conference (CSNC), Shanghai, China, 23–25 May 2017; pp. 339–349. [Google Scholar]
  89. Wu, M.Y.; Ding, J.C.; Zhao, L.; Kang, Y.Y.; Luo, Z.B. An adaptive deep-coupled GNSS/INS navigation system with hybrid pre-filter processing. Meas. Sci. Technol. 2018, 29, 14. [Google Scholar] [CrossRef]
  90. Ruotsalainen, L.; Kirkko-Jaakkola, M.; Bhuiyan, M.; Söderholm, S.; Thombre, S.; Kuusniemi, H. Deeply coupled GNSS, INS and visual sensor integration for interference mitigation. In Proceedings of the 27th International Technical Meeting of the Satellite Division of The Institute of Navigation (ION GNSS+ 2014), Tampa, FL, USA, 8–12 September 2014; pp. 2243–2249. [Google Scholar]
  91. Zuo, Z.; Yang, B.; Li, Z.; Zhang, T. A GNSS/IMU/Vision Ultra-Tightly Integrated Navigation System for Low Altitude Aircraft. IEEE Sens. J. 2022, 22, 11857–11864. [Google Scholar] [CrossRef]
  92. Indelman, V.; Gurfil, P.; Rivlin, E.; Rotstein, H. Real-time vision-aided localization and navigation based on three-view geometry. IEEE Trans. Aerosp. Electron. Syst. 2012, 48, 2239–2259. [Google Scholar]
  93. Bloesch, M.; Omari, S.; Hutter, M.; Siegwart, R. Robust visual inertial odometry using a direct EKF-based approach. In Proceedings of the 2015 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), Hamburg, Germany, 28 September–2 October 2015; pp. 298–304. [Google Scholar]
  94. Castellanos, J.A.; Martinez-Cantin, R.; Tardós, J.D.; Neira, J. Robocentric map joining: Improving the consistency of EKF-SLAM. Robot. Auton. Syst. 2007, 55, 21–29. [Google Scholar]
  95. Mourikis, A.I.; Roumeliotis, S.I. A multi-state constraint Kalman filter for vision-aided inertial navigation. In Proceedings of the 2007 IEEE International Conference on Robotics and Automation, Rome, Italy, 10–14 April 2007; pp. 3565–3572. [Google Scholar]
  96. Heo, S.; Jung, J.H.; Park, C.G. Consistent EKF-Based Visual-Inertial Navigation Using Points and Lines. IEEE Sens. J. 2018, 18, 7638–7649. [Google Scholar] [CrossRef]
  97. Li, X.H.; Jiang, H.D.; Chen, X.Y.; Kong, H.; Wu, J.F. Closed-Form Error Propagation on SEn(3) Group for Invariant EKF With Applications to VINS. IEEE Robot. Autom. Lett. 2022, 7, 10705–10712. [Google Scholar] [CrossRef]
  98. Weiss, S.; Scaramuzza, D.; Siegwart, R. Monocular-SLAM–based navigation for autonomous micro helicopters in GPS-denied environments. J. Field Robot. 2011, 28, 854–874. [Google Scholar] [CrossRef]
  99. Novara, C.; Ruiz, F.; Milanese, M. Direct Filtering: A New Approach to Optimal Filter Design for Nonlinear Systems. IEEE Trans. Autom. Control 2013, 58, 86–99. [Google Scholar] [CrossRef]
  100. Xin, S.; Wang, X.; Zhang, J.; Zhou, K.; Chen, Y. A Comparative Study of Factor Graph Optimization-Based and Extended Kalman Filter-Based PPP-B2b/INS Integrated Navigation. Remote Sens. 2023, 15, 5144. [Google Scholar] [CrossRef]
  101. Li, X.; Zhang, X.; Niu, X.; Wang, J.; Pei, L.; Yu, F.; Zhang, H.; Yang, C.; Gao, Z.; Zhang, Q.; et al. Progress and Achievements of Multi-sensor Fusion Navigation in China during 2019--2023. J. Geod. Geoinf. Sci. 2023, 6, 102–114. [Google Scholar]
  102. Zhu, F.; Xu, Z.; Zhang, X.; Zhang, Y.; Chen, W.; Zhang, X. On State Estimation in Multi-Sensor Fusion Navigation: Optimization and Filtering. arXiv 2024, arXiv:2401.05836. [Google Scholar]
  103. Kim, J.; Sukkarieh, S. 6DoF SLAM aided GNSS/INS navigation in GNSS denied and unknown environments. J. Glob. Position. Syst. 2005, 4, 120–128. [Google Scholar]
  104. Chu, T.; Guo, N.; Backén, S.; Akos, D. Monocular camera/IMU/GNSS integration for ground vehicle navigation in challenging GNSS environments. Sensors 2012, 12, 3162–3185. [Google Scholar] [CrossRef] [PubMed]
  105. Kalman, R.E. A New Approach to Linear Filtering and Prediction Problems Trans. ASME. Ser. D 1960, 82, 35–45. [Google Scholar] [CrossRef]
  106. Lv, W.F.; Publishing, I.O.P. Kalman Filtering Algorithm for Integrated Navigation System in Unmanned Aerial Vehicle. In Proceedings of the 5th Annual International Conference on Information System and Artificial Intelligence (ISAI), Zhejiang, China, 22–23 May 2020. [Google Scholar]
  107. Zhao, L.; Wang, X.; Ding, J.; Cao, W. Overview of nonlinear filter methods applied in integrated navigation system. J. Chin. Inert. Technol. 2009, 17, 46–52. [Google Scholar]
  108. Zhang, W.; Sun, R. Research on performance comparison of EKF and UKF and their application. J. Nanjing Univ. Sci. Technol. 2015, 39, 614–618. [Google Scholar]
  109. Shen, Z.; Yu, W.; Fang, J. Nonlinear algorithm based on UKF for lowcost SINS /GPS integrated navigation system. Syst. Eng. Electron. 2007, 29, 408–411. [Google Scholar]
  110. Hu, G.; Wang, W.; Zhong, Y.; Gao, B.; Gu, C. A new direct filtering approach to INS/GNSS integration. Aerosp. Sci. Technol. 2018, 77, 755–764. [Google Scholar] [CrossRef]
  111. Xu, C.; Chen, S.; Hou, Z. A hybrid information fusion method for SINS/GNSS integrated navigation system utilizing GRU-aided AKF during GNSS outages. Meas. Sci. Technol. 2024, 35, 106311. [Google Scholar] [CrossRef]
  112. Zhang, T.; Yuan, M.; Wang, L.; Tang, H.; Niu, X. A Robust and Efficient IMU Array/GNSS Data Fusion Algorithm. IEEE Sens. J. 2024, 24, 26278–26289. [Google Scholar] [CrossRef]
  113. Zhang, L.L.; Qu, H.; Mao, J.; Hu, X.P. A Survey of Intelligence Science and Technology Integrated Navigation Technology. Navig. Position. Timing 2020, 7, 50–63. [Google Scholar] [CrossRef]
  114. Fornasier, A.; van Goor, P.; Allak, E.; Mahony, R.; Weiss, S. MSCEqF: A Multi State Constraint Equivariant Filter for Vision-Aided Inertial Navigation. IEEE Robot. Autom. Lett. 2024, 9, 731–738. [Google Scholar] [CrossRef]
  115. Hesch, J.A.; Kottas, D.G.; Bowman, S.L.; Roumeliotis, S.I. Camera-IMU-based localization: Observability analysis and consistency improvement. Int. J. Robot. Res. 2013, 33, 182–201. [Google Scholar] [CrossRef]
  116. Huai, Z.; Huang, G. Robocentric visual–inertial odometry. Int. J. Robot. Res. 2019, 41, 667–689. [Google Scholar] [CrossRef]
  117. Eckenhoff, K.; Geneva, P.; Huang, G. MIMC-VINS: A Versatile and Resilient Multi-IMU Multi-Camera Visual-Inertial Navigation System. IEEE Trans. Robot. 2021, 37, 1360–1380. [Google Scholar] [CrossRef]
  118. Lynen, S.; Achtelik, M.W.; Weiss, S.; Chli, M.; Siegwart, R. A Robust and Modular Multi-Sensor Fusion Approach Applied to MAV Navigation. In Proceedings of the IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), Tokyo, Japan, 3–8 November 2013; pp. 3923–3929. [Google Scholar]
  119. Yue, Z.; Lian, B.; Tang, C.; Tong, K. A novel adaptive federated filter for GNSS/INS/VO integrated navigation system. Meas. Sci. Technol. 2020, 31, 85102. [Google Scholar] [CrossRef]
  120. Xia, C.; Li, X.; Li, S.; Zhou, Y. Invariant-EKF-Based GNSS/INS/Vision Integration with High Convergence and Accuracy. IEEE/ASME Trans. Mechatron. 2024, 29, 1–12. [Google Scholar] [CrossRef]
  121. Liao, J.; Li, X.; Wang, X.; Li, S.; Wang, H. Enhancing navigation performance through visual-inertial odometry in GNSS-degraded environment. GPS Solut. 2021, 25, 50. [Google Scholar] [CrossRef]
  122. Gu, S.; Dai, C.; Mao, F.; Fang, W. Integration of Multi-GNSS PPP-RTK/INS/Vision with a Cascading Kalman Filter for Vehicle Navigation in Urban Areas. Remote Sens. 2022, 14, 4337. [Google Scholar] [CrossRef]
  123. Li, K.; Li, J.; Wang, A. Discussion on development of GNSS/INS/Visual integrated navigation technology and data fusion. J. Navig. Position. 2023, 11, 9–15. [Google Scholar] [CrossRef]
  124. Hu, J.-S.; Chen, M.-Y. A sliding-window visual-IMU odometer based on tri-focal tensor geometry. In Proceedings of the 2014 IEEE International Conference on Robotics and Automation (ICRA), Hong Kong, China, 31 May–7 June 2014; pp. 3963–3968. [Google Scholar]
  125. Xian, Z.; Hu, X.; Lian, J. Fusing stereo camera and low-cost inertial measurement unit for autonomous navigation in a tightly-coupled approach. J. Navig. 2015, 68, 434–452. [Google Scholar] [CrossRef]
  126. Kong, X.; Wu, W.; Zhang, L.; Wang, Y. Tightly-coupled stereo visual-inertial navigation using point and line features. Sensors 2015, 15, 12816–12833. [Google Scholar] [CrossRef]
  127. Huang, G.P.; Mourikis, A.I.; Roumeliotis, S.I. A quadratic-complexity observability-constrained unscented Kalman filter for SLAM. IEEE Trans. Robot. 2013, 29, 1226–1243. [Google Scholar]
  128. Xu, J.; Yang, G.; Sun, Y.; Picek, S. A Multi-Sensor Information Fusion Method Based on Factor Graph for Integrated Navigation System. IEEE Access 2021, 9, 12044–12054. [Google Scholar] [CrossRef]
  129. Taghizadeh, S.; Nezhadshahbodaghi, M.; Safabakhsh, R.; Mosavi, M.R. A low-cost integrated navigation system based on factor graph nonlinear optimization for autonomous flight. GPS Solut. 2022, 26, 78. [Google Scholar]
  130. Wen, W.; Pfeifer, T.; Bai, X.; Hsu, L.-T. Factor graph optimization for GNSS/INS integration: A comparison with the extended Kalman filter. Navig. J. Inst. Navig. 2021, 68, 315–331. [Google Scholar]
  131. Chiu, H.-P.; Zhou, X.S.; Carlone, L.; Dellaert, F.; Samarasekera, S.; Kumar, R. Constrained optimal selection for multi-sensor robot navigation using plug-and-play factor graphs. In Proceedings of the 2014 IEEE International Conference on Robotics and Automation (ICRA), Hong Kong, China, 31 May–7 June 2014; pp. 663–670. [Google Scholar]
  132. Indelman, V.; Williams, S.; Kaess, M.; Dellaert, F. Information fusion in navigation systems via factor graph based incremental smoothing. Robot. Auton. Syst. 2013, 61, 721–738. [Google Scholar]
  133. Li, W.; Cui, X.W.; Lu, M.Q. A Robust Graph Optimization Realization of Tightly Coupled GNSS/INS Integrated Navigation System for Urban Vehicles. Tsinghua Sci. Technol. 2018, 23, 724–732. [Google Scholar] [CrossRef]
  134. Lin, F.; Wang, S.; Chen, Y.; Zou, M.; Peng, H.; Liu, Y. Vehicle integrated navigation IMU mounting angles estimation method based on nonlinear optimization. Meas. Sci. Technol. 2023, 35, 036304. [Google Scholar] [CrossRef]
  135. Zhang, L.; Wen, W.; Hsu, L.-T.; Zhang, T. An improved inertial preintegration model in factor graph optimization for high accuracy positioning of intelligent vehicles. IEEE Trans. Intell. Veh. 2023, 9, 1641–1653. [Google Scholar]
  136. Li, T.; Zhang, H.; Han, B.; Xia, M.; Shi, C. Relative Accuracy of GNSS/INS Integration Based on Factor Graph Optimization. IEEE Sens. J. 2024, 24, 33182–33194. [Google Scholar] [CrossRef]
  137. Walcott-Bryant, A.; Kaess, M.; Johannsson, H.; Leonard, J.J. Dynamic Pose Graph SLAM: Long-term Mapping in Low Dynamic Environments. In Proceedings of the 2012 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), Vilamoura-Algarve, Portugal, 7–12 October 2012; pp. 1871–1878. [Google Scholar]
  138. Li, J.; Pan, X.; Huang, G.; Zhang, Z.; Wang, N.; Bao, H.; Zhang, G. RD-VIO: Robust Visual-Inertial Odometry for Mobile Augmented Reality in Dynamic Environments. IEEE Trans. Vis. Comput. Graph. 2024, 30, 6941–6955. [Google Scholar] [CrossRef]
  139. Zheng, F.; Lin, W.; Sun, L. Dyna VIO: Real-Time Visual-Inertial Odometry with Instance Segmentation in Dynamic Environments. In Proceedings of the 2024 4th International Conference on Computer, Control and Robotics (ICCCR), Shanghai, China, 19–21 April 2024; pp. 21–25. [Google Scholar]
  140. Mascaro, R.; Teixeira, L.; Hinzmann, T.; Siegwart, R.; Chli, M. GOMSF: Graph-Optimization Based Multi-Sensor Fusion for robust UAV Pose estimation. In Proceedings of the 2018 IEEE International Conference on Robotics and Automation (ICRA), Brisbane, Australia, 21–25 May 2018; pp. 1421–1428. [Google Scholar]
  141. Niu, X.; Tang, H.; Zhang, T.; Fan, J.; Liu, J. IC-GVINS: A Robust, Real-Time, INS-Centric GNSS-Visual-Inertial Navigation System. IEEE Robot. Autom. Lett. 2023, 8, 216–223. [Google Scholar] [CrossRef]
  142. Chi, C.; Zhang, X.; Liu, J.; Sun, Y.; Zhang, Z.; Zhan, X. GICI-LIB: A GNSS/INS/Camera Integrated Navigation Library. IEEE Robot. Autom. Lett. 2023, 8, 7977. [Google Scholar] [CrossRef]
  143. Yin, J.; Li, T.; Yin, H.; Yu, W.X.; Zou, D.P. Sky-GVINS: A sky-segmentation aided GNSS-Visual-Inertial system for robust navigation in urban canyons. Geo-Spat. Inf. Sci. 2023, 11, 2257–2267. [Google Scholar] [CrossRef]
  144. Ravi, R.; Lin, Y.-J.; Elbahnasawy, M.; Shamseldin, T.; Habib, A. Simultaneous System Calibration of a Multi-LiDAR Multicamera Mobile Mapping Platform. IEEE J. Sel. Top. Appl. Earth Obs. Remote Sens. 2018, 11, 1694–1714. [Google Scholar] [CrossRef]
  145. Zhou, T.; Hasheminasab, S.M.; Habib, A. Tightly-coupled camera/LiDAR integration for point cloud generation from GNSS/INS-assisted UAV mapping systems. ISPRS J. Photogramm. Remote Sens. 2021, 180, 336–356. [Google Scholar] [CrossRef]
  146. Liao, J.; Li, X.; Feng, S. GVIL: Tightly—Coupled GNSS PPP/Visual/INS/LiDAR SLAM Based on Graph Optimization. Geomat. Inf. Sci. Wuhan Univ. 2023, 48, 1204–1215. [Google Scholar] [CrossRef]
  147. Liu, F.; Zhao, H.; Chen, W. A Hybrid Algorithm of LSTM and Factor Graph for Improving Combined GNSS/INS Positioning Accuracy during GNSS Interruptions. Sensors 2024, 24, 5605. [Google Scholar] [CrossRef]
  148. Wu, F.; Luo, H.; Zhao, F.; Wei, L.; Zhou, B. Optimizing GNSS/INS Integrated Navigation: A Deep Learning Approach for Error Compensation. IEEE Signal Process. Lett. 2024, 31, 3104–3108. [Google Scholar] [CrossRef]
  149. Meng, X.; Tan, H.; Yan, P.; Zheng, Q.; Chen, G.; Jiang, J. A GNSS/INS Integrated Navigation Compensation Method Based on CNN–GRU + IRAKF Hybrid Model During GNSS Outages. IEEE Trans. Instrum. Meas. 2024, 73, 1–15. [Google Scholar] [CrossRef]
  150. Liu, J.; Guo, G. Vehicle Localization During GPS Outages with Extended Kalman Filter and Deep Learning. IEEE Trans. Instrum. Meas. 2021, 70, 1–10. [Google Scholar] [CrossRef]
  151. Clark, R.; Wang, S.; Wen, H.; Markham, A.; Trigoni, N. VINet: Visual-Inertial Odometry as a Sequence-to-Sequence Learning Problem. In Proceedings of the 31st AAAI Conference on Artificial Intelligence, San Francisco, CA, USA, 4–9 February 2017; pp. 3995–4001. [Google Scholar]
  152. Chen, C.H.; Lu, C.X.; Wang, B.; Trigoni, N.; Markham, A. DynaNet: Neural Kalman Dynamical Model for Motion Estimation and Prediction. IEEE Trans. Neural Netw. Learn. Syst. 2021, 32, 5479–5491. [Google Scholar] [CrossRef]
  153. Han, L.; Lin, Y.; Du, G.; Lian, S. DeepVIO: Self-supervised Deep Learning of Monocular Visual Inertial Odometry using 3D Geometric Constraints. In Proceedings of the 2019 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), Macau, China, 3–8 November 2019; pp. 6906–6913. [Google Scholar]
  154. Kurt, Y.B.; Akman, A.; Alatan, A.A. Causal Transformer for Fusion and Pose Estimation in Deep Visual Inertial Odometry. arXiv 2024, arXiv:2409.08769. [Google Scholar]
  155. Chen, C.H.; Rosa, S.; Miao, Y.S.; Lu, C.X.X.; Wu, W.; Markham, A.; Trigoni, N.; Soc, I.C. Selective Sensor Fusion for Neural Visual-Inertial Odometry. In Proceedings of the 32nd IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR), Long Beach, CA, USA, 16–20 June 2019; pp. 10534–10543. [Google Scholar]
  156. Ragab, H.; Abdelaziz, S.K.; Elhabiby, M.; Givigi, S.; Noureldin, A.; Inst, N. Machine Learning-based Visual Odometry Uncertainty Estimation for Low-cost Integrated Land Vehicle Navigation. In Proceedings of the 33rd International Technical Meeting of the Satellite-Division-of-The-Institute-of-Navigation (ION GNSS), Electr Network, 21–25 September 2020; pp. 2569–2578. [Google Scholar]
  157. Ma, C.; Lin, J.; Zhao, Y.; Shi, Q.; Xia, G.; Qiu, A.; Huang, J. Low Noise Temperature Compensation Strategy for North-finding MEMS Gyroscope. In Proceedings of the 2024 IEEE SENSORS, Kobe, Japan, 20–23 October 2024; pp. 1–4. [Google Scholar]
  158. Li, K.; Cui, R.; Cai, Q.; Wei, W.; Shen, C.; Tang, J.; Shi, Y.; Cao, H.; Liu, J. A Fusion Algorithm for Real-Time Temperature Compensation and Noise Suppression With a Double U-Beam Vibration Ring Gyroscope. IEEE Sens. J. 2024, 24, 7614–7624. [Google Scholar] [CrossRef]
  159. Li, A.; Cui, K.; An, D.; Wang, X.; Cao, H. Multi-Frame Vibration MEMS Gyroscope Temperature Compensation Based on Combined GWO-VMD-TCN-LSTM Algorithm. Micromachines 2024, 15, 1379. [Google Scholar] [CrossRef]
  160. Rodriguez Mendoza, L.; O’Keefe, K. Wearable Multi-Sensor Positioning Prototype for Rowing Technique Evaluation. Sensors 2024, 24, 5280. [Google Scholar] [CrossRef] [PubMed]
  161. Yan, P.; Jiang, J.; Tan, H.; Zheng, Q.; Liu, J. High Precision Time Synchronization Strategy for Low-Cost Embedded GNSS/MEMS-IMU Integrated Navigation Module. IEEE Trans. Intell. Transp. Syst. 2024, 25, 14087–14099. [Google Scholar] [CrossRef]
Figure 1. Flowchart of the research methodology for data collection, processing, and analysis.
Figure 1. Flowchart of the research methodology for data collection, processing, and analysis.
Remotesensing 17 01136 g001
Figure 2. The number of global geographic distribution publications from 2000 to 2024.
Figure 2. The number of global geographic distribution publications from 2000 to 2024.
Remotesensing 17 01136 g002
Figure 3. (a) The annual publications of countries in the field of integrated navigation from 2000 to 2024, with a donut chart showing the proportion of publications contributed by the top five countries; (b) total publications and citations of the top five countries in integrated navigation research from 2000 to 2024.
Figure 3. (a) The annual publications of countries in the field of integrated navigation from 2000 to 2024, with a donut chart showing the proportion of publications contributed by the top five countries; (b) total publications and citations of the top five countries in integrated navigation research from 2000 to 2024.
Remotesensing 17 01136 g003
Figure 4. Knowledge map of author collaboration. Each node on the map represents an author, with the size of the node corresponding to the number of articles they have published. The connections between nodes illustrate collaboration between authors, with thicker lines indicating stronger and closer collaboration. The color of each node indicates the author’s affiliation to a specific research cluster, with each color representing a distinct group of authors with stronger collaboration and related research topics.
Figure 4. Knowledge map of author collaboration. Each node on the map represents an author, with the size of the node corresponding to the number of articles they have published. The connections between nodes illustrate collaboration between authors, with thicker lines indicating stronger and closer collaboration. The color of each node indicates the author’s affiliation to a specific research cluster, with each color representing a distinct group of authors with stronger collaboration and related research topics.
Remotesensing 17 01136 g004
Figure 5. Knowledge map of cocitation authors. Nodes represent each author. Size of node defines the number of citations by each author. Link between nodes defines the cocitation relationship between authors. Width of link defines the strength of cocitation relationship. Colors define the cocitation-based cluster to which each author belongs.
Figure 5. Knowledge map of cocitation authors. Nodes represent each author. Size of node defines the number of citations by each author. Link between nodes defines the cocitation relationship between authors. Width of link defines the strength of cocitation relationship. Colors define the cocitation-based cluster to which each author belongs.
Remotesensing 17 01136 g005
Figure 6. Knowledge map of country collaboration.
Figure 6. Knowledge map of country collaboration.
Remotesensing 17 01136 g006
Figure 7. Knowledge map of institution collaboration.
Figure 7. Knowledge map of institution collaboration.
Remotesensing 17 01136 g007
Figure 8. Visualization analysis of keyword co-occurrence network from 2000 to 2024.
Figure 8. Visualization analysis of keyword co-occurrence network from 2000 to 2024.
Remotesensing 17 01136 g008
Figure 9. The top 25 burst keywords from 2000 to 2024. The dark blue band reflects the starting year of the keyword and the red band reflects the period of its sudden activity.
Figure 9. The top 25 burst keywords from 2000 to 2024. The dark blue band reflects the starting year of the keyword and the red band reflects the period of its sudden activity.
Remotesensing 17 01136 g009
Figure 10. The top 25 references with the strongest citation bursts from 2000 to 2024. The dark blue band reflects the publication date of the paper and the red band reflects the period in which the paper was cited [22,24,25,30,31,33,34,35,36,37,38,39,40,41,42,43,44,45,46,47,48,49,50,51,52].
Figure 10. The top 25 references with the strongest citation bursts from 2000 to 2024. The dark blue band reflects the publication date of the paper and the red band reflects the period in which the paper was cited [22,24,25,30,31,33,34,35,36,37,38,39,40,41,42,43,44,45,46,47,48,49,50,51,52].
Remotesensing 17 01136 g010
Figure 11. Connections and peak period of multisensor technologies in integrated navigation.
Figure 11. Connections and peak period of multisensor technologies in integrated navigation.
Remotesensing 17 01136 g011
Figure 12. Timeline of research hotspots in integrated navigation algorithms.
Figure 12. Timeline of research hotspots in integrated navigation algorithms.
Remotesensing 17 01136 g012
Figure 13. GNSS/IMU/visual/LiDAR multisensor fusion factor graph optimization framework.
Figure 13. GNSS/IMU/visual/LiDAR multisensor fusion factor graph optimization framework.
Remotesensing 17 01136 g013
Table 1. Top 15 journals by the number of publications.
Table 1. Top 15 journals by the number of publications.
RankJournalsPublications
1Sensors493
2IEEE Sensors Journal226
3Remote Sensing209
4IEEE Access161
5IEEE Transactions on Instrumentation and Measurement138
6GPS Solutions114
7IEEE Transactions on Intelligent Transportation Systems104
8Journal of Navigation100
9IEEE Robotics and Automation Letters99
10Measurement Science and Technology86
11IEEE Transactions on Vehicular Technology76
12Measurement70
13Applied Sciences Basel68
14IEEE Transactions on Aerospace and Electronic Systems65
15Journal of Field Robotics55
Table 2. Top 15 ranking authors and number of articles published.
Table 2. Top 15 ranking authors and number of articles published.
RankAuthorTotal Publications
1Xiaoji Niu84
2Noureldin, Aboelmagd65
3El-Sheimy, Naser49
4Hsu, Li-Ta48
5Xingxing Li37
6Jingnan Liu35
7Chiang, Kai-Wei35
8Jian Wang33
9Quan Zhang33
10Shengyu Li31
11Tisheng Zhang30
12Yuxuan Zhou29
13Weisong Wen29
14Zhouzheng Gao28
15Hongping Zhang28
Table 3. Top 15 ranking cocited authors and number of citations.
Table 3. Top 15 ranking cocited authors and number of citations.
RankAuthorCitations
1Shaojie Sheng3314
2Xiaoji Niu2163
3Noureldin, Aboelmagd2095
4Scaramuzza, Davide1621
5El-Sheimy, Naser1418
6Yongmin Zhong1264
7Hsu, Li-Ta1239
8Stergios I. Roumeliotis1218
9Wei Xu1104
10Fu Zhang1104
11Anastasios I. Mourikis1088
12Gaoge Hu1067
13Guoquan Huang910
14Chang, Kai-Wei863
15Jinling Wang860
Table 4. Top 15 ranking organizations and number of articles published.
Table 4. Top 15 ranking organizations and number of articles published.
RankAffiliationsArticles
1Wuhan University262
2Beihang University199
3University of Calgary157
4National University of Defense Technology126
5Southeast University116
6Chinese Academy of Science106
7Northwestern Polytechnical University103
8Harbin Engineering University98
9Nanjing University of Aeronautics and Astronautics92
10The Hong Kong Polytechnic University78
11Tsinghua University77
12Beijing Institute of Technology72
13China University of Mining and Technology71
14Shanghai Jiao Tong University71
15Royal Military College of Canada63
Table 5. Top ten articles of cocitation literature [22,23,24,25,26,27,28,29,30,31].
Table 5. Top ten articles of cocitation literature [22,23,24,25,26,27,28,29,30,31].
RankTitleAuthorYear
1Vision Meets Robotics: The KITTI datasetGeiger, Andreas2013
2VINS-Mono: A Robust and Versatile Monocular Visual-Inertial State EstimatorTong Qin2018
3ORB-SLAM3: An Accurate Open-Source Library for Visual, Visual-Inertial, and Multimap SLAMCampos, Carlos2021
4AUV Navigation and Localization: A ReviewPaull, Liam2014
5On-Manifold Pre-integration for Real-Time Visual–Inertial OdometryForster, Christian2017
6High-precision, consistent EKF-based visual-inertial odometryMingyang Li2013
7Visual-Inertial Monocular SLAM With Map ReuseMur-Artal, R2017
8FAST-LIO2: Fast Direct LiDAR-Inertial OdometryWei Xu2022
9In-Car Positioning and Navigation Technologies-A SurveySkog, Isaac2009
10Performance Enhancement of MEMS-Based INS/GPS Integration for Low-Cost Navigation ApplicationsNoureldin, Aboelmagd2009
Table 6. Summary of four typical integrated navigation modes and their algorithms.
Table 6. Summary of four typical integrated navigation modes and their algorithms.
Sensor
Type
Algorithm NameFusion TypeAlgorithm TypeFeatures
GNSS/INSKF-GINSLCEKFAdopts optimized Bayesian methods with high robustness; effectively processes noise, outliers, and uncertainties
OB_GINSLCGraph
Optimization
Optimized Bayesian approach, enhances robustness against noise, outliers, and uncertainties.
GINavLC/TCEKFApplications for real-time and postprocessing; flexible sensor configurations and an extensible framework for advanced filtering and other sensor integrations.
IgnavLC/TCFilteringModular design, adaptable to various application scenarios; high flexibility.
INS/visualROVIOLCIEKF High real-time performance and robustness; suitable for fast-moving platforms.
MSCKFTCKFSuitable for embedded platforms; lightweight real-time navigation, suppresses motion blur and lighting changes.
MSCEqFTCEquivariant
Filtering
Integrates multiple state variables with self-calibration capabilities; applies group theory for equivariant design, ensuring improved linearization and internal consistency.
R-VIOTCEKFUses a mobile local coordinate system for state estimation; operates from any initial posture without requiring alignment with the global gravity direction.
MIMC-VINSTCMultistate
Constrained
Kalman Filter
Fuses data from multiple uncalibrated cameras and IMUs; ensures seamless 3D motion tracking; incorporates on-manifold state interpolation and online calibration for spatiotemporal and intrinsic parameters.
VINS-MonoTCGraph
Optimization
Monocular vision and inertial fusion; preintegration and partial marginalization techniques for large-scale graph optimization.
OKVISTCGraph
Optimization
Keyframe-based global pose estimation; handles scale uncertainty, suitable for complex dynamic environments.
StructVIOTCGraph
Optimization
Structured line features with iterative optimization graph optimization; improves accuracy in weak and repetitive texture environments.
ORB-SLAM3TCGraph
Optimization
Multisensor support (monocular, stereo, RGB-D); visual + IMU tightly coupled; multimap system; excellent positioning accuracy and robustness.
RD-VIOTCGraph OptimizationUtilizes IMU-PARSAC for robust keypoint detection and matching; employs deferred triangulation to handle pure rotation; enhances performance in dynamic environments.
INS/visualDFF-VIOTCGraph OptimizationUtilizes dynamic feature matching, handles dynamic environments, improves tracking accuracy and robustness, especially for moving objects.
DynaVIOTCGraph OptimizationIdentifies and rejects dynamic features; uses dynamic probability propagation for robust pose estimation.
DeepVIOTCDeep Learning (LSTM + CNN + Fusion Network)Combines 2D optical flow and IMU data; self-supervised learning for trajectory estimation; robust to calibration errors and missing data.
VIFTTCDeep Learning (Transformer-Based)Uses attention mechanisms for pose estimation; refines latent feature vectors temporally; addresses data imbalance and rotation learning in SE(3); suitable for monocular VIO systems in autonomous driving and robotics.
GNSS/INS/
visual
IC-GVINSGNSS LCFactor Graph
Optimization
Adapted to complex environments and sensor failures.
InGVIOGNSS TCIEKFHigh precision and efficient computation with keyframe marginalization; robust performance.
GICI-LIBGNSS LC/TCFactor Graph
Optimization
The first open-source integrated navigation library; multifrequency processing and optimized RTK, PPP algorithms for high-precision fusion navigation.
GVINSGNSS TCGraph
Optimization
Accurate and robust 3D navigation and positioning, suitable for high-precision and high-robustness requirements.
Sky-GVINSGNSS TCGraph
Optimization
Optimized for UAVs; high-precision, low-latency 3D positioning and attitude estimation.
GNSS/INS/
visual/LiDAR
GVILTCGraph
Optimization
Integrates multisensor information for high-precision, comprehensive 3D perception and positioning, suitable for complex environments and autonomous driving.
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Wei, J.; Song, M.; Yuan, Y. High Precision Navigation and Positioning for Multisource Sensors Based on Bibliometric and Contextual Analysis. Remote Sens. 2025, 17, 1136. https://doi.org/10.3390/rs17071136

AMA Style

Wei J, Song M, Yuan Y. High Precision Navigation and Positioning for Multisource Sensors Based on Bibliometric and Contextual Analysis. Remote Sensing. 2025; 17(7):1136. https://doi.org/10.3390/rs17071136

Chicago/Turabian Style

Wei, Jiayi, Min Song, and Yunbin Yuan. 2025. "High Precision Navigation and Positioning for Multisource Sensors Based on Bibliometric and Contextual Analysis" Remote Sensing 17, no. 7: 1136. https://doi.org/10.3390/rs17071136

APA Style

Wei, J., Song, M., & Yuan, Y. (2025). High Precision Navigation and Positioning for Multisource Sensors Based on Bibliometric and Contextual Analysis. Remote Sensing, 17(7), 1136. https://doi.org/10.3390/rs17071136

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop