Next Article in Journal
Brain Implantable End-Fire Antenna with Enhanced Gain and Bandwidth
Previous Article in Journal
Multi-Sensory Data Fusion in Terms of UAV Detection in 3D Space
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

An Integrated INS/LiDAR SLAM Navigation System for GNSS-Challenging Environments

by
Nader Abdelaziz
1,2,* and
Ahmed El-Rabbany
1
1
Department of Civil Engineering, Ryerson University, Toronto, ON M5B 2K3, Canada
2
Department of Civil Engineering, Tanta University, Tanta 31527, Egypt
*
Author to whom correspondence should be addressed.
Sensors 2022, 22(12), 4327; https://doi.org/10.3390/s22124327
Submission received: 28 April 2022 / Revised: 20 May 2022 / Accepted: 2 June 2022 / Published: 7 June 2022
(This article belongs to the Section Navigation and Positioning)

Abstract

:
Traditional navigation systems rely on GNSS/inertial navigation system (INS) integration, in which the INS can provide reliable positioning during short GNSS outages. However, if the GNSS outage persists for prolonged periods of time, the performance of the system will be solely dependent on the INS, which can lead to a significant drift over time. As a result, the need to integrate additional onboard sensors is essential. This study proposes a robust loosely coupled (LC) integration between the INS and LiDAR simultaneous mapping and localization (SLAM) using an extended Kalman filter (EKF). The proposed integrated navigation system was tested for three different driving scenarios and environments using the raw KITTI dataset. The first scenario used the KITTI residential datasets, totaling 48 min, while the second case study considered the KITTI highway datasets, totaling 7 min. For both case studies, a complete absence of the GNSS signal was assumed for the whole trajectory of the vehicle in all drives. In contrast, the third case study considered the use of minimal assistance from GNSS, which mimics the intermittent receipt and loss of GNSS signals for different driving environments. The positioning results of the proposed INS/LiDAR SLAM integrated system outperformed the performance of the INS for the residential datasets with an average reduction in the root mean square error (RMSE) in the horizontal and up directions of 88% and 32%, respectively. For the highway datasets, the RMSE reductions were 70% and 0.2% for the horizontal and up directions, respectively.

1. Introduction

Providing accurate, robust positioning is a major challenge that exists in the field of vehicular navigation research and development. Vehicular positioning demands navigation systems that provide an accurate position for all driving environments (i.e., urban or rural), all weather conditions (i.e., dry, rain, snow), and different illumination conditions (i.e., day or night). In addition, the navigation system must possess redundancy, such that the system will still be functional should one sensor fail to operate. As a result, the use of a single sensor may not produce a robust navigation solution, even if such a sensor produces measurements that lead to accurate positioning. This emphasizes the need for multi-sensor integration to achieve robust positioning [1,2].
GNSS/INS integrated navigation systems are the most popular. Typically, the observations of the GNSS and IMU are fused using a Kalman filter [3]. For example, in a loosely coupled (LC) integration, the INS provides the position of the vehicle through mechanization, while receiving updates from the GNSS at a slower frequency to minimize the INS solution drift. The integration relies on the ability of the INS to provide the position of the vehicle at a high frequency. As a result, for closed-error scheme integration, the INS can provide reliable positioning while experiencing short GNSS signal outages [4]. However, if the GNSS outage occurs for a prolonged period of time, the system will rely on the performance of the INS, which is prone to a significant drift, especially when a low-cost micro-electro-mechanical system (MEMS) inertial measurement unit (IMU) is used [5,6,7,8,9].
In order to improve the performance of navigation systems, additional onboard sensors that can be used for navigation are required which will allow the system to sustain prolonged GNSS outages. LiDAR sensors are widely used for localization through simultaneous mapping and localization (SLAM) techniques. The basic idea of SLAM algorithms is to use a LiDAR or a camera sensor to construct a map of the surrounding environment while simultaneously keeping track of the sensor location [10].
There are several LiDAR SLAM algorithms. In 2014, the LiDAR odometry and mapping (LOAM) algorithm was presented as one of the leading algorithms [11]. This was supported by its superior performance via the KITTI benchmark [12]. Subsequently, other variations and updated versions of LOAM were presented, including, A-LOAM [13], LeGO-LOAM [14], Kitware SLAM [15], R-LOAM [16], and F-LOAM [17]. The common contribution of these SLAM algorithms was improvement in the processing time of the LOAM algorithm. For vehicular navigation, the integration of LiDAR SLAM with GNSS/INS is crucial to achieve system redundancy and robustness.
Many studies have proposed the integration of GNSS, INS, and LiDAR SLAM. In [18], an integration scheme was proposed which integrates GNSS/INS with LiDAR SLAM based on graph optimization. In this study, the GNSS/INS results were matched with the relative pose of a 3D probability map. The system was tested during a one-minute outage of the GNSS signal. The RMSE of the position in the east and north directions was reduced by roughly 80% compared to the GNSS/INS navigation solution. Similar work was undertaken in [19]; however, an odometer was included in the integration scheme. The system was tested during a 2-min outage, which resulted in reductions in the RMSE by 62.8%, 72.3%, and 52.1%, along the north, east, and height directions, respectively, compared to the GNSS/INS integrated system. In [20], the integration of a three-dimensional reduced-inertial-sensor system (3D-RISS) with GNSS and LiDAR odometry was accomplished using an extended Kalman filter (EKF). The LiDAR odometry updates to the 3D-RISS mechanization were the position and azimuth. The system was tested for different driving scenarios in downtown Toronto and Kingston, Canada. The integration results showed a reduction in positioning errors by 64% compared to the inertial navigation solution.
In this paper, an LC integration between the INS and the LiDAR SLAM algorithm using an EKF is proposed. The resulting integrated navigation system produces reliable position and attitude information during complete GNSS signal outages. The main objectives of this paper are as follows: to design an integrated INS/LiDAR SLAM navigation system that satisfies the concept of vehicular navigation robustness and redundancy, where both the LiDAR and the INS sensors provide the position and attitude of the vehicle at moderate and high frequency, respectively; to present an LC integration that takes advantage of the LiDAR SLAM solution, including the 3D position and attitude angles, to update the INS solution in a closed-error loop scheme using the EKF; to test the proposed integrated navigation system on the raw KITTI dataset for both residential driving environments of scenes containing dense features and low driving speed; to test the navigation system in highway driving environments of scenes featuring sparse features and high driving speed; and to simulate and study the effect of intermittent receipt and loss of GNSS signals while driving in various environments.
The remainder of this paper is organized as follows: Section 2 includes the system architecture and mathematical models for the IMU mechanization, LiDAR SLAM, and the EKF; Section 3 illustrates the datasets considered in this paper, along with defining the different case studies; Section 4 includes the results of the analysis; Section 5 presents a comparison between the proposed integrated INS/LiDAR system and three state-of-the-art LiDAR SLAM algorithms; and Section 6 ends the paper with some concluding remarks.

2. System Architecture and Mathematical Models

2.1. Full IMU Mechanization

The IMU observations are measured with respect to the body frame (b-frame), whose axes are defined as follows: the Y-axis is the forward direction, the X-axis is the transverse direction, and the Z-axis is the up direction, such that the coordinate system is right-handed. In addition, the IMU measurements are referenced to the navigation frame, alias the local-level frame (LLF), which is a right-handed system composed of the east, north, and up axes. Given the raw measurements of the accelerometer and gyroscopes of the IMU into the mechanization, the outputs are position (latitude, longitude, and altitude), velocity (east, north, and up directions), and attitude (roll, pitch, and yaw angles) [4].

2.2. LiDAR SLAM

The SLAM algorithm used in this study was the Kitware SLAM [15], which is based on the LOAM algorithm [11]. Kitware SLAM was adopted over GraphSLAM algorithms, as the latter are generally more computationally expensive, such that they consider all pose estimates in the calculation process. By contrast, EKF-based SLAM methods consider the last pose only [21].
The LOAM algorithm is composed of three main stages, namely, point cloud registration, LiDAR odometry, and LiDAR mapping. In the first stage, the point cloud registration is accomplished by extracting and matching feature points among every consecutive pair of point clouds. For every scan, feature points corresponding to edges and planes are extracted based on the smoothness of the local surface. This is described by the c coefficient as presented in Equation (1). Low and high c values correspond to planes and edges, respectively.
c = 1 | S | . | | X ( k , i ) L | | | | j S , j i ( X ( k , i ) L X ( k , j ) L ) | |
where S is a set of consecutive points returned by the LiDAR in the same scan, X k , i L are the coordinates of point i in sweep k, expressed in the LiDAR frame, and X k , j L are the coordinates of point j in sweep k, expressed in the LiDAR frame.
In order to ensure an even distribution of feature points across the point cloud, the scan is divided into subregions. Moreover, a feature point cannot belong to a surface patch that is parallel to a laser beam or on a boundary on an occluded region. Subsequently, the extracted points of a current scan are projected to the beginning of the consecutive scan, where feature correspondence is executed using the nearest neighbor search. Then, the LiDAR motion is recovered by minimizing the overall point to line dε and point to plane d H distances as shown by Equations (2) and (3), respectively.
d ε = | ( X ˜ ( k + 1 , i ) L X ¯ ( k , j ) L ) × ( X ˜ ( k + 1 , i ) L X ¯ ( k , l ) L ) | | X ¯ ( k , j ) L X ¯ ( k , l ) L |
d H = ( X ˜ ( k + 1 , i ) L X ¯ ( k , j ) L ) ( X ¯ ( k , j ) L X ¯ ( k , l ) L ) × ( X ¯ ( k , j ) L X ¯ ( k , m ) L ) | ( X ¯ ( k , j ) L X ¯ ( k , l ) L ) × ( X ¯ ( k , j ) L X ¯ ( k , m ) L ) |
where X ˜ k + 1 , i L , X ¯ k , j L , X ¯ k , l L , X ¯ k , m L are the coordinates of points i, j, l, and m in the LiDAR frame, respectively. In order to account for the point cloud motion distortion, the motion of the LiDAR is assumed to have constant and linear angular velocities during each scan. This allows for the linear interpolation of the pose transform for LiDAR points that are acquired at different times. The result of the scan matching is the transformation between a consecutive pair of point clouds, which is the input to the odometry stage. Thereafter, the transformation is projected into the frame of the first point cloud. Then, the distortion-free point cloud is used to build a map at a frequency of 1 Hz. The map is then used to further refine the pose of the LiDAR by performing the same scan-matching technique between the scan and the map.
While Kitware SLAM uses the base architecture of LOAM, there are some improvements. Firstly, the running time is reduced due to the use of C++ libraries and tools designed for better computational performance. In addition, the algorithm is independent from the ROS and does not rely on hard-coded parameters. It can also run in Windows and Linux operating systems using LidarView software. Furthermore, it is more generalized, such that it runs on several LiDAR sensors, not just the Velodyne. Finally, the algorithm can process point clouds from multiple LiDAR sensors. However, it is important to note that, at present, Kitware SLAM neither performs loop closure, nor does it integrate the IMU measurements in the scan matching of subsequent frames.
After obtaining the SLAM solution in the local frame of the first point cloud of the LiDAR data, it is used to update the inertial navigation solution. However, since the latter is estimated in the WGS84 global reference, the poses estimated from the LiDAR slam must be transformed into the same reference frame, as graphically illustrated in Figure 1.
The position and rotation transformations can be performed using a homogenous transformation using 4 × 4 transformation matrices, which is more computationally efficient. Let PLi denote a point captured in the local frame of the LiDAR and let denote the same point expressed in the WGS84 reference frame. The sequence of homogenous transformations is presented by Equations (4) and (5).
P e c e f = ( R / t ) l e c e f ( R / t ) b l ( R / t ) L i b P L i
R L i l = R b l R L i b
where ( R / t ) l e c e f is the homogenous transformation matrix from the l-frame to the WGS84 frame, ( R / t ) b l is the homogenous transformation matrix from the b-frame to the l-frame, ( R / t ) L i b is the homogenous transformation matrix from the LiDAR frame to the b-frame, R L i l is the rotation matrix from the LiDAR frame to the l-frame, and R L i b is the rotation matrix from the LiDAR frame to the b-frame.

2.3. INS/LiDAR Integration

In this paper, an LC integration between the INS and the LiDAR SLAM is adopted using an EKF, which results in an integrated navigation solution, as shown in Figure 2. The raw IMU measurements of accelerations and angular rotations are used as input to a full IMU mechanization, which outputs the position, velocity, and attitude angles of the navigation system. Meanwhile, the LiDAR point clouds are fed into the Kitware SLAM algorithm, which yields the position and attitude angles of the vehicle. These are used as the measurement update to the IMU mechanization output during the update stage of the EKF, which yields an integrated navigation solution; the updated errors are fed back into the IMU mechanization, which forms a closed-loop error scheme.

2.4. System Model

The discrete form of the system model [4] is described by Equations (6) and (7), which are used to predict the error state of the system. The uncertainty of the predicted system error state is estimated by the a priori covariance matrix Pk (−), as presented in Equation (8). In the initialization stage of the EKF, the a priori covariance matrix Pk (−) is assumed to be a diagonal matrix that includes the variances of the state vector. The system noise covariance matrix Q is a diagonal matrix, which is furnished based on the noise covariance of each parameter in the system state.
δ x k ( ) = ϕ k 1 , k δ x k 1 ( + ) + G k 1 w
ϕ k 1 , k = F ( t ) Δ t + I
P k ( ) = ϕ k , k 1 P k 1 ( + ) ϕ k , k 1 T + G k , k 1 Q k , k 1 G k , k 1 T
where δ x k is the predicted state vector at epoch k, φ k 1 , k is the state transition matrix from epoch k − 1 to k, δ x k 1 + is the estimated state vector to epoch k − 1, G k 1 is the noise coupling matrix [22,23], w is the system noise with Q covariance matrix, F(t) is the system dynamic matrix, Δt is the sampling interval, I is the identity matrix, P k 1 is the posteriori covariance matrix of the state vector at epoch k − 1, and Q k , k 1 is the system noise covariance matrix. The error state of the system δ x for the loosely coupled integration is represented by Equation (9).
δ x = δ r δ v δ ε δ b a δ b g T
where δ r = δ ϕ δ λ δ h T is the position error vector, δ v = δ v e δ v n δ v u T is the velocity error vector, δ ε = δ p δ r δ y T is the attitude angles’ error vector, δ b a = δ b a x δ b a y δ b a z T is the accelerometer bias error vector, and δ b g = δ b g x δ b g y δ b g z T is the gyroscope bias error vector.
The system dynamic matrix is shown by Equation (13) [4]. It is worth noting that the accelerometer and gyroscope biases were modeled using a first-order Gauss–Markov (GM) process [9].
F = F r r F r v 0 3 × 3 0 3 × 3 0 3 × 3 F v r F v v F v ε R b l 0 3 × 3 F e r F ε v F e e 0 3 × 3 R b l 0 3 × 3 0 3 × 3 0 3 × 3 F b a 0 3 × 3 0 3 × 3 0 3 × 3 0 3 × 3 0 3 × 3 F b g

2.5. Measurement Model

The predictions of the EKF are modified during the update stage. The measurement model for the LC INS/LiDAR integration is presented by Equation (11).
δ Z k = H k δ x k + v k
where δ Z k is the measurement error vector, Hk is the design matrix, δ x k is the error state, and v k is the measurement noise. Since the EKF is operating on the error state of the system, the measurement update vector will be modeled as a measurement error vector as represented by Equation (12).
δ Z k = δ r δ ε T = φ i m u L i λ i m u L i h i m u L i p i m u L i r i m u L i y i m u L i T
The design matrix H elements are depicted by Equation (13).
H = I 3 × 3 0 3 × 3 0 3 × 3 0 3 × 3 0 3 × 3 0 3 × 3 0 3 × 3 I 3 × 3 0 3 × 3 0 3 × 3
The covariance matrix of the measurements Rk is assumed to be a diagonal matrix, including the variances of the position and attitude angles received from the LiDAR SLAM, which is depicted by Equation (14).
R k = d i a g σ ϕ L i 2 σ λ L i 2 σ h L i 2 σ p L i 2 σ r L i 2 σ y L i 2
where ϕ i m u L i , λ i m u L i , h i m u L i are the measurement errors for the geodetic latitude, longitude and height; p i m u L i , r i m u L i , y i m u L i are the measurement errors for the pitch, roll and yaw angles; σ φ L i 2 , σ λ L i 2 , σ h L i 2 are the variances of the LiDAR geodetic latitude, longitude and height; and σ p L i 2 , σ r L i 2 , σ y L i 2 are the variances of the LiDAR geodetic pitch, roll and yaw angles. Subsequently, the Kalman gain Kk and the posterior state covariance P k + are calculated using Equations (15) and (16).
K k = P k ( ) H k T H k P k ( ) H k T + R k 1
P k ( + ) = ( I K k H k ) P k ( ) ( I K k H k ) T + K k R k K k T
The innovation term d x k is calculated using Equation (17). However, since this study adopts a closed-loop error scheme, Equation (17) can be written in a more simplified format, as shown in Equation (18).
d x k = K k δ z k H k δ x k ( )
d x k = K k δ z k
Finally, the error state of the system can be updated using Equation (19), which can be further simplified using Equation (20).
δ x k ( + ) = δ x k ( ) + K k δ z k H k δ x k ( )
δ x k ( + ) = δ x k ( + ) + K k δ z k

3. Data Source and Case Studies

The raw Karlsruhe Institute of Technology and Toyota Technological Institute (KITTI) dataset was considered in this study [24]. The KITTI dataset is an online dataset made available to all researchers worldwide. The KITTI data collection platform is a land vehicle equipped with several sensors, including an integrated GNSS/IMU unit (the OXTS RT3003 unit), a Velodyne HDL-64E mechanical LiDAR, a Sony greyscale stereo pair, and a Sony colored stereo pair. All the extrinsic and intrinsic calibration parameters of the system are estimated and provided with the dataset. The KITTI dataset can be divided into two main categories, namely the raw dataset and the odometry dataset. The raw dataset covers a wide range of driving environments in residential and highway areas, while the odometry dataset is obtained from the raw dataset for sequences of mostly residential areas [24]. Finally, the KITTI dataset provides the odometry benchmark [12] for researchers to showcase their work and the results of developing visual and LiDAR SLAM algorithms that satisfy certain conditions.
In this study, the raw KITTI data were adopted, from which three case studies were formulated to accommodate different driving environments and scenarios.
The first case study featured the use of residential raw KITTI datasets during a complete absence of GNSS signal along the whole trajectory of the vehicle. The residential datasets represent typical urban driving environments, where driving speed is relatively low to moderate, and the scene is feature-rich. Within the first case study, the datasets were broken down into three different scenarios: a sample relatively short dataset, a sample relatively long dataset, and the remaining residential KITTI datasets.
Similarly, the second case study followed the same structure as the first. However, it featured the KITTI road datasets, which represent highway driving environments. Furthermore, the scene in the highway environments included drastically fewer features than its urban counterpart, which was expected to adversely affect the performance of the LiDAR SLAM algorithm.
The third and final case study involved an attempt to improve the performance of the integrated INS/LiDAR SLAM navigation system in cases where the GNSS signal loss persists for prolonged periods of time, which leads to degradation in the performance of the integrated navigation system. Such improvement was studied by incorporating minimal assistance from the GNSS in the form of a single observation along the whole trajectory of the vehicle. The third case study was broken down into two different scenarios: a sample residential dataset and a sample highway dataset.
Descriptive information about the trajectories of the datasets used in all the previous three case studies and their corresponding scenarios are described in Table 1.

4. Analysis and Results

4.1. First Case Study—Residential Datasets (Complete GNSS Outage)

The first case study mimicked a full GNSS signal outage along the whole trajectory of the vehicle while driving in urban environments. In this case study, the raw KITTI residential datasets were adopted. Furthermore, the datasets were further categorized into three categories: a short sample dataset, a long sample dataset, and other residential datasets.
For all the case studies, the ground truth was the integrated solution of the OXTS GNSS/INS system used in the KITTI data collection platform, whose measurements were provided in the raw KITTI dataset [24]. The position and attitude solutions of the integrated INS/LiDAR SLAM navigation system were compared to the ground truth for all cases.

4.1.1. Sample Relatively Short Residential Dataset

The dataset used in this scenario was 2011_09_30_drive_0027_sync, labelled as D-27. This was a 692.47-m long drive, lasting 114.85 s at an average speed of 21.71 km/h. Figure 3 illustrates the position errors in the ENU local frame, while Figure 4 presents the errors of the attitude angles (roll, pitch, and yaw) for three navigation solutions. The first used the full INS-only solution without any update from the LiDAR SLAM. The second navigation solution was obtained through the LIDAR SLAM only. Finally, the third navigation solution was the integrated INS/LiDAR SLAM system. These figures are quantified in Table 2, where the position and attitude error statistics are shown for the three navigation solutions.
It is noticeable from Figure 3 that there was a large drift of the INS solution over time, which indicates that it cannot be used as the sole system for precise navigation. However, the integrated INS/LiDAR navigation solution exhibited significantly less error in comparison with the INS solution. Therefore, the integrated INS/LiDAR solution was tuned such that it emphasized the weights of position updates from the LiDAR SLAM, while de-emphasizing that of the INS solution. As a result, the integrated INS/LiDAR position solution was closer to the LiDAR SLAM solution.
In contrast, in Figure 4, the integrated INS/LiDAR solution was tuned to follow the INS solution for the attitude angles, which outperformed that of the LiDAR SLAM. The reason for this was that the IMU directly measured the vehicle’s accelerations and angular rotations, after which the attitude was estimated. By contrast, the LiDAR estimates of the attitude angles were obtained through the SLAM solution (point cloud registration), which led to a significant amount of error accumulation. These trends are numerically noticeable in Table 2, where the RMSE of the integrated INS/LiDAR position errors converged towards that of the LiDAR SLAM. By contrast, the RMSE of the attitude errors was closer to the mechanization errors. All trajectories were compared to the ground truth, which was the integrated solution of the OXTS unit provided in the raw KITTI dataset. The trajectories for the complete outage of the residential dataset are presented in Figure 5, where the mechanization, LiDAR SLAM, and INS/LiDAR trajectories are compared to the ground truth.

4.1.2. Sample Relatively Long Residential Dataset

In this scenario, a significantly longer dataset, 2011_09_30_drive_0028_sync, labelled as D-28, was considered. This is an approximately 9-min drive with an average speed of 28.17 km/h. Similar to the previous scenario, the same analysis was performed for this drive. Figure 6 and Figure 7 present the position and attitude errors, respectively. Table 3 shows the statistical characteristics of the position and attitude errors of D-28.
It is noticeable from Figure 6 that the INS position solution deteriorated significantly in all directions. The drift rate was much higher than the previous scenario in D-27 because D-28 was significantly longer. This is reflected numerically in Table 3 for the INS navigation solution statistics. However, the INS still succeeded in yielding a more stable solution for the attitude angles, even for the longer dataset. While this may seem counter-intuitive, because the INS solution was prone to drift for both position and attitude estimates, the reason for the better performance was the quality of the IMU used in the KITTI data collection platform (OXTS unit). Furthermore, as mentioned in the discussion of the previous scenario, the IMU measures the attitude angles in a more direct way than the LiDAR SLAM. Through a careful examination of the IMU mechanization equations [4], the IMU measures the attitude angles in relation to the inertial system, from which the attitude angles can be estimated in the local-level reference frame. The attitude angles evolve from one epoch to the subsequent one using the updated quaternion. This led to less accumulated error over time, given that the used IMU was high-grade rather than of low-cost. Conversely, the attitude angles resulting from the LiDAR SLAM algorithms depended on the quality of the point-cloud-matching in a pair-wise fashion, which led to significant error accumulation while processing a stream of point clouds. Figure 8 presents a comparison between the trajectories of the INS, LiDAR, and INS/LiDAR algorithms versus the ground truth for D-28.

4.1.3. KITTI Residential Datasets

The third and final scenario in this first case study involved showcasing the results of the proposed integrated navigation system for several of the raw KITTI datasets. This is depicted in Figure 9, Figure 10 and Figure 11, where the navigation solutions of the INS, LiDAR SLAM, and INS/LiDAR are compared to the ground truth for every dataset.
Figure 12 presents a thorough evaluation of the developed integrated INS/LiDAR navigation performance using the datasets in this case study. The figure graphically depicts the reductions in the RMSE as a result of using the proposed LiDAR/INS integrated navigation system as opposed to its INS counterpart. Additionally, Table A1, Table A2, Table A3, Table A4, Table A5, Table A6, Table A7, Table A8, Table A9, Table A10, Table A11, Table A12, Table A13, Table A14, Table A15, Table A16 and Table A17 presented in Appendix A show the detailed results of the position and attitude error statistics.
Figure 9, Figure 10 and Figure 11 reinforce the conclusion that the integrated INS/LiDAR SLAM navigation system performed significantly better than the INS for position estimation. However, there were some cases where it was observable that the INS performance was almost as accurate as the integrated navigation system, which was exemplified by drives D-19 and D-79. Additionally, it is observable from Figure 12 that, for most drives, the proposed integrated navigation system outperformed the INS position estimation. However, for drives D-19, D-23, and D-64, the INS outperformed the integrated system for the up-direction estimates. The reason for this was that the datasets were shorter in length and time with minimal turns in the E-N plane (almost straight lines). Therefore, for these reasons and because of the relatively high grade of the OXTS IMU, the total drift was low.

4.2. Second Case Study—Highway Datasets (Complete GNSS Outage)

The second case study featured driving in highway environments. When using the LIDAR SLAM algorithms, one of the main challenges is feature extraction and feature matching among consecutive point cloud frames. Therefore, highway driving is challenging.
The proposed integrated navigation system was evaluated in such a challenging environment using different scenarios by testing several datasets of different lengths from the raw KITTI highway dataset.

4.2.1. Sample Relatively Short Highway Dataset

In the first scenario, D-101 was considered, which represents a relatively short dataset. This dataset involved a length of 1299.13-m and a 96.62-s driving time with an average speed of 48.40 km/h. Figure 13 and Figure 14 show the position and attitude errors, respectively, while Table 4 quantifies these errors statistically.
As shown in Figure 13, overall, the integrated INS/LiDAR outperformed the INS solution, even though the INS yielded better position estimates in the east direction. This was because the INS drifted significantly in the north direction. As a result, the net horizontal position was in favor of the integrated navigation system. This is noticeable numerically from the values presented in Table 4. In Figure 14, similar to the residential datasets in the first case study, the results show that the INS attitude estimations outperformed their LiDAR counterpart.
Figure 15 illustrates the resultant trajectories of the three navigation solutions versus the ground truth trajectory for D-101.

4.2.2. Sample Relatively Long Highway Dataset

In the second scenario of this case study, the integrated INS/LiDAR SLAM navigation system was tested using a relatively long highway drive of the raw KITTI dataset, 2011_10_03_drive_0042_sync, labelled as D-42. The drive involves travel along an approximately 2.6-km highway segment in 121.19 s at an average speed of approximately 77 km/h, which is the longest highway dataset in the raw KITTI datasets. The position and attitude errors of D-42 are presented in Figure 16 and Figure 17, respectively, while Table 5 quantifies these errors numerically for all resultant navigation solutions.
Similar to the previous scenario in the second case study, from Figure 16 and Figure 17, the same conclusions are reinforced, that the LiDAR SLAM outperformed the INS for position estimations, albeit not for the attitude estimation.
The INS, LiDAR SLAM, and INS/LiDAR trajectories of D-42 are compared against the reference trajectory in Figure 18.

4.2.3. KITTI Highway Datasets

In the third and final scenario of the second case study, a combination of different highway raw KITTI datasets was used to test the proposed integrated navigation system. Figure 19 presents the results of comparing the navigation solution trajectories to the ground truth trajectory for all the datasets considered.
Figure 20 illustrates a graphical summary of the reduction in the RMSE values as a result of using the proposed integrated navigation system as opposed to sole use of the INS. The results are presented in detail in Table A18, Table A19, Table A20, Table A21, Table A22 and Table A23, as attached in Appendix A.
It is noticeable from Figure 20 that the overall performance of the LiDAR SLAM, and, thereby, the INS/LiDAR SLAM integrated system was significantly better than the INS, especially for the net horizontal positioning estimation. However, there was only a slight improvement over the averages in the up direction. In contrast, the improvements in the up-direction for the residential datasets were significantly greater, as shown in Figure 12. This was potentially due to the way the up direction was estimated in comparison with the estimations for the east and north directions. That is, by nature, the collected point clouds included points that were very well distributed around the LiDAR scanner from all directions, such that the LiDAR was centered among the points. However, the vast majority of the collected points were above the LiDAR sensor, which meant that very few points, to almost none, were below the level of the LiDAR sensors. As a result, the points were poorly distributed along the vertical direction, which led to poor geometry. This is analogous to GNSS positioning, where the horizontal dilution of precision (HDOP) is always lower than the vertical dilution of precision (VDOP), due to the poor geometry of the satellites around the GNSS receiver in the vertical direction [25].
Based on the above analysis, the LiDAR SLAM appears to be highly sensitive in the up-direction, such that many features are needed to obtain a good estimation of the up-direction. This is why it is evident that residential, feature-rich environments (as shown in Figure 12) provide better estimations of the up-direction as opposed to highway, feature-poor environments (as shown in Figure 20).
Additionally, comparing the results of the up component considering Figure 12 (residential datasets) and Figure 20 (highway datasets), it can be concluded that the performance of the LiDAR SLAM algorithm in the up direction was significantly better for feature-rich residential areas.
It is essential to note that the improvements presented in Figure 20 are in the form of percentages. This does not necessarily reflect the visual comparison of the trajectories provided in Figure 19. For example, D-15 showed an improvement in the horizontal positioning direction of just over 80%; however, this is not visually present in Figure 19b. This is because the drive length was too short; therefore, the numerical quantification is of the utmost importance to avoid the drawing of any misleading conclusions.
It is important to note that the performance of the LiDAR SLAM in the case of the highway dataset was generally poorer when compared to its performance in the first case study. This was essentially due to the different nature of the driving environment and the sparseness of the highway scene in terms of the availability of features in comparison with the urban scenes.

4.3. Third Case Study—GNSS Assistance

The third case study considered improvement in the performance of the proposed integrated navigation system in cases where the GNSS signal outages persist for prolonged periods of time. This was accomplished by assuming minimal assistance from the GNSS, mimicking the intermittent receipt and outage of GNSS signals. An analogy to this scenario is driving into an urban canyon with high-rise buildings and skyscrapers which almost completely block the GNSS signal.
Two scenarios were considered in this case study. The first involved the use of a long, residential dataset (D-28), which was the same as the one used in the first case study (second scenario). Similarly, the second scenario considered the effect of GNSS assistance using one of the highway datasets of the KITTI datasets (D-42), which was previously considered in the second case study (second scenario).

4.3.1. Sample Residential Dataset (GNSS-Assisted)

The first scenario involved the nearly 9-min drive (D-28) of the residential raw KITTI drives. A complete GNSS signal outage was assumed along the whole trajectory of the vehicle. However, a simulated single update from the GNSS was received after half the time of the drive had elapsed—at approximately the 4.5-min mark. Figure 21 and Figure 22 show the position and attitude errors, respectively, for D-28 with assistance from GNSS. The errors are presented numerically and statistically in Table 6.
It is evident from Figure 21 and Figure 22 that a significant improvement in both position and attitude estimations was accomplished by a single GNSS update at the middle point of the trajectory. This is further illustrated in Table 6, where the RMSE for the whole trajectory was reduced to 4.96 m for the horizontal direction. To emphasize the significance of such an improvement, for the same drive D-28 presented in the first case study, the RMSE value for the horizontal direction was 15.06 m. The results are represented visually on the base map shown in Figure 23, where the three navigation solutions of D-28 (GNSS-assisted) are shown versus the ground truth trajectory.

4.3.2. Sample Highway Dataset (GNSS-Assisted)

The second and final scenario in this case study involved the use of D-42, which represents the use of a relatively long highway drive from the raw KITTI dataset. Similar to the previous scenario, only one GNSS update was received at the middle point of the drive—approximately the 39-s mark. The position and attitude errors of D-42 are presented in Figure 24 and Figure 25, respectively, while Table 7 shows the statistical characteristics of these errors.
Figure 24 and Figure 25 illustrate the improvements in the position and attitude of D-42 in comparison with Figure 16 and Figure 17 for the same dataset in the case of an artificial complete GNSS signal outage. Similarly, the positioning results show significant improvements, as shown in Table 7, when compared to the results shown in Table 5, where the horizontal position RMSE was reduced from 32.15 m to 13.19 m. Figure 26 shows the resulting three navigation solutions of D-42 versus the ground truth trajectory.

5. Comparison to State-Of-The-Art SLAM Algorithms

The proposed INS/LiDAR integrated navigation system was compared to three other state-of-the-art LiDAR SLAM algorithms, namely A-LOAM [13,26], LeGO-LOAM [14,27], and F-LOAM [17,28]. These SLAM algorithms were used to process three datasets, namely D-28 (residential), D-42 (highway), and D-101 (highway). Since the performance of SLAM algorithms can be sensitive to the machine performance, it is important to note that these algorithms were installed and tested on a machine with the following specifications and environment configurations: ROS Melodic—Ubuntu 18.05 (Intel® core™ i7-8550U CPU @ 1.80 GHZ and 16 GB RAM).
As shown in Table 8 and Figure 27, the proposed integrated INS/LiDAR system slightly outperformed the A-LOAM and Le-GO LOAM algorithms for the residential drive D-28. In addition, the system produced almost equal results for the RMSE compared to the F-LOAM. This was expected, as residential areas are characterized by being feature-rich, which simplifies and enhances the accuracy of SLAM algorithms. However, for the highway datasets (D-42 and D-101), the system significantly outperformed all three algorithms.

6. Conclusions

In this paper, an LC integrated navigation system was proposed to fuse the INS and the LiDAR SLAM (Kitware) algorithms using an EKF. The proposed navigation system was tested for various driving environments and scenarios using the raw residential and highway KITTI datasets. Three case studies were presented. One featured the use of residential datasets of the raw KITTI datasets, which involved a total driving time of 48 min for all the considered datasets during a complete artificial outage of the GNSS signal. For all the scenarios, the LiDAR SLAM system outperformed the INS for position estimation in all directions. However, for short datasets (below 1 min), the INS performed as well as the LiDAR SLAM system due to the high quality of the IMU (OXTS) of the KITTI data collection platform, which made it less prone to drift for short periods of time. In contrast, the INS yielded better attitude estimations than the LiDAR SLAM system.
The second case study examined the use of highway datasets for a total time of 7 min, which comprised all the available highway datasets in the raw KITTI data. Similar to the first case study, the LiDAR SLAM system yielded better positioning results for the longer datasets. However, the INS yielded better results for the up direction, unlike for the first case study. This was because the LiDAR SLAM is very sensitive when estimating the up-direction in comparison to the north and east directions due to the poor geometry of the point cloud distribution in the vertical direction. Therefore, the more the extracted features, the better the up-direction estimation.
The INS also outperformed the LiDAR SLAM for attitude estimation. Although the LiDAR SLAM system produced satisfactory results for the highway datasets, its performance was better in the case of driving in residential areas. This was because residential environments are feature-rich.
The third and final case study involved driving within residential and highway environments where intermittent GNSS signal outages occurred, mimicking driving through urban canyons and highways with tunnels. The results for both the residential and highway datasets showed significant improvement in comparison with the same datasets in the first case study, where a complete artificial outage of the GNSS signal was involved.
For all the driving scenarios, the integrated INS/LiDAR SLAM navigation system yielded positioning results for the residential datasets that outperformed the INS by an average RMSE reduction of 88% and 32% in the horizontal and up components, respectively. For the highway datasets, the improvements were of the order of 70% and 0.2% for the horizontal and up components, respectively.
The proposed system was compared to three state-of-the-art LiDAR SLAM algorithms. The system yielded slightly better performance than its counterparts for residential datasets, but significantly outperformed other SLAM algorithms for highway environments.

Author Contributions

N.A. and A.E.-R. conceived the idea and wrote the paper. N.A. implemented the analysis and performed the validation under the supervision of A.E.-R. All authors have read and agreed to the published version of the manuscript.

Funding

This research was funded by the Natural Sciences and Engineering Research Council of Canada (NSERC), Ryerson Graduate Fellowship (RGF), and the Government of Ontario Scholarship (OGS).

Institutional Review Board Statement

Not applicable.

Informed Consent Statement

Not applicable.

Data Availability Statement

The data used in this study can be found at: http://www.cvlibs.net/datasets/kitti/ (accessed on 27 April 2022).

Acknowledgments

This work is supported by the Natural Sciences and Engineering Research Council of Canada (NSERC), Ryerson University, and the Government of Ontario.

Conflicts of Interest

The authors declare no conflict of interest.

Appendix A

Appendix A.1. Residential Datasets

Table A1. D-18—position (m) and attitude (deg) error statistics.
Table A1. D-18—position (m) and attitude (deg) error statistics.
INSLiDARINS/LiDAR
MeanRMSEMaxMeanRMSEMaxMeanRMSEMax
East870.781252.613118.31−0.503.347.35−0.503.347.35
North130.45229.34482.44−9.5111.8820.71−9.5111.8820.71
Horizontal901.051273.433155.4110.2412.3420.7210.2412.3420.72
Up17.7823.8356.713.714.2010.543.944.729.81
Roll0.1190.7401.7630.2741.1512.4820.1300.7351.767
Pitch−0.0360.6411.651−0.2850.8982.841−0.0390.6311.639
Yaw2.1233.1004.7760.9891.2772.7422.1643.1905.249
Table A2. D-19—position (m) and attitude (deg) error statistics.
Table A2. D-19—position (m) and attitude (deg) error statistics.
INSLiDARINS/LiDAR
MeanRMSEMaxMeanRMSEMaxMeanRMSEMax
East−3.714.397.752.752.834.102.742.834.10
North0.291.735.01−2.973.043.56−2.973.043.56
Horizontal4.004.719.234.084.155.274.084.155.27
Up2.252.865.536.177.3211.085.567.3112.70
Roll−0.0470.1200.311−1.0141.2902.523−0.0500.1220.312
Pitch0.0500.1400.316−1.2181.4092.7530.0470.1360.309
Yaw0.1690.2200.497−0.5750.6381.4560.1700.2210.497
Table A3. D-20—position (m) and attitude (deg) error statistics.
Table A3. D-20—position (m) and attitude (deg) error statistics.
INSLiDARINS/LiDAR
MeanRMSEMaxMeanRMSEMaxMeanRMSEMax
East−162.07220.09480.21−0.901.001.95−0.901.001.95
North−283.97375.19827.291.681.863.801.681.863.80
Horizontal327.41434.98956.562.022.114.012.022.114.01
Up3.675.2813.483.083.436.383.093.406.04
Roll−0.0660.5920.9210.4441.2181.980−0.0650.5920.926
Pitch−0.0200.4810.875−0.2920.9531.936−0.0210.4830.872
Yaw1.4391.4481.708−0.6691.0673.6231.4331.4421.688
Table A4. D-22—position (m) and attitude (deg) error statistics.
Table A4. D-22—position (m) and attitude (deg) error statistics.
INSLiDARINS/LiDAR
MeanRMSEMaxMeanRMSEMaxMeanRMSEMax
East17.1424.3950.710.271.982.900.271.982.90
North−76.01107.52267.92−3.533.835.57−3.533.835.57
Horizontal78.00110.25272.674.034.316.114.034.316.11
Up5.676.9713.171.922.204.012.212.765.45
Roll0.1270.4610.902−0.3130.5921.3730.1260.4520.888
Pitch0.1570.3770.883−0.3040.6141.4790.1530.3700.864
Yaw0.1800.3510.683−0.0160.7331.7340.1740.3460.677
Table A5. D-23—position (m) and attitude (deg) error statistics.
Table A5. D-23—position (m) and attitude (deg) error statistics.
INSLiDARINS/LiDAR
MeanRMSEMaxMeanRMSEMaxMeanRMSEMax
East15.5521.1047.480.450.631.300.450.631.30
North6.638.2513.720.831.292.200.831.292.20
Horizontal18.0022.6547.681.201.432.261.201.432.26
Up1.792.616.214.486.3114.153.935.8013.41
Roll−0.1930.3360.619−0.0610.1640.543−0.2000.3460.646
Pitch0.2500.2940.466−1.4711.7263.2270.2400.2840.444
Yaw0.1070.1960.353−1.0151.0271.3800.1080.1980.363
Table A6. D-27_d—position (m) and attitude (deg) error statistics.
Table A6. D-27_d—position (m) and attitude (deg) error statistics.
INSLiDARINS/LiDAR
MeanRMSEMaxMeanRMSEMaxMeanRMSEMax
East47,110.1462751.46140945.001.9820.1654.111.9820.1654.11
North13,282.1115,974.9423,896.152.1323.9262.922.1323.9262.92
Horizontal49,127.9764,752.95142,305.1124.3831.2874.5724.3831.2974.56
Up2234.593017.396928.427.248.2317.6922.0123.8145.11
Roll0.1206.34110.4560.2961.7756.0770.1435.74910.012
Pitch0.0186.21310.939−0.1961.3994.327−0.0615.79710.928
Yaw−1.0593.4388.4898.94242.785269.9931.6054.2119.148
Table A7. D-33—position (m) and attitude (deg) error statistics.
Table A7. D-33—position (m) and attitude (deg) error statistics.
INSLiDARINS/LiDAR
MeanRMSEMaxMeanRMSEMaxMeanRMSEMax
East39.1173.59168.64−0.185.078.62−0.195.078.61
North249.35382.20991.30−0.966.6313.08−0.966.6313.07
Horizontal254.16389.221005.547.928.3513.387.928.3513.37
Up11.8613.6120.296.457.9116.356.969.0818.21
Roll−0.0300.4351.142−0.9472.1325.2350.0060.4440.977
Pitch−0.0280.6151.1950.0741.9377.0260.0140.5731.117
Yaw−0.1040.2140.4701.5617.47626.579−0.1240.2700.702
Table A8. D-34—position (m) and attitude (deg) error statistics.
Table A8. D-34—position (m) and attitude (deg) error statistics.
INSLiDARINS/LiDAR
MeanRMSEMaxMeanRMSEMaxMeanRMSEMax
East−79.24122.03320.1169.1978.54119.9869.1978.54119.97
North96.27147.53421.25−4.509.1217.64−4.509.1217.64
Horizontal125.03191.46529.0870.1279.06119.9870.1279.06119.98
Up9.3013.0723.819.8111.4021.199.6511.7521.31
Roll0.2230.3671.146−0.5441.3514.4810.2770.4441.334
Pitch−0.1960.7141.442−0.4662.7284.549−0.1550.6801.528
Yaw0.0680.1690.42510.36810.38812.454−0.0410.2380.546
Table A9. D-34_d—position (m) and attitude (deg) error statistics.
Table A9. D-34_d—position (m) and attitude (deg) error statistics.
INSLiDARINS/LiDAR
MeanRMSEMaxMeanRMSEMaxMeanRMSEMax
East484.13795.011556.31−2.9011.0835.52−2.9011.0835.52
North743.811806.975738.22−23.2628.3056.44−23.2628.3056.44
Horizontal1507.411974.135805.3725.9930.3959.8825.9930.3959.88
Up165.96316.381063.3528.3732.6162.3031.5637.9078.94
Roll0.2472.0216.144−0.0893.47511.2620.0772.4308.920
Pitch−0.0702.6446.298−0.5733.76310.065−0.0403.5709.032
Yaw0.9281.0684.015−2.98712.18866.4650.4231.3373.942
Table A10. D-36—position (m) and attitude (deg) error statistics.
Table A10. D-36—position (m) and attitude (deg) error statistics.
INSLiDARINS/LiDAR
MeanRMSEMaxMeanRMSEMaxMeanRMSEMax
East15.9421.8650.58−0.851.282.45−0.851.282.45
North−67.85100.41252.751.712.114.201.712.114.20
Horizontal69.90102.77257.772.282.464.212.282.464.21
Up4.605.9410.511.091.282.540.881.233.04
Roll−0.3870.4640.743−0.9871.1271.929−0.4000.4800.767
Pitch0.1980.5070.795−0.5800.9161.8920.1860.4940.770
Yaw0.1940.2200.4440.0240.3861.0500.1930.2170.446
Table A11. D-39—position (m) and attitude (deg) error statistics.
Table A11. D-39—position (m) and attitude (deg) error statistics.
INSLiDARINS/LiDAR
MeanRMSEMaxMeanRMSEMaxMeanRMSEMax
East−8.0310.3721.34−6.287.4712.20−6.287.4712.20
North11.5317.5744.765.286.2810.135.286.2810.13
Horizontal14.4120.4049.598.299.7615.858.299.7615.85
Up6.157.6114.471.692.114.173.243.604.73
Roll−0.3610.3920.6151.0451.3372.870−0.3600.3910.614
Pitch−0.2540.3060.493−0.9981.1752.153−0.2450.2950.491
Yaw−0.0770.1060.3102.8982.9373.481−0.0770.1070.310
Table A12. D-46—position (m) and attitude (deg) error statistics.
Table A12. D-46—position (m) and attitude (deg) error statistics.
INSLiDARINS/LiDAR
MeanRMSEMaxMeanRMSEMaxMeanRMSEMax
East−0.770.992.14−0.260.440.88−0.260.440.88
North0.320.421.11−0.300.390.69−0.300.390.69
Horizontal0.841.082.410.570.590.910.560.580.91
Up0.390.481.001.341.351.620.340.410.78
Roll0.0370.0680.186−0.1820.4411.0610.0370.0680.186
Pitch−0.0340.0620.1040.0740.2980.614−0.0340.0620.103
Yaw−0.1070.1530.316−0.7760.9901.858−0.1080.1540.316
Table A13. D-61—position (m) and attitude (deg) error statistics.
Table A13. D-61—position (m) and attitude (deg) error statistics.
INSLiDARINS/LiDAR
MeanRMSEMaxMeanRMSEMaxMeanRMSEMax
East−125.23212.27618.430.260.501.350.260.501.35
North9.3618.8057.371.631.903.071.631.903.07
Horizontal126.06213.10621.091.701.973.271.701.973.27
Up11.6619.6955.263.794.6010.405.918.0716.39
Roll−0.6031.1682.447−0.3730.4971.634−0.3870.8731.930
Pitch2.2012.8284.468−1.3761.7503.0342.2582.9054.519
Yaw0.2270.3280.676−0.2710.5021.4410.2270.3310.676
Table A14. D-64—position (m) and attitude (deg) error statistics.
Table A14. D-64—position (m) and attitude (deg) error statistics.
INSLiDARINS/LiDAR
MeanRMSEMaxMeanRMSEMaxMeanRMSEMax
East−4.705.7213.46−7.509.6219.22−7.509.6219.22
North−26.1438.6094.038.6410.4316.538.6410.4316.53
Horizontal26.7739.0294.9911.6614.1924.0611.6614.1924.06
Up1.161.372.421.652.086.141.431.694.15
Roll0.1520.3580.817−1.5102.0383.5080.1600.3520.815
Pitch0.2260.2700.535−1.6672.1434.4400.2140.2560.503
Yaw−0.0210.1010.3063.2453.2884.696−0.0230.1000.293
Table A15. D-79—position (m) and attitude (deg) error statistics.
Table A15. D-79—position (m) and attitude (deg) error statistics.
INSLiDARINS/LiDAR
MeanRMSEMaxMeanRMSEMaxMeanRMSEMax
East0.600.681.190.320.330.430.320.330.43
North0.490.581.120.160.200.300.160.200.30
Horizontal0.780.891.630.380.390.510.370.390.51
Up0.120.160.311.901.922.150.140.170.32
Roll−0.0630.1030.1710.9681.0051.865−0.0630.1030.172
Pitch−0.0440.0710.118−0.0560.1650.491−0.0440.0710.118
Yaw−0.0050.0510.113−0.2380.4400.810−0.0050.0510.113
Table A16. D-86—position (m) and attitude (deg) error statistics.
Table A16. D-86—position (m) and attitude (deg) error statistics.
INSLiDARINS/LiDAR
MeanRMSEMaxMeanRMSEMaxMeanRMSEMax
East23.1732.1266.083.234.028.013.234.028.01
North−43.1063.11141.80−1.752.023.18−1.752.023.18
Horizontal49.4270.81156.443.794.508.113.794.508.11
Up7.059.2219.310.450.530.941.301.452.42
Roll−0.2570.4210.9071.1861.6893.914−0.2530.4250.913
Pitch−0.1250.2110.3850.2091.5144.972−0.1500.2310.437
Yaw0.6850.7171.6231.3511.4322.6130.6820.7131.604
Table A17. D-87—position (m) and attitude (deg) error statistics.
Table A17. D-87—position (m) and attitude (deg) error statistics.
INSLiDARINS/LiDAR
MeanRMSEMaxMeanRMSEMaxMeanRMSEMax
East14.9630.2193.41−0.370.611.78−0.370.611.78
North−22.1932.9492.80−0.781.162.63−0.781.162.62
Horizontal28.6744.70131.671.091.312.681.091.312.67
Up4.566.6816.361.301.573.471.992.424.32
Roll0.4910.6721.449−3.1053.7115.7400.4530.6161.328
Pitch0.0940.2830.843−0.5911.0503.2770.0630.2390.689
Yaw−0.0630.5601.1370.6380.8551.616−0.0610.5571.122

Appendix A.2. Highway Datasets

Table A18. D-04—position (m) and attitude (deg) error statistics.
Table A18. D-04—position (m) and attitude (deg) error statistics.
INSLiDARINS/LiDAR
MeanRMSEMaxMeanRMSEMaxMeanRMSEMax
East2.3883.1907.4875.8906.3978.7455.8946.4058.757
North−5.3947.38516.745−5.8456.4969.873−5.8496.5079.895
Horizontal5.9088.04518.3438.3179.11713.1848.3209.13013.209
Up0.3210.3440.4812.8943.0684.5750.3270.3500.484
Roll0.1420.1490.2250.0960.4100.9190.1420.1490.225
Pitch0.0650.0690.120−0.7430.8341.2200.0650.0690.120
Yaw0.0760.0990.169−3.1963.2033.4090.0760.0990.169
Table A19. D-15—position (m) and attitude (deg) error statistics.
Table A19. D-15—position (m) and attitude (deg) error statistics.
INSLiDARINS/LiDAR
MeanRMSEMaxMeanRMSEMaxMeanRMSEMax
East1.972.565.53−0.940.951.29−0.910.941.51
North3.294.4510.38−0.860.881.31−0.840.871.48
Horizontal3.845.1411.771.281.301.841.241.282.12
Up0.480.601.303.533.947.990.480.601.31
Roll0.0080.0270.0651.1651.1691.3940.0080.0270.065
Pitch−0.1840.2030.400−0.7450.8991.764−0.1840.2030.400
Yaw0.0530.0710.176−0.3720.3920.6350.0530.0710.176
Table A20. D-16—position (m) and attitude (deg) error statistics.
Table A20. D-16—position (m) and attitude (deg) error statistics.
INSLiDARINS/LiDAR
MeanRMSEMaxMeanRMSEMaxMeanRMSEMax
East−0.610.811.37−1.341.612.86−1.341.612.86
North2.453.347.00−1.081.131.69−1.071.121.64
Horizontal2.553.437.131.771.973.321.761.973.30
Up1.581.903.332.894.149.921.581.903.33
Roll−0.0140.0500.139−1.4961.8113.116−0.0140.0500.139
Pitch−0.0720.1190.220−1.2611.4922.709−0.0720.1190.220
Yaw0.0250.0420.1110.1870.2030.3840.0250.0420.111
Table A21. D-28—position (m) and attitude (deg) error statistics.
Table A21. D-28—position (m) and attitude (deg) error statistics.
INSLiDARINS/LiDAR
MeanRMSEMaxMeanRMSEMaxMeanRMSEMax
East−2.723.637.944.864.977.244.814.978.74
North−6.639.3123.20−6.866.988.22−6.847.008.06
Horizontal7.1710.0024.528.548.579.198.508.5810.05
Up1.311.542.787.729.9222.091.321.562.82
Roll−0.1350.1460.238−0.5951.1442.061−0.1350.1460.238
Pitch0.1180.1500.370−1.8832.2073.8350.1180.1500.370
Yaw0.2940.3170.450−0.0320.5161.0590.2940.3170.450
Table A22. D-29—position (m) and attitude (deg) error statistics.
Table A22. D-29—position (m) and attitude (deg) error statistics.
INSLiDARINS/LiDAR
MeanRMSEMaxMeanRMSEMaxMeanRMSEMax
East−4.716.5913.50−0.700.831.36−0.700.841.37
North18.4125.2858.600.120.440.790.140.460.80
Horizontal19.0426.1260.130.820.941.510.830.961.53
Up3.283.735.544.434.615.853.283.735.52
Roll0.1840.3220.6580.6100.6871.2450.1840.3220.658
Pitch−0.1580.2810.378−0.2430.8061.250−0.1580.2810.378
Yaw−0.0840.3071.660−0.9341.3293.620−0.0840.3061.660
Table A23. D-32—position (m) and attitude (deg) error statistics.
Table A23. D-32—position (m) and attitude (deg) error statistics.
INSLiDARINS/LiDAR
MeanRMSEMaxMeanRMSEMaxMeanRMSEMax
East2.312.875.450.730.771.060.740.771.05
North1.181.392.360.410.852.180.410.842.13
Horizontal2.633.195.751.041.152.251.041.142.20
Up2.453.367.568.7010.4621.402.453.367.56
Roll−0.0190.0720.1960.7600.7671.042−0.0190.0720.196
Pitch−0.0380.0770.194−1.4111.6342.996−0.0380.0770.194
Yaw0.0230.0500.093−0.4090.4920.8630.0230.0500.093

References

  1. De Ponte Müller, F.J.S. Survey on ranging sensors and cooperative techniques for relative positioning of vehicles. Sensors 2017, 17, 271. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  2. Martínez-Díaz, M.; Soriguera, F. Autonomous vehicles: Theoretical and practical challenges. Transp. Res. Procedia 2018, 33, 275–282. [Google Scholar] [CrossRef]
  3. Shin, E.-H. Estimation Techniques for Low-Cost Inertial Navigation. Ph.D. Thesis, University of Calgary, Calgary, AB, Canada, 2005. [Google Scholar] [CrossRef]
  4. Noureldin, A.; Karamat, T.B.; Georgy, J. Fundamentals of Inertial Navigation, Satellite-Based Positioning and Their Integration; Springer Science & Business Media: Berlin/Heidelberg, Germany, 2013; Volume 314, pp. 1–18. [Google Scholar]
  5. Wang, G.; Han, Y.; Chen, J.; Wang, S.; Zhang, Z.; Du, N.; Zheng, Y. A GNSS/INS integrated navigation algorithm based on Kalman filter. IFAC-Paper 2018, 51, 232–237. [Google Scholar] [CrossRef]
  6. Elmezayen, A.; El-Rabbany, A. Performance evaluation of real-time tightly-coupled GNSS PPP/MEMS-based inertial integration using an improved robust adaptive Kalman filter. J. Appl. Geod. 2020, 14, 413–430. [Google Scholar] [CrossRef]
  7. Elmezayen, A.; El-Rabbany, A. Ultra-Low-Cost Tightly Coupled Triple-Constellation GNSS PPP/MEMS-Based INS Integration for Land Vehicular Applications. Geomatics 2021, 1, 258–286. [Google Scholar] [CrossRef]
  8. Gao, B.; Hu, G.; Zhong, Y.; Zhu, X. Cubature Kalman Filter with Both Adaptability and Robustness for Tightly-Coupled GNSS/INS Integration. IEEE Sens. J. 2021, 21, 14997–15011. [Google Scholar] [CrossRef]
  9. Abd Rabbou, M.; El-Rabbany, A. Tightly coupled integration of GPS precise point positioning and MEMS-based inertial systems. GPS Solut. 2015, 19, 601–609. [Google Scholar] [CrossRef]
  10. Ben-Ari, M.; Mondada, F. Elements of Robotics; Springer International Publishing: Cham, Switzerland, 2018. [Google Scholar]
  11. Zhang, J.; Singh, S. LOAM: Lidar Odometry and Mapping in Real-time. In Proceeding of the Robotics: Science and Systems, Berkeley, CA, USA, 12–16 July 2014. [Google Scholar]
  12. KITTI. 2013. Available online: http://www.cvlibs.net/datasets/kitti/eval_odometry.php (accessed on 10 October 2021).
  13. LOAM. 2018. Available online: https://github.com/HKUST-Aerial-Robotics/A-LOAM (accessed on 10 October 2021).
  14. Shan, T.; Englot, B. Lego-loam: Lightweight and ground-optimized lidar odometry and mapping on variable terrain. In Proceeding of the 2018 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), Madrid, Spain, 1–5 October 2018. [Google Scholar]
  15. KITWARE. 2020. Available online: https://gitlab.kitware.com/keu-computervision/slam (accessed on 1 May 2020).
  16. Oelsch, M.; Karimi, M.; Steinbach, E. R-LOAM: Improving LiDAR Odometry and Mapping With Point-to-Mesh Features of a Known 3D Reference Object. IEEE Robot. Autom. Lett. 2021, 6, 2068–2075. [Google Scholar] [CrossRef]
  17. Wang, H.; Wang, C.; Chen, C.L.; Xie, L. F-loam: Fast lidar odometry and mapping. In Proceeding of the 2021 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS) IEEE, Prague, Crech Republic, 27 September–1 October 2021; pp. 4390–4396. [Google Scholar]
  18. Chang, L.; Niu, X.; Liu, T.; Tang, J.; Qian, C. GNSS/INS/LiDAR-SLAM integrated navigation system based on graph optimization. Remote Sens. 2019, 11, 1009. [Google Scholar] [CrossRef] [Green Version]
  19. Chang, L.; Niu, X.; Liu, T. GNSS/IMU/ODO/LiDAR-SLAM Integrated Navigation System Using IMU/ODO Pre-Integration. Sensors 2020, 20, 4702. [Google Scholar] [CrossRef] [PubMed]
  20. Aboutaleb, A.; El-Wakeel, A.S.; Elghamrawy, H.; Noureldin, A. Lidar/riss/gnss dynamic integration for land vehicle robust positioning in challenging gnss environments. Remote Sens. 2020, 12, 2323. [Google Scholar] [CrossRef]
  21. Das, S.; Kumari, R.; Kumar, S.D. A Review on Applications of Simultaneous Localization and Mapping Method in Autonomous Vehicles. In Advances in Interdisciplinary Engineering; Springer: Singapore, 2021; pp. 367–375. [Google Scholar]
  22. Abd Rabbou, M.; El-Rabbany, A. Precise point positioning using multi-constellation GNSS observations for kinematic applications. J. Appl. Geod. 2015, 9, 15–26. [Google Scholar] [CrossRef]
  23. Gao, Y.; Liu, S.; Atia, M.M.; Noureldin, A. INS/GPS/LiDAR integrated navigation system for urban and indoor environments using hybrid scan matching algorithm. Sensors 2015, 15, 23286–23302. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  24. Geiger, A.; Lenz, P.; Stiller, C.; Urtasun, R. Vision meets robotics: The KITTI dataset. Int. J. Robot. Res. 2013, 32, 1231–1237. [Google Scholar] [CrossRef] [Green Version]
  25. El-Rabbany, A. Introduction to GPS: The Global Positioning System; Artech House: London, UK, 2002. [Google Scholar]
  26. LOAM. 2020. Available online: https://github.com/Mitchell-Lee-93/kitti-A-LOAM (accessed on 25 May 2022).
  27. LeGO-LOAM. 2020. Available online: https://github.com/Mitchell-Lee-93/kitti-lego-loam (accessed on 25 May 2022).
  28. F-LOAM. 2021. Available online: https://github.com/wh200720041/floam (accessed on 25 May 2022).
Figure 1. Graphical illustration of LiDAR SLAM pose transformation into the WGS84 reference frame.
Figure 1. Graphical illustration of LiDAR SLAM pose transformation into the WGS84 reference frame.
Sensors 22 04327 g001
Figure 2. A block diagram for the INS/LiDAR SLAM LC integration.
Figure 2. A block diagram for the INS/LiDAR SLAM LC integration.
Sensors 22 04327 g002
Figure 3. D-27—complete GNSS signal outage: position errors (ENU).
Figure 3. D-27—complete GNSS signal outage: position errors (ENU).
Sensors 22 04327 g003
Figure 4. D-27—complete GNNS signal outage: errors of attitude angles (roll, pitch, and yaw).
Figure 4. D-27—complete GNNS signal outage: errors of attitude angles (roll, pitch, and yaw).
Sensors 22 04327 g004
Figure 5. D-27—complete GNNS signal outage: comparison of trajectories.
Figure 5. D-27—complete GNNS signal outage: comparison of trajectories.
Sensors 22 04327 g005
Figure 6. D-28—complete GNSS signal outage: position errors (ENU).
Figure 6. D-28—complete GNSS signal outage: position errors (ENU).
Sensors 22 04327 g006
Figure 7. D-28—complete GNNS signal outage: errors of attitude angles (roll, pitch, and yaw).
Figure 7. D-28—complete GNNS signal outage: errors of attitude angles (roll, pitch, and yaw).
Sensors 22 04327 g007
Figure 8. D-28—complete GNNS signal outage: comparison of trajectories.
Figure 8. D-28—complete GNNS signal outage: comparison of trajectories.
Sensors 22 04327 g008
Figure 9. Complete GNNS signal outage: comparison of trajectories (D-18 to D-27_d).
Figure 9. Complete GNNS signal outage: comparison of trajectories (D-18 to D-27_d).
Sensors 22 04327 g009
Figure 10. Complete GNNS signal outage: comparison of trajectories (D-33 to D-64).
Figure 10. Complete GNNS signal outage: comparison of trajectories (D-33 to D-64).
Sensors 22 04327 g010
Figure 11. Complete GNNS signal outage: comparison of trajectories (D-79 to D-87).
Figure 11. Complete GNNS signal outage: comparison of trajectories (D-79 to D-87).
Sensors 22 04327 g011
Figure 12. Reductions in RMSE of KITTI residential datasets.
Figure 12. Reductions in RMSE of KITTI residential datasets.
Sensors 22 04327 g012
Figure 13. D-101—complete GNSS signal outage: position errors (ENU).
Figure 13. D-101—complete GNSS signal outage: position errors (ENU).
Sensors 22 04327 g013
Figure 14. D-101—complete GNNS signal outage: errors of attitude angles (roll, pitch, and yaw).
Figure 14. D-101—complete GNNS signal outage: errors of attitude angles (roll, pitch, and yaw).
Sensors 22 04327 g014
Figure 15. D-101—complete GNNS signal outage: comparison of trajectories.
Figure 15. D-101—complete GNNS signal outage: comparison of trajectories.
Sensors 22 04327 g015
Figure 16. D-42—complete GNSS signal outage: position errors (ENU).
Figure 16. D-42—complete GNSS signal outage: position errors (ENU).
Sensors 22 04327 g016
Figure 17. D-42—complete GNNS signal outage: errors of attitude angles (roll, pitch, and yaw).
Figure 17. D-42—complete GNNS signal outage: errors of attitude angles (roll, pitch, and yaw).
Sensors 22 04327 g017
Figure 18. D-42—complete GNNS signal outage: comparison of trajectories.
Figure 18. D-42—complete GNNS signal outage: comparison of trajectories.
Sensors 22 04327 g018
Figure 19. Complete GNNS signal outage: comparison of trajectories.
Figure 19. Complete GNNS signal outage: comparison of trajectories.
Sensors 22 04327 g019
Figure 20. Reductions in RMSE of KITTI highway datasets.
Figure 20. Reductions in RMSE of KITTI highway datasets.
Sensors 22 04327 g020
Figure 21. D-28—GNSS-assisted: position errors (ENU).
Figure 21. D-28—GNSS-assisted: position errors (ENU).
Sensors 22 04327 g021
Figure 22. D-28—GNSS-assisted: errors of attitude angles (roll, pitch, and yaw).
Figure 22. D-28—GNSS-assisted: errors of attitude angles (roll, pitch, and yaw).
Sensors 22 04327 g022
Figure 23. D-28—GNNS assisted: comparison of trajectories.
Figure 23. D-28—GNNS assisted: comparison of trajectories.
Sensors 22 04327 g023
Figure 24. D-42—GNSS-assisted: position errors (ENU).
Figure 24. D-42—GNSS-assisted: position errors (ENU).
Sensors 22 04327 g024
Figure 25. D-42—GNSS-assisted: errors of attitude angles (roll, pitch, and yaw).
Figure 25. D-42—GNSS-assisted: errors of attitude angles (roll, pitch, and yaw).
Sensors 22 04327 g025
Figure 26. D-42—GNNS-assisted: comparison of trajectories.
Figure 26. D-42—GNNS-assisted: comparison of trajectories.
Sensors 22 04327 g026
Figure 27. Performance comparison between the proposed integrated system and state-of-the-art SLAM algorithms (RMSE in horizontal direction).
Figure 27. Performance comparison between the proposed integrated system and state-of-the-art SLAM algorithms (RMSE in horizontal direction).
Sensors 22 04327 g027
Table 1. Trajectory information for the KITTI datasets considered [24].
Table 1. Trajectory information for the KITTI datasets considered [24].
Drive LabelDrive NumberLength (m)Time (s)Average Speed (km/h)No. of Frames
Residential datasets
D-182011_09_30_drive_0018_sync2206.47287.5327.632761
D-192011_09_26_drive_0019_sync406.4849.6129.50480
D-202011_09_30_drive_0020_sync1233.74114.4938.791104
D-222011_09_26_drive_0022_sync515.1782.6722.43799
D-232011_09_26_drive_0023_sync413.9948.9130.47473
D-272011_09_30_drive_0027_sync692.47114.8521.711106
D-27-d2011_10_03_drive_0027_sync3669.18465.9728.354497
D-282011_09_30_drive_0028_sync4208.65537.7828.175177
D-332011_09_30_drive_0033_sync1709.57165.3137.231594
D-342011_09_30_drive_0034_sync920.52126.8826.121224
D-34-d2011_10_03_drive_0034_sync5043.67481.3137.724642
D-362011_09_26_drive_0036_sync715.5782.6831.16802
D-392011_09_26_drive_0039_sync297.7840.5726.42394
D-462011_09_26_drive_0046_sync47.5612.8113.37125
D-612011_09_26_drive_0061_sync494.0272.6124.49703
D-642011_09_26_drive_0064_sync437.8958.7426.84569
D-792011_09_26_drive_0079_sync26.2710.249.23100
D-862011_09_26_drive_0086_sync268.7272.8413.28705
D-872011_09_26_drive_0087_sync286.7975.2713.72728
Highway datasets
D-042011_09_29_drive_0004_sync255.0535.1526.13339
D-152011_09_26_drive_0015_sync362.8130.6542.61297
D-162011_09_30_drive_0016_sync404.7128.8450.52278
D-282011_09_26_drive_0028_sync779.5044.4363.16430
D-292011_09_26_drive_0029_sync351.2444.4628.44430
D-322011_09_26_drive_0032_sync578.3040.3151.64390
D-422011_10_03_drive_0042_sync2591.80121.1976.991170
D-1012011_09_26_drive_0101_sync1299.1396.6248.40936
Table 2. D-27—position (m) and attitude (deg) error statistics.
Table 2. D-27—position (m) and attitude (deg) error statistics.
INSLiDARINS/LiDAR
MeanRMSEMaxMeanRMSEMaxMeanRMSEMax
East−30.3844.27122.57−0.883.345.84−0.883.345.84
North−88.24136.01353.70−0.223.126.71−0.223.126.71
Horizontal94.16143.03374.344.224.577.194.224.577.19
Up7.8010.6222.852.312.554.572.572.954.78
Roll0.0120.4440.8380.5041.2763.0840.0220.4410.852
Pitch−0.0760.3320.885−0.0980.9892.123−0.0730.3300.881
Yaw0.2800.3370.8631.2331.5553.0270.2620.3220.868
Table 3. D-28—position (m) and attitude (deg) error statistics.
Table 3. D-28—position (m) and attitude (deg) error statistics.
INSLiDARINS/LiDAR
MeanRMSEMaxMeanRMSEMaxMeanRMSEMax
East−7196.2510,042.8523,518.43−9.9413.5729.68−9.9413.5729.68
North−6188.348316.5217,618.60−2.296.5320.69−2.296.5320.70
Horizontal9500.9213,039.3029,385.9012.0915.0631.8612.0915.0631.86
Up252.12367.13893.979.9912.4660.6610.6113.5762.23
Roll0.1110.9561.9260.0851.6754.4950.1671.0361.968
Pitch0.1710.9372.0320.0201.6474.3260.1571.0252.304
Yaw0.8020.9632.213−1.5562.1465.1591.2471.6704.797
Table 4. D-101—position (m) and attitude (deg) error statistics.
Table 4. D-101—position (m) and attitude (deg) error statistics.
INSLiDARINS/LiDAR
MeanRMSEMaxMeanRMSEMaxMeanRMSEMax
East0.073.065.489.9311.4320.469.9311.4420.42
North−40.2060.39152.14−8.239.4014.95−8.249.4214.95
Horizontal40.4260.46152.2213.0714.8025.1013.0814.8225.03
Up4.896.0611.7225.0036.6587.504.866.0311.66
Roll−0.2340.2730.5670.7330.9985.318−0.2340.2730.568
Pitch0.0570.1450.558−2.9954.0927.8820.0570.1450.558
Yaw0.3750.4440.911−0.2380.3680.9650.3750.4440.911
Table 5. D-42—position (m) and attitude (deg) error statistics.
Table 5. D-42—position (m) and attitude (deg) error statistics.
INSLiDARINS/LiDAR
MeanRMSEMaxMeanRMSEMaxMeanRMSEMax
East−155.14232.32600.4011.8818.6735.4911.8518.6535.49
North25.4631.7550.88−23.1026.1734.33−23.1126.1934.36
Horizontal160.79234.48600.5227.8632.1548.3927.8632.1548.40
Up19.4824.0947.33130.98167.81285.4919.1523.5845.66
Roll−0.1670.5691.421−0.0456.29416.392−0.1660.5691.420
Pitch−0.3140.6131.210−5.1899.36816.017−0.3150.6131.209
Yaw0.1720.3550.903−0.3451.2212.4830.1700.3520.897
Table 6. D-28—GNSS-assisted—position (m) and attitude (deg) error statistics.
Table 6. D-28—GNSS-assisted—position (m) and attitude (deg) error statistics.
INSLiDARINS/LiDAR
MeanRMSEMaxMeanRMSEMaxMeanRMSEMax
East−687.001423.764839.25−0.253.6418.39−0.253.6418.39
North−194.201710.084720.220.233.3620.690.233.3620.70
Horizontal1504.872225.196760.093.834.9627.693.834.9627.69
Up−12.5137.62149.04−10.3614.4060.66−10.8015.5062.23
Roll−0.0270.7941.9260.2682.1155.5400.0190.7521.667
Pitch0.1050.7341.7870.0941.8885.3900.0980.7551.625
Yaw0.7320.8912.213−0.4971.0793.7800.7250.9222.297
Table 7. D-42—GNSS-assisted—position (m) and attitude (deg) error statistics.
Table 7. D-42—GNSS-assisted—position (m) and attitude (deg) error statistics.
INSLiDARINS/LiDAR
MeanRMSEMaxMeanRMSEMaxMeanRMSEMax
East−20.8333.7692.361.014.278.430.994.308.29
North−3.9325.7279.29−6.4112.4732.36−6.4112.4732.31
Horizontal28.6342.44121.738.9713.1833.448.9713.1933.35
Up−3.426.4918.70−29.7439.60113.50−3.406.4818.60
Roll−0.1020.3400.878−0.3402.2906.046−0.1020.3400.878
Pitch−0.2170.4100.817−3.0564.81710.521−0.2170.4100.818
Yaw0.1390.3130.8220.0430.7991.8910.1390.3130.821
Table 8. D-42—GNSS-assisted—position (m) and attitude (deg) error statistics.
Table 8. D-42—GNSS-assisted—position (m) and attitude (deg) error statistics.
Proposed SystemA-LOAMLeGO-LOAMF-LOAM
MeanRMSEMeanRMSEMeanRMSEMeanRMSE
D-28
Horizontal12.0915.0626.8134.7715.7817.8427.4631.27
Up10.6113.5710.9413.3416.7218.927.9911.00
D-42
Horizontal27.8632.15167.42201.33575.94715.1167.5587.35
Up19.1523.5879.76104.2355.6670.2525.2937.36
D-101
Horizontal13.0814.8216.0522.2724.1932.527.669.09
Up4.866.0319.5928.5719.1329.335.187.82
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Share and Cite

MDPI and ACS Style

Abdelaziz, N.; El-Rabbany, A. An Integrated INS/LiDAR SLAM Navigation System for GNSS-Challenging Environments. Sensors 2022, 22, 4327. https://doi.org/10.3390/s22124327

AMA Style

Abdelaziz N, El-Rabbany A. An Integrated INS/LiDAR SLAM Navigation System for GNSS-Challenging Environments. Sensors. 2022; 22(12):4327. https://doi.org/10.3390/s22124327

Chicago/Turabian Style

Abdelaziz, Nader, and Ahmed El-Rabbany. 2022. "An Integrated INS/LiDAR SLAM Navigation System for GNSS-Challenging Environments" Sensors 22, no. 12: 4327. https://doi.org/10.3390/s22124327

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop