*Article*

## **MIMU**/**Odometer Fusion with State Constraints for Vehicle Positioning during BeiDou Signal Outage: Testing and Results**

**Kai Zhu 1,2, Xuan Guo 1, Changhui Jiang 3,\*, Yujingyang Xue 1, Yuanjun Li 1, Lin Han 3 and Yuwei Chen 4,\***


Received: 29 February 2020; Accepted: 16 April 2020; Published: 17 April 2020

**Abstract:** With the rapid development of autonomous vehicles, the demand for reliable positioning results is urgent. Currently, the ground vehicles heavily depend on the Global Navigation Satellite System (GNSS) and the Inertial Navigation System (INS) providing reliable and continuous navigation solutions. In dense urban areas, especially narrow streets with tall buildings, the GNSS signals are possibly blocked by the surrounding tall buildings, and under this condition, the geometry distribution of the in-view satellites is very poor, and the None-Line-Of-Sight (NLOS) and Multipath (MP) heavily affects the positioning accuracy. Further, the INS positioning errors will quickly diverge over time without the GNSS correction. Aiming at improving the position accuracy under signal challenging environment, in this paper, we developed an MIMU(Micro Inertial Measurement Unit)/Odometer integration system with vehicle state constraints (MO-C) for improving the vehicle positioning accuracy without GNSS. MIMU/Odometer integration model and the constrained measurements are given in detail. Several field tests were carried out for evaluating and assessing the MO-C system. The experiments were divided into two parts, firstly, field testing with data post-processing and real-time processing was carried out for fully assessing the performance of the MO-C system. Secondly, the MO-C was implemented in the BeiDou Satellite Navigation System (BDS)/integrated navigation system (INS) for evaluating the MO-C performance during the BDS signal outage. The MIMU standalone positioning accuracy was compared with that from the MIMU/Odometer integration (MO), MO-C and MIMU with constraints (M-C) for assessing the Odometer, and the influence of the constraint on the positioning errors reduction. The results showed that the latitude and longitude errors could be suppressed with Odometer assisting, and the height errors were suppressed while the state constraints were included.

**Keywords:** GNSS; MIMU; odometer; state constraints

## **1. Introduction**

Unmanned Ground Vehicles (UGV) with complete automatic operation are regarded as the most promising technology to be available in the near future [1,2]. Precise and reliable position and navigation information are fundamental for the autonomous driving vehicles [3]. Currently, the Global Navigation Satellite System (GNSS) and the Inertial Navigation System (INS) are the most popular solutions for providing comparatively reliable positioning information [4]. GNSS is usually integrated with INS since they are highly complementary. The GNSS works by relying on the geometry distribution of the in-view satellites and signal quality, however, if the satellite signals are blocked by the surrendered buildings or obstacles, the GNSS will fail to generate precise positioning results [5]. Under this condition, the INS could provide moderate navigation solutions in a short time. However, due to the complex noises contained in the raw measurements from the gyroscope and accelerometer, the INS errors will accumulate quickly over time [4–6].

In the past decade, there has been a lot of literature focusing on improving the positioning accuracy under GNSS signal-challenging environments. These methods could be divided into two categories. The first solution is to suppress the INS noise and compensate for its positioning errors. The INS generates navigation solutions through processing the raw accelerator and gyroscope outputs. Limited by the manufacturing technology, there are complex noises contained in the raw measurements. Sheimy employed the Allan Variance method to characterize and quantify the noise [7,8]. Grip proposed an exponentially stable attitude and gyroscope bias estimation method in GNSS/INS integration [9]. Machine learning (LS-SVM, LSTM-RNN) methods were employed for modeling the errors [10–13]. Some calibration methods were also proposed to improve positioning accuracy [14–18]. Wu investigated the self-calibration of the Inertial Measurement Unit (IMU)/odometer integrated system for land vehicle navigation [14]. In addition, in the GNSS/INS integrated navigation system, some machine learning methods were employed and investigated to compensate for the INS errors during the GNSS signal outage [15–18]. These machine learning methods were well trained while the GNSS signal was normal.

The second solution is to employ more sensors in the GNSS/INS integration system and construct a multi-sensor fusion system. Among these sensors, LiDAR, vision cameras, altitude barometers, Chip Scale Atomic Clock (CSAC), and the odometer are the most popular sensors [19–24]. LiDAR is a sensor collecting the point cloud of the surrounding environment. With the continuous matching of the point cloud sequences, LiDAR can generate relative displacements and attitudes [19–25]. In aspects of the vision sensors, with the matching of the image's sequences, attitude changes could also be extracted. With two well-calibrated vision cameras or depth cameras, this method could also provide positioning information [23,24]. An altitude barometer and odometer could provide height and odometer information, respectively. GNSS/LiDAR/HD-Map/INS integration system is a popular solution for autonomous driving vehicle positioning and navigation [25]. In addition, with the size and accuracy improvement, the CSAC is employed for augmenting the GNSS accuracy by providing a more precise frequency base [21].

In general, ground vehicles are usually equipped with an odometer for measuring the traveled distance. Therefore, it is convenient to develop a GNSS/MEMS-INS/odometer fusion system. Some works have revealed and demonstrated its e ffectiveness in improving positioning accuracy [26–29]. Georgy investigated the stochastic drift model of a MEMS (Micro-Electro-Mechanical System) gyroscope in a gyroscope/odometer/GPS integrated navigation system [26]. A mixture of particle filter and fuzzy neural network was employed for enhancing the MEMS-IMU/odometer/GPS integration for land vehicle applications [27,28]. An odometer and MEMS IMU were also employed for enhancing Precise Point Positioning (PPP) under weak satellite observability environments [29]. However, these studies were conducted while the GNSS was available, and the influence of the constraints and odometer on the positioning errors were not presented respectively and clearly.

Scientists have explored and investigated this issue using the vehicle trajectory constraints to reduce the INS errors while the GNSS is unavailable [30–33]. Non-Holonomic Constraints (NHC) were employed as the measurements for suppressing the INS errors while GNSS was unavailable. While the vehicle is driving on the road, the velocity of both the up direction and perpendicular to the direction of vehicle traveling are almost zero [33]. The observability of the NHC was analyzed for demonstrating its e ffectiveness in land vehicle navigation systems [33]. However, NHC could not suppress the positioning errors in the forward direction. Therefore, in this paper, apart from the NHC, an odometer was also employed to suppress the positioning errors in the forward direction. We developed an MEMS-IMU/odometer integration navigation system considering the vehicle state constraints (MO-C) for ground vehicle positioning without GNSS. Both data post-processing and real-time processing experiments were carried out for assessing the navigation solution accuracy. Comparisons between MO-C, MO, and M-C were presented for evaluating and validating the odometer and constraints influence on the navigation solution accuracy. The contribution and innovation of this paper are summarized as follows:


The rest of the paper is organized as follows: Section 2 introduces the model of the MO, MO-C (state measurement equations), the integration filter, and the vehicle state detection method. Section 3 presents the results and numerical analysis of the field tests. Then, we discuss and conclude the paper, and the limitations and the future work are detailed.

## **2. Model**

⎡⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎣

## *2.1. GNSS*/*MIMU Loose Integration Model*

The state vector of the GNSS/MIMU loose integration model contains 15 states, and the state vector *XI* is defined as:

$$\mathbf{X}\_{I} = \begin{bmatrix} \delta\mathfrak{q}, \delta\mathfrak{v}, \delta\mathfrak{p}, \delta\mathfrak{e}, \delta\mathfrak{V} \end{bmatrix}^{\mathrm{T}} \tag{1}$$

where δφ = [α, β, γ] denotes the three-axis attitude errors (pitch, roll, and yaw angle errors), δ*v* = [<sup>δ</sup>*ve*, δ*vn*, <sup>δ</sup>*vu*] denotes the three-axis velocity errors (east, north, and up velocity errors) in the ENU navigation frames, δ*p* = [<sup>δ</sup>*L*, δλ, δ*h*] denotes the three-axis positioning errors (latitude, longitude, and height errors), δε = ε*x*, <sup>ε</sup>*y*, ε*z* denotes the bias errors of the three-axis gyroscopes in body frame, and δ∇ = -<sup>∇</sup>*<sup>x</sup>*,∇*y*,∇*z* denotesthebiaserrorsofthethree-axisaccelerometers.

 The state equation GNSS/MIMU loose integration can be written as:

$$\dot{\mathbf{X}}\_{l}(t) = \mathbf{F}\_{l}(t) \cdot \mathbf{X}\_{l}(t) + \mathbf{G}\_{l}(t)\mathbf{W}\_{l}(t) \tag{2}$$

where *<sup>F</sup>I*(*t*) is the state transferring matrix; *WI* denotes the state model noise matrix [18–21]. Specifically, the detailed description of the state equation is as:

δ .*p*3×<sup>1</sup> δ .*v*3×1 δ .φ<sup>3</sup>×<sup>1</sup> .ε3×1 .∇3×<sup>1</sup> ⎤⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎦ = ⎡⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎣ *<sup>F</sup>pp <sup>F</sup>pv* **0**3×<sup>3</sup> **0**3×<sup>3</sup> **0**3×<sup>3</sup> *<sup>F</sup>vp Fvv <sup>F</sup>v*<sup>φ</sup> **0**3×<sup>3</sup> *Cnb* **0**3×<sup>3</sup> **0**3×<sup>3</sup> *<sup>F</sup>*φφ *Cnb* **0**3×<sup>3</sup> **0**3×<sup>3</sup> **0**3×<sup>3</sup> **0**3×<sup>3</sup> *F*εε **0**3×<sup>3</sup> **0**3×<sup>3</sup> **0**3×<sup>3</sup> **0**3×<sup>3</sup> **0**3×<sup>3</sup> *F*∇∇ ⎤⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎦⎡⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎣ <sup>δ</sup>*p*3×<sup>1</sup> δ*<sup>v</sup>*3×<sup>1</sup> δφ<sup>3</sup>×<sup>1</sup> ε3×1 ∇3×<sup>1</sup> ⎤⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎦ + ⎡⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎣ **0**3×<sup>3</sup> **0**3×<sup>3</sup> **0**3×<sup>3</sup> **0**3×<sup>3</sup> *Cnb* **0**3×<sup>3</sup> **0**3×<sup>3</sup> **0**3×<sup>3</sup> **0**3×<sup>3</sup> *Cnb* **0**3×<sup>3</sup> **0**3×<sup>3</sup> **0**3×<sup>3</sup> **0**3×<sup>3</sup> *I*3×<sup>3</sup> **0**3×<sup>3</sup> **0**3×<sup>3</sup> **0**3×<sup>3</sup> **0**3×<sup>3</sup> *I*3×<sup>3</sup> ⎤⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎦⎡⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎣ *<sup>w</sup><sup>v</sup>*3×<sup>1</sup> *<sup>w</sup>*φ3×<sup>1</sup> *<sup>w</sup>*<sup>ε</sup>3×<sup>1</sup> *<sup>w</sup>*∇3×<sup>1</sup> ⎤⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎦ (3)

Further, the first-order discrete form of the state equation is as:

$$
\dot{X}\_I(k+1) = (I + F\_I \cdot T) \cdot X\_I(k+1) + G\_I \cdot T \cdot \mathbf{W}\_I(k+1) \tag{4}
$$

$$
\begin{bmatrix}
\delta\dot{p}\_{3\times1} \\ \delta\dot{w}\_{3\times1} \\ \delta\dot{p}\_{3\times1} \\ \delta\dot{p}\_{3\times1} \\ \dot{v}\_{3\times1} \\ \ddot{V}\_{3\times1}
\end{bmatrix} = \begin{bmatrix}
I\_{3\times3} + F\_{pp} & F\_{pv}T & \mathbf{0}\_{3\times3} & \mathbf{0}\_{3\times3} & \mathbf{0}\_{3\times3} \\
F\_{vp}T & I\_{3\times3} + F\_{vT}T & F\_{v\rho}T & \mathbf{0}\_{3\times3} & \mathbf{C}\_{b}^{\theta}T \\
\mathbf{0}\_{3\times3} & \mathbf{0}\_{3\times3} & I\_{3\times3} + F\_{\dot{\phi}\phi}T & \mathbf{C}\_{b}^{\theta}T & \mathbf{0}\_{3\times3} \\
\mathbf{0}\_{3\times3} & \mathbf{0}\_{3\times3} & \mathbf{0}\_{3\times3} & I\_{3\times3} + F\_{\dot{\varepsilon}\boldsymbol{T}}T & \mathbf{0}\_{3\times3} \\
\mathbf{0}\_{3\times3} & \mathbf{0}\_{3\times3} & \mathbf{0}\_{3\times3} & \mathbf{0}\_{3\times3} & I\_{3\times3} + F\_{\nabla\Gamma}T
\end{bmatrix} \begin{bmatrix}
\delta\mathbf{p}\_{3\times1} \\ \delta\mathbf{v}\_{3\times1} \\ \delta\Phi\mathbf{p}\_{3\times1} \\ \delta\boldsymbol{\chi}\_{3\times1} \\ \delta\boldsymbol{\chi}\_{3\times1} \\ \mathbf{0}\_{3\times1} \\ \mathbf{0}\_{3\times3} \\ \mathbf{0}\_{3\times1} \\ \mathbf{0}\_{3\times1}
\end{bmatrix} \tag{5}
$$

where *T* denotes the integration filter updating interval.

Then, the measurement model of the GNSS/MIMU integration is expressed as:

$$\mathbf{Z}\_{k+1} = H\_{k+1} \mathbf{X}\_{k+1} + \mu\_{k+1} \tag{6}$$

where *Zk*+<sup>1</sup> denotes the measurement matrix, *Hk*+<sup>1</sup> denotes the observation matrix, and μ*k*+<sup>1</sup> denotes the measurement noise. In the loose integration model, the measurement vector is composed of GNSS and MIMU position and velocity difference, and the detailed description is as:

$$\mathbf{Z}\_{k+1} = \begin{bmatrix} \begin{pmatrix} \binom{LMMII}{k+1} - L\_{k+1}^{GNSS} \\ \binom{MMII}{k+1} - h\_{k+1}^{GNSS} \end{pmatrix} \cdot \begin{pmatrix} R\_M + h \end{pmatrix} \\ \begin{pmatrix} \binom{MMII}{k+1} - h\_{k+1}^{GNSS} \\ h\_{k+1}^{MMII} - h\_{k+1}^{GNSS} \\ \upsilon^{MMII} - \upsilon^{GNNSS} \\ \upsilon^{MMII}\_{\phantom{\phantom{\phantom{\phantom{\phantom{\phantom{\phantom{\phantom{\phantom{\phantom{\phantom{\phantom{\phantom{\phantom{\phantom{\phantom{\phantom{\phantom{\phantom{\phantom{\phantom{\phantom{\phantom{\phantom{\phantom{\boldsymbol{\beta}}}}}}}}}}}}}} \end{^{\phantom{\phantom{\phantom{\phantom{\phantom{\phantom{\phantom{\phantom{\phantom{\phantom{\phantom{\phantom{\phantom{\phantom{\phantom{\phantom{\phantom{\phantom{\phantom{\phantom{\phantom{\phantom{\boldsymbol{\beta}}}}}}}}}}}}}{^{\phantom{\phantom{\phantom{\phantom{\phantom{\phantom{\phantom{\phantom{\phantom{\boldsymbol{\beta}}}}}}}}}}} \end{^{\phantom{\phantom{\phantom{\phantom{\phantom{\phantom{\phantom{\phantom{\boldsymbol{\beta}}}}}}}}} \end{^{\textbf{\phantom{\phantom{\phantom{\phantom{\boldsymbol{\beta}}}}}}}} \end{pmatrix}} \begin{pmatrix} R\_{M} + h \end{pmatrix} \cdot \begin{pmatrix} L \\ \end{pmatrix} \\ \upsilon^{MMII} - h \begin{pmatrix} \begin{matrix} \frac{\boldsymbol{\phantom{\phantom{\boldsymbol{\beta}}}}{\begin{pmatrix} \boldsymbol{\phantom{\boldsymbol{\beta}}} \end{pmatrix}} \end{pmatrix} \\ \upsilon^{MMII} - \upsilon^{G}\_{k+1} \end{matrix} \\ \upsilon^{MMII} - \upsilon^{G}\_{\boldsymbol{\phantom{\boldsymbol{\beta}}}} \end{pmatrix} \end{bmatrix} \tag{7}$$

A detailed description of the observation matrix *Hk*+<sup>1</sup> is:

$$\mathbf{Z}\_{k+1} = \begin{bmatrix} \operatorname{diag} [\mathbf{R}\_M + h(\mathbf{R}\_N + h) \cos(\mathbf{L}) \ \mathbf{1}] & \mathbf{0}\_{3 \times 3} \ \mathbf{0}\_{3 \times 3} \\ \mathbf{0}\_{3 \times 3} & \mathbf{0}\_{3 \times 3} \end{bmatrix} \mathbf{X}\_{k+1} + \mu\_{k+1} \tag{8}$$

## *2.2. State Constraints*

The definition of the vehicle body coordinates is presented in Figure 1. In the coordinates, the origin is the center of gravity of the vehicle body, the **Y** axis points to the direction of the vehicle traveling, the **X** axis points to the right side of the vehicle body, and the **Z** axis points to the up direction of the vehicle.

According to the driving characteristics of the vehicle on the road, while the vehicle is running normally on the road without sideslip or jump, e.g., the vehicle is driving on an expressway, the X-axis and Z-axis speeds of the vehicle in the defined vehicle frame are approximately zero. The characteristic is modeled as: 

$$\begin{cases} \ V\_x^{\upsilon} \approx 0\\ V\_z^{\upsilon} \approx 0 \end{cases} \tag{9}$$

In the dynamic trajectory, vehicle kinematic constraints are employed for suppressing the diverging positioning errors under the GNSS-denied environment. Vehicle kinematic constraints are constructed in the vehicle body coordinates. Although the Inertial Measurement Unit (IMU) is installed on the

vehicle body, there is usually a misalignment angle between the IMU body coordinates and the vehicle body coordinates. The misalignment angles will affect the velocity constraints listed in Equation (1).

**Figure 1.** Illustration of the vehicle body coordinates.

Assuming the conversion matrix between the IMU body coordinates and the vehicle body system is *Cbv*. The heading misalignment angle is αψ, the pitch misalignment angle is αθ, and the roll misalignment angle is αγ. Converting the velocity in the IMU body system to the vehicle body system can be modeled as:

$$\mathbf{V}^b = \mathbf{C}\_v^b \mathbf{V}^v \tag{10}$$

where **V***v* is the velocity vector in the IMU body coordinates, and **V***<sup>b</sup>* is the velocity vector in vehicle body coordinates.

Specifically,

$$\mathbf{V}^{b} = \begin{bmatrix} V\_x^b \\ V\_y^b \\ V\_z^b \end{bmatrix} = \mathbf{C}\_v^b \begin{bmatrix} V\_x^v \\ V\_y^v \\ V\_z^v \end{bmatrix} = \begin{bmatrix} \sin a\_\psi \cos a\_\theta \\ \cos a\_\psi \cos a\_\theta \\ \sin a\_\theta \end{bmatrix} V\_y^v \tag{11}$$

While the vehicle is moving, the *Vvy* is not zero. The *Vvy* will be projected to the *Vbx* and *Vbz* through the heading and pitch misalignment angle. The roll angle does not influence the *Vbx* and *Vbz* . The influence of the misalignment angle on the velocity *Vbx* and *Vbz* can be described as

$$
\delta \mathcal{C}\_b^v = - \begin{bmatrix} \delta \alpha\_\theta \\ 0 \\ \delta \alpha\_\psi \end{bmatrix} \times \mathcal{C}\_b^v = -\delta \alpha \times \mathcal{C}\_b^v \tag{12}
$$

While employing the constraints in the GNSS/MIMU loose integration, the misalignment angle between the MIMU body frame and the vehicle body frame should be considered and added to the state vector. The new state vector is as:

$$\mathbf{X} = \begin{bmatrix} \mathbf{X}\_l & \mathbf{X}\_a \end{bmatrix}^T \tag{13}$$

where **X***I* is the same as that in Equation (1), **X**α = δαθ δαψ*<sup>T</sup>*, δαθ is the misalignment heading angle, and the δαψ is the misalignment pitch angle.

Once the IMU is fixed on the vehicle, the misalignment angles can be regarded as constant values. Therefore, the first-order derivative of the misalignment angles is zero.

$$\begin{cases} \delta \dot{\alpha}\_{\theta} = 0\\ \delta \dot{\alpha}\_{\psi} = 0 \end{cases} \tag{14}$$

Then, the state equation of the MIMU can be as:

$$
\begin{array}{c}
\dot{\mathbf{X}}(t) \\
\dot{\mathbf{X}}(t) \\
=
\begin{bmatrix}
\dot{\mathbf{X}}\_{l}(t) \\
\dot{\mathbf{X}}\_{a}(t) \\
=
\dot{\mathbf{F}}(t)
\end{bmatrix} = 
\begin{bmatrix}
\mathbf{F}\_{l}(t) & 0 \\
0 & \mathbf{F}\_{a}(t)
\end{bmatrix}
\begin{bmatrix}
\mathbf{X}\_{l}(t) \\
\mathbf{X}\_{a}(t)
\end{bmatrix} + 
\begin{bmatrix}
\mathbf{G}\_{l}(t) & 0 \\
0 & \mathbf{G}\_{a}(t)
\end{bmatrix}
\begin{bmatrix}
\mathbf{W}\_{l}(t) \\
\mathbf{W}\_{a}(t)
\end{bmatrix} \\
\end{array} \tag{15}
$$

where **<sup>F</sup>***I*(*t*) is the state transformation matrix of the IMU's states,**F**α(*t*) is the state transformation matrix of the misalignment angles, **<sup>W</sup>***I*(*t*) is the IMU state noise matrix, and **<sup>W</sup>**α(*t*) is the misalignment angle state noise matrix.

Commonly, the MIMU and GNSS loose integration model is constructed in East–North–Up (ENU) coordinates. Positioning and velocity information from the GNSS and INS are subtracted and employed as the measurement information. Converting the SINS velocity from the ENU coordinates to the vehicle body frame.

$$\mathbf{V}^{v} = \mathbf{C}\_{b}^{v} \mathbf{C}\_{n}^{b} \mathbf{V}^{n} \tag{16}$$

where *Cvb* means the velocity conversion from the vehicle body frame from the IMU body coordinates, *Cbn* is the velocity transformation matrix from the ENU navigation frame to the vehicle body coordinates, and **V***n* is the velocity vector in the ENU navigation frame.

Combining Equations (11)–(16), the differential equation is

$$\begin{array}{rcl}\delta\mathbf{V}^{\upsilon} &= \mathsf{C}\_{b}^{\upsilon} (\mathsf{C}\_{n}^{b}\mathsf{q} \times \mathbf{V}^{n} + \mathsf{C}\_{n}^{b}\delta\mathbf{V}^{n}) - \delta\boldsymbol{\alpha} \times \mathsf{C}\_{b}^{\upsilon}\mathsf{C}\_{n}^{b}\mathbf{V}^{n} \\ &= \left(-\mathsf{C}\_{b}^{\upsilon}\mathsf{C}\_{n}^{b}(\mathbf{V}^{n}) \times \middle| \mathsf{q}\mathsf{q} + \mathsf{C}\_{b}^{\upsilon}\mathsf{C}\_{n}^{b}\delta\mathbf{V}^{n} + \left(\left(\mathsf{C}\_{b}^{\upsilon}\mathsf{C}\_{n}^{b}\mathbf{V}^{n}\right) \times\right)\delta\boldsymbol{\alpha} \\ &= \mathsf{M}\_{3\times 3}^{1}\mathsf{q} + \mathsf{M}\_{3\times 3}^{2}\delta\mathbf{V}^{n} + \mathsf{M}\_{3\times 3}^{3}\delta\mathbf{a} \end{array} \tag{17}$$

The measurement equation is

$$\mathbf{Z}\mathbf{v} = \begin{bmatrix} V\_x^\upsilon - 0\\ V\_z^\upsilon - 0 \end{bmatrix} = \mathbf{H}\mathbf{v}\mathbf{X} + \mathbf{V}\mathbf{v} \tag{18}$$

where **H***V* is the measurement matrix, and the **V***V* is the noise matrix.

$$\mathbf{H}\_{V} = \begin{bmatrix} \mathbf{M}\_{3\times3}^{1}(1,\times) & \mathbf{M}\_{3\times3}^{2}(1,\times) & 0\_{1\times11} & 0 & \mathbf{M}\_{3\times3}^{3}(1,3) \\ \mathbf{M}\_{3\times3}^{1}(3,\times) & \mathbf{M}\_{3\times3}^{2}(3,\times) & 0\_{1\times11} & \mathbf{M}\_{3\times3}^{3}(3,1) & 0 \end{bmatrix} \tag{19}$$

where *<sup>M</sup>*13×<sup>3</sup>(1,×) is the first row of the matrix *<sup>M</sup>*13×3, *<sup>M</sup>*33×<sup>3</sup>(1, 3) is the element in the first row and third column, *<sup>M</sup>*13×<sup>3</sup>(3,×) is the third row of the matrix *<sup>M</sup>*13×3, and *<sup>M</sup>*33×<sup>3</sup>(3, 1) is the element in the third row and first column.

## *2.3. MIMU*/*Odometer Measurement Model*

The state vector of the MIMU/Odometer integration model is the same as Equations (13)–(15), however, the measurement equation is different. The odometer output is listed as:

$$\mathbf{V}\_{\alpha \phi \nu}^{b} = \begin{bmatrix} 0 & \mathcal{V}\_{\alpha \phi \nu}^{b} & 0 \end{bmatrix}^{\mathrm{T}} \tag{20}$$

*Sensors* **2020**, *20*, 2302

We then convert the odometer velocity from the IMU body frame to the ENU navigation frame, then subtracting them with the velocity from the MIMU. The equation is listed as:

$$\mathbf{Z}\_O = \mathbf{V}\_I^n - \mathbf{C}\_b^n \mathbf{V}\_{ado}^b = \begin{bmatrix} V\_{IE} - V\_{odoE}^n \\ V\_{IN} - V\_{odoN}^n \\ V\_{II} - V\_{odoI}^n \end{bmatrix} = \mathbf{H}\_O \mathbf{X} + \mathbf{V}\_O \tag{21}$$

where **V***nI* is the velocity from the MIMU, and **V**ˆ *bodo* is the velocity from the odometer, **H***O* is the measurement matrix, and **V***O* is the noise matrix.

$$\mathbf{H}\_{\bullet} = \begin{bmatrix} \mathbf{0}\_{1\times3} & \mathbf{1} & \mathbf{0} & \mathbf{0} & \mathbf{0}\_{1\times11} \\ \mathbf{0}\_{1\times3} & \mathbf{0} & \mathbf{1} & \mathbf{0} & \mathbf{0}\_{1\times11} \\ \mathbf{0}\_{1\times3} & \mathbf{0} & \mathbf{0} & \mathbf{1} & \mathbf{0}\_{1\times11} \end{bmatrix} \tag{22}$$

#### *2.4. MIMU*/*Odometer Measurement Model with Constraints*

Combining Equations (8)–(14), the MIMU/odometer measurement model with constraints is as:

$$\mathbf{Z}\_{OV} = \begin{bmatrix} V\_x^{\upsilon} - 0 \\ V\_{\text{IN}} - V\_{\text{doN}}^{n} \\ V\_z^{\upsilon} - 0 \end{bmatrix} = \mathbf{H}\_{OV}\mathbf{X} + \mathbf{V}\_{OV} \tag{23}$$

where *Vvx* is the X-axis velocity in the vehicle body coordinates, *Vvz* is the Z-axis velocity in the vehicle body coordinates, **H***OV* is the measurement matrix, and **V***OV* is the measurement noise matrix.

$$\mathbf{H}\_{\rm OV} = \begin{bmatrix} \mathbf{M}\_{\rm{3\times3}}^1(1,\times) & \mathbf{M}\_{\rm{3\times3}}^2(1,1) & \mathbf{M}\_{\rm{3\times3}}^2(1,2) & \mathbf{M}\_{\rm{3\times3}}^2(1,3) & \mathbf{0}\_{1\times11} & \mathbf{0} & \mathbf{M}\_{\rm{3\times3}}^3(1,3) \\ \mathbf{0}\_{1\times3} & \mathbf{0} & \mathbf{1} & \mathbf{0} & \mathbf{0}\_{1\times11} & \mathbf{0} & \mathbf{0} \\ \mathbf{M}\_{\rm{3\times3}}^1(3,\times) & \mathbf{M}\_{\rm{3\times3}}^2(3,1) & \mathbf{M}\_{\rm{3\times3}}^2(3,2) & \mathbf{M}\_{\rm{3\times3}}^2(3,3) & \mathbf{0}\_{1\times11} & \mathbf{M}\_{\rm{3\times3}}^3(3,1) & \mathbf{0} \end{bmatrix} \tag{24}$$

## *2.5. Integration Method*

The state model is listed in Equations (5)–(7), and the measurement models under different conditions are given in Equations (10), (14), and (15). Here, a Kalman filter is employed for carrying out the integration. The Kalman filter is as:

The Kalman filter state vector and state covariance prediction are as:

$$
\hat{\mathbf{X}}\_k^- = \mathbf{ \Phi}\_{k|k-1} \hat{\mathbf{X}}\_{k-1} \tag{25}
$$

$$P\_k^{-} = \Phi\_{k|k-1} \mathbf{P}\_{k-1} \Phi\_{k|k-1}^T + Q\_{k-1} \tag{26}$$

The updating of the gain matrix, state vector, and the covariance are as follows:

$$\mathbf{K}\_k = \mathbf{P}\_k^- \mathbf{H}\_k^T (\mathbf{H}\_k \mathbf{P}\_k^- \mathbf{H}\_k^T + \mathbf{R}\_k)^{-1} \tag{27}$$

$$\mathbf{\hat{X}}\_{k} = \mathbf{\hat{X}}\_{k}^{-} + \mathbf{K}\_{k}(\mathbf{Z}\_{k} - \mathbf{H}\_{k}\mathbf{\hat{X}}\_{k}^{-}) \tag{28}$$

$$P\_k = (I - \mathcal{K}\_k H\_k) P\_K^- \tag{29}$$

where **<sup>Φ</sup>***k*|*k*−<sup>1</sup> is the state transformation matrix; *X*ˆ −*k* is the predicted state vector through the state transformation matrix and the state vector at previous epoch; *<sup>P</sup>*<sup>−</sup>*k* is the covariance matrix; *Kk* is the gain matrix at the *kth* epoch, which decides the updating weight between the predicted state vector and the new measurements; *X*ˆ *k* is the estimated state vector at the *kth* epoch; *Pk* is the covariance matrix.

Based on the above model, Figure 2 shows the structure of the integration system with constraints. When the GNSS is available, the GNSS/MIMU integration system can provide satisfying navigation solutions. While GNSS is unavailable, constraints are employed in the integration system for estimating the IMU state errors and compensating them.

**Figure 2.** Structure of the Global Navigation Satellite System (GNSS)/Inertial Navigation System (INS)/odometer/Non-Holonomic Constraints (NHC) integrated system.

## **3. Results**

For fully testing and assessing the performance of the MO-C system, two different field tests were carried out. The equipment employed in the field testing are given in Figure 3, the employed MIMU is presented in Figure 3a, and the vehicle is shown in Figure 3b,c. In the first field test, the MIMU and Odometer dataset was collected and post-processed through the software implemented in Matlab [34]. In the second field test, the algorithm was implemented using the hardware platform DSP+FPGA (Digital Signal Processor, DSP; Field Programmable Gate Array, FPGA), and the results were obtained from real-time processing of the data. The MO-C was also implemented in the BDS/MIMU integrated navigation system for improving the effectiveness during BDS signal outage.

**(a)** Inertial Measurement Unit

**Figure 3.** Field testing equipment.

Parameters of the employed MIMU were given in Table 1. The gyroscope bias stability was 3 degrees/h, and the accelerometer bias stability was 0.1 mg. The MIMU sampling frequency was 400 Hz, and the odometer data output frequency was 20 Hz. The integration filter operation period was one second.


#### *3.1. Field Testing with Data Post-Processing*

MIMU and Odometer data were collected with the equipment presented in Figure 3. Following positioning values comparisons ( GNSS, MIMU, MIMU/odometer (MO), and MIMU/Odometer with constraints (MO-C) are presented in Figures 4–6. Trajectories obtained from different methods were presented in Figure 4. The MO-C and MO results were similar in this trajectory. The positioning curves in Figure 5 are plotted as the trajectory. Velocity results comparisons are presented in Figure 6. Positioning and velocity error analysis including the Root Mean Square Error (RMSE) and 90-s errors without GNSS are listed in Table 2. Through a comparison of the results of the GNSS, MIMU, MO, and MO-C, it could be seen that:


**Figure 4.** The trajectory from different methods.

**Figure 5.** Positioning results comparison.

**Figure 6.** Velocity results comparison.


**Table 2.** Positioning and velocity error comparison.

#### *3.2. Field Testing with Real-Time Data Processing*

After the post-processing field testing, we carried out real-time data processing-based field testing for further evaluation of the performance of the method. The algorithm was implemented using the DSP+FPGA hardware platform with real-time processing data fusion. This sub-section is divided into four parts. In the first part, we evaluate the MIMU with constraints; in the second part, the MIMU/odometer integration is assessed; in the third part, the MO-C results are presented and analyzed; in the last part, the MO-C was integrated into GNSS/MIMU integrated navigation system, and the navigation solutions are presented and compared during a signal outage.

## 3.2.1. MIMU with Constraints

The field-testing trajectory was presented in Figure 7, and the positioning errors and velocity errors were presented in Figures 8 and 9. The latitude and longitude errors of M-C gradually accumulated, but gradually tended to be flat, the altitude errors gradually stabilized, and the errors were small; although the three-dimensional speed errors were stable, the east and north speed errors were relatively large, and the up speed errors were small. Without GNSS, the position and velocity errors within 90 s were as follows: latitude error was −25.85 m, longitude error was −28.80 m, altitude error was −3.06 m, east velocity error was −0.91 m/s, north velocity error was 0.27 m/s, up velocity error was −0.38 m/s. The results showed that the constraints were e ffective for suppressing the errors of the MIMU in the dynamic trajectory. However, the M-C 90-s error values were still not ideal, which was also a ffected by the heading angle. The heading angle was presented in Figure 10, and it varied between 40◦ and 42◦. The vehicle kinematics constraints could only suppress the X-axis and Z-axis position and velocity errors.

**Figure 7.** Field testing trajectory.

**Figure 10.** Heading angle.

## 3.2.2. MIMU/Odometer Integration

Three-axis position errors and velocity errors were presented in Figures 11 and 12, and it could be seen that the MO latitude and longitude errors were stable and small, the altitude errors firstly decreased and then increased; the east and north speed errors were stable, and kept within 0.2 m/s, the up velocity errors gradually increased, and it trended to diverge. The 90-s position and velocity errors were as follows: latitude error was 1.74 m, longitude error was 4.73 m, altitude error was −13.35 m, east velocity error was 0.03 m/s, north velocity error was −0.06 m/s, and the up velocity error was −0.85 m/s. The results showed that the MO was effective for reducing the MIMU positioning errors. It was worth of noting that the error of up velocity increased gradually without good constraints, which was the same as the results in Section 3.1. If we wanted to obtain high-precision three-dimensional positioning and speed measurement under GNSS-denied environments, additional sensors or methods were necessary to suppress the height and up-direction velocity errors.

## 3.2.3. MIMU/Odometer Integration with Constraints

In Figures 13 and 14, the three-dimensional position and velocity errors of MO-C were stable and small. The height errors and the up velocity errors were significantly smaller than that of the MC. The added kinematic constraints information performed well in suppressing the up velocity errors and height errors. The 90-s position and velocity errors were as follows: latitude error was 1.72 m, longitude error was −1.36 m, altitude error was −4.38 m; east velocity error was 0.02 m/s, north velocity error was −0.04 m/s, and up velocity error was −0.23 m/s. The position and velocity errors comparison for MINS/BDS, M-C, MO, and MO-C are listed in Table 3. It could be seen that the latitude and longitude errors were suppressed while the odometer was included in the system. With the odometer assisting, the MO-C 90-s latitude and longitude errors decreased over 90% compared with that of M-C. While adding the constraints to the MO, the MO-C longitude and height errors performed with a 71.2% and 67.2% decrease, and the north and up velocity decreased by 33.3% and 72.9%. These results demonstrated the effectiveness of the odometer and constraints in position and velocity error suppression.

**Figure 14.** Velocity errors.



3.2.4. Implementation of MC-O in MIMU/BDS during Signal Outage

In this part, we integrated into the BDS/MINS coupled navigation system for improving its positioning accuracy during a signal outage. Figure 15 presented the field-testing trajectory in Google maps. The BDS satellite amount of change was presented in Figure 16. The red line represented the in-view satellite amount of GPS and BDS, which was employed as the reference. The blue line represented the BDS satellite amount employed in the experiment. At 75 s, the antenna was removed to simulate the signal outage. Therefore, the satellite amount was zero after 75 s.

Position and velocity errors were presented in Figures 17 and 18. During 0–75 s, the system worked on BDS/MINS integration mode, and the position and velocity errors were within the normal range. The system worked on GNSS/MIMU/Odometer mode during 0–75 s, the benefits could be summarized as follows: firstly, the GNSS provided the initial position and velocity information for the MIMU, and the GNSS velocity was helpful for the attitude estimation; secondly, the odometer could also help the navigation solutions estimation, and in this mode, some parameters of the odometer could be estimated from the reliable navigation solutions.

**Figure 15.** Trajectory.

**Figure 17.** Position errors.

After the 75 s, the BDS antenna was removed to simulate the signal outage and to assess the performance of the MC-O. During 75 s and 165 s, the position errors experienced a minor decrease; however, the position errors still kept within 10 m. After 165 s, the odometer was disconnected for the assessment of the M-C performance. The position and velocity errors obtained a dramatic decrease. The latitude and longitude errors were over 20 m. However, the height errors still kept within 10 m, which demonstrated the effectiveness of the up velocity constraint. After 180 s, the odometer was re-connected to the system, and the positioning and velocity errors converged quickly to normal range.

**Figure 18.** Velocity errors.

## **4. Discussion**

Our experimental results demonstrated that the odometer and the state constraints were effective for suppressing the positioning errors of the MIMU while GNSS was unavailable. The odometer was effective for reducing the errors of the vehicle moving direction, and the constraints also performed well in reducing the height errors. During the 90 s testing time, the MO-C three-dimensional position errors could keep within five meters above IMU. However, we thought the following work was worthy of further investigation:


## **5. Conclusions**

In this paper, we present a comprehensive investigation of the MIMU/odometer integrated navigation system with vehicle state constraints. The algorithm is described and listed in detail. Abundant experiments were conducted for evaluating and comparing the performance of the MO, M-C, and MO-C methods. We could conclude that:


**Author Contributions:** K.Z. wrote the first draft of this paper, X.G. conducted the field tests, C.J. proposed the method, Y.X. and Y.L. reviewed and revised the paper, L.H. developed the software, and Y.C. guided the paper writing and discussed the method. All authors have read and agreed to the published version of the manuscript.

**Funding:** The authors acknowledge the support of the National Natural Science Foundation of China (Grant No. 61601225).

**Conflicts of Interest:** The authors declare no conflict of interest.
