Next Article in Journal
An Optical Fiber Chemical Sensor for the Detection of Copper(II) in Drinking Water
Previous Article in Journal
Comparison of Calibration Approaches in Laser-Induced Breakdown Spectroscopy for Proximal Soil Sensing in Precision Agriculture
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

A Model-Based Method for Estimating the Attitude of Underground Articulated Vehicles

School of Mechanical Engineering, University of Science & Technology Beijing, Beijing 100083, China
*
Author to whom correspondence should be addressed.
Sensors 2019, 19(23), 5245; https://doi.org/10.3390/s19235245
Submission received: 28 September 2019 / Revised: 11 November 2019 / Accepted: 15 November 2019 / Published: 28 November 2019
(This article belongs to the Section Physical Sensors)

Abstract

:
This paper presents a novel model-based method for estimating the attitude of underground articulated vehicles (UAV). We selected the Load–Haul–Dump (LHD) vehicle as our application object, as it is a typical UAV. First, we established the involved models of the LHD vehicle, including a kinematic model, the linear and angular constraints of a center articulation model, and a dynamic four degrees-of-freedom (DOF) yaw model. Second, we designed a Kalman filter (KF) to integrate the kinematic and constraint models with the data from an inertial measurement unit (IMU), overcoming gyroscope drift and disturbances in external acceleration. In addition, we designed another KF to estimate the yaw based on the dynamic yaw model. The accuracy of the estimations was further enhanced by data fusion. Then, the proposed method was validated by a simulation and a field test under different dynamic conditions. The errors in the estimation of roll, pitch, and yaw were 3.8%, 2.4%, and 4.2%, respectively, in the field test. The estimated longitudinal acceleration was used to obtain the velocity of the LHD vehicle; the error was found to be 1.2%. A comparison of these results to those of other methods showed that the proposed method has high precision. The proposed model-based method will greatly benefit the location, navigation, and control of UAVs without any artificial infrastructure in a global positioning system (GPS)-free environment.

1. Introduction

With the increase in depth of underground mines, a hazardous environment and safety problems are becoming increasingly prominent. Autonomous vehicles (AVs) are urgently needed to ensure that underground mines are safe and efficient [1,2]. As a typical and important kind of underground vehicle, underground articulated vehicles (UAVs) are widely used in underground mines due to their advantages of a higher maneuverability and efficiency [3]. Therefore, automated UAVs (AUAVs) have first priority in underground mines. However, the automation of their precise control and navigation requires related information on the target vehicle [4,5,6], which includes the attitude, velocities, and even accelerations relative to different directions [7].
The acquisition of the attitude and states of vehicles has been studied intensively for many years [8,9,10,11]. What needs to be emphasized is that, although a global positioning system (GPS) can provide the related information with high accuracy, obstruction of the signal can cause the system to be invalid in some urban areas and tunnels, which is an especially serious problem in underground mines [9]. Two kinds of methods were designed in order to cope with this issue with UAVs according to the specific characteristics of underground mines: infrastructure-based methods [8] and sensor-based methods [12,13]. In addition, a mixture of both is sometimes used [14]. As efficient vehicles for underground mines, Load–Haul–Dump (LHD) vehicles, as shown in Figure 1, are considered to be study and application objects.
Infrastructure-based methods were applied during an early stage, but have not been widely used due to their cost and obstructions in the environment [8,15].
Regarding the sensor-based methods, sensors with various functions, such as laser scanners [16], odometers [9,17], and charge-coupled device (CCD) cameras [18] have been applied to obtain the states of vehicles in underground mines. Mäkelä [17] designed a guidance system for LHD vehicles that required no extra infrastructure in the tunnel, and where only the yaw motion was considered. The yaw angle of the front and rear frames was obtained by a gyroscope, an angle encoder, and an odometer; however, the accuracy of the yaw angle was not given. Chi et al. [16] designed a positioning system based on laser sensors, in which the yaw angle is measured and corrected based on barcode theory, and the authors only provided the longitudinal error from a field test, which was less than 15 cm after 40 m of travel. For these sensor-based methods, the involved sensors need external signals; for example, the lasers need to receive reflected signals, and the cameras take pictures based on the reflected light. However, the dusty and dim environment of an underground mine can obstruct these signals [19], which can bring about uncertainties and errors during the process of measurement. For these reasons, the robustness of these methods will be poor. Additionally, the data from the gyroscope provides basic information on the yaw motion, gyroscope drift is not considered, and the error that accumulates can only be corrected with the assistance of beacons. Sometimes, continuous estimation during operation cannot be realized, which can seriously restrict the speed and efficiency [20].
To overcome these difficulties with the infrastructure-based and sensor-based methods, some special devices and algorithms have been applied. Marshall et al. [21] designed a system without infrastructure for the automation of an LHD vehicle to improve the robustness and reliability. Laser rangefinders, a wheel odometer, and angle encoders were applied to provide the necessary data. An extended Kalman filter (EKF) was used to estimate the planar states, which included position and orientation. Although the errors in the estimation were not provided, the path-tracking errors in the heading angle and the lateral angle, which were less than 0.1 rad and 0.28 m, respectively, demonstrated the validity of the system. Wu et al. [22] presented a system for estimating the yaw motion of an LHD vehicle, in which two ultra-wide band (UWB) modules were adopted and planar motion was assumed for the LHD vehicle. A field test showed that the error in the estimation was less than 3°. Jin et al. [9] proposed a model-based method for estimating the states of electric drive UAVs, in which an unscented Kalman filter (UKF) and a dynamic model were used. The estimated error in the longitudinal velocity, yaw motion, and side-slip was less than 5% under pavement conditions, which was a cement pavement. Although the accuracy was found to be acceptable, the uncertainties in the parameters in the dynamic model may bring about errors under uneven road conditions. Byeolteo et al. [23] presented a novel method for obtaining the states of position and direction with a magnetic field and simultaneous localization and mapping (SLAM) in an underground environment. Regarding the position and direction, results from field tests showed that the proposed method reduced the root mean square (RMS) of the errors by 95.56% and 99.34%, respectively, compared with the results of dead reckoning. Although the robustness of the system was enhanced, some basic installations were necessary, and most of the sensors required a feasible environment. Moreover, the method was not verified on mine-like roads, where the road is terrible and bumpy.
The gyroscope that was used in the abovementioned methods has no special requirements for operation in a mine environment, which makes it the preferred sensor. However, a high-grade gyroscope is too expensive to be widely used, and the drift in a general gyroscope may also lead to errors in the results. With the advances in micro-electro-mechanical systems (MEMS), inertial measurement units (IMUs), in which a gyroscope and an accelerometer are included, have been widely used for the estimation of attitude and positioning. Zhu et al. [24] presented a novel system for estimating the attitude and measuring the stability of articulated heavy vehicles. A set of IMU sensors formed the hardware foundation of the system, and several sensors were adopted by considering the special structure of articulated vehicles. The states of the target vehicle were obtained by data fusion. A field test showed that the errors in the pitch, roll, and yaw angles in a static state were 0.027°, 0.025°, and 0.426°, respectively. The pitch and roll motions were found to vary between ±20°, and the yaw motion was found to vary between 0° and 60°. The corresponding errors were all less than 4%. However, the results were obtained on pavement, and the system did not cope well with gyroscope drift.
Drift in the gyroscope and interference with the accelerometer remain key problems with IMUs, and some methods have been proposed to improve the accuracy of estimations. These include the threshold-based switching approach [25], the adaptive filter method [26], a model-based algorithm [10], and a comprehensive algorithm [27]. Furthermore, Ahmed et al. [28] investigated a novel algorithm for attitude estimation using low-cost IMU sensors, which was applied to a land vehicle. A kinematic model and the vector norm property of gravity were used to estimate the pitch and roll angles. After that, the obtained results were used to estimate the yaw angle in combination with a magnetometer. The results of a simulation and field tests showed that the proposed algorithm was robust and accurate under different dynamic conditions.
Most of the target vehicles that were studied in previous papers were four-wheeled commercial vehicles [29], which can be considered to be single rigid bodies. However, different from the target vehicles in the abovementioned studies, UAVs contain two parts, which are connected to each other with center articulation [24]. Although some methods have been applied to articulated vehicles, the dynamics, the kinematic model, and the defects in IMU sensors have not been considered. Moreover, the roll and pitch angles of UAVs have been neglected in related research [9,22], as the motion of the target vehicle was always considered to be on a plane. The motions of UAVs involve violent pitch and roll motions that occur during normal operation between different levels of underground mines or on an uneven road. Estimations of roll and pitch are essential to the navigation and dynamical control of UAVs in an underground mine environment.
In addition, as the most important piece of information about a target vehicle, the yaw motion is measured by a magnetometer in an IMU. The magnetometer measures the magnetic force between different yaw motions to obtain the yaw angle. Unfortunately, UAVs are usually put to work in an underground metal mine, where the magnetic field may be disordered by metallic minerals. In addition, in the commonly used method, the accuracy of the yaw angle’s estimation cannot be improved by using data on acceleration. Therefore, the yaw angle of a UAV cannot be obtained by an IMU alone. Inspired by the work in [9], we use the information from measurements and a kinematic model of center articulation to improve the accuracy of yaw angle estimation, which is obtained primarily by a dynamic model. Motivated by the above discussion, we make a slight improvement in the accuracy of the estimation of the state of a UAV. The three original contributions that distinguish the present work from previous works can be summarized as follows:
(1)
A novel model-based method for estimating the attitudes of UAVs is proposed. In the first step, the kinematic model and the constraint of center articulation models are used to overcome the drift in the IMU.
(2)
A model-based algorithm that combines the dynamic model with a KF is developed to estimate the yaw motion of an LHD vehicle.
(3)
A fusion of the data from different models and sensors is carried out to improve the accuracy of attitude estimations.
The remainder of this paper is organized as follows. In Section 2, the involved models are established, these models are embedded into the KFs, and algorithms are described for the pitch and roll motion and yaw motion. In Section 3, a simulation of an LHD vehicle is carried out under different dynamic conditions, and data from the simulation are used to verify the method. Section 4 presents the results of a field test and a discussion. Finally, Section 5 summarizes our conclusions and future work.

2. Method

The proposed method is briefly described in the diagram shown in Figure 2. Some sensors are necessary, including IMUs, an encoder, a pressure sensor, and a length sensor. In Figure 2, the green line denotes the attitude of the roll and the pitch. KFs are the major algorithms. The yaw and other attitudes are obtained separately. In Figure 2, the black line after the angular model denotes the yaw of the LHD vehicle, and the green line denotes the other attitudes. First, the data from the IMU that is located on the front frame are used to obtain the dynamic acceleration and attitude (roll and pitch) of the front frame. During this process, the kinematic model of the front frame and the norm characteristics of the gravitational acceleration vector are used by the KF. Second, the information on acceleration is combined with linear constraints to calculate the acceleration of the rear frame, which can be used to calculate the attitude of the pitch and roll of the rear frame. Simultaneously, the attitude of the front frame is used to calculate the attitude of the rear frame with angular constraints. Additionally, an improvement is carried out with data fusion to obtain the final attitude of the LHD vehicle. The dynamic model of yaw is used to obtain the yaw of the front frame with the help of the KF and data from the sensors. The corresponding part of the rear frame is obtained by the model of angular constraints.

2.1. Models

2.1.1. Kinematic Model of the LHD Vehicle

The directions of the linear and angular velocities of the LHD vehicle are defined as shown in Figure 3. The front and rear frames are considered to be rigid bodies. According to the motion of a rigid body, the kinematic model of the front frame and the rear frame of the LHD vehicle can be expressed as:
a x 1 = v ˙ x 1 ω z 1 v y 1 + ω y 1 v z 1
a y 1 = v ˙ y 1 + ω z 1 v x 1 ω x 1 v z 1
a z 1 = v ˙ z 1 ω y 1 v x 1 + ω x 1 v y 1
a x 2 = v ˙ x 2 ω z 2 v y 2 + ω y 2 v z 2
a y 2 = v ˙ y 2 + ω z 2 v x 2 ω x 2 v z 2
a z 2 = v ˙ z 2 ω y 2 v x 2 + ω x 2 v y 2
where the subscripts x, y, and z indicate the different directions in Figure 3. The subscripts 1 and 2 indicate the front frame and the rear frame of the LHD vehicle, respectively.
The kinematic constraints were obtained between the two frames, which contain linear and angular constraints. More specifically, the linear constraints in different directions can be expressed as:
v x 2 = v x 1 cos φ v y 1 L f 2 ω z 1 sin φ
v y 2 = v x 1 sin φ v y 1 L f 2 ω z 1 cos φ L r 2 ω z 2
v z 2 = v z 1 + L f 2 + L r 2 ω y 1
where φ is the steering angle of center articulation, and Lf2 and Lr2 are the distance from the center of gravity of the front frame and the rear frame to the point of articulation, respectively, as shown in Figure 3. Similarly, the angular constraints in different directions can be expressed as:
θ 2 = θ 1 cos φ + ϕ 1 sin φ
ϕ 2 = ϕ 1 cos φ + θ 1 sin φ
φ = ψ 1 ψ 2
where θ, φ, and Ψ are the pitch, roll, and yaw angle, respectively.
According to [10], the external acceleration must be removed to improve the accuracy of pitch and roll estimation. The constraints (2a)–(2c) may be useful for the removal of external acceleration. In addition, Equations (3a)–(3c) can also be used to estimate the attitude of the rear frame. However, neither of them can be the only source of the attitude’s estimation because of the overly ideal assumptions and the errors in the measurement.

2.1.2. Dynamic Model of Yaw for the LHD Vehicle

The dynamic model of yaw is used to restrain the drift in the yaw information from the gyroscope in the IMU. Considering the efficiency of the method, a simplified dynamic model of the LHD vehicle was adopted, which is shown in Figure 4. The directions of the motions are the same as the directions that are defined in Figure 3.
According to Newton’s second law, the simplified dynamic model can be given by:
I z 1 + m 1 L f 2 2 ω ˙ z 1 = T 0 F y 1 L f 2 L f 1
I z 2 + m 2 L f r 2 ω ˙ z 2 = T 0 + F y 2 L r 2 + L r 1
m 1 v ˙ x 1 + m 2 v ˙ x 2 cos φ = F x f + F x r cos φ F y r sin φ
m 1 v ˙ y 1 + m 2 v ˙ y 2 cos φ = F y f + F y r cos φ F x r sin φ
where Iz1 and Iz2 are the moment of inertia of the front frame and the rear frame, respectively. Fxf, Fxr, Fyf, and Fyr are the force of front and rear tire in X and Y directions, respectively. T0 is the steering moment, and although it can be modeled as an equation of ϕ in a steady state [9], it normally operates in an unsteady state. Here, we consider the steering moment as the input of the dynamic model, and it can be indirectly measured by the pressure and displacement sensors [30]. It is sufficient to focus on one of the two yaw motions in Equation (4) considering the constraints in Equation (3c).
In addition, the cornering force of a tire can be given by:
F y i = k i α i
where ki and αi are the side-slip stiffness and the angle of the tire, respectively. They can be described by:
α 1 = v y 1 L f 1 ω 1 / v x 1
α 2 = v y 1 + L r 1 L f 2 L r 2 ω z 1 + L r 2 L r 1 φ ˙ / v x 1 + φ   .
Therefore, the state equation of the dynamic model of yaw for the front frame can be given by:
ω ˙ z 1 = T 0 / I z 1 + m 1 L f 2 2 k 1 β L f 2 L f 1 / I z 1 + m 1 L f 2 2 + k 1 ω z 1 L f 1 L f 2 L f 1 / I z 1 + m 1 L f 2 2 v x 1
where the sideslip angle β can be expressed as:
β v y 1 / v x 1 L f 1 / R 1
R 1 = L f 2 cos φ + L r 2 / sin φ   .
In addition to the kinematic model and the dynamic model, a model of the sensors should also be taken into account.

2.1.3. Model of the Sensors

Two kinds of sensor are applied in the proposed method, i.e., an IMU sensor and an angular encoder. Considering the noise in the signals from the involved sensors, which include a tri-axis gyroscope, a tri-axis accelerometer, and an angular encoder, the models of the involved sensors can be described in the corresponding frame as follows:
y i , G = ω i , G + n i , G
y i , A = g i , A + a i , A + n i , A
y v = v x + n v
y c = γ c + n c
where the subscript i = 1,2 denotes the front frame and the rear frame of the LHD vehicle, respectively, yi,G is a column vector [ωi,Gx ωi,Gy ωi,Gz]T, which contains three angular velocities of the X, Y, and Z axes in the frame of installation, vector ωi,G is the corresponding true value [ωi,x ωi,y ωi,z]T, and ni,G is white Gaussian noise with a zero mean about each axis. In Formula (9b), yi,A is also a column vector [gi,Ax gi,Ay gi,Az]T, which contains the acceleration of gravity gi,A = [gi,x gi,y gi,z]T, the true acceleration of the dynamics of the target aA = [ai,Ax ai,Ay ai,Az]T, and the corresponding white Gaussian noise with a zero mean ni,A. Similarly, in Formula (9c), yv is a scalar value of the longitudinal velocity from the angular encoder including the true value vx and noise nv. In Formula (9d), yc is also a scalar value of the angular transducer, which contains the true angle ϕc and noise nc.

2.2. State Vector

2.2.1. State Vector of Pitch and Roll

Considering the conditions of an LHD vehicle, the Euler angle method is a good choice for attitude representation. Then, the states of the LHD vehicle can be defined as pitch (θi), roll (ϕi), and yaw (ψi). The transformation relationship between the vehicle coordinate frame (VX) and the navigation coordinate frame (NX) can be expressed as [27]:
X N i = T i X V i
where Ti is the transformation matrix, and the subscripts i = 1,2 represent the front frame and the rear frame, respectively. The transformation matrix can be expressed as:
T i = c ψ i c θ i c ψ i s θ i s ϕ i s ψ i c ϕ i c ψ i s θ i c ϕ i + s ψ i s ϕ i s ψ i c θ i s ψ i s θ i s ϕ i c ψ i c ϕ i s ψ i s θ i c ϕ i c ψ i s ϕ i s θ i c θ i s ϕ i c θ i c ϕ i
where c and s are the trigonometric functions cosine and sine, respectively. We note that the last row of the transformation matrix only contains the pitch and roll angles, which can be used to calculate these angles with:
ϕ i = arctan ( T i 32 / T i 33 )
θ i = arctan ( T i 31 / ( T i 33 / s i n ϕ i ) )
where Tij k is the (j, k)-th element of Ti.
As mentioned in [10], the state vector of a target vehicle at time t, which is also actual state, can be expressed as:
X i , t = T i 31 T i 32 T i 33 T .
The acceleration of gravity in Equation (9b) can be given by:
g i , A = g X i , t
where g is the gravitational acceleration constant, which is 9.81 m/s2.

2.2.2. State Vector of Yaw

According to Equation (7), the yaw motion of the front frame can be estimated by a state equation, and the corresponding angle of the rear frame can be obtained by Equation (3c). Therefore, the state vector of yaw motion can be defined as:
X z , t = ω z 1 .

2.3. Design of KFs

2.3.1. KF for Pitch and Roll

The model of a linear KF, which includes the model of process and measurement, can be expressed as:
X i , t = Φ i , t 1 X i , t 1 + w i , t 1
z i , t = H i X i , t + v i , t
where Xi,t is the state vector at time t as defined in Equation (13), Φt−1 is the state transition matrix, Hi,t is the observation matrix, and w and v are the white Gaussian noise in the process and measurement, respectively.
The transformation matrix can be calculated with an integration from t − 1 to t [13]:
T i , t = T i , t 1 I 3 + Δ t ω ˜ i , t 1
where I3 is a 3 × 3 identity matrix, Δt is the interval of the sampling, and the symbol―denotes a cross-product of the target vector [27].
Not all of the terms in Ti,t are involved, as only the last row of Ti,t is useful for the estimation of attitude. We replace the transformation matrix Ti,t in Equation (17) with the state vector Xi,t, which is defined in Equation (13). The new integration can be given by
X i , t = I 3 + Δ t ω ˜ i , t 1 X i , t 1
where ω can only be measured by the gyroscope in the IMU, which is modeled in Equation (9a). We substitute Equation (9a) into Equation (18) to obtain
X i , t = I 3 + Δ t y ˜ i , G , t 1 X i , t 1 + Δ t X ˜ i , t 1 n i , G .
Comparing Equation (19) with Equation (16a), the state transition matrix and the noise in the process can be expressed as:
Φ i , t 1 = I 3 + Δ t y ˜ i , G , t 1
w i , t 1 = Δ t X ˜ i , t 1 n i , G
When combined with Equation (15), the covariance of the noise in the process can be expressed as:
Q i , t 1 = Δ t 2 X ˜ i , t 1 E n i , G n i , G T X ˜ i , t 1
where E is the expectation operator, and the covariance of the noise from the gyroscope [27] is equal to σi,G2I3, which is based on the covariance being the same in different directions.
In order to improve the accuracy of estimations, we identify the external acceleration with the kinematic model of the target vehicle [10]. Equations (1a)–(1c) can be used to determine the external acceleration of the front frame while neglecting some minor terms. Finally, the kinematic model can be given by:
a y 1 = v ˙ y 1 + ω z 1 v x 1
a z 1 = v ˙ z 1 ω y 1 v x 1
where the longitudinal velocity vx1 is vital to the model, and the angular velocity ωx1 and ωz1 can be replaced by data from the IMU.
In addition, v ˙ y 1 and v ˙ z 1 can be described with a first-order low-pass filter [10]:
a 1 y , t = c 1 y , a a 1 y , t 1 + ε t
a 1 z , t = c 1 z , a a 1 z , t 1 + ε t
where c1y,a and c1z,a are constant between 0 and 1 and determine the cut-off frequency [10], which is considered to be equal to ca, and εt is the error in this model, which is time-varying. We substitute Equations (22a) and (22b) into Equations (23a) and (23b), respectively, to obtain:
a t , 1 = c a a t 1 + y G t , 1 v t + n G , 1 n v , 1 + ε t
where at,1 = [a1y,t a1z,t]T, yGt,1 = [G1z,t G1x,t]T, nG,1 = [nGz nGx]T, nv,1 = [nv nv]T, and εt = [εyt εzt]T.
Then, we substitute Equation (5b) into Equation (24) to obtain:
y A t , 1 = H m g X t + c a a t 1 + y G t , 1 v t + n G , 1 n v , 1 + ε t + n A , 1
where yAt,1 = [Ay,t Az,t]T, Hm is a matrix for coping with our disregard of the acceleration along the X-axis, and nA,1 = [nAy nAz]T.
In order to correspond with the measurement model in Equation (16b), Equation (25) can be written as:
y A t , 1 c a a t 1 y G t , 1 v t = H m g X t + n G , 1 n v , 1 + ε t + n A , 1 .
Then, we can obtain
z t , 1 = y A t , 1 c a a t 1 y G t , 1 v t
H 1 = g H m
v t = n G , 1 n v , 1 + ε t + n A , 1 .
Then, we can obtain the improved estimate of X1,t(2) and X1,t(3) in the state vector X1,t. All the terms of the vector satisfy:
X 1 , t 1 2 + X 1 , t 2 2 + X 1 , t 3 2 = 1 .
This provides an approach to calculating X1,t(1) in combination with the measurement model, which can be given by [27]
y A x , t = 1 X 1 , t 2 2 + X 1 , t 3 2 + γ t .
Then, we can obtain
z t , 2 = y A x , t
H 2 = 1 0 0
v t , 2 = γ t .
After the external acceleration of the front frame has been obtained, on the one hand, the data regarding acceleration from the IMU can be used to estimate the attitude of the front frame; on the other hand, these data can also be used to estimate the corresponding acceleration of the rear frame by the derived linear kinematic constraints. The attitude of the front frame can also be used to estimate the attitude of the rear frame with the angular constraints.
According to Equations (3a)–(3c), the acceleration of the rear frame can be expressed as
y A t , 2 o t a t , 1 = g X 2 , t + n A , 2
where yAt,2 = [Ax,t Ay,t Az,t]T and ot (at,1) = [Ax,t Ay,t Az,t]T.
Then, we can obtain
z t , 3 = y A t , 2 o t a t , 1
H 3 = g I 3
v t , 3 = n A , 2 .
Similarly, the pitch and roll of the rear frame can be obtained with Equations (12a) and (12b).
According to Equations (3a)–(c), the attitude vector of the rear frame can also be expressed as a function of the front frame:
θ 2 ϕ 2 ψ 2 = cos φ sin φ 0 sin φ cos φ 0 0 0 1 θ 1 ϕ 1 ψ 1 + 0 0 1 φ .
In a general way, we can obtain the attitude of the rear frame by substituting the estimation of the attitude of the front frame into Equation (33).

2.3.2. KF for Yaw

According to Equations (7), (12a) and (12b), the KF for yaw can be expressed as
X z , t = A z , t 1 X z , t 1 + B u z , t 1 + w z , t 1
z z , t = H z X z , t + v z , t
where A z , k 1 = k 1 L f 1 L f 2 L f 1 / I z 1 + m 1 L f 2 2 v x , k 1 , B z , k = 1 / I z 1 + m 1 L f 2 2 k 1 L f 2 L f 1 / I z 1 + m 1 L f 2 2 , and u z , k = T 0 , k β k T , H z = 1 .
Then, the yaw of the front frame can be obtained by:
X ^ z , k = A z X ^ z , k 1 + B z u z , k 1 P z , k = A z P z , k 1 A z T + Q z K z , k = P z , k H z T H z P z , k H z T + R z 1 X ^ z , k = X ^ z , k + K z , k z z , k H z X ^ z , k P z , k = I K z , k H z P z , k I K z , k H z T + K z , k R z K z , k T .
The corresponding part of the rear frame can be obtained with Equation (3c).

2.4. Fusion of States

The attitude of the rear frame, which can be obtained by the linear and angular constraints, should be fused to improve the accuracy of estimations. The attitudes from different sources can be expressed as:
X ¯ t = θ ¯ 2 , t ϕ ¯ 2 , t T
X t = θ 2 , t ϕ 2 , t T
where X ¯ t and X t are the estimated value from the angular and linear constraints, respectively.
The fusion of the data can be given by
X ^ 2 , t = U 1 , t 1 + U 2 , t 1 U 1 , t 1 X ¯ t + U 2 , t 1 X t
where U 1 , t 1 and U 2 , t 1 are the corresponding estimation errors, respectively.

3. Simulation Verification

To verify the validity and accuracy of the proposed method, a simulation was performed with a verified prototype model of the LHD vehicle in ADAMS, and the relevant parameters were shown in Appendix A. The data were processed with the proposed method to estimate the attitude of the LHD vehicle.

3.1. Simulation Setup

According to [15,25], the sampling frequency of the simulation was set to 100 Hz. The speed of the LHD vehicle was set to 14 km/h under normal running conditions and 9.5 km/h during the constant radius steering process. The noise variances of the accelerometer and the gyroscope were set to 0.4 mg/ H z and 0.02 dps/ H z , respectively. The noise variances of the encoder and the velocity were set to 0.01 dps/ H z and 0.2 m/s/ H z , respectively.

3.2. Simulation Conditions

In order to test the performance of the proposed method thoroughly, we included nine conditions, which are the static condition (C1, 0–3 s), the acceleration condition (C2, 3–8 s), the obstacle condition (C3, 9–10 s), the sine condition (C4, 10–20 s), the steady-state condition (C5, 22–25 s), the unilateral bridge condition (C6, 26–34 s), the single lane change condition (C7, 38–42 s), the break condition (C8, 45–48 s), and the constant radius steering condition (C9, 55–68 s). The acceleration data and the angular velocity data from the IMUs are shown in Figure 5 and Figure 6, respectively.
As we can see from Figure 5, during C1, the LHD vehicle was static, the accelerations in the lateral and longitudinal directions were zero, and the accelerations of two of the frames in the vertical direction were the same as gravity. During C2, the LHD vehicle accelerated in a longitudinal direction according to the acceleration data in Figure 5a, and the corresponding angular velocity was very small because the LHD vehicle has no suspension. During C3, the left and right front wheels of the LHD vehicle simultaneously encountered a bump; as a result, the vertical accelerations in vertical motion and pitch motion changed dramatically. During C4, the LHD vehicle ran over a sine bump, and the pitch motion changed with the rise and fall of the road. After that, the LHD vehicle ran at a constant speed, C5, which was similar to the speed during C1.
According to Figure 6, the velocity of the LHD vehicle oscillated during C4 and C5. This was caused by the rough control strategy for velocity during the simulation. However, the range remained within the LHD vehicle’s normal running velocity range.
Then, the left wheel of the LHD vehicle encountered a sine bump, C6, which brought about changes in the roll motion, and the lateral acceleration also changed with the rise and fall of the road. During C7, the yaw motion of the front frame and the rear frame changed obviously. During C8, the acceleration in the longitudinal direction, in contrast to C2, was negative, and the speed of the LHD vehicle was also reduced to 9.5 km/h. Then, the LHD vehicle steered in a constant radius. At the start of C9, the angular velocities of the front frame and the rear frame were different, which implies that the angles of central articulation had changed.

3.3. Estimation Results

The results of the estimation of the pitch, roll, and yaw angles of the LHD vehicle are shown in Figure 7 and Figure 8, respectively, and the time history of the error covariance matrix during the estimation was shown in Figure 9. The upper part in the figures is a comparison between the estimated values and the true values, and the lower part shows the errors in the estimation. We found that the roll and pitch of the front frame and the rear frame were almost the same, so only the attitude of the rear frame is shown in Figure 7; however, we included the yaw of the two frames. As we can see from Figure 7, Figure 8 and Figure 9, the estimated values and the true values are in good agreement. The RMS of errors in the roll, pitch, and yaw were 0.2° (2.4%), 0.19° (2.1%), and (1.1°) 0.23%, respectively. The percentage was obtained with the maximum RMS and the largest values of corresponding angles. The maximum error in the estimations appeared in C9 (the constant radius steering condition), which was caused by the slip of the LHD vehicle during steering. Although small differences appeared during the dynamic process according to the error, the estimations were found to be satisfactory.
Then, the estimated longitudinal accelerations of the rear frame were integrated to obtain the velocity of the LHD vehicle. Figure 10a compares the estimated velocity with the reference velocity. Figure 10b shows the error in the estimation of the velocity. During 15–18 s condition, the drastic changes of the pitch angle and characteristic of the tire would increase the estimation error of acceleration and the dynamic model, and the error of acceleration caused the deviation of velocity. Therefore, a deviation occurred. The error in the estimated speed was less than 0.27 km/h (3%). This means that the proposed method can be used in the control and navigation of an LHD vehicle without any infrastructure.

4. Experimental Results

4.1. Experimental Setup

The field test of the LHD vehicle was carried out on an uneven road that was similar to the road in underground mines. The test system is shown in Figure 11. The LMS SCADAS (①, SCM05) was used to record the data during the field test, and the sample frequency was set to 100 Hz according to [15]. Two IMUs (②, ③, Beijingsanchi, SC-AHRS-200A) were installed in the front frame and the rear frame, respectively. Two kinds of encoders were used to measure the articulated angle of a frame (④, MIRAN, WOA-C) and the speed of transmission (⑤, SCHMEASAL IFL 5-18M-10P), respectively. A gyroscope (⑥) with high precision (the bias stability is less than 0.05 °/h, and the random walk coefficient is less than 0.005 °/ H z ) was installed to provide references.
Different conditions were adopted to simulate the working conditions of the LHD vehicle. Usually, the characteristics of the route of an LHD vehicle are special in underground mines, and always include a square turn and a change in operation direction. Thus, the route for the field test was planned as shown in Figure 12, where the direction of the arrow represents the direction of the LHD vehicle and the solid and hollow lines correspond to the forward and backward directions, respectively. The key dimensions are L1 = 50 m and L2 = 20 m. The operations between Point A and Point B were used to simulate the operations between the blast point and the dump point of the underground mines [2]. Although the trajectory in the field test was not so long as the other autonomous ground vehicle, the operation and dynamic characteristics in this route can cover all of characteristics of the LHD during the normal operations.

4.2. Experimental Results

The results of the estimation of the front and rear frames were almost the same according to the data from the field test; thus, only the results of the estimation of the rear frame’s attitude angle are shown and were analyzed. The results of the estimation corresponding to the roll, pitch, and yaw are shown in Figure 13, Figure 14 and Figure 15, respectively. The upper part (a) in each figure is a comparison between the reference values and the estimation, and the lower part (b) shows the error in the estimation, and the dramatic or typical changes of errors were detailed. The reference values were sourced from the data of a high-precision reference gyroscope.

4.3. Discussion of Results

As we can see from Figure 13 and Figure 14, the estimations of roll and pitch are consistent with the reference values, while some errors fluctuate sharply. This is due to the special structure of the LHD vehicle’s chassis, which is also a common characteristic of UAVs. The frames of an LHD vehicle are rigidly connected to the front and rear drive axle, which means that these vehicles do not have suspension that can filter vibrations from the road. Although rubber pads are used to eliminate sharp fluctuations, the improvement is limited. Hence, vibrations bring about sharp fluctuations in the errors in the estimation.
As we can see from Figure 15, the estimation of yaw is highly consistent with the reference value, and the error is less than 5 degrees and does not increase with time. The error only obviously changed after the square turn (100–150 s, 260–310 s). During the square turn, the tire of the LHD vehicle seriously slipped, so error would be brought about by the change in Equations (8a) and (8b). In addition, according to Figure 16, ripples also occurred between 164 s and 166 s. This error could be traced back to the input of the estimation model, which is the steering moment. The steering moment is decided by the pressure in the cylinder. The pressure is shown in Figure 16, where “right” and “left” refer to the rod-less cavity pressure in the right and left steering cylinder, respectively. As we can see from the Figure 16, the two pressures rippled alternately—between 12.5 MPa and 0.2 MPa—after a steering maneuver. This means that the oscillatory yaw motion in the frames occurred, which is always caused by instability in the hydraulic steering system. Although different errors occurred during the estimation, the accuracy was found to meet the requirements of practical applications.
During the estimation, the estimated value of the longitudinal dynamic acceleration is converted to the velocity of the LHD vehicle by integration. Figure 17 compares the estimated velocity obtained from the shaft encoder with the reference value. As we can see from the figure, not only are the trends consistent, but the estimated and reference values are also very close according to the details in Figure 17.
Despite all this, the accuracy of the estimation is high enough to indicate the attitude and speed of articulated vehicles, which can be verified by the root mean square (RMS) of the error in the estimated values, which are listed in Table 1. The RMS of errors for the roll, pitch, and yaw were 0.19° (3.8%), 0.1° (2.4%), and 2.08° (4.2%), respectively. The error in the estimated velocity was 0.21 km/h (1.2%). Compared with the estimation error of 5% in [9], the proposed model-based method improves the accuracy of estimations and involves more information on attitude, which is useful for the navigation and control of autonomous vehicles [28].

5. Conclusions

A model-based method was presented for estimating the attitude of articulated vehicles in a GPS-free environment. A kinematic model, a two-constraint model of articulation, and a dynamic four degrees-of-freedom (DOF) yaw model of UAVs were established and integrated with rough data from IMUs by KFs to obtain information on attitude. Then, the method was verified by a simulation under different dynamic conditions. A field test of the LHD vehicle was carried out to prove the method’s practical efficacy.
The results of the field test showed that the method could estimate the yaw and velocity accurately. The errors in the estimated roll, pitch, and yaw were 3.8%, 2.4%, and 4.2%, respectively. The error in the estimated velocity was 1.2%. The characteristics of these errors were analyzed and interpreted. The errors in the estimated pitch and roll were brought about by the direct vibrations from the ground. The errors in the estimated yaw included trend errors and ripples. The trend errors occurred at the square turn, when the constraints between the tires and the road changed. Local ripples were induced by instability in the hydraulic steering system.
A trajectory of the typical work cycle for LHD was designed during the validation field test. The field test can prove the validation of this method during the work cycle, which is most of working conditions of the LHD vehicle. For the rarely long-traveled transition, this method still needs further verification.
Therefore, motivated by these limitations and sources of error, future research will focus on improvements in accuracy. The method would be verified during the long-traveled transition, the dynamic yaw model will be improved by covering tire slips, and instability in the hydraulic steering system will be studied and controlled to eliminate ripples. In addition, the algorithm will also be improved to decrease the effect that vibrations have on the frame.

Author Contributions

L.G. and C.J. developed the method, performed the simulation and the experiment, and wrote the article. F.M. supervised the overall study and experiments and revised the article.

Funding

This research was funded by the National Key Research and Development Program of China (Grant No. 2016YFC0802900) and the National Natural Science Foundation of China (grant number 51774019).

Acknowledgments

The authors would like to acknowledge Shandong Gold Group Co., Ltd. for the experimental articulated vehicle and test sites.

Conflicts of Interest

The authors declare no conflicts of interest.

Appendix A

Table A1. Parameters of the target LHD.
Table A1. Parameters of the target LHD.
SymbolParametersValues
m1mass of front frame12,850 kg
m2mass of rear frame15,650 kg
Iz1moment inertia of front frame307,420 kg·m2
Iz2moment inertia of rear frame380,320 kg·m2
Lf1distance of front mass center to bridge0.3 m
Lf2distance of front mass center to articulation point2.05 m
Lr1distance of rear mass center to bridge0.16 m
Lr2distance of rear mass center to articulation point1.59 m
Bhalf distance of track0.89 m
Rfree radius of tire0.89 m
fvvertical stiffness of tire80 KN/m

References

  1. Li, J.; Zhan, K. Intelligent Mining Technology for an Underground Metal Mine Based on Unmanned Equipment. Engineering 2018, 4, 381–391. [Google Scholar] [CrossRef]
  2. Gustafson, A.; Lipsett, M.; Schunnesson, H.; Galar, D.; Kumar, U. Development of a Markov model for production performance optimisation. Application for semi-automatic and manual LHD machines in underground mines. Int. J. Min. Reclam. Environ. 2014, 28, 342–355. [Google Scholar] [CrossRef]
  3. Gao, Y.; Shen, Y.; Xu, T.; Zhang, W.; Güvenç, L. Oscillatory Yaw Motion Control for Hydraulic Power Steering Articulated Vehicles Considering the Influence of Varying Bulk Modulus. IEEE Trans. Control Syst. Technol. 2018, 99, 1–9. [Google Scholar] [CrossRef]
  4. Nayl, T.; Nikolakopoulos, G.; Gustafsson, T. Effect of kinematic parameters on MPC based on-line motion planning for an articulated vehicle. Robot. Auton. Syst. 2015, 70, 16–24. [Google Scholar] [CrossRef]
  5. Yang, L.; Yue, M.; Ma, T. Path Following Predictive Control for Autonomous Vehicles Subject to Uncertain Tire-ground Adhesion and Varied Road Curvature. Int. J. Control. Syst. 2019, 17, 193–202. [Google Scholar] [CrossRef]
  6. Naisi, Z.; Jun, N.; Jibin, H. Robust H state feedback control for handling stability of intelligent vehicles on a novel all-wheel independent steering mode. IET Intell. Transp. Syst. 2019, 10, 1579–1589. [Google Scholar]
  7. Gao, S.; Liu, Y.; Wang, J.; Deng, W.; Oh, H. The Joint Adaptive Kalman Filter (JAKF) for Vehicle Motion State Estimation. Sensors 2016, 16, 1103. [Google Scholar] [CrossRef]
  8. Roberts, J.M.; Duff, E.S.; Corke, P.I. Reactive navigation and opportunistic localization for autonomous underground mining vehicles. Inf. Sci. 2002, 145, 127–146. [Google Scholar] [CrossRef]
  9. Chun, J.T.L.Y. State Estimation of the Electric Drive Articulated Dump Truck Based on UKF. J. Harbin Inst. Technol. (New Ser.) 2015, 22, 21–30. [Google Scholar]
  10. Lee, J.K.; Park, E.J.; Robinovitch, S.N. Estimation of Attitude and External Acceleration Using Inertial Sensor Measurement during Various Dynamic Conditions. IEEE Trans Instrum. Meas. 2012, 61, 2262–2273. [Google Scholar] [CrossRef]
  11. Shi, G.; Li, X.; Jiang, Z. An Improved Yaw Estimation Algorithm for Land Vehicles Using MARG Sensors. Sensors 2018, 18, 3251. [Google Scholar] [CrossRef] [PubMed]
  12. Lee, C.J.; Kim, K.E.; Lim, M.T. Sensor fusion for vehicle tracking based on the estimated probability. Iet Intell. Transp. Syst. 2018, 10, 1386–1395. [Google Scholar] [CrossRef]
  13. Xu, Z.; Yang, W.; You, K.; Li, W.; Kim, Y.I. Vehicle autonomous localization in local area of coal mine tunnel based on vision sensors and ultrasonic sensors. PLoS ONE 2017, 12, 1–31. [Google Scholar] [CrossRef] [PubMed]
  14. Dissanayake, G.; Sukkarieh, S.; Nebot, E.; Durrant-Whyte, H. The aiding of a low-cost strapdown inertial measurement unit using vehicle model constraints for land vehicle applications. IEEE Trans. Robot. Autom. 2001, 17, 731–747. [Google Scholar] [CrossRef]
  15. Mäkelä, H. Navigation System for LHD Machines. In Proceedings of the IFAC Intelligent Autonomous Vehicles, Espoo, Finland, 12–14 June 1995. [Google Scholar] [CrossRef]
  16. Chi, H.; Zhan, K.; Shi, B. Automatic guidance of underground mining vehicles using laser sensors. Tunn. Undergr. Space Technol. 2012, 27, 142–148. [Google Scholar] [CrossRef]
  17. Mäkelä, H. Overview of LHD navigation without artificial beacons. Robot. Auton. Syst. 2001, 36, 21–35. [Google Scholar] [CrossRef]
  18. Scheding, S.; Dissanayake, G.; Nebot, E.M. An experiment in autonomous navigation of an underground mining vehicle. IEEE Trans. Robot. Autom. 2001, 15, 85–95. [Google Scholar] [CrossRef]
  19. Paraszczak, J.; Gustafson, A.; Schunnesson, H. Technical and operational aspects of autonomous LHD application in metal mines. Int. J. Min. Reclam. Environ. 2015, 29, 391. [Google Scholar]
  20. Gustafson, A.; Schunnesson, H.; Kumar, U. Reliability Analysis and Comparison between Automatic and Manual Load Haul Dump Machines. Qual. Reliab. Eng. Int. 2015, 31, 523–531. [Google Scholar] [CrossRef]
  21. Marshall, J.; Barfoot, T.; Larsson, J. Autonomous underground tramming for center-articulated vehicles. J. Field Robot. 2008, 25, 400–421. [Google Scholar] [CrossRef]
  22. Wu, D.; Meng, Y.; Gu, Q.; Ma, F.; Zhan, K. A novel method for estimating the heading angle for underground Load-Haul-Dump based on Ultra Wideband. Trans. Inst. Meas. Control 2017, 40, 1608–1614. [Google Scholar] [CrossRef]
  23. Park, B.; Myung, H. Underground localization using dual magnetic field sequence measurement and pose graph SLAM for directional drilling. Meas. Sci. Technol. 2014, 25, 1–12. [Google Scholar] [CrossRef]
  24. Zhu, Q.; Xiao, C.; Hu, H.; Liu, Y.; Wu, J. Multi-Sensor Based Online Attitude Estimation and Stability Measurement of Articulated Heavy Vehicles. Sensors 2018, 18, 212. [Google Scholar] [CrossRef] [PubMed]
  25. Sabatini, A.M. Quaternion-based extended Kalman filter for determining orientation by inertial and magnetic sensing. IEEE Trans. Bio-Med. Eng. 2006, 53, 1346–1356. [Google Scholar] [CrossRef] [PubMed]
  26. Suh, Y.S. Orientation Estimation Using a Quaternion-Based Indirect Kalman Filter With Adaptive Estimation of External Acceleration. IEEE Trans. Instrum. Meas. 2010, 59, 3296–3305. [Google Scholar] [CrossRef]
  27. Oh, J.J.; Choi, S.B. Vehicle Velocity Observer Design Using 6-D IMU and Multiple-Observer Approach. IEEE Trans. Intell. Transp. Syst. 2012, 13, 1865–1879. [Google Scholar] [CrossRef]
  28. Ahmed, H.; Tahir, M. Accurate Attitude Estimation of a Moving Land Vehicle Using Low-Cost MEMS IMU Sensors. IEEE Trans. Intell. Transp. Syst. 2017, 18, 1723–1739. [Google Scholar] [CrossRef]
  29. Eltrass, A.; Khalil, M. Automotive radar system for multiple-vehicle detection and tracking in urban environments. IET Intell. Transp. Syst. 2018, 12, 783–792. [Google Scholar] [CrossRef]
  30. He, Y.; Khajepour, A.; McPhee, J.; Wang, X. Dynamic modelling and stability analysis of articulated frame steer vehicles. Int. J. Heavy Veh. Syst. 2005, 12, 28–59. [Google Scholar] [CrossRef]
Figure 1. The typical structure of a Load–Haul–Dump (LHD) vehicle.
Figure 1. The typical structure of a Load–Haul–Dump (LHD) vehicle.
Sensors 19 05245 g001
Figure 2. A block diagram of the model-aided method for estimating the attitude of an LHD vehicle.
Figure 2. A block diagram of the model-aided method for estimating the attitude of an LHD vehicle.
Sensors 19 05245 g002
Figure 3. Directions of the linear and angular velocities of the LHD vehicle.
Figure 3. Directions of the linear and angular velocities of the LHD vehicle.
Sensors 19 05245 g003
Figure 4. A dynamic model of yaw for the LHD vehicle.
Figure 4. A dynamic model of yaw for the LHD vehicle.
Sensors 19 05245 g004
Figure 5. Data from the inertial measurement units (IMUs). (a) acceleration data of front frame; (b) angular velocity data of front frame; (c) acceleration data of rear frame; (d) angular velocity data of rear frame.
Figure 5. Data from the inertial measurement units (IMUs). (a) acceleration data of front frame; (b) angular velocity data of front frame; (c) acceleration data of rear frame; (d) angular velocity data of rear frame.
Sensors 19 05245 g005
Figure 6. The velocity of the LHD vehicle.
Figure 6. The velocity of the LHD vehicle.
Sensors 19 05245 g006
Figure 7. Results of the estimation of pitch and roll. (a) estimation results of pitch and roll; (b) error of pitch and roll estimation.
Figure 7. Results of the estimation of pitch and roll. (a) estimation results of pitch and roll; (b) error of pitch and roll estimation.
Sensors 19 05245 g007
Figure 8. Error in the estimation results. (a) estimation results of yaw; (b) error of yaw estimation.
Figure 8. Error in the estimation results. (a) estimation results of yaw; (b) error of yaw estimation.
Sensors 19 05245 g008
Figure 9. Trace of error covariance matrix.
Figure 9. Trace of error covariance matrix.
Sensors 19 05245 g009
Figure 10. Results of the estimation of velocity. (a) estimation results of velocity; (b) error of velocity estimation.
Figure 10. Results of the estimation of velocity. (a) estimation results of velocity; (b) error of velocity estimation.
Sensors 19 05245 g010
Figure 11. The system configuration of the LHD vehicle field test.
Figure 11. The system configuration of the LHD vehicle field test.
Sensors 19 05245 g011
Figure 12. The route of the LHD vehicle during the field test.
Figure 12. The route of the LHD vehicle during the field test.
Sensors 19 05245 g012
Figure 13. Results of the estimation of roll during the field test. (a) estimation results of roll in field test; (b) error of roll estimation in field test.
Figure 13. Results of the estimation of roll during the field test. (a) estimation results of roll in field test; (b) error of roll estimation in field test.
Sensors 19 05245 g013
Figure 14. Results of the estimation of pitch during the field test. (a) estimation results of pitch in field test; (b) error of pitch estimation in field test.
Figure 14. Results of the estimation of pitch during the field test. (a) estimation results of pitch in field test; (b) error of pitch estimation in field test.
Sensors 19 05245 g014
Figure 15. Results of the estimation of yaw during the field test. (a) estimation results of yaw in field test; (b) error of yaw estimation in field test.
Figure 15. Results of the estimation of yaw during the field test. (a) estimation results of yaw in field test; (b) error of yaw estimation in field test.
Sensors 19 05245 g015
Figure 16. The pressure in the steering cylinders.
Figure 16. The pressure in the steering cylinders.
Sensors 19 05245 g016
Figure 17. Estimation of velocity during the field test.
Figure 17. Estimation of velocity during the field test.
Sensors 19 05245 g017
Table 1. Root mean square (RMS) error in the estimation.
Table 1. Root mean square (RMS) error in the estimation.
ErrorRMS
Roll0.19 deg
Pitch0.10 deg
Yaw2.08 deg
Velocity0.21 km/h
Distance0.26 m

Share and Cite

MDPI and ACS Style

Gao, L.; Ma, F.; Jin, C. A Model-Based Method for Estimating the Attitude of Underground Articulated Vehicles. Sensors 2019, 19, 5245. https://doi.org/10.3390/s19235245

AMA Style

Gao L, Ma F, Jin C. A Model-Based Method for Estimating the Attitude of Underground Articulated Vehicles. Sensors. 2019; 19(23):5245. https://doi.org/10.3390/s19235245

Chicago/Turabian Style

Gao, Lulu, Fei Ma, and Chun Jin. 2019. "A Model-Based Method for Estimating the Attitude of Underground Articulated Vehicles" Sensors 19, no. 23: 5245. https://doi.org/10.3390/s19235245

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop