Next Article in Journal
Tomato Maturity Detection and Counting Model Based on MHSA-YOLOv8
Next Article in Special Issue
FCCD-SAR: A Lightweight SAR ATR Algorithm Based on FasterNet
Previous Article in Journal
A Distributed Particle-Swarm-Optimization-Based Fuzzy Clustering Protocol for Wireless Sensor Networks
Previous Article in Special Issue
A Novel Ranging and IMU-Based Method for Relative Positioning of Two-MAV Formation in GNSS-Denied Environments
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

IMU/UWB Fusion Method Using a Complementary Filter and a Kalman Filter for Hybrid Upper Limb Motion Estimation

1
School of Aeronautic Science and Engineering, Beihang University, Beijing 100191, China
2
Aircraft and Propulsion Laboratory, Ningbo Institute of Technology, Beihang University, Ningbo 315100, China
*
Author to whom correspondence should be addressed.
Sensors 2023, 23(15), 6700; https://doi.org/10.3390/s23156700
Submission received: 29 June 2023 / Revised: 21 July 2023 / Accepted: 24 July 2023 / Published: 26 July 2023
(This article belongs to the Special Issue UAV and Sensors Applications for Navigation and Positioning)

Abstract

:
Motion capture systems have enormously benefited the research into human–computer interaction in the aerospace field. Given the high cost and susceptibility to lighting conditions of optical motion capture systems, as well as considering the drift in IMU sensors, this paper utilizes a fusion approach with low-cost wearable sensors for hybrid upper limb motion tracking. We propose a novel algorithm that combines the fourth-order Runge–Kutta (RK4) Madgwick complementary orientation filter and the Kalman filter for motion estimation through the data fusion of an inertial measurement unit (IMU) and an ultrawideband (UWB). The Madgwick RK4 orientation filter is used to compensate gyroscope drift through the optimal fusion of a magnetic, angular rate, and gravity (MARG) system, without requiring knowledge of noise distribution for implementation. Then, considering the error distribution provided by the UWB system, we employ a Kalman filter to estimate and fuse the UWB measurements to further reduce the drift error. Adopting the cube distribution of four anchors, the drift-free position obtained by the UWB localization Kalman filter is used to fuse the position calculated by IMU. The proposed algorithm has been tested by various movements and has demonstrated an average decrease in the RMSE of 1.2 cm from the IMU method to IMU/UWB fusion method. The experimental results represent the high feasibility and stability of our proposed algorithm for accurately tracking the movements of human upper limbs.

1. Introduction

Motion capture (MoCap) systems provide technical support for the operation of space robots, which has enormously benefited the research into human–computer Interaction in the aerospace field. By capturing human upper limb movement and mapping it to space robot motion, space operators can remotely control space robot arms and allow them to perform complex and elaborate actions, such as grasping and handling. Thus, we need to accurately estimate the movement of human upper limbs [1,2,3]. Optical Mocap systems employ multiple cameras to capture the positions of reflective markers and have been widely used in human motion tracking. Although these systems share precise real-time tracking and strong anti-interference abilities, they are expensive and suffer from light and occlusion problems [4,5]. In order to flexibly track human movements, it is necessary to employ ubiquitous wearable sensors.
With the intensive study and rapid advancement of micro-electro-mechanical systems, wearable inertial measurement units (IMUs) have been extensively adopted for 3-D human motion tracking due to their low cost, portability, and high independence [6]. In many of the proposed IMU Mocap methods, accelerometers and gyroscopes are fused to estimate the body segment orientation [7,8], including the knee joint angle [9,10], arm adduction angle [11], and shoulder and elbow joint angles [12,13]. However, it has been illustrated that there is obvious drift error of the attitude angle calculated by the gyroscope, which contributes to the poor estimation accuracy and stability. Considering that the accelerometers can only compensate for the tilt angles relative to the direction of gravity, it is necessary to introduce magnetometers to provide compensation for the yaw angles, which is known as nine-axis IMU or the magnetic, angular rate, and gravity (MARG) system [14]. Data fusion of three inertial sensors can be achieved by adopting either a complementary filter [14,15,16,17] or a classic Kalman filter [18,19]. The Kalman filter has been widely used as an orientation filter for the majority of the proposed methods. Wu et al. [20,21,22,23] employed a particle filter and an unscented Kalman filter for information fusion and developed a micro-sensor Mocap system to achieve real-time tracking. Roetenberg et al. [24] designed a Kalman filter and improved the orientation estimation through the compensation of magnetic disturbances. Considering that the Kalman filter requires knowledge of noise distribution and is prone to computational load and parameter tuning issues, many researchers tend to use the complementary filter. Fourati et al. [25] proposed a complementary observer based on the quaternion method for motion tracking. However, only adopting inertial sensors fusion for motion estimation will still cause drift, so it is expected that a stable and high-precision technology will be introduced to fuse with IMU.
Considering the flexibility and high sampling rate of IMU, ultrawideband (UWB) technology is attractive for data fusion due to its low cost, portability, and low energy consumption [26,27], but it requires a clear line-of-sight (LOS) channel [28]. Compared with traditional UWB positioning methods consisting of maximum likelihood estimation (MLE), linearized least square estimation (LLSE), and weighted centroid estimation (WCE), the extended Kalman filter (EKF) performs with less computational time and less complexity [29]. Applying UWB and IMU fusion for human motion tracking can not only compensate for the low sampling rate of the UWB system, but reduce the drift of IMU as well. Depending on whether the fusion is based on raw time or location, data fusion methods are divided into tightly coupled [30] and loosely coupled [31,32] categories. Kok et al. [33] provided tightly coupled IMU and UWB fusion using an optimization method that showed good performance in terms of handling outliers, but suffered from clock skew challenges. Therefore, in order to facilitate the calculation, the loosely coupled method is often employed. Zihajehzadeh et al. [32,34,35] proposed a Kalman-filter-based IMU and UWB fusion method without a magnetometer and accurately captured lower body motion under magnetic disturbances. However, when extending the rotation matrices to the upper limb, their algorithm may suffer from the singularity problem. Zhang et al. [36] adopted a Mahony filter and quaternion for foot attitude estimation via IMU and UWB fusion, but the use of acceleration double integration to obtain the position would lead to huge cumulative errors.
In this paper, the human upper limb is taken as our research subject, and a novel IMU/UWB data fusion method is proposed for 3-D motion estimation by applying the Runge–Kutta Madgwick filter, the UWB localization Kalman filter, and the IMU/UWB Kalman filter. On the basis of the established kinematics model of the upper limbs, the quaternion method was employed to calculate the attitude angle to avoid the gimbal lock problem. Our proposed algorithm comprises the following two novel aspects: (1) we combined the Madgwick RK4 complementary orientation filter and Kalman filter for motion tracking, the Madgwick RK4 filter was employed to reduce gyroscope drift without leading to noise distribution, and the Kalman filter was implemented for UWB localization and fusion with known error distribution from the UWB system; and (2) the drift-free position obtained by the UWB localization system was used to fuse the position calculated by IMU for upper limb motion estimation, which enormously reduced the drift caused by the double integration of acceleration. The good experimental results represent the high feasibility and stability of our proposed algorithm.
The rest of the paper is structured as follows. In Section 2, the theoretical fusion method for 3-D upper limb tracking is described. The experimental setup and protocol are demonstrated in Section 3. Then, the experimental results are shown and analyzed in Section 4. Finally, conclusions are provided in Section 5.

2. Theoretical Method

In this section, the proposed information fusion algorithm for human upper limb motion estimation is described. As shown in Figure 1, we first simplified the human upper limb as a kinematics model with three joints (shoulder, elbow, and wrist) and two segments (upper arm and forearm) [21]. The shoulder joint was set as a fixed point with three degrees of freedom (DoFs) and two DoFs for the elbow and wrist [37]. Taking the right arm as an example, two IMU sensors were arranged on the lateral side above the wrist and the elbow, respectively, as well as two UWB tags. The navigation coordinate frame (n) was set as the reference system, which was consistent with the coordinate system of UWB and MoCap. The body frame (b) was attached to each body segment where the sensors were located. It aligned with the n-frame initially. Neglecting the installation errors, the sensor frame (s) was attached to each IMU and aligned with the b-frame to track the movements of human upper limbs.
The overall framework of the proposed algorithm consists of three parts, illustrated in Figure 2, including (1) Quaternion RK4 Based Madgwick Orientation Complementary Filter, (2) UWB localization Kalman filter, and (3) IMU/UWB Fusion Kalman filter. For the purpose of estimating the 3-D spatial trajectory of the movements of human upper limbs, we first employed the quaternion method to calculate the attitude angle using gyroscope measurements. However, the integration of gyroscope measurement errors contributed to an accumulating error in the quaternion algorithm [14]. Therefore, we considered adopting the Madgwick orientation filter to improve the motion estimation accuracy through the optimal fusion of the accelerometer, gyroscope, and magnetometer of the MARG system. Then, in order to further reduce the drift error, we proposed that the UWB localization system be combined with the IMU sensors. An extended Kalman filter was utilized to fuse IMU and UWB in order to perform stable and high-precision motion estimation of the human upper limbs.

2.1. Quaternion RK4 Based Madgwick Orientation Complementary Filter

2.1.1. Fourth-Order Runge–Kutta-Based Quaternion Update Algorithm

Rotation transformation can be described as a vector rotating around a specified rotation axis with a certain angle in a coordinate system. The unit quaternion is defined as
q = q 1 + i q 2 + j q 3 + k q 4
For the same vector r n and r b , defined in the n-frame and b-frame, respectively, r n and r b are their extended forms, containing a 0 inserted as the real part [14]. q b n describes the rotation of the n-frame relative to the b-frame, so r n can be expressed as
r n = q b n r b q b n = q b n M q b n r b = M q b n M q b n r b = q 1 q 2 q 3 q 4 q 2 q 1 q 4 q 3 q 3 q 4 q 1 q 2 q 4 q 3 q 2 q 1 q 1 q 2 q 3 q 4 q 2 q 1 q 4 q 3 q 3 q 4 q 1 q 2 q 4 q 3 q 2 q 1 0 x y z
q b n r b = q 1 q 2 q 3 q 4 q 2 q 1 q 4 q 3 q 3 q 4 q 1 q 2 q 4 q 3 q 2 q 1 0 x y z
where q is the conjugate of q ; the symbol denotes the quaternion product [21], as defined in (3); M q b n is the left multiplication matrix of the quaternion q b n ; and M q b n is the right multiplication matrix of the conjugate quaternion q b n . Furthermore, the rotation can be represented by a rotation matrix.
r n = C b n r b
where C b n represents the rotation of the n-frame relative to the b-frame.
C b n = ( q 1 2 + q 2 2 q 3 2 q 4 2 ) 2 ( q 2 q 3 q 1 q 4 ) 2 ( q 2 q 4 + q 1 q 3 ) 2 ( q 2 q 3 + q 1 q 4 ) ( q 1 2 q 2 2 + q 3 2 q 4 2 ) 2 ( q 3 q 4 q 1 q 2 ) 2 ( q 2 q 4 q 1 q 3 ) 2 ( q 3 q 4 + q 1 q 2 ) ( q 1 2 q 2 2 q 3 2 + q 4 2 )
The quaternion update algorithm uses the angular velocity increment in the sample period measured by the IMU sensors to calculate the quaternion at each time in order to update the human motion data. The quaternion derivative is given by
q ˙ b n = 1 2 q b n ω b = 1 2 M q b n ω b = 1 2 M ω b q b n         = 1 2 q 1 q 2 q 3 q 4 q 2 q 1 q 4 q 3 q 3 q 4 q 1 q 2 q 4 q 3 q 2 q 1 0 ω b x ω b y ω b z = 1 2 0 ω b x ω b y ω b z ω b x 0 ω b z ω b y ω b y ω b z 0 ω b x ω b z ω b y ω b x 0 q 1 q 2 q 3 q 4
where ω b = 0 ω b x ω b y ω b z T is the angular velocity measured by the gyroscope.
Given the initial value and the rotational angular velocity, the orientation at each time q b n ω , t can be obtained by numerically integrating the quaternion derivative q ˙ b n ω , t .
q b n ω , t = q ^ b n e s t , t 1 + q ˙ b n ω , t Δ t = q ^ b n e s t , t 1 + 1 2 Δ t q ^ b n e s t , t 1 ω b t
where Δ t is the sampling period, q ^ b n e s t , t 1 is the estimation of the quaternion at the previous time, and ω b t is the angular velocity at time t.
Although increasing the order of the above algorithm can improve the computational accuracy, the complexity is also increasing. The Runge–Kutta method is a high-precision single-step algorithm; the most classic and widely used is the fourth-order Runge–Kutta algorithm. It can perform iterative operations on definite solution problems with known initial values and equations without solving differential equations, which enormously reduces the computational complexity. In this paper, the calculation of human upper limb motion can be regarded as the initial value problem of a differential equation. The calculation formulas are as follows.
Δ q ˙ b n 1 = 1 2 q ^ b n e s t , t 1 ω b t 1 Δ q ˙ b n 2 = 1 2 q ^ b n e s t , t 1 + Δ q ˙ b n 1 2 ω b t 1 / 2 Δ q ˙ b n 3 = 1 2 q ^ b n e s t , t 1 + Δ q ˙ b n 2 2 ω b t 1 / 2 Δ q ˙ b n 4 = 1 2 q ^ b n e s t , t 1 + Δ q ˙ b n 3 ω b t q b n ω , t = q ^ b n e s t , t 1 + 1 6 Δ t Δ q ˙ b n 1 + 2 Δ q ˙ b n 2 + 2 Δ q ˙ b n 3 + Δ q ˙ b n 4
where ω b t 1 / 2 is the angular velocity at the intermediate time between t − 1 and t.
The quaternion fourth-order Runge–Kutta algorithm is designed to interpolate in the integration interval, and the slope is iteratively optimized at each step of the calculation to obtain an updated value.

2.1.2. Madgwick RK4 Orientation Complementary Filter for MARG

By formulating an objective function, this filter fuses the data of the tri-axis accelerometer, gyroscope, and magnetometer of the MARG system, then iteratively optimizes it to calculate the orientation q ^ b n .
Let the predefined reference vector in the n-frame be d ^ n and the sensor measurement in the b-frame be s ^ b . The objective function is defined as
min q ^ b n 4   f q ^ b n , d ^ n , s ^ b f q ^ b n , d ^ n , s ^ b = q ^ b n d ^ n q ^ b n s ^ b d ^ n = 0 d x d y d z s ^ b = 0 s x s y s z
By recording the initial quaternion q ^ b n 0 and step size μ , the estimation of q ^ b n n + 1 is obtained for n iterations, adopting the general gradient descent algorithm illustrated in (10). The symbol represents the gradient of the objective function, which can be computed by the objective function and its Jacobian, as shown in (11).
q ^ b n k + 1 = q ^ b n k μ f q ^ b n k , d ^ n , s ^ b f q ^ b n k , d ^ n , s ^ b ,   k = 0 , 1 , 2 n
f q ^ b n k , d ^ n , s ^ b = J T q ^ b n k , d ^ n f q ^ b n k , d ^ n , s ^ b
The specific algorithm expression of the accelerometer is described in (12). Normalized gravity g ^ n and normalized accelerometer measurements a ^ b substituted d ^ n and s ^ b , respectively.
g ^ n = 0 0 0 1 a ^ b = 0 a x a y a z f g q ^ b n , a ^ b = 2 q 2 q 4 q 1 q 3 a x 2 q 1 q 2 + q 3 q 4 a y 2 0.5 q 2 2 q 3 2 a z J g q ^ b n = 2 q 3 2 q 4 2 q 1 2 q 2 2 q 2 2 q 1 2 q 4 2 q 3 0 4 q 2 4 q 3 0
The accelerometer compensated the tilt angle of motion estimation obtained by the gyroscope measurement, but the heading angle was not compensated, as this requires the introduction of a magnetometer. The magnetometer can be used to measure the strength and direction of the magnetic field and to determine the orientation of a device.
As for the magnetometer, the geomagnetic field b ^ n , as the substitution of d ^ n , can be decomposed into a horizontal axis and a vertical axis theoretically. Additionally, the normalized magnetometer measurement m ^ b is the substitution of s ^ b . The specific form can be written as
b ^ n = 0 0 b y b z m ^ b = 0 m x m y m z f b q ^ b n , b ^ n , m ^ b = 2 b y q 1 q 4 + q 2 q 3 + 2 b z q 2 q 4 q 1 q 3 m x 2 b y 0.5 q 2 2 q 4 2 + 2 b z q 1 q 2 + q 3 q 4 m y 2 b y q 3 q 4 q 1 q 2 + 2 b z 0.5 q 2 2 q 3 2 m z J b q ^ b n , b ^ n = 2 b y q 4 2 b z q 3 2 b y q 3 + 2 b z q 4 2 b y q 2 2 b z q 1 2 b y q 1 + 2 b z q 2 2 b z q 2 4 b y q 2 + 2 b z q 1 2 b z q 4 4 b y q 4 + 2 b z q 3 2 b y q 2 2 b y q 1 4 b z q 2 2 b y q 4 4 b z q 3 2 b y q 3
However, the magnetometer may be disturbed by the bias of hard iron and soft iron, causing errors in the measurement direction of the earth’s magnetic field, so magnetic distortion compensation needs to be employed.
h ^ n t = 0 h x h y h z = q ^ b n e s t , t 1 m ^ b t q ^ b n e s t , t 1
b ^ n t = 0 0 h x 2 + h y 2 h z
where h ^ n t is the normalized magnetometer measurement in the n-frame and b ^ n t is the compensated geomagnetic field at time t .
Combining the specific algorithms of the accelerometer and magnetometer, the formulas were developed as follows.
f g , b q ^ b n , a ^ b , b ^ n , m ^ b = f g q ^ b n , a ^ b f b q ^ b n , b ^ n , m ^ b J g , b q ^ b n , b ^ n = J g T q ^ b n J b T q ^ b n , b ^ n
By substituting (16) into (11), the gradient of the combined objective function can be written as
f = J g , b T q ^ b n e s t , t 1 , b ^ n f g , b q ^ b n e s t , t 1 , a ^ b , b ^ n , m ^ b
And the estimated quaternion q b n , t calculated at time t can be given by
q b n , t = q ^ b n e s t , t 1 μ t f f
Employing the Madgwick orientation filter, an estimated rotation q b n e s t , t was obtained by fusing q b n ω , t and q b n , t .
q b n e s t , t = γ t q b n , t + 1 γ t q b n ω , t ,   0 γ t 1
where γ t represents weight, ensuring that the weighted divergence of q b n ω , t equals the weighted convergence of q b n , t .
According to [14], it is known that when noise augmentation of μ t is assumed to be extremely high, (19) can be rewritten as
q b n e s t , t = q ^ b n e s t , t 1 + q ˙ b n e s t , t Δ t = q ^ b n e s t , t 1 + Δ t q ˙ b n ω , t β f f
where β is the divergence rate of q b n ω , t .
By substituting (8) into (20), it can be written in the fourth-order Runge–Kuta form as
q b n e s t , t = q ^ b n e s t , t 1 + Δ t 1 6 Δ q ˙ b n 1 + 2 Δ q ˙ b n 2 + 2 Δ q ˙ b n 3 + Δ q ˙ b n 4 β f f
The process of the quaternion RK4-based Madgwick orientation filter algorithm is summarized as follows in Algorithm 1.
Algorithm 1: Process of quaternion RK4-based Madgwick orientation complementary filter.
Initialization: q b n 0
Input:  ω b t , Δ t , a ^ b t , m ^ b t , β
Output:  q b n e s t , t
1: q ˙ b n ω , t 0.5   q ^ b n e s t , t 1 ω b t
2: q b n ω , t q ^ b n e s t , t 1 + q ˙ b n ω , t Δ t
3: Δ q ˙ b n 1 , Δ q ˙ b n 2 , Δ q ˙ b n 3 , Δ q ˙ b n 4 q ^ b n e s t , t 1 , ω b t 1 , ω b t 1 / 2 , ω b t
4: q b n ω , t q ^ b n e s t , t 1 + Δ t Δ q ˙ b n 1 + 2 Δ q ˙ b n 2 + 2 Δ q ˙ b n 3 + Δ q ˙ b n 4 / 6
5: h ^ n t q ^ b n e s t , t 1 m ^ b t q ^ b n e s t , t 1 ; b ^ n t h ^ n t
6: f g , b q ^ b n , a ^ b , b ^ n , m ^ b f g q ^ b n , a ^ b f b q ^ b n , b ^ n , m ^ b T
7: J g , b q ^ b n , b ^ n J g T q ^ b n J b T q ^ b n , b ^ n T
8: f J g , b T q ^ b n e s t , t 1 , b ^ n f g , b q ^ b n e s t , t 1 , a ^ b , b ^ n , m ^ b
9: q b n e s t , t q b n ω , t Δ t β f f
Return:  q b n e s t , t

2.2. UWB Localization Kalman Filter

The UWB positioning system contained a tag-anchor wireless communication channel, as shown in Figure 3. The calculation of the distances between each tag and anchor exploiting double-sided two-way ranging (DS-TWR) method and position estimation employed distances [28,29]. In addition to classical UWB positioning algorithms such as MLE, LLSE, and WCE, we used EKF for position estimation due to its lower complexity and shorter computational time [29].
It was defined that the UWB localization system was aligned with the n-frame. The system model adopted for UWB EKF algorithm can be expressed as
x u , t = x u , t 1 + w u , t 1 y u , t = h u , t x u , t + v u , t
where the subscript u represents the UWB localization Kalman filter, and the state vector x u , t = P U W B , t = x t y t z t T is a 3 × 1 vector where P U W B , t represents the 3-D position of UWB tags in the n-frame. The measurement vector y u , t = d 1 , t d 2 , t d 3 , t d 4 , t T denotes the distances between each anchor and tag. The expansion form of h u , t x u , t is expressed in (23). The process noise w u , t 1 and measurement noise v u , t are zero mean additional Gaussian white noise with covariance matrices of Q u , t 1 and R u , t respectively. These matrices were obtained from the UWB system, and distinguished the ranging accuracy of different anchors.
h u , t x u , t = x t x 1 2 + y t y 1 2 + z t z 1 2 x t x 2 2 + y t y 2 2 + z t z 2 2 x t x 3 2 + y t y 3 2 + z t z 3 2 x t x 4 2 + y t y 4 2 + z t z 4 2
where x i , y i , z i , i = 1 , 2 , 3 , 4 represent the position of the i-th anchor.
Thus, the Jacobian matrix H u , t could be calculated as shown in (24). And for the sake of brevity, D i , t ,   i = 1 , 2 , 3 , 4 is the corresponding row of h u , t x u , t .
H u , t = h u , t x u , t x u , t x u , t = x t x 1 D 1 , t y t y 1 D 1 , t z t z 1 D 1 , t x t x 2 D 2 , t y t y 2 D 2 , t z t z 2 D 2 , t x t x 3 D 3 , t y t y 3 D 3 , t z t z 3 D 3 , t x t x 4 D 4 , t y t y 4 D 4 , t z t z 4 D 4 , t
Assuming that P u , 0 is the initial state estimation covariance and is known as the prior, as well as the initial, state vector x u , 0 , the posterior estimations P u , t and x u , t were recursively obtained by employing the prediction and update functions. It was defined that P u , t | t 1 and x u , t | t 1 were the prediction forms illustrated in (25) and could be calculated using the posterior estimations P u , t 1 and x u , t 1 at time t 1 .
P u , t | t 1 = P u , t 1 + Q u , t 1 x u , t | t 1 = x u , t 1
Then, the prediction and the measurement vector y u , t were used to update the prior estimations, and the posterior estimations at time t could be expressed as
k u , t = P u , t | t 1 H u , t T H u , t P u , t | t 1 H u , t T + R u , t 1 x u , t = x u , t | t 1 + k u , t y u , t h u , t x u , t | t 1 P u , t = I 3 × 3 k u , t H u , t P u , t | t 1
where x u , t is the 3-D position of the UWB tag, and will be used as the measurement vector in the IMU/UWB Kalman filter.

2.3. IMU/UWB Fusion Kalman Filter

The innovative aspect of this IMU/UWB Kalman filter is that it uses the drift-free position calculated by the UWB system to compensate for the orientation and position estimated by the IMU system. The information obtained by the IMU system was employed as the state vector, and the position calculated by the UWB system was applied as the measurement vector. The system model can be expressed as
x i u , t = f i u , t 1 x i u , t 1 , u i u , t 1 , w i u , t 1 y i u , t = h i u , t x i u , t , v i u , t
where the subscript i u represents the IMU/UWB Kalman filter. x i u , t is the state vector, u i u , t 1 is the input vector, y i u , t is the measurement vector, and w i u , t 1 and v i u , t are the process noise and measurement noise, respectively.
The state vector x i u , t = P I M U , t q b n e s t , t T is a 7 × 1 vector where P I M U , t represents the 3-D position of human upper limb and q b n e s t , t is the orientation obtained from the Madgwick complementary filter. The state model consists of two parts, and the part q b n e s t , t can be expressed as
q b n e s t , t = q b n e s t , t 1 + Δ t q ˙ b n e s t , t = q b n e s t , t 1 + Δ t q ˙ b n ω , t β f f = q b n e s t , t 1 + Δ t 1 2 q b n e s t , t 1 ω b t + ω b b , t β f f
where ω b b , t is the bias of the gyroscope and f f is the input vector.
Supposing that P I M U , 0 is the initial position of the IMU sensors, then P I M U , t can be recursively calculated from a geometric perspective using the rotation quaternion q b n e s t , t or the rotation matrix C b n t in (5), updated by q b n e s t , t . The position part of the state model is given by (29) and (30).
P I M U , 1 u = C b n u , 1 P I M U , 0 u T T P I M U , 2 u = C b n u , 2 P I M U , 0 u T T = C b n u , 2 C b n u , 1 T P I M U , 1 u T T P I M U , 3 u = C b n u , 3 P I M U , 0 u T T = C b n u , 3 C b n u , 2 T P I M U , 2 u T T                         P I M U , t u = C b n u , t C b n u , t 1 T P I M U , t 1 u T T
where P I M U , t u and C b n u , t represent the position and rotation matrix of the upper arm. The position of the forearm relies on the upper arm, and is given by
P I M U , t f = P I M U , t u + C b n f , t C b n f , t 1 T P I M U , t 1 f P I M U , t 1 u T T
where P I M U , t f and C b n f , t represent the position and rotation matrix of the forearm. When using the Kalman filter for the forearm, the input vector should contain the upper arm position.
The Jacobian matrices F i u , t 1 and L i u , t 1 were obtained.
F i u , t 1 = f i u , t 1 x i u , t 1 x i u , t 1 = C b n t C b n t 1 T 0 3 × 4 0 4 × 3 I 4 × 4 + 1 2 Δ t M ω b t + ω b b , t
L i u , t 1 = f i u , t 1 w i u , t 1 x i u , t 1 = 0 3 × 4 1 2 Δ t M q b n e s t , t 1
The measurement vector y i u , t = P U W B , t T is a 3 × 1 vector obtained from the UWB localization Kalman filter, representing the 3-D position of UWB tags in the n-frame. The measurement equation can be expressed as
y i u , t = H i u , t x i u , t + v i u , t
where H i u , t is the Jacobian matrix, expressed as
H i u , t = I 3 × 3 0 3 × 4
The covariance matrices Q i u , t 1 and R i u , t of the process noise w i u , t 1 and the measurement noise v i u , t were calculated as
Q i u , t 1 = E w i u , t 1 w i u , t 1 T = G = σ G 2 I 4 × 4 R i u , t = E v i u , t v i u , t T = P , U W B = σ P , U W B 2 I 3 × 3
It was assumed that P i u , 0 and x i u , 0 were the initial state estimation covariance and state vector, respectively, and the prediction forms P i u , t | t 1 and x i u , t | t 1 could be recursively calculated using the following equations.
P i u , t | t 1 = F i u , t 1 P i u , t 1 F i u , t 1 T + L i u , t 1 Q i u , t 1 L i u , t 1 T x i u , t | t 1 = f i u , t 1 x i u , t 1 , u i u , t 1 , 0
Then, the prior estimations were updated, and the posterior estimations P i u , t and x i u , t were written as
k i u , t = P i u , t | t 1 H i u , t T H i u , t P i u , t | t 1 H i u , t T + R i u , t 1 x i u , t = x i u , t | t 1 + k i u , t y i u , t h i u , t x i u , t | t 1 , 0 P i u , t = I 7 × 7 k i u , t H i u , t P i u , t | t 1
where x i u , t is the fused 3-D position in the IMU/UWB Kalman filter, which was used for 3-D motion reconstruction of human upper limbs. A detailed flowchart of the UWB localization and the IMU/UWB Kalman filters is shown in Figure 4. The parameters were set as follows. The initial state estimation covariance P u , 0 was set to 0.1 × I 3 × 3 . All elements of P i u , 0 were set to 0.1. x u , 0 and x i u , 0 were obtained from the UWB system and the optical MoCap system for each trial of movement, respectively.

3. Experimental Demonstration

3.1. Experimental Setup

To estimate the movement of human upper limbs, wearable 9-axis MPU9250 inertial sensor units (each sensor unit included a tri-axis accelerometer, a tri-axis gyroscope, and a tri-axis magnetometer) were adopted in our experiments. To avoid the extrusive effect of muscles of the human upper limbs, these two IMU sensors were placed on the lateral side, above the wrist and the elbow, respectively [11,21]. Each IMU sensor was connected to the computer via a serial port module for data transmission. The sample rate of the IMU sensors was set to 100 Hz, and the installation direction is shown in Figure 1.
Additionally, a DW1000 UWB real-time localization system manufactured by Haoru Technology, Dalian, China, was employed to track the 3-D position with an update rate of about 10 Hz. The UWB communication technology was implemented between the anchors and tags through the DS-TWR method based on time-domain transmission of radio signals, and the anchors were also connected to the computer via a serial port module. It provided an accuracy of 10 cm for the X/Y axis and 30 cm for the Z axis. Two UWB tags were fixed on the bracelets with the IMU sensors to avoid relative movement, as shown in Figure 5. To ensure the positioning stability of UWB, four anchors were arranged in our laboratory in a cube distribution.
The Nokov optical MoCap system captures human motion through sixteen cameras evenly arranged in the laboratory, as illustrated in Figure 5, and is used as a reference system for algorithm verification and further comparison [34]. For each segment of the human upper limb, at least four reflective markers were attached to the skin surface and sensors, constituting an envelope rather than a cluster [9,11,38,39]. The optical MoCap system achieved sub-millimeter positioning accuracy and higher reliability.

3.2. Experimental Protocol

Our experiments aimed to capture several common movements of the human upper limbs, including simple whole-arm movements and combined upper-arm and forearm movements. These common movements comprised motion 1: shoulder flexion/extension and abduction/adduction (the whole arm moving along a circular trajectory); motion 2: shoulder abduction/adduction, internal/external rotation, and elbow flexion/extension; and motion 3: shoulder flexion/extension, internal/external rotation, and elbow flexion/extension, as shown in Figure 6. In each trial, one type of movement was performed periodically, with a duration of about one minute.
As for the UWB system, four anchors arranged in conventional cube distribution were employed, as shown in Figure 7. The above movements were performed by one subject under each distribution throughout the experiment. The subject was a 24-year-old healthy female 165 cm in height and 40 kg in weight. It is important to mention that the optical MoCap system tracked the movement of the human upper limbs for each trial, which could be used to initialize the IMU sensors and obtain the positions of UWB anchors. Before each trial, the IMU sensors were calibrated, and a few seconds of stillness then enhanced the stability of the proposed algorithm.

4. Experimental Results and Discussion

4.1. Performance of Madgwick RK4 Orientation Filter

The performance of the Madgwick orientation filter in terms of attitude angle and 3-D position estimation is shown in this section. Figure 8 shows the Euler angles of the wrist for motion 1, the whole arm moving along a circular trajectory, which is beneficial for us in verifying the tri-axis rotation (roll, pitch, and yaw) of the upper limbs. On the basis of the attitude angle obtained by the quaternion fourth-order Runge–Kutta algorithm and the original second-order update algorithm illustrated in (7), the Madgwick orientation filter was combined, as shown in (21) and (20), respectively, and the tri-axis positions of sensor unit 1 during the movement were calculated as shown in Figure 9. The solid lines represent the positions calculated using various methods from the IMU system, whereas the black dotted lines are the reference positions obtained by the optical Mocap system. As can be seen, the proposed methods were able to track the movements of the upper limbs in spite of the errors at the inflection points.
The position accuracies of the three kinds of movements are summarized in Table 1. It shows the 3-D position accuracy of forearm motion tracking by the fourth-order Runge–Kutta Madgwick method and the original second-order update algorithm compared with the optical Mocap system. According to Table 1, the root mean square error (RMSE) values of tri-axis motion tracking of Madgwick RK4 method varied from 6.17 cm to 12.76 cm, which shows the feasibility of precise motion tracking of the human upper limbs. In order to further reduce the error, it is necessary to utilize the UWB measurements.

4.2. Performance of UWB Localization Kalman Filter

As for the UWB localization Kalman filter, we utilized multiple sets of movements of the human upper limb to verify the accuracy of the UWB positioning system. It is important to mention that the installation direction of the IMU was always the same as that in Figure 1 during the experiment. And the subject stood at the edge of the UWB area to ensure that there was no occlusion between the tags and anchors during the movement of the upper limbs. Before the experiment, we first kept the tag still and verified the static positioning accuracy of UWB. Table 2 shows the static positioning variance of the four-anchor UWB system. The mean values of the tri-axis variances fell within 0.6 cm2, so it shared the good positioning robustness of the UWB system.
Figure 10 shows a comparison of the position of upper limb motion tracking for three kinds of movements by the IMU, UWB, and optical Mocap systems. Motion 1, represented by Figure 10a, consisted of movement along a circular trajectory, which is consistent with of the motion in Figure 9. And motion 2 was a combined motion, represented by Figure 10b, comprising shoulder abduction/adduction, internal/external rotation, and elbow flexion/extension. Motion 3 was another combined motion, represented by Figure 10c and similar to motion 2, consisting of shoulder flexion/extension, internal/external rotation, and elbow flexion/extension. As can be seen in these figures, it is obvious that compared with the position calculated by the IMU sensors, the position calculated by the UWB system was closer to the reference value of the optical Mocap system at certain steps, especially the inflection points. Therefore, as for the measurement, the UWB system can be fused with IMU for upper limb motion estimation to reduce the error caused by the drift of the IMU sensors.

4.3. Performance of IMU/UWB Kalman Filter

The performance of the IMU/UWB Kalman filter in 3-D upper limb motion tracking is shown in this section. As mentioned previously, by adopting the cube distribution of four anchors, the measurement of UWB was fused with IMU for drift error reduction and better positioning accuracy in motion estimation. The position’s RMSE values are illustrated in Table 3, where motion 1, motion 2, and motion 3 are represented in Figure 10a–c, respectively. This shows a comparison of the 3-D position accuracy of various upper limb motion tracking methods, i.e., the IMU method, UWB method, and IMU/UWB fusion method with the optical Mocap system. Although the position accuracy of the UWB system may have sometimes fallen lower than that of the IMU, the adopted Kalman filter adaptively adjusted the weights between them based on the covariances. It can be observed that the tri-axis position RMSE calculated by the IMU/UWB Kalman filter was consistently less than that without UWB fusion. In our experiment, the proposed method was been extensively tested by various movements. For simple movements like motion 1, consisting of a circular trajectory, the tri-axis position RMSE values were all less than 10 cm. Compared with the IMU method, the relative errors calculated by the IMU/UWB fusion method were reduced by 40%, 3.6%, and 25.5% in the X-axis, Y-axis, and Z-axis, respectively. Complex combined movements such as motion 2 and 3 always consisted of multiple simple movements, such as shoulder flexion/extension, abduction/adduction, internal/external rotation, and elbow flexion/extension. Despite the fact that the tri-axis errors with combined motions are usually more significant than those with simple movements, the IMU/UWB fusion method still demonstrated better position accuracy than the IMU method, with a maximum RMSE of 12.2 cm. And our proposed methodology achieved an average decrease in the RMSE of 1.2 cm from the IMU method to the IMU/UWB fusion method. In comparison, by transferring the angle RMSE into position RMSE, the position RMSE values illustrated in Table 3 fell within the accuracy range of forearm motion without the constraints represented in [21].
Figure 11 shows a comparison of the 3-D spatial trajectory of upper limb motion reconstructed by the proposed algorithm and the real movement for motion 3. As can be seen in this figure, the proposed method was able to accurately track the movement of the human upper limb, and showed high feasibility and stability.

4.4. Power Consumption and Cost

According to the datasheet of the MPU9250 which was adopted in our experiment, the power consumption of the sensor depends on the selected operating mode and sensor configuration. In general, the device consumes between 8.3 μA and 3.9 mA. When the voltage is 3.3 V, the power consumption is as follows: in low-power mode, it ranges from 27.4 μW to 66.7 μW; in normal mode, with both the accelerometer and the gyroscope enabled, it varies from 31.5 μW to 764.7 μW; and in normal mode with all three sensors (accelerometer, gyroscope, and magnetometer) enabled, the power consumption ranges from 37.3 μW to 926.7 μW. The average price for a single MPU9250 chip is between USD 1 and 5.
As for the UWB system, according to the DW1000 datasheet, the typical power consumption values for the device in different modes are as follows: in idle mode, 24.75 mW; in receive mode, 56.84 mW; in transmit mode (at 6.8 Mbps data rate), around 270.76 mW; and in sleep mode, 20 nW. Typically, the cost of a single DW1000 module ranges from around USD 10 to 25.
However, the power consumption of an optical MoCap system can range from a few hundred watts to several kilowatts. This is because the cameras employed in this system require significant processing power and strict light conditions. And the cost of a basic system with a few cameras and basic software is around USD 1000. Therefore, compared with the optical MoCap system, the IMU/UWB system which we adopted exhibits extremely lower power consumption and is more economical and applicable.

5. Conclusions

In this paper, we proposed a novel method for hybrid upper limb motion tracking and 3-D positioning by fusing the IMU and UWB systems. We first simplified the human upper limb as a kinematics model and employed the quaternion method to calculate the attitude angle of each segment. To compensate for the accumulated error of gyroscope measurement, the fourth-order Runge–Kutta Madgwick orientation filter was adopted to improve the accuracy of 3-D motion tracking through the optimal fusion of the accelerometer, gyroscope, and magnetometer of the MARG system. The RMSE values varied from 6.17 cm to 12.76 cm, which shows feasibility for the precise motion tracking of human upper limbs. The error was mainly caused by drift of the IMU sensors. And it was inevitable that there would be a slight misalignment of coordinate frames at the initial moment.
In order to further reduce the drift error, we combined the UWB localization system with the IMU sensors. The static positioning variance of the four-anchor UWB system was tested, and the mean values of the tri-axis variances were within 0.6 cm2, so it shared good positioning robustness. By employing the UWB localization Kalman filter, the accuracy of UWB was verified by multiple sets of movements of the human upper limbs.
Adopting the four-anchor UWB system, we employed various movements to test the IMU/UWB Kalman filter. The experimental results represent that our proposed fusion algorithm achieved an average decrease in the RMSE of 1.2 cm from the IMU method to the IMU/UWB fusion method. With high feasibility and stability, our proposed algorithm was able to accurately track the movements of the human upper limbs.
In future work, we aim to study the effects of various distributions of multiple anchors on UWB localization accuracy, as well as to extend the proposed algorithm to the whole-body motion estimation.

Author Contributions

Conceptualization, Y.S. and Y.Z.; methodology, Y.S.; software, Y.S.; validation, Y.S., Y.Z. and Z.L.; formal analysis, Y.S.; investigation, Y.S.; resources, Y.S. and S.Y.; data curation, Y.S. and S.Z.; writing—original draft preparation, Y.S.; writing—review and editing, Y.S., Y.Z. and Z.L.; visualization, Y.S.; supervision, Y.Z.; project administration, Y.Z.; funding acquisition, Y.Z. All authors have read and agreed to the published version of the manuscript.

Funding

This research received no external funding.

Institutional Review Board Statement

Not applicable.

Informed Consent Statement

Not applicable.

Data Availability Statement

Not applicable.

Conflicts of Interest

The authors declare no conflict of interest.

References

  1. Li, R.; Wang, H.; Liu, Z. Survey on Mapping Human Hand Motion to Robotic Hands for Teleoperation. IEEE Trans. Circuits Syst. Video Technol. 2022, 32, 2647–2665. [Google Scholar]
  2. Liarokapis, M.V.; Artemiadis, P.K.; Kyriakopoulos, K.J. Mapping human to robot motion with functional anthropomorphism for teleoperation and telemanipulation with robot arm hand systems. In Proceedings of the IEEE/RSJ International Conference on Intelligent Robots & Systems, Tokyo, Japan, 3–7 November 2013; p. 2075. [Google Scholar]
  3. Vongchumyen, C.; Bamrung, C.; Kamintra, W.; Watcharapupong, A. Teleoperation of Humanoid Robot by Motion Capturing Using KINECT. In Proceedings of the 2018 International Conference on Engineering, Applied Sciences, and Technology (ICEAST), Phuket, Thailand, 4–7 July 2018; IEEE: Piscataway, NJ, USA, 2018; pp. 1–4. [Google Scholar]
  4. Wan, K.; Sawada, H. 3D measurement of human upper body for gesture recognition. In Optomechatronic Computer-Vision Systems II; SPIE: Bellingham, WA, USA, 2007. [Google Scholar]
  5. Liu, Y.-T.; Zhang, Y.-A.; Zeng, M. Sensor to segment calibration for magnetic and inertial sensor based motion capture systems. Measurement 2019, 142, 1–9. [Google Scholar]
  6. Yuan, Q.; Chen, I.M. Human velocity and dynamic behavior tracking method for inertial capture system. Sens. Actuators A Phys. 2012, 183, 123–131. [Google Scholar] [CrossRef]
  7. Fan, B.; Li, Q.; Tan, T.; Kang, P.; Shull, P.B. Effects of IMU Sensor-to-Segment Misalignment and Orientation Error on 3D Knee Joint Angle Estimation. IEEE Sens. J. 2021, 22, 2543–2552. [Google Scholar]
  8. Luinge, H.J.; Veltink, P.H. Measuring orientation of human body segments using miniature gyroscopes and accelerometers. Med. Biol. Eng. Comput. 2005, 43, 273–282. [Google Scholar] [CrossRef]
  9. Seel, T.; Raisch, J.; Schauer, T. IMU-based joint angle measurement for gait analysis. Sensors 2014, 14, 6891–6909. [Google Scholar]
  10. Favre, J.; Jolles, B.M.; Aissaoui, R.; Aminian, K. Ambulatory measurement of 3D knee joint angle. J. Biomech. 2008, 41, 1029–1035. [Google Scholar] [CrossRef]
  11. Luinge, H.J.; Veltink, P.H.; Baten, C.T. Ambulatory measurement of arm orientation. J. Biomech. 2007, 40, 78–85. [Google Scholar]
  12. El-Gohary, M.; Mcnames, J. Shoulder and Elbow Joint Angle Tracking With Inertial Sensors. IEEE Trans. Bio-Med. Eng. 2012, 59, 2635–2641. [Google Scholar]
  13. Álvarez, D.; Alvarez, J.C.; González, R.C.; López, A.M. Upper limb joint angle measurement in occupational health. Comput. Methods Biomech. Biomed. Eng. 2016, 19, 159–170. [Google Scholar] [CrossRef]
  14. Madgwick, S. An Efficient Orientation Filter for Inertial and Inertial/Magnetic Sensor Arrays; Report x-io and University of Bristol: Bristol, UK, 2010; Volume 25, pp. 113–118. [Google Scholar]
  15. Mahony, R.; Hamel, T.; Pflimlin, J.M. Nonlinear Complementary Filters on the Special Orthogonal Group. IEEE Trans. Autom. Control 2008, 53, 1203–1218. [Google Scholar] [CrossRef] [Green Version]
  16. Bachmann, E.R. Inertial and Magnetic Tracking of Limb Segment Orientation for Inserting Humans into Synthetic Environments. Doctoral Dissertation, Naval Postgraduate School, Monterey, CA, USA, 2000. [Google Scholar]
  17. Shen, H.-M.; Lian, C.; Wu, X.-W.; Bian, F.; Yu, P.; Yang, G. Full-pose estimation using inertial and magnetic sensor fusion in structurized magnetic field for hand motion tracking. Measurement 2021, 170, 108697. [Google Scholar] [CrossRef]
  18. Zihajehzadeh, S.; Loh, D.; Lee, M.; Hoskinson, R.; Park, E.J. A cascaded two-step Kalman filter for estimation of human body segment orientation using MEMS-IMU. In Proceedings of the 2014 36th Annual International Conference of the IEEE Engineering in Medicine and Biology Society, Chicago, IL, USA, 26–30 August 2014; IEEE: Piscataway, NJ, USA, 2014. [Google Scholar]
  19. Young, A.D. Use of Body Model Constraints to Improve Accuracy of Inertial Motion Capture. In Proceedings of the International Conference on Body Sensor Networks, Singapore, 7–9 June 2010. [Google Scholar]
  20. Zhang, Z.; Wong, L.W.C.; Wu, J.-K. 3D Upper Limb Motion Modeling and Estimation Using Wearable Micro-sensors. In Proceedings of the 2010 International Conference on Body Sensor Networks, Singapore, 7–9 June 2010; pp. 117–123. [Google Scholar]
  21. Zhang, Z.-Q.; Wu, J.-K. A novel hierarchical information fusion method for three-dimensional upper limb motion estimation. IEEE Trans. Instrum. Meas. 2011, 60, 3709–3719. [Google Scholar] [CrossRef]
  22. Tao, G.; Sun, S.; Huang, S.; Huang, Z.; Wu, J. Human modeling and real-time motion reconstruction for micro-sensor motion capture. In Proceedings of the 2011 IEEE International Conference on Virtual Environments, Human-Computer Interfaces and Measurement Systems Proceedings, Ottawa, ON, Canada, 19–21 September 2011; IEEE: Piscataway, NJ, USA, 2011. [Google Scholar]
  23. Zhang, Z.Q.; Wong, W.C.; Wu, J.K. Ubiquitous Human Upper-Limb Motion Estimation using Wearable Sensors. IEEE Trans. Inf. Technol. Biomed. 2011, 15, 513–521. [Google Scholar] [CrossRef]
  24. Roetenberg, D.; Luinge, H.J.; Baten, C.; Veltink, P.H. Compensation of magnetic disturbances improves inertial and magnetic sensing of human body segment orientation. IEEE Trans. Neural Syst. Rehabil. Eng. 2005, 13, 395. [Google Scholar] [CrossRef] [Green Version]
  25. Fourati, H.; Manamanni, N.; Afilal, L.; Handrich, Y. Complementary Observer for Body Segments Motion Capturing by Inertial and Magnetic Sensors. IEEE/ASME Trans. Mechatron. 2014, 19, 149–157. [Google Scholar] [CrossRef] [Green Version]
  26. Bharadwaj, R.; Parini, C.; Alomainy, A. Indoor tracking of human movements using UWB technology for motion capture. In Proceedings of the 2014 8th European Conference on Antennas and Propagation (EuCAP), The Hague, The Netherlands, 6–11 April 2014. [Google Scholar]
  27. Queralta, J.P.; Almansa, C.M.; Schiano, F.; Floreano, D.; Westerlund, T. UWB-based system for UAV Localization in GNSS-Denied Environments: Characterization and Dataset. In Proceedings of the 2020 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), Las Vegas, NV, USA, 25–29 October 2020. [Google Scholar]
  28. Kim, D.-H.; Pyun, J.-Y. NLOS identification based UWB and PDR hybrid positioning system. IEEE Access 2021, 9, 102917–102929. [Google Scholar] [CrossRef]
  29. Poulose, A.; Emeršič, Ž.; Eyobu, O.S.; Han, D.S. An accurate indoor user position estimator for multiple anchor UWB localization. In Proceedings of the 2020 International Conference on Information and Communication Technology Convergence (ICTC), Jeju Island, Republic of Korea, 21–23 October 2020; IEEE: Piscataway, NJ, USA, 2020; pp. 478–482. [Google Scholar]
  30. Zampella, F.; Angelis, A.D.; Skog, I.; Zachariah, D.; Jiménez, A. A constraint approach for UWB and PDR fusion. In Proceedings of the International Conference on Indoor Positioning & Indoor Navigation, Montbeliard, France, 28–31 October 2013. [Google Scholar]
  31. Corrales, J.A.; Candelas, F.A.; Torres, F. Hybrid tracking of human operators using IMU/UWB data fusion by a Kalman filter. In Proceedings of the Third ACM/IEEE International Conference on Human-Robot Interaction, Amsterdam, The Netherlands, 12–15 March 2008; IEEE: Piscataway, NJ, USA, 2008; p. 193. [Google Scholar]
  32. Zihajehzadeh, S.; Yoon, P.K.; Park, E.J. A magnetometer-free indoor human localization based on loosely coupled IMU/UWB fusion. In Proceedings of the 2015 37th Annual International Conference of the IEEE Engineering in Medicine and Biology Society (EMBC), Milan, Italy, 25–29 August 2015; IEEE: Piscataway, NJ, USA, 2015. [Google Scholar]
  33. Kok, M.; Hol, J.D.; Schon, T.B. Indoor Positioning Using Ultrawideband and Inertial Measurements. IEEE Trans. Veh. Technol. 2015, 64, 1293–1303. [Google Scholar] [CrossRef] [Green Version]
  34. Zihajehzadeh, S.; Park, E.J. A novel biomechanical model-aided IMU/UWB fusion for magnetometer-free lower body motion capture. IEEE Trans. Syst. Man Cybern. Syst. 2016, 47, 927–938. [Google Scholar] [CrossRef]
  35. Zihajehzadeh, S.; Yoon, P.K.; Kang, B.S.; Park, E.J. UWB-Aided Inertial Motion Capture for Lower Body 3-D Dynamic Activity and Trajectory Tracking. IEEE Trans. Instrum. Meas. 2015, 64, 3577–3587. [Google Scholar] [CrossRef]
  36. Zhang, H.; Zhang, Z.; Gao, N.; Xiao, Y.; Meng, Z.; Li, Z. Cost-Effective Wearable Indoor Localization and Motion Analysis via the Integration of UWB and IMU. Sensors 2020, 20, 344. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  37. Peppoloni, L.; Filippeschi, A.; Ruffaldi, E.; Avizzano, C.A. A novel 7 degrees of freedom model for upper limb kinematic reconstruction based on wearable sensors. In Proceedings of the 2013 IEEE 11th International Symposium on Intelligent Systems and Informatics (SISY), Subotica, Serbia, 26–28 September 2013; IEEE: Piscataway, NJ, USA, 2013; pp. 105–110. [Google Scholar]
  38. Tian, Y.; Meng, X.; Tao, D.; Liu, D.; Feng, C. Upper limb motion tracking with the integration of IMU and Kinect. Neurocomputing 2015, 159, 207–218. [Google Scholar] [CrossRef]
  39. Roux, E.; Bouilland, S.; Godillon-Maquinghen, A.P.; Bouttens, D. Evaluation of the global optimisation method within the upper limb kinematics analysis. J. Biomech. 2009, 35, 1279–1283. [Google Scholar] [CrossRef] [PubMed] [Green Version]
Figure 1. Upper limb kinematics model and sensor unit arrangement.
Figure 1. Upper limb kinematics model and sensor unit arrangement.
Sensors 23 06700 g001
Figure 2. Framework of the proposed upper limb motion tracking algorithm.
Figure 2. Framework of the proposed upper limb motion tracking algorithm.
Sensors 23 06700 g002
Figure 3. UWB positioning system.
Figure 3. UWB positioning system.
Sensors 23 06700 g003
Figure 4. Flowchart of UWB localization and IMU/UWB Kalman filters.
Figure 4. Flowchart of UWB localization and IMU/UWB Kalman filters.
Sensors 23 06700 g004
Figure 5. Setup of IMU sensors, UWB system, and optical MoCap system.
Figure 5. Setup of IMU sensors, UWB system, and optical MoCap system.
Sensors 23 06700 g005
Figure 6. Tested movements. (ac) represent motions 1–3, respectively.
Figure 6. Tested movements. (ac) represent motions 1–3, respectively.
Sensors 23 06700 g006
Figure 7. Cube distribution of four anchors of the UWB system.
Figure 7. Cube distribution of four anchors of the UWB system.
Sensors 23 06700 g007
Figure 8. Euler angles of the forearm for motion 1 using the IMU system.
Figure 8. Euler angles of the forearm for motion 1 using the IMU system.
Sensors 23 06700 g008
Figure 9. Positions of sensor unit 1 for motion 1.
Figure 9. Positions of sensor unit 1 for motion 1.
Sensors 23 06700 g009
Figure 10. Comparison of the sensor unit 1 position of upper limb motion tracking for three kinds of movements by the IMU, UWB, and optical Mocap systems: (ac) represent motion 1, motion 2, and motion 3, respectively.
Figure 10. Comparison of the sensor unit 1 position of upper limb motion tracking for three kinds of movements by the IMU, UWB, and optical Mocap systems: (ac) represent motion 1, motion 2, and motion 3, respectively.
Sensors 23 06700 g010
Figure 11. 3-D spatial trajectory of sensor units 1 and 2 for motion 3. (a) Sequences of the real movement. (b) Motion reconstructed by the proposed algorithm.
Figure 11. 3-D spatial trajectory of sensor units 1 and 2 for motion 3. (a) Sequences of the real movement. (b) Motion reconstructed by the proposed algorithm.
Sensors 23 06700 g011
Table 1. The tri-axis position accuracy of forearm motion tracking by the Madgwick RK4 method and the original second-order update algorithm compared with the optical Mocap system.
Table 1. The tri-axis position accuracy of forearm motion tracking by the Madgwick RK4 method and the original second-order update algorithm compared with the optical Mocap system.
Position RMSE—Original Second
-Order (cm)
Position RMSE—MadgwickRK4 (cm)
XYZXYZ
Motion 19.83614.940210.14918.61246.17459.3199
Motion 27.45547.589810.72546.67177.687610.5322
Motion 313.205011.159711.261112.756310.302611.9849
Table 2. Static positioning variance of the four-anchor UWB system.
Table 2. Static positioning variance of the four-anchor UWB system.
Static Positioning Variance (cm2)
XYZ
10.29700.41060.4080
20.28030.22550.7632
30.16680.30040.2671
40.19900.16490.2265
50.38130.42730.9591
mean0.26490.30570.5248
Table 3. Comparison of the 3-D position accuracy of various upper limb motion tracking methods, i.e., the IMU method, UWB method, and IMU/UWB fusion method with the optical Mocap system.
Table 3. Comparison of the 3-D position accuracy of various upper limb motion tracking methods, i.e., the IMU method, UWB method, and IMU/UWB fusion method with the optical Mocap system.
Position RMSE—IMU (cm)Position RMSE—UWB (cm)Position RMSE—IMU/UWB (cm)
XYZXYZXYZ
Motion 19.14488.64238.54075.76784.331112.86575.48278.32916.3664
Motion 25.12568.517610.752011.57415.685115.90244.96067.82069.7357
Motion 312.759910.274311.992216.011712.502113.780312.21709.084511.1169
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Shi, Y.; Zhang, Y.; Li, Z.; Yuan, S.; Zhu, S. IMU/UWB Fusion Method Using a Complementary Filter and a Kalman Filter for Hybrid Upper Limb Motion Estimation. Sensors 2023, 23, 6700. https://doi.org/10.3390/s23156700

AMA Style

Shi Y, Zhang Y, Li Z, Yuan S, Zhu S. IMU/UWB Fusion Method Using a Complementary Filter and a Kalman Filter for Hybrid Upper Limb Motion Estimation. Sensors. 2023; 23(15):6700. https://doi.org/10.3390/s23156700

Chicago/Turabian Style

Shi, Yutong, Yongbo Zhang, Zhonghan Li, Shangwu Yuan, and Shihao Zhu. 2023. "IMU/UWB Fusion Method Using a Complementary Filter and a Kalman Filter for Hybrid Upper Limb Motion Estimation" Sensors 23, no. 15: 6700. https://doi.org/10.3390/s23156700

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop