**1. Introduction**

Stroke is considered to be a leading cause of upper body limb disability among adults, particularly the elderly. It results in immobility and can be deadly in severe cases [1,2]. Numerous factors, such as age, gender, level of physical activity, and others, affect the movement of the upper body limbs [3]. People started experiencing a wide range of health issues that altered their way of life, and their regular activities became more difficult due to their inability to use their upper limbs [4]. One of the leading causes of mobility loss in middle-aged and older people is the age-related decline of the musculoskeletal system caused by chronic aging illnesses [2,5]. The early commencement of the degeneration of body limb mobility can be usefully predicted by the joint data. Thus, the quantification of human mobility-related parameters can assist in making decisions about human health status and performance [6], and measuring joint characteristics such as joint angles, angular

**Citation:** Rahman, M.M.; Gan, K.B.; Aziz, N.A.A.; Huong, A.; You, H.W. Upper Limb Joint Angle Estimation Using Wearable IMUs and Personalized Calibration Algorithm. *Mathematics* **2023**, *11*, 970. https:// doi.org/10.3390/math11040970

Academic Editors: Camelia Petrescu and Valeriu David

Received: 17 November 2022 Revised: 10 January 2023 Accepted: 11 January 2023 Published: 14 February 2023

**Copyright:** © 2023 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https:// creativecommons.org/licenses/by/ 4.0/).

accelerations, etc., can contribute a vital role in accurately assessing the human body's stability and functional capacity.

It is important that a human motion capture system used in a therapeutic setting be easy to use, portable, and inexpensive while also being accurate in addressing the problems [7]. Marker-based optical motion capture systems, like VICON, are often considered state of the art for conducting such studies. These systems can give position and orientation with extremely high accuracy by directly measuring the positions of the markers [8]. Reconstructing a person's spatial posture is the primary function of the optical system, which primarily employs a high-speed camera to record the locations of reflecting markers on the surface of the body. Due to its ability to ensure precision, the optical system is sometimes referred to as the industry standard. Issues such as light, occlusion, location, cost, lengthy preparation, setup time, etc., are the primary constraints of this technology [9,10]. However, these systems are impractical outside of institutional settings such as hospitals and large-scale laboratories due to the high price, highly complicated design, and requirements for professional personnel [6].

Micro-electromechanical systems (MEMS) technology has made low-cost, energyefficient, and sensitive tiny sensors possible in the real world. Nowadays, smartphones, smartwatches, and fitness trackers have MEMS-based IMUs, which include various sensors such as accelerometers, gyroscopes, magnetometers, and so on [11–13]. In comparison to an optoelectronic system, they offer portability and convenience of use due to their small size, low cost, and light weight. The validity and reliability of wearable sensors in motion analysis were examined in several studies. IMU sensors can analyze gait, lower limb joint and pelvic angle kinematics, upper limb motion and joint parameters, and total body motion [14]. Many commercially available IMU systems, including Xsens, IMeasureU, BioSyn Systems, and Shimmer Sensing, were developed specifically for motion analysis. Among those systems, the precision of 3D kinematics was validated [15,16]. Moreover, due to the ability to detect strength, angular velocity, and orientation of the body limbs, the IMU is one of the options that was often selected [17]. Low-power, small IMUs can be integrated with smart textiles to create wearable, noninvasive elderly health monitoring devices, which will allow remote healthcare workers to monitor, assess, and retain data on their patients' mobility, activity, physical fitness, and rehabilitation [18,19].

However, inertial measurement units (IMUs) have their own challenges. There is a possibility that the data produced by an IMU may be insufficient, imprecise, or contaminated with errors. The data from the accelerometer can be inaccurate due to a gravitational force [20]. Typically, the orientation of a segment is determined by integrating the gyroscope measurement (angular velocity). The translational acceleration detected by the accelerometers is double-integrated to determine the position. One of the most critical problems with integration is that measurement errors quickly add up, and the precision of the outcome reduces. The magnetometer suffers from the interference generated by the surrounding magnetic fields and the presence of ferromagnetic materials close to the magnetometer [21].

Because of the aforementioned difficulties, human kinematics-related research using accelerometers, gyroscopes, or magnetometers was severely restricted. To solve these problems, a tri-axial gyroscope, accelerometer, and magnetometer were integrated into a single device, and a fusion algorithm was also developed to accompany the device. Integrating the data obtained from sensors can help decrease measurement error and produce a more precise estimate of the motion [22]. For the purpose of performing sensor fusion, several methods were reported in the various scientific literature. These methods include the Kalman filter (KF) and its variants, such as the extended Kalman filter (EKF), particle filter, unscented Kalman filter (UKF), complementary filter and its variants, and so on [23]. A nonlinear variation of the KF is the EKF, considered to be a common attitude estimate technique among researchers [24,25]. In fact, the Kalman filter (KF) maintains its undisputed coverage in industrial applications and is an ideal filtering strategy in terms of minimum mean squared error [26]. However, there are still several issues with

traditional KF techniques. The state and observation models must be linear for classical KF to work. Additionally, the noise sources of these two models must be uncorrelated white Gaussian noise [27]. In contrast to actual engineering practice, where the system is typically nonlinear, the traditional Kalman filtering theory is only applicable to linear systems and demands linear observation equations. As a result, Bucy and Sunahara developed the Extended Kalman filter (EKF) in the 1970s. It is necessary to linearize the nonlinear system before using the generalized Kalman filter to estimate its state. The linearization procedure adds errors to the nonlinear system, which reduces the accuracy of the final state estimation [28,29]. Thus, researchers focused on the complementary filter (CF) to overcome problems related to the KF [30]. However, it became necessary to use a nonlinear complementary filter (NCF) since linear complementary filters are also unable to adjust the changing bias of low-cost sensors [31]. For attitude estimation, Madgwick offered a gradient-descent-based CF in 2011 [32], while Mahony proposed a version of the CF in a particular orthogonal group in 2008 [33]. In terms of execution time, the Madgwick filter requires less time than the EKF, which happens due to the higher computational load in the EKF [34]. The EKF was recommended for reliable tilt measurements at higher angular speeds, while the Madgwick filter appeared to be the best for angular velocities up to 100–150◦ s−<sup>1</sup> [35]. To find a reliable and efficient sensor fusion technique, a number of fusion algorithms were compared in [36], where the authors concluded that the Madgwick filter performs marginally better than all other evaluated algorithms based on the root mean square error (RMSE).

The inevitable consequence is that calibration of the sensors is a concerning issue that affects the measurement's outcome, and the angle measurement from the individual sensors is not exact. Though a sensor fusion technique is required to prevent individual problems with the accelerometer, gyroscope, and magnetometer, most operational algorithms have numerous problems which have been mentioned in the previous section. Excessive external acceleration and distortion in magnetometer data may cause errors in the computed joint angles, and contemporary works did not address external acceleration. So, in this work, a two-stage algorithm was proposed to estimate the joint angles of the upper limbs, utilizing three IMUs, and this two-stage algorithm improves the accuracy and reduces the bias for joint angle calculation, addressing these gaps. In the first step, an estimation of each IMU sensor's orientation was made using the sensor fusion algorithm. The second step included deriving the joint angle from these IMUs' orientations, which were less susceptible to noise introduced by external acceleration. The rest of this article is organized as follows: The experimental setup and general methodology are covered in Section 2. The specifics of the proposed algorithm have also been discussed in Section 2. Section 3 then discusses and evaluates the algorithm's performance. In Section 4, some final findings and a discussion on potential possibilities are drawn for further study using the suggested algorithm.

#### **2. Materials and Methods**

## *2.1. Experimental Scenario and Platform*

This system comprised three wireless sensor nodes, each of which was equipped with a relatively inexpensive MEMS inertial sensor MPU9250. This MPU9250 is a low-cost 9-axis MEMS IMU made by WitMotion Company. The inertial sensor comprises a gyroscope with three axes, an accelerometer with three axes, and a magnetometer with three axes. The dimensions of the inertial sensor are 51.3 mm × 36 mm × 15 mm, and the net weight is 20 g. Table 1 provides a comprehensive summary of the IMU's numerous features and parameters.

Figure 1a,b depicts the hardware configuration of the inertial motion capture system used in this study. BLE 5.0 multiple connection software was utilized to link a PC with IMU sensors, which has a 50 m coverage range (without obstacles like walls). This software can retrieve and save CSV files containing data from accelerometers, gyroscopes, and magnetometers.


**Table 1.** IMU specifications. Source: https://www.wit-motion.com/. Accessed on 5 September 2022.

**Figure 1.** (**a**,**b**) The hardware of the motion capture system. Source: https://www.wit-motion.com/. (**c**) Three-dimensional rigid body structure.

The elbow joint angle of the human body was represented by a 3D rigid body, shown in Figure 1c, which resembles an arm of the human body. The rigid body was designed and developed utilizing two electro-goniometers that were joined together using a strong adhesive, as shown in Figure 1c. The electro-goniometers have an accuracy of 0.5◦ with a measuring range of 0–200◦. This structure was then placed on a platform to develop a complete arrangement. This rigid body has three segments or arms. One arm represents the upper arm, the second arm represents the forearm, and the third arm represents the shoulder part of the human body. Sensors were attached to each arm of the rigid body and raw sensor data were collected at a rate of 10 samples per second. These IMU sensor data were used to determine the elbow joint angle of the rigid body during static and dynamic conditions.

#### *2.2. Joint Angle Measurement Algorithm*

In this study, we proposed a two-stage orientation measurement algorithm that can estimate the joint angles of the elbow joint using the accelerometer, gyroscope, and magnetometer raw data from three IMUs. Among them, two IMUs were attached above and below the elbow joint of the rigid body, and the third IMU acted as a reference, placed in the shoulder position (static). At the beginning of the proposed algorithm, each IMU sensor's data were calibrated, and in the first stage, all sensor data were fused using the Madgwick filter, which returns output in quaternion form. In the second stage, the joint angle was estimated by fusing the quaternion data of all IMUs obtained in the first stage. Finally, to eliminate the high frequency components from the estimated angle, the joint angle was passed through a fourth-order Butterworth low-pass filter. Algorithm 1 outlines the

procedure that has been proposed for estimating the joint angle. The proposed algorithm is represented as a block diagram in Figure 2.

#### **Algorithm 1** Proposed Joint Angle Measurement Algorithm

	- Gyroscope data: *g<sup>i</sup>*
	- Magnetometer data: *m<sup>i</sup>*

$$a\_c^k = (a^k - b\_i)^k$$

5 Conversion of Magnetometer data (*m<sup>i</sup>* ) from local frame to global frame

6 Calibration and correction of the Magnetometer data in global frame:

$$m\_c^k = \begin{bmatrix} q\_b & q\_c & q\_d \end{bmatrix}$$

7 First Stage: Use of Madgwick filter for all three sensors' orientation estimation Output:

Forearm Sensor → *qFA*

Upper Arm Sensor → *qUA*

Reference Sensor → *qR*

8 Second Stage: Calculation of Joint Angle (*θ*) from the three quaternions

9 Filtering of Joint Angle (*θ*) using Fourth-Order Butterworth Filter

**Figure 2.** Proposed algorithm for joint angle estimation.
