*2.3. IMU*

Two wireless MTw IMU sensors (Xsens, Enschede, The Netherlands) are used, which are placed on the users' body segments, (weight: 10 g, size: 47 × 30 × 13 mm). Each MTw sensor comprises a 3D gyroscope, a 3D accelerometer and a 3D magnetometer. IMUs sends its data to a station (Awinda, Xsens) via a proprietary wireless communication protocol at 100 Hz of sampling frequency. The orientation data,

in quaternion format, was estimated with a customized Kalman filter developed by the manufacturer (XKF3-hm) [32].

We implemented a method using two IMU units to estimate the elbow flexion-extension angle. The IMU on the proximal body limb (i.e., upper arm) was used as a reference, hereafter referred to as *S*1. The second IMU, which was located on the distal body limb, will be referred as *S*2. The algorithm uses five coordinate systems for its operation. *S*1 and *S*2 are the local coordinate system of the IMU sensors, these are embedded in each sensor. The technical-anatomical coordinate systems (*B*1 and *B*2) correspond to the body limbs' orientation [33]. Finally, the global coordinate system, *G*, is a frame formed by gravity acceleration vector and Earth's magnetic north. The reference IMU sensor (*<sup>S</sup>*1) was carefully aligned with the body limb, in such a way, during calibration moment, with a neutral standing posture and after applying a gravity alignment, *B*1 (*B*2) coordinate system is estimated from *S*1 (*S*2).

The algorithm aims the estimation of the angle between the orientation of the proximal and distal limb. Therefore, the limbs' orientations should be projected to a common coordinate system (i.e., global coordinate system). This is done through the orientation quaternions, *<sup>G</sup>***q***B*1 (*t*) and *<sup>G</sup>***q***B*2 (*t*). The estimation process presupposes that *x*-axis of both technical-anatomical coordinate system are aligned with gravity vector at the calibration moment in a straight neutral posture, as proposed in previous works [33–35]. Besides the elbow joint is simplified as a 1-DOF hinge joint. The flexion-extension angle (*α*) is calculated following the steps below:


$$\mathbf{^Gq}\_{B\_1}(0) = \mathbf{q}\_c \otimes \mathbf{^Gq}\_{\mathbf{^Gq}}(0) \tag{2}$$

$$\mathbf{^Gq}\_{B\_1}(0) = \mathbf{^Gq}\_{B\_2}(0) \tag{3}$$

where ⊗ denotes the Hamilton product. Please note that due to the assumptions mentioned above, at the calibration moment, *<sup>G</sup>***q***B*1 (0) and *<sup>G</sup>***q***B*2 (0) are equals. Which means that the initial angle in a straight neutral posture is zero.

• To calculate sensor orientation *S*1 (*S*2) with respect to its associated body segmen<sup>t</sup> *B*1 (*B*2) using Equation (4). Please note that *<sup>B</sup>*<sup>1</sup>**q***S*1 and *<sup>B</sup>*<sup>2</sup>**q***S*2 are constant at all time instants.

$$\mathbf{^{B\_1}q\_{S\_1}} = \left(^{G}\mathbf{q\_{B\_1}}(0)\right)^{-1} \otimes ^{G}\mathbf{q\_{S\_1}}(0) \tag{4}$$

• To estimate the body segments' orientation *<sup>G</sup>***q***B*1 (*t*) and *<sup>G</sup>***q***B*2 (*t*) using Equation (5). Please note that Equations (4) and (5) are expressed in terms of *B*1 and *S*1, but they are also applicable to *B*2 and *S*2.

$$\mathbf{^Gq}\_{B\_1}(t) = \mathbf{^Gq}\_{S\_1}(t) \otimes \left( {^{B\_1}\mathbf{q}}\_{S\_1} \right)^{-1} \tag{5}$$

• To calculate the relative orientation between *<sup>G</sup>***q***B*1 (*t*) and *<sup>G</sup>***q***B*2 (*t*). In that way, *<sup>B</sup>*<sup>1</sup>**q***B*2 (*t*) is then decomposed into 'ZXY' Euler angles. The rotation about *z*-axis represents the elbow flexion-extension angle, *α*, consistent with the ISB recommendations [16,36].

$$\mathbf{q}^{B\_1}\mathbf{q}\_{B\_2}(t) = \left(^G\mathbf{q}\_{B\_1}(t)\right)^{-1} \otimes ^G\mathbf{q}\_{B\_2}(t) \tag{6}$$

#### *2.4. Sensors Characterization*

To validate the measurements of the camera-based and IMU-based systems a goniometer as a standard reference is used. This reference has two adjustable lever locks, which are positioned to limit the flexion-extension motion between 20◦ (lower bound) and 90◦ (upper bound). The goniometer was placed and aligned with the elbow joint of the subject M1, then, this was asked to perform flexion-extension movements on the sagittal plane in such a way to reach both locks. Lastly, the data was acquired by the camera-based system and the IMUs. The POF curvature sensor is characterized using the procedure mentioned in Section 2.2.

#### *2.5. Experimental Protocol*

Eleven participants without motor impairments were enrolled in this study. Six females, referred as F1, F2, F3, F4, F5, and F6, age: 27.3 ± 4.9 years old, corporal weight 56.8 ± 16.3 kg, and five males, referred as M1, M2, M3, M4, and M5, age: 27.4 ± 3.3 years old, corporal weight 70.2 ± 3.8 kg, as shown in Table 1. This research was approved by the Ethical Committee of UFES (Research Project CAAE: 64797816.7.0000.5542). As shown in Table 1, there is a variability not only in gender, but also on subjects' weight and height, which provides evidence for a further generalization of the proposed study.


**Table 1.** Characteristics of the participants of this research.

Two IMU sensors, one POF curvature sensor and two RGB-D cameras were used to estimate elbow joint angles. The IMU reference sensor was placed on the superior third of the right upper arm, and the second IMU was attached dorso-distally on the right forearm, as shown in Figure 6e. In a standing neutral posture, both sensors were positioned with *x*-axis pointing cranially, *z*-axis laterally and *y*-axis an orthogonal axis to *x* and *z* axes. These positions have been suggested by different authors [9,35,37]. Moreover, the POF curvature sensor was carefully aligned with the elbow joint in such a way that the sensitive zone of the optical fiber is located on the axis of rotation (flexion-extension axis).

In this test, the 11 participants (see Table 1) perform, a comfortable self-velocity, flexion-extension movements on three planes: sagittal, transverse, and frontal. Each participant was standing at the center of the room, observing the middle point between the two RGB-D cameras, see Figure 6d. All trials start with a synchronization movement, which consists of keeping the elbow in maximum extension on standing posture, then performing an elbow flexion of 90◦ and returning to the extended elbow position, where each transition last 5 s. Then, the subject was asked to perform three repetitions of flexion-extension on a specific plane. In the sagittal plane, the shoulder is in a neutral position and the participant performs elbow flexion-extension to ge<sup>t</sup> the maximum angle as possible (see in Figure 6a). In the transverse plane, the shoulder is in abduction (at max 90◦) and kept in that position for 5 s before the elbow flexion-extension

movements (Figure 6b). In the frontal plane, the shoulder is in abduction (at max 90◦) and external rotation, so the palm of the hand is facing forward as shown in Figure 6c. These described steps can be summarized in Figure 7.

**Figure 6.** Sensors' placement on the human upper limb, and movements performed during the experimental protocol. (**a**) User movement representation in sagittal plane, (**b**) transverse plane, and (**c**) frontal plane. (**d**) Top view of RGB-D system arrangement. (**e**) User using the IMU system and POF sensor.

**Figure 7.** Summary of the protocol's phases.

#### **3. Results and Discussion**

The comparison variables of the three systems were: (i) the correlation coefficient and (ii) the root mean squared error (RMSE) between the RGB-D cameras, IMUs, and POF. After the comparison among the sensors, a novel approach for angle correction on the markerless camera-based system was proposed and validated for the correction of angular errors on the sagittal plane. The proposed technique is based on anthropometric measurements of each subject, where the errors of the camera-based system compared with the POF curvature sensor and IMU system are corrected through a correlation between the measured errors and the length of the arm of each subject (*d*1, *d*2 and *d*3 of Equation (1)). In this way, an equation for error correction considering the length *d*3 of each subject was obtained.

#### *3.1. Sensors Characterization*

#### 3.1.1. IMUs and RGB-D Cameras

Figure 8 presents the results for RGB-D camera system and IMUs for the characterization phase using a goniometer as a reference for upper and lower bounds. The performed movement was divided into three cycles, and each cycle presents correlation between cameras and IMUs higher than 0.98.

**Figure 8.** Camera-based system and IMU angles of the first test.

Table 2 shows the maximum and minimum angles for cameras system and IMUs, compared to upper and lower bounds, respectively. The average error for the cameras system was 4.9◦, with a maximum error of 9◦, when compared with the goniometer for two values (90◦ e 20◦), which is lower than the mean error presented in Tannous et al. [38] (14, 6◦). However, in Tannous et al. [38] only one camera was used, carrying a higher self-occlusion leading to errors on the angle assessment. Since our system consists of two cameras, the self-occlusion decreases, consequently, reducing the errors. Compared to the goniometer, the IMUs average error was 3.7◦, which is lower than the camera-based system.


**Table 2.** Maximum and minimum angles of camera-based system and IMU of each cycle for the first test.

## 3.1.2. POF

Figure 9 presents the normalized POF power attenuation as a function of the angle on the experimental setup shown in Figure 6, where a high correlation with the measured angle was obtained, since the determination coefficient (R2) between the POF power variation and the angle was 0.996.

**Figure 9.** Sensors' placement on the human upper limb and the movements performed.

Thereafter, the POF curvature sensor is positioned on the subject's elbow joint as previously described (see Section 2.5). The data of the elbow movement at each plane (sagittal, frontal and transverse) were recorded, which are presented in Figure 10 for subject M3. The initial movement of 90◦ in the sagittal plane was used to adjust the sensor response with respect to the differences aroused from the positioning on the subject elbow.

**Figure 10.** Sensors' placement on the human upper limb and the movements performed.

The sensor response has the pattern of the flexion/extension movements performed by the subject. Furthermore, the angles estimated by the POF curvature sensor are within the range of movement described by the literature for elbow (0–145◦) [39]. Thus, the results presented in Figure 10 in conjunction with the high correlation of 0.996 (obtained on the sensor characterization as a function of the angle measured by a potentiometer) shows the feasibility of the POF sensor on the angle assessment, which makes it a suitable option for the comparison with angles estimated by the camera-based system.

#### *3.2. Comparison among the Sensors*

After the first evaluation of the sensors and the POF curvature sensor characterization, the sensors used in this work were compared using the experimental protocol depicted in Section 2.5. In this case, the camera-based system was compared with both IMU and POF. The comparison was made with respect to the correlation coefficient and RMSE (as also performed in the previous sections). Figure 11 shows the results obtained for all sensors in different planes, i.e., sagittal, transverse, and frontal planes, for subject M1.

**Figure 11.** Comparison among camera-based system, POF curvature sensor and IMU in the sagittal, frontal, and transverse planes, for subject M1.

The results presented in Figure 11 show a good correlation between the errors of the POF curvature sensor and IMU, especially on sagittal and frontal planes. Although we used the same number of cycles to compare the sensors, the period of each movement is different, due to the fact that each subject was allowed to perform the movements at a comfortable self-velocity.

Furthermore, the range of movement at each plane is different, i.e., the movement at the sagittal plane occurs in a range of about 0–145◦, whereas the one at the transverse plane reaches angles lower than 130◦. Similarly, the angles at the frontal plane can be as high as 145◦ (as in the sagittal plane). From the tests, the mean deviation between POF curvature sensor and IMUs was about 6.5% on the tests in the sagittal plane. However, such deviation increased to about 10% on the transverse and frontal planes. The reason for this increase can be related to the POF positioning on the tests, since it is a critical factor on the angle assessment using such technology. In addition, it can be also related to the increase of the errors of the IMUs when the test was performed in planes different from the sagittal one, as reported in Vargas-Valencia et al. [33]. Regarding the camera-based system, the results at the sagittal plane show an overestimation of the angle, when compared to IMU and POF curvature sensor. In this case, the angles estimated by the markerless camera system had a maximum value of about 160◦, which is higher than the elbow range of motion [39]. In contrast, the camera-based system underestimates the angles at the frontal plane when compared with the other two systems for angle assessment.

Such as aforementioned, the errors on the markerless camera system for angle assessment are related to issues, such as frame errors, exploitation of multiple image streams and, especially, due to self-occlusions. To further evaluate the errors obtained by the camera-based system, Table 3 presents the correlation coefficient and RMSE between the markerless system and the IMUs for each of the 11 participants in all three planes tested, whereas Table 4 presents the correlation and RMSE between the markerless system and the POF curvature sensor.


**Table 3.** RMSE and correlation coefficient for Cameras and IMU.

**Table 4.** RMSE and correlation coefficient for Cameras and POF.


Tables 3 and 4 show a correlation coefficient higher than 0.9 in all analyzed cases, which indicates a high correlation between the responses of the sensors. In addition, the standard deviation of the correlation coefficient was below 0.01 in all the analyzed cases. Thus, it is possible to verify not only a high correlation between the data of the camera-based system compared to the wearable ones, but also that the results present a promising evidence of repeatability of such systems. The mean of correlation coefficients between the camera-based system and the IMUs were 0.990, 0.984 and 0.979 on the sagittal, transverse, and frontal planes, respectively. It is noteworthy that higher correlations were obtained between the camera-based system and the IMUs than the ones comparing the markerless system with the POF curvature sensor. The mean of the correlation coefficients considering the later comparison was 0.978, 0.964 and 0.975 for sagittal, transverse, and frontal planes, respectively.

Even though the proposed camera-based system presented high correlation with the wearable sensors in all scenarios, the errors of such system are generally high. Such as can be observed in Figure 11, there are deviations on the angle estimation of the camera-based system when compared with the wearable sensors, where, considering all the performed tests, these errors can be as high as 15◦ on the worst case. In addition, the mean errors are about 10◦ when compared with the wearable sensors. It is noteworthy that these errors are lower than the ones reported on the literature [40], which is mainly due to the use of two cameras to reduce the errors related to occlusions. However, errors of about 10◦ are still not sufficient when a reliable system for movement analysis is concerned. Nevertheless, the high correlations obtained in all tests for the comparison with the wearable sensors (see Tables 3 and 4) indicate that the proposed markerless camera-based system can be a feasible solution for angle estimation if a post-processing technique for the correction of the angular errors is applied, as also discussed in Schmitz et al. [40].

#### *3.3. Technique for Angle Correction in Markerless Camera-Based Systems*

The primary assumption for the proposed compensation technique for angle errors in markerless camera-based systems is that the errors mainly occur due to occlusions, or errors on computer vision algorithm for the tracking of the anatomical points used to calculate the parameters *d*1 and *d*2 in Figure 4. If these parameters are incorrectly estimated, errors on the angle assessment will occur. Thus, it is possible to assume that these angular errors have a correlation with the anthropometric measurements of each participant. To verify this assumption and develop the compensation technique for the markerless system, each participant performed 3 flexion/extension cycles only on the sagittal plane (see Section 2.5), and the angles estimated with the markerless camera-based system were compared with the ones measured by the POF curvature sensor. We used the POF curvature sensor for the development of the compensation technique, since it was already evaluated with respect to the potentiometer, presenting low errors in this characterization. However, we must emphasize that other sensor systems can be used as reference for the proposed compensation technique, including IMUs, marker-based camera systems and goniometers. The technique proposed here is based on the basic premise that the errors are mainly related to the detection of the parameters *d*1, *d*2 and *d*3 (due to self-occlusions, numerical errors on the computer vision algorithm, among other reasons). Therefore, the errors can be correlated (and then compensated) by considering the actual value of the anthropometric measurements (*d*1 and *d*2) used on the angle estimation, which can be measured on each subject or estimated using the subject's height [41].

For the first characterization of the technique, the flexion/extension cycles of five subjects (M1, M3, M5, F2 and F5) are analyzed, and a polynomial regression between the angles estimated by the camera-based system and the POF curvature sensor is performed for each of the five subjects, where each equation has the type shown in Equation (7):

$$
\hbar \arg\_{ref} = a \cdot \arg\_{cam} \cdot^3 + b \cdot \arg\_{cam} \cdot^2 + c \cdot \arg\_{cam} + d,\tag{7}
$$

where *angcam* is the angle estimated by the camera, *angref* is the angle measured by the POF curvature sensor. In addition, a, b, c and d are polynomial regression coefficients experimentally obtained through the regression between the angular responses of both sensor systems, i.e., markerless camera-based and POF sensor, using the least squares method. The coefficient d is the offset on the sensor response (in ◦). Therefore, if the sensors responses are normalized in the beginning of the test, the offset will be null. For this reason, the coefficient d is not employed on the analysis of correlation between the coefficients of the angular error correction and the anthropometric measurements of the subjects.

Figure 12 shows the regression between the angle measured by the camera-based system and the POF curvature sensor for the third flexion cycle (as an example) of subject F5. The results show a high correlation (0.998) between the responses using a third-order polynomial regression. Actually, such high correlation occurs for all the cycles of the five subjects analyzed, where the correlation coefficient was higher than 0.9 in all cases. Hence, the assumption of correlation between the errors of both sensor systems holds true (based on the analyses performed). Then, the next step is to correlate the polynomial coefficients (a, b and c) with the anthropometric measurements of each participant.

**Figure 12.** Polynomial regression between camera-based system and POF curvature sensor angular responses for the third flexion cycle of subject F5.

As discussed in Section 2.1, the parameters used on the angle estimation by the camera-based system are the anthropometric distances (*d*1 and *d*2), which are detected through computer vision algorithms. Thus, errors on the detection of such points will lead to errors on angle estimation, where such errors can be related to those anthropometric distances. However, these parameters (*d*1 and *d*2) are intrinsic of each subject and can be easily measured. In addition, it is possible to use the height of each subject (as showed in Table 1) in conjunction with anthropometric data for males and females to correlate the arm length with the subject's height. Figure 13 shows the correlation of the polynomial regression coefficients (a, b and c) with the subjects' arm lengths (D).

**Figure 13.** Polynomial regression between the subjects' arm length (parameter D) and the coefficients of angular error correction.

The results as well as the equations presented in Figure 13 indicate the feasibility of using the anthropometric measurements of each subject on equations for angular errors corrections in camera-based systems. The correlation coefficient is higher than 0.9 for all analyzed coefficients, indicating the possibility of using the proposed compensation technique for angle correction. Then, by substituting the equations shown in Figure 13 in Equation (7), it is possible to obtain a corrected angle as depicted in Figure 14 for three flexion/extension cycles for subject F1. In addition, the uncompensated response, i.e., the response of the camera-based system without applying the equations for angle correction, is also presented for comparison purposes. The RMSE for the compensated response is also presented in order to verify the accuracy enhancement provided by the proposed technique. Compared to the uncompensated responses, where the RMSE was 15.04◦, 9.25◦ and 10.23◦ for cycles 1, 2, and 3, respectively, the proposed angular error compensation was able to reduce the errors substantially in all three cycles. To further verify the performance of the proposed technique, the aforementioned compensation equations were applied for the responses in the sagittal plane for all subjects. The comparison between the RMSEs for the cases with and without the compensation technique is presented in Table 5 for each subject in all three flexion/extension cycles analyzed, where the mean and standard deviation of the three cycles are presented for each participant.

The results presented in Table 5 show the feasibility of the proposed technique, where the RMSE was reduced for all 11 subjects analyzed. The highest reduction occurred in Subject F1, in which the RMSE reduced from 11.52◦ to 3.52◦ after applying the correction equations. The mean of the RMSEs for the compensated responses is about 4.90◦, whereas the uncompensated one is 10.42◦, which means a two-fold reduction of the RMSE when the proposed compensation is applied. It is also worth to mention that the lowest RMSE reduction for the compensated case occurred in subject M3, where the RMSE reduced 2.11◦. However, one should note that the RMSE of the uncompensated response of this subject was already low

(6.90◦) when compared to the ones of the other subjects and even when compared to the errors presented in the literature for similar systems [40].

**Figure 14.** Polynomial regression between the subjects' arm length (parameter *d*3) and the coefficients of angular error correction.


**Table 5.** Comparison between RMSEs of the compensated and uncompensated responses for each subject.

The proposed technique for angular errors correction in markerless camera-based system is a feasible and straightforward option to enhance the angular accuracy in such systems. There is a calibration step in which the response of the camera-based system has to be compared with the one of a reference sensor system, e.g., wearable or marker-based camera systems. Then, the errors obtained on the markerless camera-based system are compared with the subject's anthropomorphic parameters (arm length in this case) in order to obtain an equation that relates the angle correction with the parameters of each subject. Therefore, an important caveat should be mentioned: the calibration routine must be performed with respect to a reliable reference, and the movements should be performed at one plane, i.e., sagittal, frontal, or transverse planes movements. In addition, the calibration has to be performed on the same range at which the angle analysis will be performed, i.e., if an angular interval of 0 to 160◦ will be analyzed, the

calibration has to be made at this same angular range (0 to 160◦). By following these steps, it is possible to obtain accurate single plane angle measurements with a markerless camera-based system. Therefore, the main limitation of this approach is the necessity of a calibration stage prior to the application of the proposed sensor system in the same range and planes of movement envisaged on the proposed application. However, it is worth noting that the proposed approach can be extended for movement analysis of different degrees of freedom by adjusting the calibration stage accordingly and correlating the errors with the anthropometric parameters of each subject.
