**8. Results**

The measured coordinates for each target marker relative to the desired position are depicted in Figures 15 and 16. The blue markers represent the first variant, where the user had no visual feedback. The red markers represent the second variant of each experiment. The euclidean distance between each measured point and the desired position was computed to quantify the precision of pointing for each experiment. Figures 17 and 18 show the boxplots of the euclidean distance data for each experiment scenario. The mean deviation in position and the overall deviation are shown in Tables 2 and 3.

Additionally, the dispersion of measured points in each axis was calculated to evaluate the repeat-ability of pointing. The results are shown in Tables 4 and 5. All the data are represented in millimeters.

**Figure 15.** The acquired position data for experiment no. 1.

**Figure 16.** The acquired position data for experiment no. 2.

**Figure 17.** Boxplots of euclidean distance data for experiment no. 1.

**Figure 18.** Boxplots of euclidean distance data for experiment no. 2.


**Table 2.** The mean of euclidean distances between the desired position and measured data for each horizontally placed target marker.

**Table 3.** The mean of euclidean distances between the desired position and measured data for each vertically placed target marker.


**Table 4.** The dispersion of measured points for each target in the first experimental scenario.


**Table 5.** The dispersion of measured points for each target in the second experiment scenario.


## **9. Discussion**

The data show that the error of pointing to the specific location, relying solely on the user's estimation, is approximately 50 mm. The user's consistency of pointing is around 30–50 mm, depending on the user's distance to the target. The results also show significant improvement in both accuracy and consistency when visual feedback is provided. These results outline the potential application use regarding the possible size of manipulated objects as mentioned previously in a real-life scenario. They also highlight the importance of some feedback that should be implemented to potential real use-case applications. However, the data are not ye<sup>t</sup> sufficient to derive general conclusions. Further, more thorough analysis is needed. During the experiments, several factors were identified that could significantly influence the accuracy of pointing. The first factor is the posture and position of the human body and the distortion caused by the user's point of view. The second factor is the tracking capabilities and precision of the LMC sensor. During experiments, numerous glitches of

LMC controller were reported. The LMC occasionally misinterpreted the position of joints and fingers, which led to inaccuracies in the measurements. Another factor influencing the results is the joints selected to represent the half-line intersecting with the working plane. The last identified factor is the influence of the command pistol gesture on pointing accuracy, as extending the thumb may change the intended pointing location. Further evaluation of the method, considering all of the mentioned factors, must be done in the future to determine the full potential of the proposed gestural framework. In all, we believe that the experiments and the data served their purpose. The main factors influencing the accuracy were identified, and the obtained data can be used as the base for further experiments and evaluation of the method.
