**10. Conclusions**

In this work, we proposed and successfully implemented a framework for gestural HRI using pointing gestures. The core concept of the method was based on the previous work by Tölgyessy et al. [6], particularly on the idea of linear HRI. In the work, they proposed a similar concept of controlling a mobile robotic platform via pointing gestures. This article further improves the work by extending the overall potential of this innovative concept from mobile platforms to other fields of robotics. Additionally, it promotes the universal nature of linear HRI as the concept uses a different sensory system and a different type of robotic system. The work contains a general overview as well as the mathematical background for the core method. A concept of the collaborative robotic workplace, also known as COCOHRIP, was built around the proposed framework. Practical experiments in a laboratory environment were carried out to prove the described method. The obtained data provided the basis for further development of the whole gestural framework.

The main contribution of this work is a novel approach to the control of the collaborative robotic arm using pointing gestures. In the majority of the cited work, the researchers focus on teleoperation of the robotic arm either through movement mirroring or by binding the control of the robot to a specific gesture. The operator is not directly interacting with the (shared) workspace. The robot is controlled directly by the user, and its movements are entirely subordinate to the operator's movements. In our approach, the operator interacts with the robot through the shared workspace, thus not teleoperating the robot directly. The robot is in the role of an assistant rather than a subordinate. Consequently, this dynamic could facilitate more effective collaboration between humans and the robotic systems. Cippolla and Hollinghurst [22] presented a gesture-based interface method that is most comparable to our approach. They similarly use pointing gestures to direct the robot's movement. In their work, hand tracking is achieved using stereo monochromatic CCD cameras. However, their method suffers shortcomings such as the need for high contrast background and specific light conditions. Additionally, the method's workspace is much smaller compared to our method. Our proposed concept has far more robust gesture recognition and tracking performance, additionally being able to recognize multiple gestures. Other cited works in which the user interacts with a workspace require some sort of wearable equipment. We believe the future of HRI lies in the utilization of multiple interconnected visual-based sensors with the ultimate purpose to rid humans of any additional restrictive equipment, allowing them to interact with robots as naturally and ergonomically as possible, without the need for superior technical skills. Furthermore, the core mathematical method in the proposed framework is universally applicable, not relying on the specific hardware or software.

The practical utilization of the proposed framework was outlined in the form of a collaborative robotic assistant that could potentially be used in the manufacturing process, academic field, or research and development. This paper further develops this concept, widening its usability from mobile robotic applications to robotic arm manipulation while utilizing a new state-of-the-art visual-based sensory system. Our framework, along with the built COCOHRIP workplace, could serve as a basis for more complex applications, such as naturally controlled object picking and manipulation, robot-assisted assembly, or material

processing. Our future research will focus on improving the system's accuracy based on the identified factors, and designing and implementing more complex algorithms and methods to further increase the usability and flexibility of the proposed gestural framework.

**Author Contributions:** Conceptualization, M.T. and M.C.; methodology, M.T.; software, M. ˇ C.; valid-ˇ ation, M.C.; formal analysis, M.T. and M. ˇ C.; investigation, M.T. and M. ˇ C.; resources, M.T. and M. ˇ C.; ˇ data curation, M.C.; writing—original draft preparation, M. ˇ C.; writing—review and editing, M.T. ˇ and M.C.; visualization, M. ˇ C.; supervision, M.T.; project administration, M.T.; funding acquisition, ˇ P.H. All authors have read and agreed to the published version of the manuscript.

**Funding:** This work was funded by APVV-17-0214.

**Data Availability Statement:** All data presented in this paper are available at: https://drive.google. com/drive/folders/1V4cUwvD7barr6BBPcmY9KFzPz3vc1Ee8?usp=sharing.

**Conflicts of Interest:** The authors declare no conflict of interest.
