*2.2. Description of the Controllers*

The control architecture developed in [31] for each robotic manipulator of the application is as follows.

The control of the WR is given by 4 prioritized tasks:


The control of the STR is given by 3 prioritized tasks:


#### *2.3. Description of the Conventional PC-Based Interface*

The authors in [31] proposed a conventional PC-based interface, which shows a 3D interface on a screen, that is composed of the following visual elements (see Figure 2):


Note that the user commands both robots by means of the gamepad.

**Figure 2.** Conventional PC-based user interface: visual references and effects. (**a**) Video: 0 m 20 s. (**b**) Video: 0 m 23 s.

Figure 3 shows several frames for the performance of the described application, focusing on the interface; see the video at https://media.upv.es/player/?id=15ffabe0-a733 -11eb-a0b0-2fbcb59aaef7 (accessed on 26 April 2022) [75].

**Figure 3.** Frames of the video showing the functionalities of the conventional user interface. See the video at https://media.upv.es/player/?id=15ffabe0-a733-11eb-a0b0-2fbcb59aaef7 (accessed on 26 April 2022) [75]. (**a**) Video: 0 m 20 s. (**b**) Video: 1 m 00 s. (**c**) Video: 1 m 24 s. (**d**) Video: 1 m 33 s. (**e**) Video: 1 m 43 s.

#### *2.4. Discussion of Human–Robot Interaction Using Conventional Interfaces*

The conventional PC-based user interface presents several problems that directly affect the task performance. Next, the three most relevant problems, which were identified from questions asked to several users that tested the application described above, are discussed.

The first significant problem reported by the users is that their interaction with the virtual environment was not natural. In particular, the robotic system is teleoperated in the 3D space and, hence, it requires changing the screen view to properly track the task. To do this, the user has to stop the robot teleoperation and accommodate the interface, affecting the total amount of time needed to complete the task.

The second significant problem reported by the users is that it was difficult for them to see the real system at any time. In this sense, Figure 3e shows the user looking at the real system instead of the interface while performing the task. The reason given by several users, who exhibited the same behavior, was that they needed to see what the real system was doing because they did not know if the task was being done correctly or not. This means that this type of interface does not properly help the user to conduct the real task.

The third significant problem reported by several users is that it was difficult for them to move the references in the virtual 3D space, wasting a lot of time before resuming the robotic task.

All these issues and problems show the difficulties of using conventional interfaces and make evident the need to develop new interfaces allowing a more intuitive user interaction, especially when working with complex systems such as the bimanual robotic system considered in this work.

#### **3. Proposed Augmented Reality-Based User Interface**

In order to overcome the aforementioned problems of the conventional PC-based interface, this work proposes the use of AR technology to improve the user ergonomics and task performance. In particular, the conventional PC-based interface used in the previous setup (see Figure 1a) is replaced by an AR headset in the new setup—see Figure 4—allowing the user to see the relevant information in the form of holograms while still seeing at all times the real elements involved in the task: robots, workpiece, tool, etc. Note that the remaining elements of the new setup (see Figure 4) are the same as in the previous setup (see Figure 1a): an STR with an F/T sensor and a cylinder-shaped tool with a piece of cloth; a WR with a flat workpiece of methacrylate attached to the end-effector using a self-made piece; and a gamepad to command both robots.

**Figure 4.** New setup used for the real experimentation.

Figure 5 shows the methodology considered in this work to develop and validate the proposed AR-based interface. Although this methodology is used below to design the AR interface for the specific bimanual robot teleoperation task at hand, it is generic and, in general, it can be applied to design AR interfaces for other types of applications.

**Figure 5.** Flowchart of the methodology proposed in this work for designing the AR-based interface.

Firstly, the requirements of the applications were established based on the opinions of several users who previously tested the conventional PC-based interface. These requirements are summarized in Table 1.

**Table 1.** Application requirements.

The user should have the option to see the full boundaries when required The part of the boundary activated should be indicated (e.g., visually, sound, etc.) STR tool reference direction should be indicated WR rotation reference direction should be indicated The new interface should use a similar interaction device to that of the previous PC-based interface (i.e., gamepad, joystick, or similar) Alarm sounds should be used to indicate boundary activation The user should have the option to remove all holograms Holograms should not disturb the user visibility during the task The user should have the option to configure, activate, and deactivate the alarm sounds

A mockup design was developed taking into account this information. The designed AR-based interface has, from a functionality perspective, two kinds of virtual objects: firstly, those representing the STR and WR reference indicators; and, secondly, those corresponding to the boundaries information. In order to develop both kinds of virtual objects, several tools and strategies related to the mockup design were used. These preliminary designs were validated by some users before their implementation.

Once the preliminary design was finished, the following step was to study the best option of AR headset to be used for the application at hand. Several considerations were taken into account, especially the following: first, the capability of the device to be used in industrial environments; second, the stability of the holograms, which is important when working in this kind of application; third, the computational power of the device; fourth, the sound capabilities; and fifth, the communication capability (i.e., Bluetooth and WiFi). Note that most AR headsets in the market accomplish the aforementioned requirements. However, among all of them, Microsoft *HoloLens glasses* [76] were chosen because the second generation of this device offers several services that could be added to the final version of the interface according to the company needs [77].

Once the AR headset was selected, the interface was developed. Using a PC workstation, the proposed virtual objects were created and assembled in a virtual space using Blender 2.7 [78] and *Unity* [79], respectively. This was an iterative design process, where the main characteristics of the virtual objects (e.g., size, color, shape, etc.) and their interactions were verified and modified, connecting the workstation with the AR headset in a remote mode from the Unity editor (note that the perception of the holograms is different when showing them in a PC screen compared to when projecting them in the real world through the AR headset), until the result was satisfactory.

Figure 6 shows the holograms designed for the robot references. In the case of the WR, the user can command the robot through the 3D workspace and modify the end-effector orientation. For this reason, two different holograms were designed. The translation reference hologram was modeled by a 3D orange cube; see Figure 6a. This hologram appears when the user teleoperates the WR translation reference. To reduce the number of holograms present at any moment, this hologram disappears 3 s after the user has stopped moving the WR translation reference. The orientation reference hologram was modeled by an animated arrowed yellow circle; see Figure 6b. This hologram appears when the user teleoperates the WR rotation reference, and disappears 3 s after the user has stopped moving the WR rotation reference. It should be noted that, in both cases, the movement of the references is relative to the position of the user, i.e., the AR headset, making their use more intuitive and natural. The STR translation reference was modeled by a yellow arrow attached to a green sphere; see Figure 6b. Note that this hologram is constrained to the plane of the workpiece surface, allowing a 2D movement. This hologram disappears 3 s after the user has stopped moving the STR translation reference.

Figure 7 shows the holograms designed for the 2D and 3D boundaries.

The 3D boundary is modeled by a superellipsoid—see Figure 7a—which is defined as:

$$\left|\frac{\chi}{\mathcal{W}}\right|^m + \left|\frac{\mathcal{Y}}{H}\right|^m + \left|\frac{z}{M}\right|^m = 1,\tag{1}$$

where {*W*, *H*, *M*} are the superellipsoid axes and *m* represents the smoothing parameter of the superellipsoid, i.e., it is equivalent to an ellipsoid for *m* = 2, whereas it tends to a cuboid as *m* tends to infinity. For the bimanual robot application at hand, it has been chosen *m* = 4.

The 2D boundary is modeled by a modified superellipse—see Figure 7c—which is defined as:

$$\frac{1}{N} \left| \frac{\chi}{W} \right|^m + \left( \frac{\max(|y| - (H - W), 0)}{W} \right)^m = 1,\tag{2}$$

where it is implicitly assumed that the value of axis *H* is greater than that of axis *W* (the expression is easily modified for the analogous case *H* < *W*). This equation represents a

rectangle with smooth corners, with 2*H* for its long side and 2*W* for its short side, by joining a 2*W* × 2(*H* − *W*) rectangle to two offset halves of an even-sided 2*W* × 2*W* superellipse.

(**c**)

**Figure 6.** Proposed holograms for the robot references. (**a**) WR: translation reference hologram. (**b**) WR: rotation reference hologram. (**c**) STR: translation reference hologram.

Note that if the proposed boundary holograms were permanently shown, they could occlude some real elements from the user's view, affecting the task performance. For this reason, a new material shader [80] was designed; see Figure 8. This shader computes the minimum distance between the robot end-effector and the 3D boundary, for the case of the WR, or the closest point of the robot tool to the 2D boundary, for the case of the STR. Thus, the shader only displays the affected part of the boundary hologram. That is, as the WR end-effector and/or the STR tool approach to the 3D and 2D boundaries, respectively, the part of the boundary hologram affected is progressively displayed; see Figure 7b,d.

In addition to this, and according to the user requirements, two warning sounds were included in the interface: the first one to indicate that the STR tool is close to the 2D boundary; and the second one to indicate that the WR end-effector is close to the 3D boundary. Moreover, the user is able to deactivate this warning sound at any time.

Once the main holograms and sound elements were implemented, some communication protocols were used and programmed. Bluetooth communication between the Microsoft *HoloLens glasses* and the gamepad was established to allow the user to provide commands to the interface. Moreover, in order to avoid non-desired interactions with the interface, voice and gesture commands were deactivated by default. In addition, the AR interface and the robot controller communicate via WiFi with Protocol TCP/UDP at 10 Hz.

**Figure 8.** Material shader designed for controlling the visibility of the 3D and 2D boundaries depending on the proximity of the WR end-effector and STR tool, respectively.

#### **4. Results**

This section presents four experiments to show the main functionalities of the developed AR-based interface; the performance of the 2D boundary and the STR reference hologram; the performance of the 3D boundary and the WR reference hologram; and the performance of the overall system when the user commands simultaneously both robots using the proposed AR-based interface.

Figure 9 depicts several frames of the *first experiment*, which shows the main functionalities of the AR interface implemented in the Microsoft *HoloLens glasses*; see the video at https://media.upv.es/player/?id=a64014f0-8a5a-11ec-ac0a-b3aa330d3dad (accessed on 26 April 2022) [81]. Figure 9a shows the full 3D boundary hologram, whilst Figure 9b shows the full 2D boundary hologram. Note that both holograms are hidden by default. Figure 9c shows the WR end-effector translation reference hologram, whilst Figure 9e,f show the WR end-effector rotation reference hologram. Note that, in the case of the rotation, the animated arrows indicate the direction of the commanded angle while the yellow circle indicates the rotation in the roll, pitch, and yaw angles, or a combination of them. Figure 9d shows the STR reference hologram.

**Figure 9.** First experiment: frames of the video showing the functionalities of the proposed AR-based interface. See the video at https://media.upv.es/player/?id=a64014f0-8a5a-11ec-ac0a-b3aa330d3dad (accessed on 26 April 2022) [81]. (**a**) Video: 0 m 20 s. (**b**) Video: 0 m 23 s. (**c**) Video: 0 m 30 s. (**d**) Video: 0 m 36 s. (**e**) Video: 0 m 43 s. (**f**) Video: 0 m 57 s.

Figure 10 depicts several frames of the *second experiment*, which shows the performance of the 2D boundary and the STR reference hologram; see the video at https://media.upv. es/player/?id=9504e6f0-8a61-11ec-b7c7-7d27dda7c5d5 (accessed on 26 April 2022) [82]. Figure 10a shows how the user is commanding the STR tool towards one side of the workpiece and, when the tool approaches the 2D boundary, the boundary region closest to the STR tool is shown in red and the warning sound is activated; see Figure 10b,c. Note that, when the user reference exceeds the 2D boundary, the tool is automatically kept within the allowed region. More details about this aspect can be further analyzed in Figure 11, which shows the allowed region on the workpiece surface, the trajectory followed by the user reference, and the trajectory followed by the STR tool. Figure 10d shows how

2D boundary.

(**a**) (**b**)

the 2D boundary hologram automatically disappears when the STR tool is far from the

**Figure 11.** The 2D trajectory performance for the second experiment, showing the 2D boundary and the STR reference hologram (see the video at https://media.upv.es/player/?id=9504e6f0-8a61- 11ec-b7c7-7d27dda7c5d5 (accessed on 26 April 2022) [82]): 2D allowed workpiece region in green; trajectory followed by the user reference in thin red line; and trajectory followed by the STR tool in thick blue line.

Figure 12 shows the position followed by the STR tool on the workpiece surface, which is due to the STR teleoperation, together with the reference values provided by the user. In particular, it can be appreciated that the trajectory described by the STR tool corresponds closely to the user reference values, except obviously when the 2D boundary constraint is active; see the bottom graph in Figure 12. In fact, the maximum deviation of the actual STR position values compared to the user reference values, when the 2D boundary constraint was not active, was around 3.2 cm, with a standard deviation of around 0.8 cm; see Table 2. Note that these teleoperation error values include all the potential sources of error: communication delays, high-level and low-level robot control, the accuracy of the workpiece location, teleoperation system, etc. Therefore, it can be concluded that the accuracy of the proposed AR-based teleoperation of the STR is sufficient for the task at hand.

**Figure 12.** Performance of the STR position teleoperation for the second experiment. First two graphs: user position references in thin red line, actual position values of the STR tool on the workpiece surface (coordinates relative to the surface) in thick blue line, and position limits given by the 2D boundary constraint in dashed lines. Bottom graph: activation of the 2D boundary constraint for the position of the STR tool on the workpiece surface.


**Table 2.** Teleoperation errors for the 2D position **p***<sup>s</sup>* of the STR tool on the workpiece surface.

Figure 13 shows several frames of the *third experiment*, which shows the performance of the 3D boundary and the WR reference hologram; see the video at https://media.upv. es/player/?id=17d88200-8f0b-11ec-be22-d786eca82090 (accessed on 26 April 2022) [83]. Figure 13a shows how the user is commanding the WR and, when the WR end-effector approaches the 3D boundary, the boundary region closest to the WR end-effector is shown in blue and the warning sound is activated; see Figure 13b–d. Note that, when the user reference exceeds the 3D boundary, the WR end-effector is automatically kept within the allowed region. More details about this aspect can be further analyzed in Figure 14,

trajectory followed by the WR end-effector.

(**a**) (**b**)

which shows the allowed 3D region, the trajectory followed by the user reference, and the

**Figure 13.** Third experiment: frames of the video showing the performance of the 3D boundary and the WR reference hologram. See the video at https://media.upv.es/player/?id=17d88200-8f0b-11ecbe22-d786eca82090 (accessed on 26 April 2022) [83]. (**a**) Video: 0 m 22 s. (**b**) Video: 0 m 24 s. (**c**) Video: 1 m 04 s. (**d**) Video: 1 m 46 s.

**Figure 14.** The 3D trajectory performance for the third experiment, showing the 3D boundary and the WR reference hologram (see the video at https://media.upv.es/player/?id=17d88200-8f0b-11ec-be2 2-d786eca82090 (accessed on 26 April 2022) [83]): 3D allowed region in green; trajectory followed by the user reference in thin red line; and trajectory followed by the WR end-effector in thick blue line.

Figures 15 and 16 show the position and orientation, respectively, followed by the workpiece, which are due to the WR teleoperation, together with the reference values provided by the user. In particular, it can be appreciated that the trajectory described by the workpiece corresponds closely to the user reference values, except obviously when the 3D boundary constraint is active; see the bottom graph in Figure 15. In fact, the maximum deviation of the actual workpiece position values compared to the user reference values, when the 3D boundary constraint was not active, was around 1.2 cm, with a standard deviation of around 0.4 cm; see Table 3. Moreover, the maximum deviation of the actual workpiece orientation values compared to the user reference values was around 1.7◦, with a standard deviation of around 0.3◦; see Table 3. Note that these teleoperation error values include all the potential sources of error: communication delays, high-level and low-level robot control, teleoperation system, etc. Therefore, it can be concluded that the accuracy of the proposed AR-based teleoperation of the WR is sufficient for the task at hand.

**Figure 15.** Performance of the WR position teleoperation for the third experiment. First three graphs: user position references in thin red line, actual position values of the workpiece in thick blue line, and position limits given by the 3D boundary constraint in dashed lines. Bottom graph: activation of the 3D boundary constraint for the workpiece position.

**Table 3.** Teleoperation errors for the pose **p***w* (i.e., position and orientation) of the WR.


Figure 17 depicts several frames of the *fourth experiment*, which shows the performance of the overall system when the user commands simultaneously both robots using the proposed AR-based interface; see the video at https://media.upv.es/player/?id=29330720 -8a8b-11ec-97cd-ab744f931636 (accessed on 26 April 2022) [84]. Figure 17a–d show how the user modifies the orientation of the WR while, at the same time, commanding the STR tool towards one side of the workpiece. Note that, in this situation, when the WR end-effector is close to one side of the 3D boundary, it is partially shown by the corresponding blue hologram. Furthermore, Figure 17e shows how the user simultaneously commands both robots to reach both 2D and 3D boundaries, which are partially shown by the red and blue holograms, respectively. It is worth noting that, in addition to the mentioned holograms, the user hears different warning sounds. Figure 17f,g show how the user modifies again the orientation of the WR while, at the same time, commanding the STR tool towards the other side of the workpiece. Finally, Figure 17h shows how the STR tool reaches the 2D boundary while the user is also commanding the WR end-effector.

**Figure 16.** Performance of the WR angle teleoperation for the third experiment: user angular references in thin red line and actual angular values of the workpiece in thick blue line.

**Figure 17.** *Cont*.

(**a**) (**b**)

(**c**) (**d**)

**Figure 17.** Fourth experiment: frames of the video showing the simultaneous teleoperation of both robots with the proposed AR-based interface. See the video at https://media.upv.es/player/?id=29 330720-8a8b-11ec-97cd-ab744f931636 (accessed on 26 April 2022) [84]. (**a**) Video: 1 m 19 s. (**b**) Video: 1 m 44 s. (**c**) Video: 1 m 54 s. (**d**) Video: 1 m 55 s. (**e**) Video: 2 m 13 s. (**f**) Video: 2 m 42 s. (**g**) Video: 2 m 52 s. (**h**) Video: 3 m 5 s.

For the fourth experiment, Figure 18 shows the complete 2D trajectories followed by the user STR reference and the STR tool, whilst Figure 19 shows the complete 3D trajectories followed by the user WR reference and the WR end-effector. In both cases, as in the second and third experiments, the STR tool and the WR end-effector are automatically kept within the allowed regions despite the fact that, at some point, the user references exceed the 2D and 3D boundaries, respectively.

**Figure 18.** The 2D trajectory performance for the fourth experiment, showing the simultaneous teleoperation of both robots (see the video at https://media.upv.es/player/?id=29330720-8a8b-11ec-97cd-ab744f931636 (accessed on 26 April 2022) [84]): 2D allowed workpiece region in green; trajectory followed by the user reference in thin red line; and trajectory followed by the STR tool in thick blue line.

**Figure 19.** The 3D trajectory performance for the fourth experiment, showing the simultaneous teleoperation of both robots (see the video at https://media.upv.es/player/?id=29330720-8a8b-11ec-97cd-ab744f931636 (accessed on 26 April 2022) [84]): 3D allowed region in green; trajectory followed by the user reference in thin red line; and trajectory followed by the WR end-effector in thick blue line.

The teleoperation errors for the fourth experiment, in which the user commands simultaneously both robots using the proposed AR-based interface, are similar to those shown above for the second experiment (STR teleoperation) and third experiment (WR teleoperation): approximately 0.8 cm standard deviation for the position of the STR tool see Table 2—and approximately 0.4 cm and 0.3◦ standard deviation for the WR position and orientation, respectively—see Table 3. As mentioned above, these teleoperation error values include all the potential sources of error: communication delays, high-level and low-level control of both robots, teleoperation system, etc. Therefore, it is concluded that the accuracy achieved by the proposed AR-based approach for teleoperating the bimanual robot system is satisfactory.

#### **5. Conclusions**

A solution to improve the assisted bimanual robot teleoperation has been developed in this work using augmented reality (AR) technology and tools. In particular, a new AR interface using the Microsoft *HoloLens glasses* has been proposed to mitigate the problems in terms of user ergonomics and task performance (i.e., completion time and finishing quality) raised from the use of conventional PC-based user interfaces. In addition, this work has proposed and followed a new methodology to design and develop AR interfaces for bimanual robotic systems.

The effectiveness and applicability of the proposed AR interface were shown by means of real experimentation with an advanced bimanual robot application consisting of two robotic arms: a 7R cobot and a 6R industrial manipulator.

It is worth noting that several users, who tested both the conventional PC-based interface and the proposed AR interface, found the latter more intuitive and were able to conduct the robot teleoperation task faster. Note that when the users teleoperated the bimanual robot system using the conventional PC-based interface, most of them complained about the difficulty of checking whether the robots were performing the task correctly or not. In addition, the users indicated that with the conventional PC-based interface, it was not easy for them to command both robots simultaneously because they could not pay attention

to so many reference signals shown. These facts negatively affected the performance of the users in terms of the time required to complete the task. Thus, the mentioned issues were mitigated with the proposed AR interface, significantly improving the user performance in the teleoperation task.

Another relevant remark is that the users also indicated that the warning sounds helped them in the early stages of the teleoperation task but, as the time of use of the interface increased, these sounds were annoying and they preferred only the visual warnings.

**Author Contributions:** Conceptualization, J.E.S. and L.G.; Funding acquisition, J.T.; Investigation, J.E.S., A.M. and L.G.; Methodology, J.E.S. and L.G.; Resources, J.T.; Software, A.G. and A.M.; Supervision, L.G. and J.T.; Writing—original draft, J.E.S.; Writing—review & editing, L.G. All authors have read and agreed to the published version of the manuscript.

**Funding:** This research was funded by the Spanish Government (Grant PID2020-117421RB-C21 funded by MCIN/AEI/10.13039/501100011033) and by the Generalitat Valenciana (Grants GV/2021/ 181 and ACIF/2019/007).

**Conflicts of Interest:** The authors declare no conflict of interest.
