Next Article in Journal
State of the Art of Mobile Learning in Jordanian Higher Education: An Empirical Study
Next Article in Special Issue
On the Effectiveness of Using Virtual Reality to View BIM Metadata in Architectural Design Reviews for Healthcare
Previous Article in Journal
Differences of Training Structures on Stimulus Class Formation in Computational Agents
Previous Article in Special Issue
Location- and Physical-Activity-Based Application for Japanese Vocabulary Acquisition for Non-Japanese Speakers
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Augmented Reality for Supporting Workers in Human–Robot Collaboration

1
TECNALIA, Basque Research and Technology Alliance (BRTA), 48160 Derio, Spain
2
TECNALIA, Basque Research and Technology Alliance (BRTA), 01510 Miñano, Spain
3
Technology Department, Siemens Aktiengesellschaft, 81739 Munich, Germany
4
Life Supporting Technologies—LifeSTech, Universidad Politécnica de Madrid, 28030 Madrid, Spain
*
Authors to whom correspondence should be addressed.
Multimodal Technol. Interact. 2023, 7(4), 40; https://doi.org/10.3390/mti7040040
Submission received: 20 March 2023 / Revised: 31 March 2023 / Accepted: 4 April 2023 / Published: 10 April 2023

Abstract

:
This paper discusses the potential benefits of using augmented reality (AR) technology to enhance human–robot collaborative industrial processes. The authors describe a real-world use case at Siemens premises in which an AR-based authoring tool is used to reduce cognitive load, assist human workers in training robots, and support calibration and inspection tasks during assembly tasks. The study highlights the potential of AR as a solution for optimizing human–robot collaboration and improving productivity. The article describes the methodology used to deploy and evaluate the ARContent tool, which demonstrated improved usability, reduced task load, and increased efficiency in the assembly process. However, the study is limited by the restricted availability of workers and their knowledge of assembly tasks with robots. The authors suggest that future work should focus on testing the ARContent tool with a larger user pool and improving the authoring tool based on the shortcomings identified during the study. Overall, this work shows the potential for AR technology to revolutionize industrial processes and improve collaboration between humans and robots.

1. Introduction

The fifth industrial revolution is a reality that focuses on putting workers at the center of production while using new technologies to promote prosperity, job creation, and environmental sustainability [1]. This revolution leverages workers’ skills and knowledge to collaborate effectively with machines and robots [2,3]. Additionally, it introduces flexibility in production processes to reduce environmental impact.
Human–robot collaboration (HRC) is a key enabler of Industry 5.0, as it facilitates humans and robots to work together in a seamless and productive manner. The idea is to combine the unique strengths of humans, such as creativity, problem-solving, and adaptability, with the precision, speed, and accuracy of robots [4]. HRC has the potential to revolutionize manufacturing and other industries by improving productivity, safety, and efficiency while also reducing production costs. In general, HRC will play a critical role in creating new job opportunities, promoting innovation, and achieving sustainable growth in Industry 5.0 [5]. Collaborative robots (cobots) are versatile and flexible tools that can be used for small batch production. This is particularly advantageous for small and medium-sized enterprises (SMEs) as it reduces the time and effort needed to adapt to new processes. Instead of deskilling the workforce, as occurred in Europe’s previous automation practices [6], Industry 5.0 emphasizes the need for highly skilled human-machine teams to improve productivity and ergonomics in carrying out versatile, customer-specific tasks [1].
In this context, technology plays a crucial role in supporting workers and enabling them to play a significant role in digital transformation [4]. Manufacturing tasks are crucial elements in any industrial revolution, as they can impact the cost, time, and quality of a product. Assembly tasks, in particular, can be complex and require precise adjustments for optimal results. Augmented reality (AR) and human–robot collaboration concepts are technologies that can optimize the skills of the workforce towards more effective and productive performance. AR technology can be used to provide workers with real-time information to help them in their daily tasks, including operational activities, maintenance, and the control of industrial machinery and systems [7]. This technology allows workers to interact with their physical environment more efficiently and effectively. By integrating workers into the system, providing them with real-time information about procedures and processes, and allowing them to make decisions to improve efficiency and productivity, time and cost parameters can be reduced while increasing worker engagement with technology. As a result, greater progress can be achieved with less effort [7].
AR combined with HRC can enable workers to leverage their strengths, while robots handle repetitive or physically demanding tasks. By working together, humans and robots can achieve better results than either could alone. Furthermore, workers who are trained and skilled can work more efficiently and effectively, especially when collaborating with robots. To ensure smooth collaboration, non-invasive devices (such as AR glasses) can support workers by allowing them to focus on the task at hand and integrating the cobots’ tasks into the process flow [8]. However, there are challenges associated with the use of AR in manufacturing, which can be summarized as [9]:
  • Specialized workers to create the content for the AR experience: The creation of AR content requires a certain level of technical expertise, and the authoring tools can be complex and difficult to use. In addition, the cost of creating AR content can be high, and there is a need for ongoing maintenance and updates to keep the AR content relevant and effective.
  • Technical background in AR: The integration of AR with existing manufacturing systems and processes can be complex, and there may be a need for additional hardware and software to support the use of AR. In addition, the use of AR may require changes to existing processes and workflows, which is time-consuming and disruptive.
Overall, addressing the two challenges is crucial to realizing the full potential of augmented reality (AR) for human–robot collaboration (HRC). This paper focuses on the need for developing personalized AR applications for industrial settings and emphasizes the importance of providing authoring tools that are accessible to non-IT professionals.
The current state of the art in AR authoring tools is reviewed, revealing a significant gap in the availability of industry-specific AR authoring tools for manufacturing scenarios, with special emphasis on HRC. Additionally, the paper highlights the absence of established requirements and guidelines for creating such tools. Furthermore, the lack of context-aware authoring tools is emphasized, which are critical for effective HRC in industrial environments. As a main contribution, the paper presents the ARContent tool, a context-aware AR authoring tool that provides support for human–robot collaboration scenarios via a web interface for non-expert users. The ARContent tool enables the creation of augmented manuals using AR and 3D model animation to monitor and validate a manufacturing procedure and translate it into an animated sequence of steps, thereby supporting workers in learning how to develop a task or maintain a machine or system through AR manuals rather than traditional PDF manuals.
Additionally, the paper discusses the integration of the ARContent tool with the FI-WARE context broker to support real-time coordination of the different activities to be performed by the worker and the robot in the same ecosystem, enabling real-time data analysis and improved performance. Through a real use case at Siemens premises, we demonstrate how AR can effectively reduce cognitive load, assist human workers in training robots, and support calibration and inspection tasks. The main objective is to highlight the potential of AR as a solution for optimizing human–robot collaboration during assembly tasks while improving productivity.
This article is structured as follows. Section 2.1 provides an introduction to the state of the art in authoring tools for AR. Section 3 describes the AR-based authoring tool proposed by the SHOP4CF H2020 project for smart factories of the future, including the methodology followed to deploy and evaluate the tool’s performance in an assembly use case developed within the project. Finally, Section 4 and Section 5 summarize the main findings and future directions.

2. Related Work

In recent years, there has been an increasing interest in the use of augmented reality (AR) in industrial environments. However, creating customized AR applications for these environments can be challenging without extensive knowledge of programming languages, which makes it difficult for companies to adopt and use this technology. Additionally, the development and personalization of AR applications can be expensive and time-consuming, as it requires the involvement of AR developers. To address these challenges, there has been a growing focus on the development of authoring tools that enable the configuration and personalization of AR applications in a simple and efficient manner, while also allowing companies to adjust the application to their specific preferences. These authoring tools can help reduce the cost and time required for the development and personalization of AR applications, making it easier for companies to adopt and utilize this technology in their industrial processes. This section provides a review of the current state of the art of authoring tools and AR applications in industrial environments.

2.1. AR Applications in Industrial Environments

An authoring tool, also known as an editing tool, is visual software that enables the creation of applications without requiring programming skills. This type of tool provides non-IT professionals with a convenient way to rapidly build simple business applications without coding [10,11]. Authoring tools have been used for years in various fields, including instructional design, multimedia creation, and education [12]. Over time, authoring tools, including those for AR, have become crucial in the context of digital content creation and exploitation.
AR authoring heavily depends on the content and real environment used during authoring, which is known as the authoring context. Simple content, such as purely textual instructions registered in a simple environment, such as 2D artificial markers, is easy to author. In comparison, using complex content, such as multiple 3D models with relevant animations within a complex environment, such as a physical assembly with homogeneous color, is much harder to author. Thus, it is important to recognize the authoring context of every authoring tool [13]. The literature typically categorizes AR authoring tools through various taxonomies, and there are dozens of concrete examples sorted according to different categories.
In 2015, the study performed in [14] sorted 19 commercial and academic AR authoring tools by authoring paradigms (for example, stand-alone vs. AR-plugin) and deployment strategies (platform-specific vs. platform-independent). Years later in 2018, Ref. [15] conducted an AR authoring tools classification by their level of fidelity in AR/VR and by the required skills and resources involved. Five classes were identified: basic mobile screens and interactions, basic AR/VR scenes and interactions, AR/VR-focused interactions, 3D content, and 3D games and applications. Even more recently, in 2019, Ref. [13] proposed a new taxonomy for categorizing AR authoring tools based on design, including linking systems, AR previewers, virtual registration, hybrid methods, context-awareness, knowledge-based, and third-party packages. As we have just reviewed, the literature typically categorizes AR authoring tools through various taxonomies, and all these papers point to dozens of concrete examples sorted according to the above categories. However, there is a lack of established requirements and guidelines for developing industry-specific AR authoring tools specifically for human–robot collaboration, possibly due to a lack of real-world evaluation and field experiments.
AR authoring tools can play a crucial role in enhancing human–robot collaboration by providing a common platform for information sharing and collaboration between humans and robots. By providing a visual representation of the information being shared, AR authoring tools can improve communication and coordination between human and robot participants. The only paper found that addresses augmented reality and human-robot collaboration proposing a classification and a taxonomy is found in [16]. The solution we provide focuses on the intersection of augmented reality and robotics, and how AR technology can enhance their interaction. It also highlights the potential benefits of AR in enhancing human–robot interaction and robotic interfaces and points to areas where further research is needed. In this comprehensive study [16] comprising 460 references on augmented reality and HRC, only two refer to authoring tools. The first one [17] is mainly focused on human actions previously created to generate future robotic actions based on a motion-capture system, in order to speed up the collaboration with a robot. The second one [18] deals with programming swarm user interfaces (Swarm UI) by leveraging direct physical manipulation; in other words, something very ad-hoc.
After analyzing taxonomies, existing AR tools, and industrial applications, our no-code tool, called ARContent, is the only context-aware AR authoring tool with the ability to provide binding for humans.

3. Materials and Methods

This section describes the methodology used for the application of one AR authoring tool in a real smart factory use case. Initially, the methodological and architecture framework of SHOP4CF is explained in Section 3.1. Subsequently, Section 3.2 provides a detailed description of the authoring tool provided in SHOP4CF, called the AR-based Content Tool (ARContent). The evaluation framework and the Siemens use case are described in Section 3.3 and Section 3.4, respectively.

3.1. Methodological Framework and Architecture

The SHOP4CF architecture consists of two main layers: the software layer and the hardware layer. The software layer comprises proprietary components such as the ARContent tool, middleware, third-party information systems, and containers that provide operating system-level virtualization. The hardware layer, on the other hand, is composed of servers and cyber-physical systems. Additionally, IoT devices can belong to either layer [19].
To handle middleware tasks, SHOP4CF uses the open-source FI-WARE platform because of its extensive support from other European projects and its focus on context information management. Context information is critical for smart factory applications as it pertains to the current state of the physical and virtual objects on the shop floor. SHOP4CF components exchange information through the FIWARE middleware whenever possible, except for connections that require real-time processing. In such cases, the components or IoT devices communicate directly because the FIWARE middleware cannot guarantee real-time response times.
In terms of the platform’s architecture, there are different levels that are categorized based on their proximity to the hardware (see Figure 1). The topmost level includes the SHOP4CF components and third-party information systems that use software components to transmit and receive information. The middle level houses middleware, containers, and software for cyber-physical systems, which are responsible for moving data between hardware and software components. At the bottom of the platform, you can find hardware components such as servers and IoT devices, which are designed to perform environmental changes.

3.2. AR-Based Content Tool

The AR-based content editor, called ARContent, provides a tool for creating augmented reality (AR) content using 3D model animation. Its purpose is to monitor and validate manufacturing procedures by translating them into an animated sequence of steps. These steps are later presented using AR and 3D models of the involved machines and systems.
The tool consists of an easy-to-use authoring tool for creating augmented manuals with different steps to guide a worker in performing a task, and a visualization tool to display the AR manual created by the worker during the task. Instead of using traditional PDF manuals, the ARContent tool allows the creation of AR manuals that are focused on supporting workers to learn how to perform a task or maintain a machine or system.
The ARContent tool supports the creation, visualization, and maintenance of AR applications in an agile and simple way, without prior programming knowledge, using a web solution. The ARContent editor creates and configures AR manuals based on a set of steps with different assets, such as 3D models, audio, PDF files, images, and video. These assets can be configured to be displayed when a pattern is scanned or when the step is started. Objects can be configured with animations to provide clearer instructions when the AR manual is viewed. Figure 2 shows an overview of the main screen of the editor.
To enable a scenario focused on human–robot collaboration, the tool is integrated with FIWARE [21] as middleware to manage context information, such as the current state of the surrounding real world as perceived by a robot. The Orion content broker is a powerful tool for managing data in a scalable and efficient manner, making it an ideal solution for robots that generate large amounts of sensor data or other types of information. With its scalability, real-time data analysis, and interoperability features, the content broker can help unlock new insights and improve the performance of robots in various contexts. This technology is particularly valuable for applications that require rapid response times or need to share data across multiple systems or platforms. In this case, using the context broker, the status of the different activities to be performed by the worker and the robot can be easily shared and coordinated in the same ecosystem, enabling real-time data analysis and improved performance.
The integration procedure to coordinate the collaborative tasks between the ARContent tool, FIWARE, and the cobot is as follows: each time the robot completes an activity, a message is published in the FIWARE context broker. The message contains information about the activity, such as the type of task performed, the next task to be completed, and any relevant parameters. The ARContent tool is subscribed to the context broker to receive updates about new messages related to tasks to be performed by the worker using the AR manual. These tasks are identified with the value “urn:ngsi-ld:Device:siemens:hololense”, as detailed in Figure 3. Based on the information received from the context broker, the tool can determine whether the worker needs to start an AR-based task or whether additional information is needed before the task can be started. This approach enables real-time collaboration between robots and workers, allowing for more efficient and effective task execution.

3.3. Evaluation Framework

To assess the deployment of a complex infrastructure in industrial environments, it is necessary to have a comprehensive evaluation framework. Such a framework should support the holistic assessment of components while providing enough flexibility to adapt methods and evaluation objectives to the individual needs and contexts of different industrial use cases.
The SHOP4CF evaluation framework aims to provide a highly adaptable methodological framework that can produce comparable results across the involved pilots [20]. The framework proposes an iterative evaluation process based on the assessment of the immediate implications of the pilots, including user experience, user acceptance, usability, ergonomics, safety, ethical aspects, and impacts on company workflows.
The iterative approach comprises several coordinated actions in different steps and tasks of the pilot design and execution, incrementally refined and reviewed. To support these steps and tasks, a set of tools was designed to facilitate the definition of objectives, the selection of measurement instruments, and the collection and analysis of data.
During the definition of pilot use cases, the involved technical and pilot staff worked together to define specific use case objectives with a focus on worker well-being and company benefits. These objectives were continually refined as the project platform and components matured. Moreover, the use cases were updated to adapt to the changing European context, such as the pandemic or energy crisis.
Once the objectives were defined, the study preparation actions start, which included the practical definition of methods to facilitate data collection. A set of tools was co-created with SHOP4CF evaluation experts and technical staff to support the definition of assessment methodologies. The first tool enables the guided definition of KPIs at four main domains: integration, functionality, process requirements, and human-related values. These domains were subdivided into subtopics, supporting a more detailed definition of the KPIs.
  • Integration: Safety and security, enterprise SW, and deployment.
  • Process: Quality, throughput, logistics, and raw materials.
  • Functionality: Manipulation, AR/HMI, mobility, sensors and measurement, decision making, maintenance, data traceability, and communication.
  • Human-related values: Productivity, change work and skills, autonomy and control, acceptance and usability, wellbeing, safety, and ethics.
For each domain and subdomain, one or more KPIs were defined, according to the needs of the use case, a measurement instrument was selected to gather necessary data for assessment, and a baseline value and an estimation of the overall reduction were established as success measurements for the KPI. At this stage, the requirement procedures and their ethical aspects were also considered, as well as the responsibilities for the correct management of the data.
Once the integration of components was ready, the study execution stage began. This involved data collection, according to the different strategies and procedures defined in the previous phase, and real users participating to validate the technologies in real industrial conditions. As the process was iterative, the refinement of the selected instruments could occur.
To support the worker-centered approach of SHOP4CF, a set of questionnaires was defined to carry out the evaluation process. These include a human factors questionnaire aimed at gathering user experience and impact on the well-being of the participants in the domains of user experience, usability, ethics, safety, acceptance, ergonomics, and usefulness, and an integration questionnaire, an open guided interview questionnaire aimed at evaluating the experience of integrating the SHOP4CF components to achieve the different use case objectives [22].
The process ended with data analysis to appropriately visualize the evaluation results based on the analysis of the collected data. The whole process is summarized in Figure 4.

3.4. Siemens Use Case Description

As a practical implementation of the SHOP4CF architecture and its components, we utilized a production scenario featuring a robotic arm collaborating with human workers across different work cells. The primary challenge in this use case is that the manufacturing engineer faces difficulties in designing manufacturing flows due to the lack of a tool for describing human–robot collaboration flows. Human operators can bridge the gap between flexible automation and high-mix low-volume production when supported by adequate tools that leverage their flexibility and experience with different processes.
More precisely, in the gearbox assembly use case (UC), a human operator and a robot work in a collaborative workspace to assemble a gearbox, which is not a fully automated task and requires fine handling of small pieces and precise snap-fitting assembly that can easily be performed by an operator [23]. To improve the assembly process, we identified two necessary tools resulting from the combination of different SHOP4CF components:
  • A visualization tool that displays specific information related to the assembly using AR technology, which reduces the cognitive load and stress associated with the task.
  • An editor who coordinates the flow between worker and robot tasks while creating an AR manual to support the worker.
In the manufacturing of gearboxes (see Figure 5), several manual assembly steps are necessary, which can be considered complex as they involve different manipulation skills. However, the manufacturing of these products occurs only in defined time frames due to low-batch production, and users may forget how to perform these assembly steps. Therefore, there is a need for tools to alleviate the mental burden necessary to perform these steps. AR-based visualization tools can be very helpful for this purpose.
For a better understanding of the assembly procedure and its complexity, Figure 6 provides the workflow of the collaborative assembly task using business process model and notation (BPMN) as a widely adopted notation model [24].
This workflow illustrates all the steps involved in the assembly process, which are split into two lanes: one for the “Robot” (in green the color) and another for the “Operator” (in the pink color). Each lane describes the actions that each actor should perform for the collaborative task. Additionally, the workflow details the passive resources (e.g., the gripper and motor holder) that are complementary to the actions (in the orange color) and how they are connected to the different actions.
Thus, the assembly is composed of a set of actions to complete three different activities (see Figure 7). There is the main assembly activity, where components are attached to a base plate (the box in the black and white colors). Then, two other sub-assembly activities are required to create sub-products that are necessary for the gearbox. In this case, the sub-assembly 1 activity is in charge of creating a light barrier while the sub-assembly 2 activity is in charge of the servo motor and the gears.
As the description indicates, it is evident that a new user may find it challenging to comprehend all the necessary steps and carry out the assembly process consistently with high quality without any visual assistance.

Experimental Setup: Hardware and Robot Cell

Figure 8 shows the hardware setup of the laboratory environment work cell.
The experimental setup is as described by the authors in [25] and it is comprised of a collaborative robotic cell with a Universal Robots® (UR) 10 CB3 with 10 kg payload (see Figure 8a), a modular grip tested for safety as described by the authors in [26] (see Figure 8b), and a laser scanner.
To integrate the possibility of visualizing the manual, the operator is equipped with a tablet. Figure 9 shows real implementation of the work cell from the perspective of the worker.

4. Results

To explain the practical execution and evaluation process of the proposed use case, this section summarizes the outcomes of the evaluation procedures and the updates and improvements made during the entire design, implementation, and assessment cycle. A total of six male users participated in the study, with an average age of around 26 years old.

4.1. Siemens Use Case: KPIs

As described in the methodological approach, defining the KPIs is one of the core steps for the use case evaluation. This enabled qualitative and quantitative assessment of the use case objectives based on a set of comparable rates and values adaptable to the real conditions of each use case deployment environment. The use case defined two main objectives: (1) to reduce the cognitive load of workers due to the need to verify the quality of the assembled parts and (2) to assist human workers in training robots and supporting human workers in inspection and calibration tasks.
Following the SHOP4CF evaluation approach, a set of KPIs and their measurement instruments were defined in the four domains of the approach. The KPIs’ definition and measurement instruments, baseline values, and values after using the described components are summarized in Table 1. It is important to note that the baseline values were collected in the current scenario before using the ARContent Tool for the assembly procedure. These baseline values are compared to the values obtained after using the ARContent component to demonstrate the improvements achieved in the four evaluated domains. In general, all KPI values have shown possible improvements after implementing the solution, demonstrating the enhancements achieved in the four evaluated domains (integration, process, functionality, and human-related values).

4.2. Siemens Use Case: Added Value for the Workers

All human-factor-related topics were found to be at a good level in this use case. Almost all of the domains scored higher than 4 points out of 5, implying a satisfactory evaluation of the tool, as summarized in Figure 10.
A detailed description of the questions included in the human factor questionnaire emphasizes the good results in the perceived usability of the workers involved in the evaluation process (see Figure 11).
The values of the questionnaire reinforce the positive results of the human-related KPIs described in Section 4.1, especially in the areas of user experience, usability, acceptance, and usefulness.

5. Discussion

This section discusses the results identified in Section 4, providing a comparison of usability, time required, and task load index (TLX) scores when using the ARContent tool versus the baseline. In general, we conclude that the tool is highly effective in supporting working in manufacturing tasks, especially assembly ones.
The results (see Figure 12) showed that workers using the ARContent tool had a higher System Usability Score (SUS) [27] than the baseline without any AR tool, indicating that they found the tool to be more user-friendly and easier to use. The SUS score can be considered optimal (the average was around 75); however, further improvements can be considered for future developments to ensure that the tool is accepted by the whole user pool, such as adding additional information about the status of robot tasks and expected remaining time to be completed when the worker is waiting.
Additionally, the experiment showed that workers had lower Nasa Task Load Index (TLX) [28] scores (around 25%) than the baseline (around 40%), indicating that the tool reduced the mental and physical workload for workers during assembly tasks. Furthermore, the experiment demonstrated that the use of the tool resulted in a reduction in the overall time needed for the assembly activity when compared to the baseline. This suggests that the ARContent tool not only improves the experience for workers but also increases the efficiency of the assembly process.
These findings highlight the potential benefits of using the ARContent tool, which has demonstrated its ability to improve usability, reduce task load, and increase efficiency in the assembly process. However, future versions of the tool should focus on continuously improving usability and adding new features to ease the edition of AR manuals, such as using 3D recognition instead of patterns for placing AR info. Overall, while 3D recognition has the potential to provide more accurate and intuitive AR experiences, it poses several technical and practical challenges that need to be addressed to fully realize its potential.

6. Conclusions

This paper presents a study on the potential of augmented reality (AR) technology to revolutionize human–robot collaborative industrial processes. The study showcases a real use case at Siemens premises to efficiently reduce cognitive load, assist human workers in training robots, and support calibration and inspection tasks. The main objective is to highlight AR as a solution for optimizing human–robot collaboration during assembly tasks and enhancing productivity. The article first introduces the state of the art in authoring tools for AR and describes the AR-based authoring tool proposed by the SHOP4CF H2020 project for smart factories of the future. The methodology used to deploy and evaluate the tool’s performance in a use case for assembly tasks is presented, followed by a summary of the main findings and future directions.
The findings demonstrate the potential benefits of using the ARContent tool, which has shown its ability to improve usability, reduce task load, and increase efficiency in assembly tasks. However, limitations of the study are acknowledged due to the restricted availability of workers with knowledge of assembly tasks with robots. Future work will focus on testing the ARContent and its perceived usability and impact by a larger user pool, as well as improving the authoring tool based on the identified shortcomings.
The implications of this research extend beyond the assembly tasks presented in the study. AR technology can support workers in various manufacturing tasks, such as maintenance, quality control, and inspection, by providing them with real-time instructions and guidance, reducing the risk of errors, and increasing efficiency. AR can provide workers with real-time information, enabling them to quickly identify and diagnose issues and resolve them. By providing workers with real-time instructions and guidance, AR can improve worker safety, reduce errors, and increase productivity. It can also help companies reduce the time and cost associated with training new workers and improve the quality of their products.
In addition, AR can help companies stay competitive by enabling workers to adapt and learn quickly about new manufacturing processes and technologies. Overall, the research presented in this paper about the use of AR in manufacturing has great potential to revolutionize the industry by improving worker safety, productivity, and quality, and allowing companies to stay competitive in a rapidly changing environment.

Author Contributions

Conceptualization, A.M. and L.B.; methodology, P.A.; software, P.A.; validation, M.P., L.B. and P.A.-J.; formal analysis, M.P.; investigation, A.M. and L.B.; resources, P.A.-J. and M.P.; data curation, P.A.-J.; writing—original draft preparation, A.M.; writing—review and editing, L.B.; visualization, P.A.-J.; supervision, L.B.; project administration, L.B.; funding acquisition, L.B. All authors have read and agreed to the published version of the manuscript.

Funding

This research received funding from the European Union’s Horizon 2020 research 765 and Innovation Programme under grant agreement No. 873087. The results obtained in this work reflect only the authors views and not the ones of the European Commission; the Commission is not responsible for any use that may be made of the information they contain.

Institutional Review Board Statement

Ethical review and approval are waived for this study due to anonymized data collection which, in Bavaria, where the study was conducted, does not need approval from an ethical committee (https://ethikkommission.blaek.de/studien/sonstige-studien/antragsunterlagen-ek-primarberatend-15-bo) (accessed on 19 February 2023).

Informed Consent Statement

Informed consent was obtained from all subjects involved in the study and the participation to the study was voluntary.

Data Availability Statement

Data are contained within the article.

Acknowledgments

The authors would like to thank the users who took part in the study.

Conflicts of Interest

The authors declare no conflict of interest.

References

  1. EU. Industry 5.0. Available online: https://research-and-innovation.ec.europa.eu/research-area/industry/industry-50_en (accessed on 23 March 2023).
  2. Demir, K.A.; Döven, G.; Sezen, B. Industry 5.0 and Human-Robot Co-Working. Procedia Comput. Science 2019, 158, 688–695. [Google Scholar]
  3. Iftikhar, H.M.; Iftikhar, L. Post COVID-19 Industrial Revolution 5.0. The Dawn of Cobot, Chipbot and Curbot. Pak. J. Surg. Med. 2020, 1, 122–126. [Google Scholar]
  4. Matheson, E.; Minto, R.; Zampieri, E.; Faccio, M.; Rosati, G. Human–Robot Collaboration in Manufacturing Applications: A Review. Robotics 2019, 8, 100. [Google Scholar] [CrossRef] [Green Version]
  5. Lin, C.J.; Lukodono, R.P. Sustainable Human–Robot Collaboration Based on Human Intention Classification. Sustainability 2021, 13, 5990. [Google Scholar] [CrossRef]
  6. Li, L. Reskilling and Upskilling the Future-ready Workforce for Industry 4.0 and Beyond. Inf. Syst. Front. 2022. [Google Scholar] [CrossRef]
  7. Choi, T.M.; Kumar, S.; Yue, X.; Chan, H.L. Disruptive technologies and operations management in the Industry 4.0 era and beyond. Prod. Oper. Manag. 2022, 31, 9–31. [Google Scholar] [CrossRef]
  8. Turner, C.J.; Garn, W. Next generation DES simulation: A research agenda for human centric manufacturing systems. J. Ind. Inf. Integr. 2022, 28, 100354. [Google Scholar] [CrossRef]
  9. Masood, T.; Egger, J. Augmented reality in support of Industry 4.0—Implementation challenges and success factors. Robot. Comput.-Integr. Manuf. 2019, 58, 181–195. [Google Scholar] [CrossRef]
  10. Montero O’Farrill, J.L.; Herrero Tunis, E. Las herramientas de autor en el proceso de producción de cursos en formato digital. Pixel-Bit. Rev. De Medios Y Educ. 2008, 33, 59–72. [Google Scholar]
  11. Zhao, Y. The impacts of Low/No-code development on digital transformation and software development. arXiv Forum 2021, arXiv:2112.14073v1, arXiv:2112.14073v1. [Google Scholar]
  12. Dabbagh, N. Authoring Tools and Learning Systems: A Historical Perspective. In Proceedings of the Annual Proceedings of Selected Research and Development [and] Practice Papers Presented at the National Convention of the Association for Educational Communications and Technology (24th), Atlanta, GA, USA, 8–12 November 2001; 1–2. see IR 021 504. [Google Scholar]
  13. Bhattacharya, B.; Winer, E.H. Augmented reality via expert demonstration authoring (AREDA). Comput. Ind. 2019, 105, 61–79. [Google Scholar] [CrossRef]
  14. Mota, R.C.; Roberto, R.A.; Teichrieb, V. [POSTER] Authoring tools in augmented reality: An analysis and classification of content design tools. In Proceedings of the 2015 IEEE International Symposium on Mixed and Augmented Reality (ISMAR 2015), Fukuoka, Japan, 29 September–3 October 2015; pp. 164–167. [Google Scholar] [CrossRef]
  15. Nebeling, M.; Speicher, M. The Trouble with Augmented Reality/Virtual Reality Authoring Tools. In Proceedings of the 2018 IEEE International Symposium on Mixed and Augmented Reality, ISMAR-Adjunct 2018, Munich, Germany, 16–20 October 2018; pp. 333–337. [Google Scholar] [CrossRef]
  16. Suzuki, R.; Karim, A.; Xia, T.; Hedayati, H.; Marquardt, N. Augmented Reality and Robotics: A Survey and Taxonomy for AR-enhanced Human-Robot Interaction and Robotic Interfaces. In Proceedings of the 2022 CHI Conference on Human Factors in Computing Systems, New Orleans, LA, USA, 29 April 2022–5 May 2022. [Google Scholar] [CrossRef]
  17. Cao, Y.; Wang, T.; Qian, X.; Rao, P.S.; Wadhawan, M.; Huo, K.; Ramani, K. GhostAR: A time-space editor for embodied authoring of human-robot collaborative task with augmented reality. In Proceedings of the 32nd Annual ACM Symposium on User Interface Software and Technology (UIST 2019), New Orleans, LA, USA, 20–23 October 2019; pp. 521–534. [Google Scholar] [CrossRef]
  18. Suzuki, R.; Kato, J.; Gross, M.D.; Yeh, T. Reactile: Programming swarm user interfaces through direct physical manipulation. In Proceedings of the 2018 CHI Conference on Human Factors in Computing Systems, Montreal, QC, Canada, 21–26 April 2018. [Google Scholar] [CrossRef]
  19. Zimniewicz, M. Deliverable 3.2—SHOP4CF Architecture. 2020. Available online: https://live-shop4cf.pantheonsite.io/wp-content/uploads/2021/07/SHOP4CF-WP3-D32-DEL-210119-v1.0.pdf (accessed on 14 March 2023).
  20. Abril-Jimenez, P.; Carvajal, D.; Buhid, E.; Gaeta, E.; Cabrera-Umpierrez, M.F.; Vilarino Gutierrez, S.; Pantano, M.; Engin, U.; Amaç, H.; Cezary, T. Deliverable 5.6—Evaluation of Early Pilot Prototypes; EU: Maastricht, The Netherlands, Internal SHOP4CF Project Pending to Publish.
  21. FIWARE. Developers Catalogue. Available online: https://www.fiware.org/catalogue/ (accessed on 2 February 2023).
  22. Aromaa, S.; Heikkilä, P. Design of a human factors questionnaire to evaluate digital solutions developed for industrial work. In Ergonomics in Design, Proceedings of the AHFE (2022) International Conference; Rebelo, F., Ed.; AHFE International: New York, NY, USA, 2022; Volume 47. [Google Scholar] [CrossRef]
  23. Panagiotis, B. Deliverable 5.1—Definition of the Deployment Scenarios. Available online: https://live-shop4cf.pantheonsite.io/wp-content/uploads/2021/07/SHOP4CF-WP5-D51-DEL-201215-v1.0.pdf (accessed on 17 March 2023).
  24. OMG. Business Process Model and Notation (BPMN), Version 2.0.2. 2013. Available online: https://www.omg.org/spec/BPMN/2.0.2 (accessed on 31 March 2023).
  25. Pantano, M.; Pavlovskyi, Y.; Schulenburg, E.; Traganos, K.; Ahmadi, S.; Regulin, D.; Lee, D.; Saenz, J. Novel Approach Using Risk Analysis Component to Continuously Update Collaborative Robotics Applications in the Smart, Connected Factory Model. Appl. Sci. 2022, 12, 5639. [Google Scholar] [CrossRef]
  26. Pantano, M.; Blumberg, A.; Regulin, D.; Hauser, T.; Saenz, J.; Lee, D. Design of a Collaborative Modular End Effector Considering Human Values and Safety Requirements for Industrial Use Cases. In Human-Friendly Robotics 2021, Proceedings of the Springer Proceedings in Advanced Robotics, Bologna-Italy; Palli, G., Melchiorri, C., Meattini, R., Eds.; Springer: Cham, Switzerland, 2022; Volume 23. [Google Scholar] [CrossRef]
  27. Brooke, J. SUS-A quick and dirty usability scale. Usability Eval. Ind. 1996, 189, 4–7. [Google Scholar]
  28. Hart, S.G.; Staveland, L.E. Development of NASA-TLX (Task Load Index): Results of empirical and theoretical research. Adv. Psychol. 1988, 52, 139–183. [Google Scholar]
Figure 1. Top-level logical platform architecture of the SHOP4CF project [20].
Figure 1. Top-level logical platform architecture of the SHOP4CF project [20].
Mti 07 00040 g001
Figure 2. ARContent editor: main screen.
Figure 2. ARContent editor: main screen.
Mti 07 00040 g002
Figure 3. Example of a json message published in the context broker.
Figure 3. Example of a json message published in the context broker.
Mti 07 00040 g003
Figure 4. SHOP4CF evaluation framework.
Figure 4. SHOP4CF evaluation framework.
Mti 07 00040 g004
Figure 5. Render of the gearbox assembly, which has different gears and components which require several manual steps to be assembled.
Figure 5. Render of the gearbox assembly, which has different gears and components which require several manual steps to be assembled.
Mti 07 00040 g005
Figure 6. BPMN workflow of the assembly procedure.
Figure 6. BPMN workflow of the assembly procedure.
Mti 07 00040 g006
Figure 7. Steps of the assembly procedure.
Figure 7. Steps of the assembly procedure.
Mti 07 00040 g007
Figure 8. Experimental setup: (a) the robotic cell and (b) the EE integrated into the use case.
Figure 8. Experimental setup: (a) the robotic cell and (b) the EE integrated into the use case.
Mti 07 00040 g008
Figure 9. Real implementation of the work cell in a laboratory environment.
Figure 9. Real implementation of the work cell in a laboratory environment.
Mti 07 00040 g009
Figure 10. Results of the human factors questionnaire for the Siemens use case.
Figure 10. Results of the human factors questionnaire for the Siemens use case.
Mti 07 00040 g010
Figure 11. Questions in the human factor questionnaire and answers.
Figure 11. Questions in the human factor questionnaire and answers.
Mti 07 00040 g011
Figure 12. Summary of results against the baseline: (a) SUS score; (b) TLX score; (c) time requested to complete the task.
Figure 12. Summary of results against the baseline: (a) SUS score; (b) TLX score; (c) time requested to complete the task.
Mti 07 00040 g012
Table 1. KPI definition for Siemens UC.
Table 1. KPI definition for Siemens UC.
TypeCategoryRequirement SpecificationKPI DefinitionBaseline ValuesAfter Using SHOP4CF
IntegrationDeploymentFlexible degree of automation, based on human worker preferences and skills and taking into account performance and quality requirements.Workload measurement [workload]43.6124.17
QualityReduction in quality issues, relevant to improper assembly, robot teaching or calibrationInventory [Pcs/h]2630
ProcessManipulationReduces the cognitive load of the worker in following up with the robot during the assembly task, as well as for the robot calibration taskWorkload measurement43.6124.17
FunctionalityChange work and skillsReduces the time and expertise required for a range of different tasks (collaborative assembly, robot teaching, and robot calibration)Lead time
[min]
135.47118.1
Human-related valuesAcceptance and usabilityAcceptance of the systemSystem usability [usability score]37.0810.8
SafetyReduces the stress of the human worker working collaboratively with the robotWorkload measurement [workload]43.6124.17
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Moya, A.; Bastida, L.; Aguirrezabal, P.; Pantano, M.; Abril-Jiménez, P. Augmented Reality for Supporting Workers in Human–Robot Collaboration. Multimodal Technol. Interact. 2023, 7, 40. https://doi.org/10.3390/mti7040040

AMA Style

Moya A, Bastida L, Aguirrezabal P, Pantano M, Abril-Jiménez P. Augmented Reality for Supporting Workers in Human–Robot Collaboration. Multimodal Technologies and Interaction. 2023; 7(4):40. https://doi.org/10.3390/mti7040040

Chicago/Turabian Style

Moya, Ana, Leire Bastida, Pablo Aguirrezabal, Matteo Pantano, and Patricia Abril-Jiménez. 2023. "Augmented Reality for Supporting Workers in Human–Robot Collaboration" Multimodal Technologies and Interaction 7, no. 4: 40. https://doi.org/10.3390/mti7040040

APA Style

Moya, A., Bastida, L., Aguirrezabal, P., Pantano, M., & Abril-Jiménez, P. (2023). Augmented Reality for Supporting Workers in Human–Robot Collaboration. Multimodal Technologies and Interaction, 7(4), 40. https://doi.org/10.3390/mti7040040

Article Metrics

Back to TopTop