Next Article in Journal
Special Issue: Agents and Robots for Reliable Engineered Autonomy
Previous Article in Journal
FTSMAC: A Multi-Channel Hybrid Reader Collision Avoidance Protocol for RFID Network
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Review

Human–Robot Collaboration Trends and Safety Aspects: A Systematic Review

Institute of Electronics and Computer Science, 14 Dzerbenes St., LV-1006 Riga, Latvia
*
Author to whom correspondence should be addressed.
These authors contributed equally to this work.
J. Sens. Actuator Netw. 2021, 10(3), 48; https://doi.org/10.3390/jsan10030048
Submission received: 30 April 2021 / Revised: 28 June 2021 / Accepted: 2 July 2021 / Published: 13 July 2021
(This article belongs to the Special Issue Robot Systems, Networks and Sensing Technologies)

Abstract

:
Smart manufacturing and smart factories depend on automation and robotics, whereas human–robot collaboration (HRC) contributes to increasing the effectiveness and productivity of today’s and future factories. Industrial robots especially in HRC settings can be hazardous if safety is not addressed properly. In this review, we look at the collaboration levels of HRC and what safety actions have been used to address safety. One hundred and ninety-three articles were identified from which, after screening and eligibility stages, 46 articles were used for the extraction stage. Predefined parameters such as: devices, algorithms, collaboration level, safety action, and standards used for HRC were extracted. Despite close human and robot collaboration, 25% of all reviewed studies did not use any safety actions, and more than 50% did not use any standard to address safety issues. This review shows HRC trends and what kind of functionalities are lacking in today’s HRC systems. HRC systems can be a tremendously complex process; therefore, proper safety mechanisms must be addressed at an early stage of development.

1. Introduction

Alongside the world, society, and everyday life, manufacturing is also changing, while digitization is rapidly becoming de facto. Jobs that are repetitive, tedious, and that do not require high skills are slowly being replaced by smart manufacturing systems. AI-based systems show great promise in automating tasks traditionally requiring human intelligence for adaptive decision making [1].
The media often emphasizes that automation will make humans obsolete and “robots will take our jobs” [2]. In fact, it is estimated that 14% of jobs in OECD countries are at a risk of automation [3]; however, the jobs that are at risk usually lack meaningfulness [4] or pose a high risk of injuries [5]. Automation and robotics are introducing new more creative job openings. It is shifting manufacturing in a more intelligent and safer direction not only regarding the production processes but also with respect to the human workforce. Even though simple, repetitive jobs that are poorly paid and possess high safety and health risks will be obsolete in smart manufacturing and smart factories, humans will still have a significant role in them [6] but with a different set of competencies and tasks.
The technological progress and achievements in the field of AI provide possibilities to tackle today’s manufacturing challenges in completely different ways than decades ago, although today’s challenges are also more complex than decades ago [7]. One of the outcomes enabled by this progress and achievements is Human–Robot Collaboration (HRC) where humans and robots work alongside each other or together to achieve shared goals. Whereas the collaboration particularly in more complex scenarios is driven by emerging Human–Robot Interaction (HRI) methods. The new interaction methods are improving the real-world HRC deployments, however, the safety in accordance with safety standard: ISO/TS 15066 [8] should be considered in order to embrace the new technologies in a safe way. The HRC can be divided into three different levels, such as coexistence, cooperation, and collaboration. The collaboration levels are directly connected to how the human worker and robot interact with each other through their workspaces. Figure 1 illustrates robots’ and human workers’ workspaces (RW, WW) which creates a shared workspace (SW) by overlapping each other.
Accordingly, different collaboration levels require respective safety actions and measures. Even though the safety modes are outlined in ISO/TS 15066, the terminology for describing HRC has not been well-established. Throughout the years, multiple ways have been introduced on how to categorize HRC. The aforementioned terms coexistence, cooperation, and collaboration often are applied inconsistently. In this article the meaning of collaboration levels are adapted from [9], where the collaboration levels are illustrated in Figure 2 and described as following:
  • Coexistence: human works in (partially or completely) shared space with a robot with no shared goals;
  • Cooperation: human and robot work towards a shared goal in (partially or completely) shared space;
  • Collaboration: human and robot work simultaneously on a shared object in a shared space.
Bearing in mind all the possible hazards that can be posed by industrial robots, safety must be the highest priority when establishing a HRC workcell. If proper safety mechanisms are not present, there is a high risk of worker injuries, as most industrial robots are unaware of their surroundings. Even though there is a need for more intuitive and effective HRI methods to raise the usability and performance of HRC, these methods should be developed in accordance with safety standards. Safety-related guidelines and instructions are discussed in ISO 10218-2 [10] and ISO/TS 15066 where collaborative operations must include one or more of the following four modes:
  • Safety-Rated monitored stop (SRMS): The human and robot can perform tasks in separate workspaces and the robot can operate without restrictions as long as the human has not entered its workspace. The human may enter the robot’s workspace only when a safety-rated monitored stop is active, and the robot may resume only when the human has exited the robot’s workspace. Safety-rated devices should be used to detect the presence of humans.
  • Hand guiding (HG): In this mode, the human can manually provide motion commands to the robot by utilizing a HG device. When the human is outside the collaborative area, the robot can move at full speed; however, the human is allowed to enter the robot’s workspace and proceed with HG tasks only after the robot achieves SRMS. When the human takes control over the robot with the HG device the SRMS is released, accordingly when the HG device is disengaged the SRMS is activated.
  • Speed and separation monitoring (SSM): In this mode, the human and the robot can work in the same workspace. The speed of the robot is adjusted according to the distance between the human and the robot itself. The robot must not get closer to the human than the protective separation distance, otherwise the robot must stop.
  • Power and force limiting (PFL): This mode allows physical contact between human and robot. PFL operations are limited to collaborative robots that have integrated force and torque sensors. Contact between human and robot is allowed, however, the forces applied to the human body through intentional or unintentional contact should be below the threshold limit values which should be determined during the risk assessment.
The main goal of this article is to analyze and discuss the HRC trends within the smart manufacturing environment and how HRI methods contribute towards enabling safe and efficient HRC.

2. Materials and Methods

The study followed four consecutive stages—identification, screening, eligibility, and inclusion. Figure 3 shows the flow diagram of the systematic review’s methodology.

2.1. Identification

In order to find the articles for this systematic review, on March 2021 a search was conducted in two databases: Scopus and Web of Science, the queries used in the search are shown in Table 1. Title, Abstract, and Keywords were used as search fields in the Scopus database, and Topic which includes Title, Abstract, Author keywords, and Keywords Plus were used as search fields in the Web of Science database. Only peer-reviewed articles were searched in these two databases.
In total, 177 matching articles were found in the Scopus database and 74 matching articles in Web of Science databases. After removing 58 duplicates, 193 matching articles were left for the next step.

2.2. Screening

All authors were involved in the screening stage, each examined a random set of articles by reading the title and abstract. The criteria for inclusion in the next stage were the following:
  • Article is in English.
  • Article is not a review paper.
  • Article is about human–robot collaboration.
Total of 193 articles were processed in this stage and 142 articles were excluded as they did not meet the aforementioned criteria, leading to 51 articles left for the next stage.

2.3. Eligibility

In the eligibility stage, the full text of each article was examined. In this stage, similarly as in the screening stage, all authors were involved by examining a random set of articles. The eligibility criteria for inclusion in the next stage were the following:
  • Full text of the article is available.
  • Article is in English.
  • Article is not a review paper.
  • Article is about human–robot collaboration.
The criteria repeating from the previous stage were rechecked regarding the full text since in some cases the abstract did not provide enough information to make a decision. A total of 51 articles were processed in this stage and 5 were excluded, leading to 46 articles for the final stage.

2.4. Included

In this stage, predefined parameters were extracted from the remaining articles. Each author performed extraction for a random set of articles. The extracted parameters were:
  • Sensors/devices used for HRC.
  • Algorithms for HRC.
  • Collaboration level.
  • Safety action.
  • Standards used for HRC.

3. Results

In total 251 articles were identified. After removing duplicates, 193 unique articles were left. After screening 51 unique articles were left, but after eligibility check 46 articles were left for further analysis.
The majority of articles (60%) were from the last three years (2019–2021). In 2018 there were 7 articles but in 2017 only 5 articles. Between 2010 and 2016 there were only 6 articles which met criteria for inclusion. No articles were found before the year 2010 (see Figure 4).
From these 46 articles, operations were extracted that described what kind of human–robot collaboration level was present in the study and how safety issues were assessed (see Table 2). Sixteen out of 46 studies had collaboration between human and robot where they both worked together on a shared object in the same workspace; meanwhile, 22 articles had cooperation between human and robot, but 14 studies had coexistence between human and robot. In 7 articles the proposed system did not meet the criteria for any of the collaboration level; however, new interface method was proposed.
The number of safety actions used by the studies was disappointing, for example, 12 studies did not use any safety action in a workspace. Even more, 29 studies did not use any standard to assure some safety in a workspace. SSM was the most used safety action (19 studies), but PFL was the least used (6 studies). SRMS was used in 13 studies, but HG was used in 10 studies. ISO 10218 was used in 12 articles and ISO/TS 15066 was used in 12 articles. Other standards used in articles were ISO 13855 [11] and ISO 9001 [12].
The most used devices for HRC were 3D cameras which were used in 22 studies. Commonly used devices were force/tactile sensors (10 studies), as well as wearables (7 studies). Laser scanners were used in 6 studies, but VR/AR, 2D cameras, and microphones/speakers were used only in 4 studies. All other devices were used less that 3 times (see Figure 5).
In most cases human detection/recognition was used as the algorithm for HRC (22 studies). Force detection/recognition and gesture recognition was used in 7 studies, while motion planning was used in 6 studies, and human physiology detection and speech recognition was used in 5 studies.

4. Discussion

4.1. Safety Aspects within Different Collaboration Levels

The required safety actions differ between collaboration levels so we are describing our findings and thoughts separately for each of the collaboration levels combining the statistical data acquired from the systematic review with our experience and observations while producing this work. We also try to highlight the HRI and human workforce training and the fact that they are not wildly represented in the data we have gathered suggests that these trends are only just emerging.
Coexistence can be considered as the lowest level of HRC where human and robot works in a shared space with no shared goals; accordingly, the robot needs to perform SRMS if a human enters the workspace of a robot or the speed of the robot needs to be adjusted based on the distance between human and robot (SSM). In 14 of the reviewed articles human and the robot coexist together. If deployed properly the coexistence by its nature poses the least number of potential hazards and risks of injuring humans are minimal as the human should not come into contact with the robot while it is still moving. The proposed methods in the reviewed articles for human–robot coexistence in a way can be directly interpreted as safety features, as there is no other interaction between robot and human than stopping or adjusting the speed of robot motions. However, it is not a safety feature if it is not developed as a safety feature in accordance with the relevant standards. Only half of the coexistence operations (50%) have references to relevant standards, within which the majority of these articles only acknowledge the existence of such standards without deeper studies on safety requirements. The safety operations are time critical, without analysis of safety requirements the methods used for detection of human presence might not meet the criteria for safe coexistence. The most commonly used method in the reviewed articles is human detection/tracking (51%) which is also mainly connected to SRMS and SSM safety operations. As the development of human detection/tracking algorithms and 3D camera usage is driven by a wide variety of application scenarios outside the field of HRC, the maturity of those solutions is increasing rapidly, contributing to more advanced and flexible applicability in the HRC scenarios.
Cooperation is similar to coexistence, the human and robot also work in a shared space but in cooperation they have a shared goal. All the safety actions as for coexistence are still present for cooperation. In some cases, the cooperation and coexistence can also be the same from the safety viewpoint, however, these levels are different for human worker and their actions. By increasing the collaboration level, the potential frequency of possible hazards also increases. Unfortunately, the relevant safety standards have been addressed in even less of the reviewed articles compared to coexistence—only 41%. The main difference in cooperation is how close the human and robot works to achieve a shared goal. The close cooperation introduces HG safety operation which should be addressed together with SRMS, respectively. The robot should receive SRMS before proceeding with hand guiding, which was not the case in most of the articles. Within the cooperation, the human task depends on the robot’s activities whereas timing and bidirectional interaction become more important.
Collaboration can be viewed as the highest level of HRC, where human and robot are working simultaneously on the shared object. Depending on the scenario, the collaboration can include all four collaborative modes. However, a distinct feature of collaboration is the simultaneous work on the shared object, whereas the most promising methods to achieve this functionality are connected to PFL. PFL allows close collaboration and contact between human and robot, however, the close collaboration also poses higher threats to a human worker, as an inappropriate force on a distinct part of a human body can result in an injury. To lower the risk of injuries, the speed limits should be established according to safety standards to maintain force and pressure values below the pain sensitivity threshold of the human body. Comparing to coexistence and cooperation, a better situation in safety standards can be seen in collaboration methods, as relevant standards have been referenced in 50% of the reviewed articles. Even though some methods connected to PFL show promise in establishing efficient HRC, these are not yet applied in manufacturing and lack risk analysis. Establishing speed limits could significantly decrease the usability, thus rendering this method economically unattractive.

4.2. Human–Robot Interaction Methods for More Intuitive Collaboration

Establishing HRC workcell does not mean changing only the robot; the whole process needs to be redesigned by reflecting on safety standards and emerging HRI. Some of the reviewed solutions are proposing new human interaction methods, thus one of the goals of HRI is to develop more natural ways of how humans can interface with robots. These HRI methods can be used to make the interaction between human and robot smoother and improve the usability of the system. HRI methods do not necessarily involve sharing the workspace between the human and the robot; however, in scenarios when the workspace is being shared the safety should be ensured according to the collaboration level. The human needs to be aware of the robot’s movements and actions; however, the robot also should be aware of human intentions. Ten percent of articles have addressed this issue by detecting and recognizing human physiology and by synthesizing/recognizing speech to communicate more naturally. Facial recognition shows to be a less explored way to extract valuable data from humans, as only 4% of reviewed articles have addressed this method. Gesture recognition shows to be a more common way to give commands to robots as 21% of articles have addressed this method. Even though gestures can be a more natural way of communicating with the robot, there is a lack of consistency of gestures used, therefore decreasing the potential usability between different industries and nationalities of human workers.

4.3. Efficient and Safe Collaboration through Virtual Training

The HRC can be a safe way to raise effectiveness in manufacturing and cope with the shortage of human workforce. The full potential of HRC cannot be achieved only by developing new HRI methods and improving the existing ones. The HRC process will be efficient when both parties work together in harmony, meaning the human needs to learn how to efficiently collaborate with the robot. The robot in a way will be addressed as a “colleague” and a human willingness to collaborate with it also should be addressed accordingly. The willingness is highly connected to safety and if the human worker feels safe to collaborate. The human needs to be aware of robots’ intentions and actions, thus the humans also need to be aware of his/her limits and potential hazards. The nature of humans can be exploratory and sometimes the limits are learned the hard way [59]. Traditionally such training is performed with manual documentation and physical instructions; however, it requires interruption in the manufacturing process which costs valuable production time. Moreover, the human needs to gradually adjust to collaborate with the robot. Digital training by utilizing virtual reality can increase the awareness of possible hazards and prepare the human operator for the real-world tasks, thus in the meantime allowing the human to explore the capabilities of HRC and his/her limitations. The training for HRC particularly has been addressed in two of the reviewed articles. Even though the safety standards have been developed in a way to minimize the risks for human workers, there is still a need to teach human workers how to collaborate safely.

4.4. Benefits of Human–Robot Collaboration

Whilst automation tends to decrease human involvement in performing the tasks [60], the HRC is maximizing the potential of both parties through collaboration. This raises a question: is the HRC easier to build than full automation, or do the aims of the manufacturing process need to be reassessed? The HRC workcell in a way is more flexible to dynamic adjustments of today’s and future factories; however, the full automation requires automation of some processes which are not yet feasible and thus requires redesigning the whole manufacturing process which therefore leads to high initial costs and limited adaptability [61]. The HRC can be viewed as a step between traditional manufacturing and fully automated factory inside large companies. However, HRC can significantly increase effectiveness for small and medium-sized enterprises (SME) also in the long term [62], as this requires less funding to achieve agile manufacturing than redesigning the whole factory. It is foreseen that in the future robots will become more affordable and easier to integrate into manufacturing; therefore, it can be expected that more SMEs from a wide variety of industrial sectors will adapt HRC systems [63].
In this phase the human is not necessarily helping the robot or vice versa; the collaboration between both parties is complementary. The reviewed articles show collaborative systems not only where strengths of industrial robots are utilized but also systems where artificial intelligence is given to robots to increase the effectiveness of collaboration. Distinct trends can be seen in development of more intuitive HRI methods, thus we foresee that in upcoming years focus will be shifted on developing more explainable and understandable HRC systems to enable more human-like performance of robots.
Even though in this study the main benefits of HRC and HRI are discussed for the manufacturing sector, the reviewed methods and concepts can be transferred to our daily life as well. A great example is social robotics, which is designed to engage people in an interpersonal manner to achieve a desired outcome in domains such as domestic chores, elderly care, health, and more [64]. Some types of social robots can already be seen in our daily life as tutors or peer learners [65] or socially assistive robots [66]. However, to accomplish more complex tasks related to domestic chores or helping elderly people, social robots also require a high level of freedom and dexterity which are hardly affordable for personal use.

4.5. Artificial Intelligence within Smart Factories

The HRC is only one of the fields where AI and developments in it are high priorities for international policymakers [67]. AI is internationally accepted as a main driver [68] for digitization and transformation of factories as flexibility and deep understanding of complex manufacturing processes are becoming the key advantage to raise competitiveness. Smart factory in a way is a manufacturing solution and key construct of Industry 4.0 [69] that utilizes these advantages of the AI by introducing flexible and adaptive production that will solve problems that can arise on production facility with dynamic and rapidly changing conditions [70]. However, the real-world AI systems including HRC applications should be designed in way that excludes unintended or harmful behavior, as there are concrete problems in AI safety [71]. HRC may or may not include AI, but the existing HRC safety operations are designed in a way that minimizes the risk of human worker injuries as in the end the industrial robot is just a tool which can have high momentum even when moving at low speeds.

4.6. Challenges of Human–Robot Collaboration

Traditionally industrial robots have been separated from human workers due to their operational speeds and heavy payloads. HRC tends to extinguish this segregation but also introduces a wide variety of challenges resulting from the complexity of the collaboration process. HRC needs to be an efficient way to raise productivity without losing the flexibility offered by industrial robots. As this study shows, there is no doubt that safety is one of the key challenges in HRC; however, such characteristics as modularity and operability also play an important role in the deployment of HRC systems. The challenge within modularity includes efficient adaption to changes in the environment by respective component change such that a replacement of one component has no (or only minimal) impact on the other components within the system [72]. The modularity in the context of a flexible system is a crucial part as different industries, manufacturing sectors, or even different tasks within the same production floor have diverse conditions. The goal is to keep both the effort as well as cost as low as possible when adapting the system to the new task. Whereas the challenge with respect to operability includes user friendliness and easy adaptability to other specified goals which should be manageable without any deep specific knowledge of the underlying target technology.
The field of HRC is constantly developing and new technologies around HRC are emerging. In this study we focused on collaboration between human and robot; however, the sustainability of HRC cannot be guaranteed by the HRC system alone. Safe and secure interconnectivity of other elements and stations within smart factory can raise future challenges. Potential users of HRC might not be sufficiently supported to facilitate the integration of such systems into their applications. The potential of edge-computing devices can partly address these issues by transferring intelligence from the cloud to the edge. As edge computing deploys computing and storage resources closer to the production floor, it significantly reduces the delays of data processing, which therefore can also introduce new perspectives in HRC. However, in these settings, a precaution is advised and security issues should be addressed very carefully [73]. Edge computing can provide more efficient computing resources for HRC applications and beyond but it also can introduce more security threats as it increases real-world attack surface from new angles [74]. Thus, the security risks should be carefully assessed and monitored by developers and system integrators as industrial robots by themselves can possess a significant number of vulnerabilities [75].

5. Conclusions

The safety aspect is crucial in regards to HRC because industrial robots can pose threats to human workers if proper safety mechanisms are not established. This review shows that 1/4 of all studies did not use any safety action, thus the functionalities in the majority of articles only resemble the safety actions. More than half did not address the safety issues even when the collaboration between human and robot was in the same space and they were working with the same object. As the level of collaboration increases more often some standard was used, but in those studies where only HRI methods were proposed safety was not addressed at all, although the interface methods mostly did not include sharing the workspace where safety aspect is not critical.
HRC systems in a combination of artificial and natural intelligence can become a tremendously complex process. The human needs to feel safe when collaborating with the robot. The feeling of safety can be achieved by trusting the robot, system, and essentially the algorithms that are designed for HRC. The level of effectiveness of HRC is designed for and can be achieved by safe collaboration, when the human worker does not feel endangered. The trust can be earned by explainable, predictable, and understandable robot actions which should be addressed by smart bidirectional communication between the human and the robot. However, even small accidents could cause a justified loss of trust in automated systems. Proper safety mechanisms should be addressed not only at an early stage of development but throughout the whole development phase and also during deployment of the system, by being aware of the critical AI safety problems. Thus there is still a need for more intuitive and human to human-like interaction in today’s HRC whilst addressing safe methods for training human workers.

Author Contributions

Conceptualization, J.A. and V.A.; methodology, J.A. and V.A.; validation, J.A., V.A., J.J.; formal analysis, J.A., V.A., J.J., O.V., A.O., K.O.; investigation, J.A., V.A., J.J., O.V., A.O., K.O.; resources, J.A., V.A., J.J., O.V., A.O., K.O.; data curation, J.A., J.J., O.V., A.O.; writing—original draft preparation, J.A., V.A., J.J., K.O.; writing—review and editing, J.A., V.A., J.J., O.V., A.O., K.O.; visualization, V.A.; supervision, K.O.; project administration, K.O.; funding acquisition, K.O. All authors have read and agreed to the published version of the manuscript.

Funding

This research was performed within the “Digital Technologies, Advanced Robotics and increased Cyber-security for Agile Production in Future European Manufacturing Ecosystems” (TRINITY) project, which has been co-funded by the Horizon 2020 Framework Programme of the European Union Under grant agreement No. 825196.

Informed Consent Statement

Not applicable.

Data Availability Statement

The data underlying this article will be shared on reasonable request from the corresponding author.

Conflicts of Interest

The authors declare no conflict of interest. The funders had no role in the design of the study; in the collection, analyses, or interpretation of data; in the writing of the manuscript, or in the decision to publish the results.

References

  1. Lee, K. Artificial intelligence, automation, and the economy. Exec. Off. Pres. USA 2016, 20. Available online: https://obamawhitehouse.archives.gov/sites/whitehouse.gov/files/documents/Artificial-Intelligence-Automation-Economy.PDF (accessed on 7 April 2021).
  2. Dahlin, E. Are robots stealing our jobs? Socius 2019, 5, 2378023119846249. [Google Scholar] [CrossRef] [Green Version]
  3. Nedelkoska, L.; Quintini, G. Automation, Skills Use and Training; OECD Publishing: Paris, France, 2018. [Google Scholar]
  4. Smids, J.; Nyholm, S.; Berkers, H. Robots in the Workplace: A Threat to—Or opportunity for—Meaningful Work? Philos. Technol. 2020, 33, 503–522. [Google Scholar] [CrossRef] [Green Version]
  5. Wadsworth, E.; Walters, D. Safety and Health at the Heart of the Future of Work: Building on 100 Years of Experience. 2019. Available online: https://www.ilo.org/safework/events/safeday/WCMS_686645/lang--en/index.htm (accessed on 7 April 2021).
  6. Evjemo, L.D.; Gjerstad, T.; Grøtli, E.I.; Sziebig, G. Trends in Smart Manufacturing: Role of Humans and Industrial Robots in Smart Factories. Curr. Robot. Rep. 2020, 1, 35–41. [Google Scholar] [CrossRef] [Green Version]
  7. Petrillo, A.; De Felice, F.; Cioffi, R.; Zomparelli, F. Fourth industrial revolution: Current practices, challenges, and opportunities. Digit. Transform. Smart Manuf. 2018, 1–20. [Google Scholar] [CrossRef] [Green Version]
  8. ISO/TS 15066:2016. Robots and Robotic Devices—Collaborative Robots; International Organization for Standardization: Geneva, Switzerland, 2016. [Google Scholar]
  9. Aaltonen, I.; Salmi, T.; Marstio, I. Refining levels of collaboration to support the design and evaluation of human-robot interaction in the manufacturing industry. Procedia CIRP 2018, 72, 93–98. [Google Scholar] [CrossRef]
  10. ISO 10218-2:2011. Robots and Robotic Devices—Safety Requirements for Industrial Robots—Part 2: Robot Systems and Integration; International Organization for Standardization: Geneva, Switzerland, 2011. [Google Scholar]
  11. ISO 13855:2010. Safety of Machinery—Positioning of Safeguards with Respect to the Approach Speeds of Parts of the Human Body; International Organization for Standardization: Geneva, Switzerland, 2010. [Google Scholar]
  12. ISO 9001:2015. Quality Management Systems—Requirements; International Organization for Standardization: Geneva, Switzerland, 2015. [Google Scholar]
  13. Melchiorre, M.; Scimmi, L.S.; Mauro, S.; Pastorelli, S.P. Vision-based control architecture for human–robot hand-over applications. Asian J. Control 2021, 23, 105–117. [Google Scholar] [CrossRef]
  14. Zlatanski, M.; Sommer, P.; Zurfluh, F.; Madonna, G.L. Radar sensor for fenceless machine guarding and collaborative robotics. In Proceedings of the 2018 IEEE International Conference on Intelligence and Safety for Robotics (ISR), Shenyang, China, 24–27 August 2018; pp. 19–25. [Google Scholar] [CrossRef]
  15. Komenda, T.; Reisinger, G.; Sihn, W. A Practical Approach of Teaching Digitalization and Safety Strategies in Cyber-Physical Production Systems. Procedia Manuf. 2019, 31, 296–301. [Google Scholar] [CrossRef]
  16. Dianatfar, M.; Latokartano, J.; Lanz, M. Concept for Virtual Safety Training System for Human-Robot Collaboration. Procedia Manuf. 2020, 51, 54–60. [Google Scholar] [CrossRef]
  17. Casalino, A.; Messeri, C.; Pozzi, M.; Zanchettin, A.M.; Rocco, P.; Prattichizzo, D. Operator awareness in human–robot collaboration through wearable vibrotactile feedback. IEEE Robot. Autom. Lett. 2018, 3, 4289–4296. [Google Scholar] [CrossRef] [Green Version]
  18. Sievers, T.S.; Schmitt, B.; Rückert, P.; Petersen, M.; Tracht, K. Concept of a Mixed-Reality Learning Environment for Collaborative Robotics. Procedia Manuf. 2020, 45, 19–24. [Google Scholar] [CrossRef]
  19. Dombrowski, U.; Stefanak, T.; Reimer, A. Simulation of human-robot collaboration by means of power and force limiting. Procedia Manuf. 2018, 17, 134–141. [Google Scholar] [CrossRef]
  20. De Gea Fernández, J.; Mronga, D.; Günther, M.; Wirkus, M.; Schröer, M.; Stiene, S.; Kirchner, E.; Bargsten, V.; Bänziger, T.; Teiwes, J.; et al. iMRK: Demonstrator for intelligent and intuitive human–robot collaboration in industrial manufacturing. KI-Künstliche Intell. 2017, 31, 203–207. [Google Scholar] [CrossRef]
  21. Meißner, D.W.I.J.; Schmatz, M.S.F.; Beuß, D.I.F.; Sender, D.W.I.J.; Flügge, I.W.; Gorr, D.K.F.E. Smart Human-Robot-Collaboration in Mechanical Joining Processes. Procedia Manuf. 2018, 24, 264–270. [Google Scholar] [CrossRef]
  22. De Gea Fernández, J.; Mronga, D.; Günther, M.; Knobloch, T.; Wirkus, M.; Schröer, M.; Trampler, M.; Stiene, S.; Kirchner, E.; Bargsten, V.; et al. Multimodal sensor-based whole-body control for human–robot collaboration in industrial settings. Robot. Auton. Syst. 2017, 94, 102–119. [Google Scholar] [CrossRef]
  23. Murali, P.K.; Darvish, K.; Mastrogiovanni, F. Deployment and evaluation of a flexible human–robot collaboration model based on AND/OR graphs in a manufacturing environment. Intell. Serv. Robot. 2020, 13, 439–457. [Google Scholar] [CrossRef]
  24. Antão, L.P.S. Cooperative Human-Machine Interaction in Industrial Environments. In Proceedings of the 2018 13th APCA International Conference on Automatic Control and Soft Computing (CONTROLO), Azores, Portugal, 4–6 June 2018; pp. 430–435. [Google Scholar]
  25. Bejarano, R.; Ferrer, B.R.; Mohammed, W.M.; Lastra, J.L.M. Implementing a Human-Robot Collaborative Assembly Workstation. In Proceedings of the 2019 IEEE 17th International Conference on Industrial Informatics (INDIN), Helsinki, Finland, 22–25 July 2019; Volume 1, pp. 557–564. [Google Scholar] [CrossRef]
  26. Mazhar, O.; Navarro, B.; Ramdani, S.; Passama, R.; Cherubini, A. A real-time human-robot interaction framework with robust background invariant hand gesture detection. Robot. Comput. Manuf. 2019, 60, 34–48. [Google Scholar] [CrossRef] [Green Version]
  27. Weitschat, R.; Aschemann, H. Safe and efficient human–robot collaboration part II: Optimal generalized human-in-the-loop real-time motion generation. IEEE Robot. Autom. Lett. 2018, 3, 3781–3788. [Google Scholar] [CrossRef] [Green Version]
  28. Vivo, G.; Zanella, A.; Tokcalar, O.; Michalos, G. The ROBO-PARTNER EC Project: CRF Activities and Automotive Scenarios. Procedia Manuf. 2017, 11, 364–371. [Google Scholar] [CrossRef]
  29. Peter, T.; Bexten, S.; Müller, V.; Hauffe, V.; Elkmann, N. Object Classification on a High-Resolution Tactile Floor for Human-Robot Collaboration. In Proceedings of the 2020 25th IEEE International Conference on Emerging Technologies and Factory Automation (ETFA), Vienna, Austria, 8–11 September 2020; Volume 1, pp. 1255–1258. [Google Scholar] [CrossRef]
  30. Al-Yacoub, A.; Buerkle, A.; Flanagan, M.; Ferreira, P.; Hubbard, E.M.; Lohse, N. Effective human-robot collaboration through wearable sensors. In Proceedings of the 2020 25th IEEE International Conference on Emerging Technologies and Factory Automation (ETFA), Vienna, Austria, 8–11 September 2020; Volume 1, pp. 651–658. [Google Scholar] [CrossRef]
  31. Pulikottil, T.B.; Pellegrinelli, S.; Pedrocchi, N. A software tool for human-robot shared-workspace collaboration with task precedence constraints. Robot. Comput. Manuf. 2021, 67, 102051. [Google Scholar] [CrossRef]
  32. Cacacea, J.; Caccavalea, R.; Finzia, A. Supervised Hand-Guidance during Human Robot Collaborative Task Execution: A Case Study. In Proceedings of the 7th Italian Workshop on Artificial Intelligence and Robotics (AIRO 2020), Online, 26 November 2020; pp. 1–6. [Google Scholar]
  33. Ferraguti, F.; Landi, C.T.; Costi, S.; Bonfè, M.; Farsoni, S.; Secchi, C.; Fantuzzi, C. Safety barrier functions and multi-camera tracking for human–robot shared environment. Robot. Auton. Syst. 2020, 124, 103388. [Google Scholar] [CrossRef]
  34. Darvish, K.; Simetti, E.; Mastrogiovanni, F.; Casalino, G. A Hierarchical Architecture for Human–Robot Cooperation Processes. IEEE Trans. Robot. 2020, 37, 567–586. [Google Scholar] [CrossRef]
  35. Wang, X.V.; Zhang, X.; Yang, Y.; Wang, L. A Human-Robot Collaboration System towards High Accuracy. Procedia CIRP 2020, 93, 1085–1090. [Google Scholar] [CrossRef]
  36. Aljinovic, A.; Crnjac, M.; Nikola, G.; Mladineo, M.; Basic, A.; Ivica, V. Integration of the human-robot system in the learning factory assembly process. Procedia Manuf. 2020, 45, 158–163. [Google Scholar] [CrossRef]
  37. Weistroffer, V.; Paljic, A.; Fuchs, P.; Hugues, O.; Chodacki, J.P.; Ligot, P.; Morais, A. Assessing the acceptability of human-robot co-presence on assembly lines: A comparison between actual situations and their virtual reality counterparts. In Proceedings of the 23rd IEEE International Symposium on Robot and Human Interactive Communication, Edinburgh, UK, 25–29 August 2014; pp. 377–384. [Google Scholar] [CrossRef] [Green Version]
  38. Maurtua, I.; Ibarguren, A.; Kildal, J.; Susperregi, L.; Sierra, B. Human–robot collaboration in industrial applications: Safety, interaction and trust. Int. J. Adv. Robot. Syst. 2017, 14, 1729881417716010. [Google Scholar] [CrossRef]
  39. Vogel, C.; Schulenburg, E.; Elkmann, N. Projective-AR Assistance System for shared Human-Robot Workplaces in Industrial Applications. In Proceedings of the 2020 25th IEEE International Conference on Emerging Technologies and Factory Automation (ETFA), Vienna, Austria, 8–11 September 2020; Volume 1, pp. 1259–1262. [Google Scholar] [CrossRef]
  40. Heredia, J.; Cabrera, M.A.; Tirado, J.; Panov, V.; Tsetserukou, D. CobotGear: Interaction with Collaborative Robots using Wearable Optical Motion Capturing Systems. In Proceedings of the 2020 IEEE 16th International Conference on Automation Science and Engineering (CASE), Hong Kong, China, 20–21 August 2020; pp. 1584–1589. [Google Scholar] [CrossRef]
  41. Ogura, Y.; Fujii, M.; Nishijima, K.; Murakami, H.; Sonehara, M. Applicability of hand-guided robot for assembly-line work. J. Robot. Mechatron. 2012, 24, 547–552. [Google Scholar] [CrossRef]
  42. Tashtoush, T.; Garcia, L.; Landa, G.; Amor, F.; Laborde, A.N.; Oliva, D.; Safar, F. Human-Robot Interaction and Collaboration (HRI-C) Utilizing Top-View RGB-D Camera System. Int. J. Adv. Comput. Sci. Appl. 2021, 12. [Google Scholar] [CrossRef]
  43. Terreran, M.; Lamon, E.; Michieletto, S.; Pagello, E. Low-cost Scalable People Tracking System for Human-Robot Collaboration in Industrial Environment. Procedia Manuf. 2020, 51, 116–124. [Google Scholar] [CrossRef]
  44. Kousi, N.; Gkournelos, C.; Aivaliotis, S.; Giannoulis, C.; Michalos, G.; Makris, S. Digital twin for adaptation of robots’ behavior in flexible robotic assembly lines. Procedia Manuf. 2019, 28, 121–126. [Google Scholar] [CrossRef]
  45. Pichler, A.; Akkaladevi, S.C.; Ikeda, M.; Hofmann, M.; Plasch, M.; Wögerer, C.; Fritz, G. Towards shared autonomy for robotic tasks in manufacturing. Procedia Manuf. 2017, 11, 72–82. [Google Scholar] [CrossRef]
  46. Ko, D.; Lee, S.; Park, J. A study on manufacturing facility safety system using multimedia tools for cyber physical systems. Multimed. Tools Appl. 2020, 1–18. [Google Scholar]
  47. Bhana, M.; Bright, G. Theoretical 3-D Monitoring System for Human-Robot Collaboration. In Proceedings of the 2020 International SAUPEC/RobMech/PRASA Conference, Cape Town, South Africa, 29–31 January 2020; pp. 1–6. [Google Scholar] [CrossRef]
  48. Engemann, H.; Du, S.; Kallweit, S.; Cönen, P.; Dawar, H. OMNIVIL—An Autonomous Mobile Manipulator for Flexible Production. Sensors 2020, 20, 7249. [Google Scholar] [CrossRef] [PubMed]
  49. Iossifidis, I. Development of a Haptic Interface for Safe Human Roobt Collaboration. In Proceedings of the 4th International Conferenceon Pervasiveand Embedded Computing and Communication Systems (PECCS-2014), Lisbon, Portugal, 7–9 January 2014; pp. 61–66. [Google Scholar] [CrossRef]
  50. Lee, H.; Liau, Y.Y.; Kim, S.; Ryu, K. Model-Based Human Robot Collaboration System for Small Batch Assembly with a Virtual Fence. Int. J. Precis. Eng. Manuf. Technol. 2020, 7, 609–623. [Google Scholar] [CrossRef]
  51. Araiza-lllan, D.; Clemente, A.d.S.B. Dynamic Regions to Enhance Safety in Human-Robot Interactions. In Proceedings of the 2018 IEEE 23rd International Conference on Emerging Technologies and Factory Automation (ETFA), Turin, Italy, 4–7 September 2018; Volume 1, pp. 693–698. [Google Scholar] [CrossRef]
  52. Xue, C.; Qiao, Y.; Murray, N. Enabling Human-Robot-Interaction for Remote Robotic Operation via Augmented Reality. In Proceedings of the 2020 IEEE 21st International Symposium on “A World of Wireless, Mobile and Multimedia Networks” (WoWMoM), Cork, Ireland, 31 August–3 September 2020; pp. 194–196. [Google Scholar] [CrossRef]
  53. Akan, B.; Cürüklü, B.; Spampinato, G.; Asplund, L. Towards robust human robot collaboration in industrial environments. In Proceedings of the 2010 5th ACM/IEEE International Conference on Human-Robot Interaction (HRI), Osaka, Japan, 2–5 March 2010; pp. 71–72. [Google Scholar] [CrossRef]
  54. Chen, H.; Leu, M.C.; Tao, W.; Yin, Z. Design of a Real-time Human-robot Collaboration System Using Dynamic Gestures. In Proceedings of the ASME International Mechanical Engineering Congress and Exposition (IMECE), Virtual, Online, 16–19 November 2020. [Google Scholar]
  55. Angleraud, A.; Houbre, Q.; Netzev, M.; Pieters, R. Cognitive Semantics For Dynamic Planning In Human-Robot Teams. In Proceedings of the 2019 IEEE 15th International Conference on Automation Science and Engineering (CASE), Vancouver, BC, Canada, 22–26 August 2019; pp. 942–947. [Google Scholar] [CrossRef]
  56. Tirmizi, A.; De Cat, B.; Janssen, K.; Pane, Y.; Leconte, P.; Witters, M. User-Friendly Programming of Flexible Assembly Applications with Collaborative Robots. In Proceedings of the 2019 20th International Conference on Research and Education in Mechatronics (REM), Wels, Austria, 23–24 May 2019; pp. 1–7. [Google Scholar] [CrossRef]
  57. Maurtua, I.; Fernandez, I.; Kildal, J.; Susperregi, L.; Tellaeche, A.; Ibarguren, A. Enhancing safe human-robot collaboration through natural multimodal communication. In Proceedings of the 2016 IEEE 21st International Conference on Emerging Technologies and Factory Automation (ETFA), Berlin, Germany, 6–9 September 2016; pp. 1–8. [Google Scholar] [CrossRef] [Green Version]
  58. Moniri, M.M.; Valcarcel, F.A.E.; Merkel, D.; Sonntag, D. Human gaze and focus-of-attention in dual reality human-robot collaboration. In Proceedings of the 2016 12th International Conference on Intelligent Environments (IE), London, UK, 14–16 September 2016; pp. 238–241. [Google Scholar] [CrossRef]
  59. Jiang, B.C.; Gainer, C.A. A cause-and-effect analysis of robot accidents. J. Occup. Accid. 1987, 9, 27–45. [Google Scholar] [CrossRef]
  60. Carbonero, F.; Ernst, E.; Weber, E. Robots Worldwide: The Impact of Automation on Employment and Trade; International Labour Organization: Geneva, Switzerland, 2020. [Google Scholar] [CrossRef]
  61. Vysocky, A.; Novak, P. Human–Robot collaboration in industry. Sci. J. 2016, 2016, 903–906. [Google Scholar] [CrossRef]
  62. Probst, L.; Frideres, L.; Pedersen, B.; Caputi, C. Service innovation for smart industry: Human–robot collaboration. Eur. Comm. Luxemb. 2015. Available online: https://ec.europa.eu/docsroom/documents/13392/attachments/4/translations/en/renditions/native (accessed on 16 April 2021).
  63. Matheson, E.; Minto, R.; Zampieri, E.; Faccio, M.; Rosati, G. Human–Robot Collaboration in Manufacturing Applications: A Review. Robotics 2019, 8, 100. [Google Scholar] [CrossRef] [Green Version]
  64. Breazeal, C.; Dautenhahn, K.; Kanda, T. Social robotics. In Springer Handbook of Robotics; Springer: Cham, Switzerland, 2016; pp. 1935–1972. [Google Scholar] [CrossRef]
  65. Belpaeme, T.; Kennedy, J.; Ramachandran, A.; Scassellati, B.; Tanaka, F. Social robots for education: A review. Sci. Robot. 2018, 3. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  66. Feil-Seifer, D.; Mataric, M.J. Defining socially assistive robotics. In Proceedings of the 9th International Conference on Rehabilitation Robotics, ICORR, Chicago, IL, USA, 28 June–1 July 2005; pp. 465–468. [Google Scholar] [CrossRef] [Green Version]
  67. Vanderborght, B. Unlocking the Potential of Industrial Human–Robot Collaboration: A Vision on Industrial Collaborative Robots for Economy and Society. 2019. Available online: https://ec.europa.eu/info/publications/unlocking-potential-industrial-human-robot-collaboration_en (accessed on 5 April 2021). [CrossRef]
  68. Probst, L.; Pedersen, B.; Lefebvre, V.; Dakkak, L. USA-China-EU plans for AI: Where do we stand. In Digital Transformation Monitor of the European Commission; 2018; Available online: https://ati.ec.europa.eu/reports/technology-watch/usa-china-eu-plans-ai-where-do-we-stand-0 (accessed on 12 April 2021).
  69. Osterrieder, P.; Budde, L.; Friedli, T. The smart factory as a key construct of industry 4.0: A systematic literature review. Int. J. Prod. Econ. 2020, 221, 107476. [Google Scholar] [CrossRef]
  70. Hozdić, E. Smart factory for industry 4.0: A review. J. Mod. Manuf. Syst. Technol. 2015, 7, 28–35. [Google Scholar]
  71. Amodei, D.; Olah, C.; Steinhardt, J.; Christiano, P.; Schulman, J.; Mané, D. Concrete Problems in AI Safety. arXiv 2016, arXiv:1606.06565. [Google Scholar]
  72. Karnouskos, S.; Sinha, R.; Leitão, P.; Ribeiro, L.; Strasser, T.I. The applicability of ISO/IEC 25023 measures to the integration of agents and automation systems. In Proceedings of the IECON 2018—44th Annual Conference of the IEEE Industrial Electronics Society, Washington, DC, USA, 21–23 October 2018; pp. 2927–2934. [Google Scholar] [CrossRef]
  73. Zeyu, H.; Geming, X.; Zhaohang, W.; Sen, Y. Survey on Edge Computing Security. In Proceedings of the 2020 International Conference on Big Data, Artificial Intelligence and Internet of Things Engineering (ICBAIE), Fuzhou, China, 12–14 June 2020; pp. 96–105. [Google Scholar] [CrossRef]
  74. Xiao, Y.; Jia, Y.; Liu, C.; Cheng, X.; Yu, J.; Lv, W. Edge computing security: State of the art and challenges. Proc. IEEE 2019, 107, 1608–1631. [Google Scholar] [CrossRef]
  75. Endika, G.U.; Víctor, M.V.; Oxel, U.; Nuria, G.; Unai, A.; Juan, M. The Week of Universal Robots’ Bugs. 2020. Available online: https://news.aliasrobotics.com/week-of-universal-robots-bugs-exposing-insecurity/ (accessed on 25 April 2021).
Figure 1. Robots’ and workers’ workspaces.
Figure 1. Robots’ and workers’ workspaces.
Jsan 10 00048 g001
Figure 2. Collaboration levels adapted for this study: (a) Coexistence; (b) Cooperation; (c) Collaboration.
Figure 2. Collaboration levels adapted for this study: (a) Coexistence; (b) Cooperation; (c) Collaboration.
Jsan 10 00048 g002
Figure 3. Flow diagram overview of consecutive stages and results at each stage.
Figure 3. Flow diagram overview of consecutive stages and results at each stage.
Jsan 10 00048 g003
Figure 4. Distribution by year of articles included in the review.
Figure 4. Distribution by year of articles included in the review.
Jsan 10 00048 g004
Figure 5. Percentage of different kinds of sensors/devices used for HRC.
Figure 5. Percentage of different kinds of sensors/devices used for HRC.
Jsan 10 00048 g005
Table 1. Queries used in search and matching article count in each database.
Table 1. Queries used in search and matching article count in each database.
DatabaseQueryResults
ScopusTITLE-ABS-KEY ((“human robot collaboration” OR “HRC”) AND (“smart manufacturing” OR “smart factories” OR “industrial environment” OR “factory”))177
Web of ScienceTOPIC: (((“human robot collaboration” OR “HRC”) AND (“smart manufacturing” OR “smart factories” OR “industrial environment” OR “factory”)))74
Total251
After removing duplicates193
Table 2. Extracted elements.
Table 2. Extracted elements.
StudySensors/Devices Used for HRCAlgorithms for HRCApplied ApplicationCollaboration LevelSafety ActionStandard
[13]3D camerasHuman detection/trackingGeneric assembly line mockupCollaborationSRMSNot mentioned
[14]RadarsHuman detection/trackingNot mentionedCollaborationSSMISO 10218, ISO/TS 15066
[15]Ultrasonic sensorsHuman detection/trackingCollaborative assembly of a toy carCollaborationSSMISO 13855, ISO/TS 15066
[16]VR/ARSpecific for training operators for HRCVirtual safety trainingCollaborationSSM, HGISO 10218-2, ISO/TS 15066
[17]Wearables, haptic feedback, 3D camerasHuman physiology detection/recognitionCollaborative assembly taskCollaborationNoneNot mentioned
[18]VR/AR, haptic feedbackSpecific for training operators for HRCCollaborative assembly tasksCollaborationNoneNot mentioned
[19]Force/tactile sensorsForce detection/recognitionCar assemblyCollaborationPFLISO/TS 15066, ISO 10218
[20]3D cameras, laser scannersHuman detection/tracking, gesture recognition, motion planning/collision avoidanceGearbox assembly stationCollaborationSSM, SRMSNot mentioned
[21]Force/tactile sensorsForce detection/recognitionMechanical Joining processesCollaborationHG, SRMSISO 10218-1
[22]3D cameras, laser scanners, wearablesHuman detection/tracking, gesture recognitionGearbox assemblyCollaborationSSM, PFL, HGISO 10218, ISO/TS 15066
[23]Force/tactile sensorsForce detection/recognitionPalletization taskCollaboration, CooperationPFLNot mentioned
[24]Wearables, 3D camerasHuman physiology detection/recognition, human detection/trackingPick and place tasksCollaboration, CooperationNoneNot mentioned
[25]Force/tactile sensorsForce detection/recognitionWooden box assemblyCollaboration, CooperationNoneNot mentioned
[26]3D camerasGesture recognitionLab demoCollaboration, CooperationSSM, SRMS, HG, PFLISO/TS 15066
[27]3D camerasMotion planning/collision avoidanceLab demoCollaboration, CooperationSSMISO/TS 15066
[28]SafetyEyE, capacitive sensorHuman detection/trackingBrakes assembly-Twin Engine assemblyCollaboration, CoexistenceSSM, SRMSISO 10218, ISO/TS 15066, ISO 9001
[29]Force/tactile sensorsHuman detection/trackingPackaging processCooperationSSMNot mentioned
[30]WearablesHuman physiology detectionNot mentionedCooperationNoneNot mentioned
[31]3D camerasHuman detection/trackingAn industrial assembly station mockupCooperationSRMSNot mentioned
[32]Force/tactile sensorsHuman detection/trackingTask of inserting metallic items on the monocoqueCooperationHG, PFLNot mentioned
[33]3D camerasMotion planning/collision avoidanceNot mentionedCooperationSSMISO 10218-1/2, ISO/TS 15066
[34]Wearables, 3D camerasHuman physiology detectionAssembly tasksCooperationSSMNot mentioned
[35]3D camerasHuman detection/tracking, motion planning/collision avoidanceNot mentionedCooperationSSMNot mentioned
[36]Force/tactile sensorsForce detection/recognitionAssembly of the drive module and the base plate of vehicleCooperationPFLNot mentioned
[37]Force/tactile sensors, laser scannerHuman detection/trackingMockup vehicle door assemblyCooperationSRMSNot mentioned
[38]3D cameras, force/tactile sensorsSpeech recognition/
synthesized speech, gesture recognition, force detection/recognition, human detection/tracking
Demos in industrial fairs and exhibitions (TECHNISHOW-BIEMH)CooperationHG, SSMISO 10218-1/2, ISO/TS 15066
[39]Laser scannersHuman detection/trackingPick and place operation on two conveyor beltsCooperation, CoexistenceSSM, SRMSISO 10218-2, ISO/TS 15066, ISO 13855
[40]Wearables, infrared sensors/thermal camerasMotion planning/collision avoidanceNot mentionedCooperation, CoexistenceSSM, SRMSNot mentioned
[41]Laser scannersHuman detection/trackingAssembly line work for vehicle assemblyCooperation, CoexistenceHGISO 10218-1
[42]3D camerasHuman detection/trackingGeneric assembly line mockupCooperation, CoexistenceSSM, SRMS, HGISO 10218-1/2, ISO/TS 15066
[43]3D camerasHuman detection/trackingNot mentionedCooperation, CoexistenceSRMSNot mentioned (ISO/TS 15066 in future)
[44]3D camerasGesture recognitionAssembly of a vehicle’s front axleCooperation, CoexistenceSSMNot mentioned
[45]3D cameras, 2D camerasHuman detection/trackingCylinder head assembly for combustion engines. Steam cooker parts assemblyCooperation, CoexistenceSSMISO 10218
[46]3D camerasHuman detection/trackingNot mentionedCoexistenceSRMSNot mentioned
[47]3D camerasHuman detection/trackingNot mentionedCoexistenceNoneNot mentioned
[48]3D cameras, 2D cameras, infrared sensors/thermal cameras, laser scannersHuman detection/tracking, motion planning/collision avoidanceRobot cell with delta picker; Manual workbench with augmented reality supportCoexistenceSSM, SRMSNot mentioned
[49]Force/tactile sensorsForce detection/recognitionMockup in labCoexistenceHG, PFLNot mentioned
[50]2D camerasHuman detection/tracking, gesture recognitionAssembly cellCoexistenceSSMNot mentioned
[51]3D camerasHuman detection/tracking, facial recognitionPick and place taskCoexistenceSSM, SRMSISO/TS 15066
[52]AR/VR, wearablesGesture recognitionPick and place operationInterface MethodNoneNot mentioned
[53]Microphone/speakersSpeech recognition/ synthesized speechSimple pick and place applicationsInterface MethodNoneNot mentioned
[54]2D camerasGesture recognitionNot mentionedInterface MethodNoneNot mentioned
[55]Microphone/speakersSpeech recognition/ synthesized speechNot mentionedInterface MethodNoneNot mentioned
[56]Microphone/speakersSpeech recognition/ synthesized speechAir compressor assemblyInterface MethodNoneNot mentioned
[57]Microphone/speakers, 3D camerasSpeech recognition/ synthesized speech, gesture recognitionDie assembly and deburring of wax piecesInterface MethodNoneNot mentioned
[58]VR/AR, 3D camerasHuman physiology detection, facial recognitionPick and place taskInterface MethodHGNot mentioned
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Share and Cite

MDPI and ACS Style

Arents, J.; Abolins, V.; Judvaitis, J.; Vismanis, O.; Oraby, A.; Ozols, K. Human–Robot Collaboration Trends and Safety Aspects: A Systematic Review. J. Sens. Actuator Netw. 2021, 10, 48. https://doi.org/10.3390/jsan10030048

AMA Style

Arents J, Abolins V, Judvaitis J, Vismanis O, Oraby A, Ozols K. Human–Robot Collaboration Trends and Safety Aspects: A Systematic Review. Journal of Sensor and Actuator Networks. 2021; 10(3):48. https://doi.org/10.3390/jsan10030048

Chicago/Turabian Style

Arents, Janis, Valters Abolins, Janis Judvaitis, Oskars Vismanis, Aly Oraby, and Kaspars Ozols. 2021. "Human–Robot Collaboration Trends and Safety Aspects: A Systematic Review" Journal of Sensor and Actuator Networks 10, no. 3: 48. https://doi.org/10.3390/jsan10030048

APA Style

Arents, J., Abolins, V., Judvaitis, J., Vismanis, O., Oraby, A., & Ozols, K. (2021). Human–Robot Collaboration Trends and Safety Aspects: A Systematic Review. Journal of Sensor and Actuator Networks, 10(3), 48. https://doi.org/10.3390/jsan10030048

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop