Perception and Computation for Speed and Separation Monitoring Architectures
Abstract
:1. Introduction
1.1. Safety-Rated Monitored Stop
1.2. Power and Force Limiting
1.3. Hand Guiding
1.4. Speed and Separation Monitoring
1.5. HRC Collaboration Method Trade-Offs
2. Perception Technologies
2.1. IR Sensors
2.2. LiDAR Sensors
2.3. Time-of-Flight Sensors
2.4. Radar Sensors
2.5. Vision Sensors
2.5.1. Stereo Vision
2.5.2. Mono Vision
2.6. Thermal Vision
3. Speed and Separation Monitoring Architecture
3.1. Perception System Mounting
3.1.1. Off-Robot
3.1.2. On-Robot
3.2. Perception Sensor Performance
3.2.1. Sample Rate
3.2.2. Coverage
3.2.3. Point Density
3.2.4. Calibration
3.3. Computation
4. Materials and Methods
5. Current Trends and Limitations of SSM
5.1. Perception Trends
5.2. Computational Trends
5.3. Scope Limitations
5.4. Technical Limitations
6. Discussion
7. Conclusions
Author Contributions
Funding
Institutional Review Board Statement
Data Availability Statement
Acknowledgments
Conflicts of Interest
Abbreviations
MDPI | Multidisciplinary Digital Publishing Institute |
SSM | Speed and Separation Monitoring |
PFL | Power and Force Limiting |
ISO | International Organization of Standardization |
ToF | Time-of-Flight |
IR | Infrared |
HRC | Human Robot Collaboration |
HRI | Human Robot Interaction |
TCP | Tool Center Point |
SBC | Single Board Computer |
SOM | System on Module |
SOC | System on Chip |
IoT | Internet of Things |
ADC | Analog to Digital Converter |
SVD | Singular Value Decomposition |
AGV | Automated Guided Vehicle |
AGR | Autonomous Guided Robot |
AoA | Angle of Arrival |
Appendix A
Reference | Category | Hardware |
---|---|---|
Zhang, Chenyang; et al. [76] | Monovision, Stereovision | PC |
Rakhmatulin, Viktor; et al. [104] | Monovision, MOCAP | Embedded, Baremetal |
Ubezio, Barnaba; et al. [62] | Radar | PC |
Podgorelec, David; et al. [27] | – | Embedded |
Flowers, Jared; et al. [75] | Stereovision | PC |
Rashid, Aquib; et al. [25] | LiDAR, Stereovision | PC |
Tsuji, Satoshi; et al. [53] | 1DTOF | Baremetal |
Tsuji, Satoshi [52] | 1DTOF | Baremetal |
Amaya-Mejía, Lina María; et al. [78] | 3DTOF | PC |
Yang, Botao; et al. [43] | Thermal, Stereovision | PC |
Sifferman, Carter; et al. [95] | 1DTOF | Baremetal |
Karagiannis, Panagiotis; et al. [73] | Stereovision | PC, PLC |
Lacevic, Bakir; et al. [42] | 3DTOF | PC, Embedded |
Park, Jinha; et al. [21] | 3DTOF, LiDAR | PC |
Ubezio, Barnaba; et al. [65] | Radar | PC |
Costanzo, Marco; et al. [72] | Thermal, Monovision, Stereovision | PC |
Scibilia, Adriano; et al. [5] | – | – |
Lucci, Niccolo; et al. [35] | 3DTOF | PC |
Rashid, Aquib; et al. [17] | LiDAR, Monovision | PC |
Du, Guanglong; et al. [41] | 3DTOF, Monovision | PC |
Tsuji, Satoshi; et al. [50] | 1DTOF | Baremetal |
Glogowski, Paul; et al. [116] | 3DTOF | PC |
Svarny, Petr; et al. [71] | Monovision, Stereovision | – |
Antão, Liliana; et al. [70] | Stereovision | PC |
Kumar, Shitij; et al. [22] | 1DTOF | PC, Baremetal |
Benli, Emrah; et al. [85] | Thermal, Stereovision | PC |
Lemmerz, Kai; et al. [117] | 3DTOF, Monovision | PC |
Kumar, Shitij; et al. [51] | 1DTOF | PC, Baremetal |
Hughes, Dana; et al. [12] | 1DTOF | PC, Baremetal |
Marvel, Jeremy A.; et al. [18] | LiDAR | PC |
Zanchettin, Andrea Maria; et al. [37] | 3DTOF | PC |
Marvel, Jeremy A. [13] | LiDAR, Stereovision, MOCAP | PC |
Tan, Jeffrey Too Chuan; et al. [68] | Stereovision | PC |
Lacevic, Bakir; et al. [118] | – | – |
References
- Barata, J.; Kayser, I. Industry 5.0—Past, Present, and Near Future. Procedia Comput. Sci. 2023, 219, 778–788. [Google Scholar] [CrossRef]
- Subramanian, K.; Singh, S.; Namba, J.; Heard, J.; Kanan, C.; Sahin, F. Spatial and Temporal Attention-Based Emotion Estimation on HRI-AVC Dataset. In Proceedings of the 2023 IEEE International Conference on Systems, Man, and Cybernetics (SMC), Honolulu, HI, USA, 1–4 October 2023; pp. 4895–4900. [Google Scholar] [CrossRef]
- Namba, J.R.; Subramanian, K.; Savur, C.; Sahin, F. Database for Human Emotion Estimation Through Physiological Data in Industrial Human-Robot Collaboration. In Proceedings of the 2023 IEEE International Conference on Systems, Man, and Cybernetics (SMC), Honolulu, HI, USA, 1–4 October 2023; pp. 4901–4907. [Google Scholar] [CrossRef]
- ISO/TS 15066:2016(en); Robots and Robotic Devices—Collaborative Robots. ISO-International Organization for Standardization: Geneva, Switzerland, 2016.
- Scibilia, A.; Valori, M.; Pedrocchi, N.; Fassi, I.; Herbster, S.; Behrens, R.; Saenz, J.; Magisson, A.; Bidard, C.; Kuhnrich, M.; et al. Analysis of Interlaboratory Safety Related Tests in Power and Force Limited Collaborative Robots. IEEE Access 2021, 9, 80873–80882. [Google Scholar] [CrossRef]
- Kuka. LBR iiwa. 2023. Available online: https://www.kuka.com/en-us/products/robotics-systems/industrial-robots/lbr-iiwa (accessed on 29 June 2024).
- ABB. Product Specification-IRB 14000. 2015. Available online: https://library.e.abb.com/public/5f8bca51d2b541709ea5d4ef165e46ab/3HAC052982%20PS%20IRB%2014000-en.pdf (accessed on 29 June 2024).
- UR10e Medium-Sized, Versatile Cobot. Available online: https://www.universal-robots.com/products/ur10-robot/ (accessed on 29 June 2024).
- myUR. 2019. Available online: https://myur.universal-robots.com/manuals/content/SW_5_14/Documentation%20Menu/Software/Introduction/Freedrive (accessed on 29 June 2024).
- Sharp. GP2Y0A21YK0F. Available online: https://global.sharp/products/device/lineup/data/pdf/datasheet/gp2y0a21yk_e.pdf (accessed on 29 June 2024).
- Buizza Avanzini, G.; Ceriani, N.M.; Zanchettin, A.M.; Rocco, P.; Bascetta, L. Safety Control of Industrial Robots Based on a Distributed Distance Sensor. IEEE Trans. Control Syst. Technol. 2014, 22, 2127–2140. [Google Scholar] [CrossRef]
- Hughes, D.; Lammie, J.; Correll, N. A Robotic Skin for Collision Avoidance and Affective Touch Recognition. IEEE Robot. Autom. Lett. 2018, 3, 1386–1393. [Google Scholar] [CrossRef]
- Marvel, J.A. Performance metrics of speed and separation monitoring in shared workspaces. IEEE Trans. Autom. Sci. Eng. 2013, 10, 405–414. [Google Scholar] [CrossRef]
- McManamon, P. LiDAR Technologies and Systems; SPIE Press: Bellingham, UK, 2019. [Google Scholar]
- Horaud, R.; Hansard, M.; Evangelidis, G.; Ménier, C. An overview of depth cameras and range scanners based on time-of-flight technologies. Mach. Vis. Appl. 2016, 27, 1005–1020. [Google Scholar] [CrossRef]
- Zlatanski, M.; Sommer, P.; Zurfluh, F.; Madonna, G.L. Radar Sensor for Fenceless Machine Guarding and Collaborative Robotics. In Proceedings of the 2018 International Conference on Intelligence and Safety for Robotics (ISR 2018), Shenyang, China, 24–27 August 2018; pp. 19–25. [Google Scholar] [CrossRef]
- Rashid, A.; Peesapati, K.; Bdiwi, M.; Krusche, S.; Hardt, W.; Putz, M. Local and Global Sensors for Collision Avoidance. In Proceedings of the IEEE International Conference on Multisensor Fusion and Integration for Intelligent Systems, Virtual, 14–16 September 2020; pp. 354–359. [Google Scholar] [CrossRef]
- Marvel, J.A.; Roger, B. Test Methods for the Evaluation of Manufacturing Mobile Manipulator Safety. J. Robot. Mechatron. 2016, 28, 199–214. [Google Scholar]
- Marvel, J.A.; Norcross, R. Implementing speed and separation monitoring in collaborative robot workcells. Robot. Comput. Integr. Manuf. 2017, 44, 144–155. [Google Scholar] [CrossRef]
- Byner, C.; Matthias, B.; Ding, H. Dynamic speed and separation monitoring for collaborative robot applications–Concepts and performance. Robot. Comput. Integr. Manuf. 2019, 58, 239–252. [Google Scholar] [CrossRef]
- Park, J.; Sorensen, L.C.; Mathiesen, S.F.; Schlette, C. A Digital Twin-based Workspace Monitoring System for Safe Human-Robot Collaboration. In Proceedings of the 2022 10th International Conference on Control, Mechatronics and Automation (ICCMA 2022), Luxembourg, 9–12 November 2022; pp. 24–30. [Google Scholar] [CrossRef]
- Kumar, S.; Arora, S.; Sahin, F. Speed and separation monitoring using on-robot time-of-flight laser-ranging sensor arrays. In Proceedings of the IEEE International Conference on Automation Science and Engineering, Vancouver, BC, Canada, 22–26 August 2019; IEEE Computer Society: Washington, DC, USA, 2019; pp. 1684–1691. [Google Scholar] [CrossRef]
- Zlatanski, M.; Sommer, P.; Zurfluh, F.; Zadeh, S.G.; Faraone, A.; Perera, N. Machine Perception Platform for Safe Human-Robot Collaboration. In Proceedings of the 2019 IEEE SENSORS, Montreal, QC, Canada, 27–30 October 2019; pp. 1–4. [Google Scholar] [CrossRef]
- Rashid, A.; Bdiwi, M.; Hardt, W.; Putz, M.; Ihlenfeldt, S. Efficient Local and Global Sensing for Human Robot Collaboration with Heavy-duty Robots. In Proceedings of the 2021 IEEE International Symposium on Robotic and Sensors Environments (ROSE), Virtually, 28–29 October 2021; pp. 1–7. [Google Scholar] [CrossRef]
- Rashid, A.; Alnaser, I.; Bdiwi, M.; Ihlenfeldt, S. Flexible sensor concept and an integrated collision sensing for efficient human-robot collaboration using 3D local global sensors. Front. Robot. AI 2023, 10, 1028411. [Google Scholar] [CrossRef]
- Kim, E.; Yamada, Y.; Okamoto, S.; Sennin, M.; Kito, H. Considerations of potential runaway motion and physical interaction for speed and separation monitoring. Robot. Comput. Integr. Manuf. 2021, 67, 102034. [Google Scholar] [CrossRef]
- Podgorelec, D.; Uran, S.; Nerat, A.; Bratina, B.; Pečnik, S.; Dimec, M.; žaberl, F.; žalik, B.; šafarič, R. LiDAR-Based Maintenance of a Safe Distance between a Human and a Robot Arm. Sensors 2023, 23, 4305. [Google Scholar] [CrossRef] [PubMed]
- Arora, S.; Subramanian, K.; Adamides, O.; Sahin, F. Using Multi-channel 3D Lidar for Safe Human-Robot Interaction. In Proceedings of the 2024 IEEE 20th International Conference on Automation Science and Engineering (CASE), Bari, Italy, 28 August–1 September 2024; pp. 1823–1830. [Google Scholar] [CrossRef]
- Adamides, O.A.; Avery, A.; Subramanian, K.; Sahin, F. Evaluation of On-Robot Depth Sensors for Industrial Robotics. In Proceedings of the 2023 IEEE International Conference on Systems, Man, and Cybernetics (SMC), Honolulu, HI, USA, 1–4 October 2023; pp. 1014–1021. [Google Scholar] [CrossRef]
- Li, L. Time-of-Flight Camera—An Introduction; Texas Instruments: Dallas, TX, USA, 2014. [Google Scholar]
- Microsoft. Azure Kinect DK Hardware Specifications|Microsoft Learn. 2022. Available online: https://learn.microsoft.com/en-us/previous-versions/azure/kinect-dk/hardware-specification (accessed on 29 June 2024).
- Adamides, O.A.; Modur, A.S.; Kumar, S.; Sahin, F. A time-of-flight on-robot proximity sensing system to achieve human detection for collaborative robots. In Proceedings of the IEEE International Conference on Automation Science and Engineering, Vancouver, BC, Canada, 22–26 August 2019; IEEE Computer Society: Washington, DC, USA, 2019; pp. 1230–1236. [Google Scholar] [CrossRef]
- Bonn-Rhein-Sieg, H. Biomechanical Requirements for Collaborative Robots in the Medical Field. Master’s Thesis, RWTH Aachen University, Aachen, Germany, 2009. Available online: https://www.dguv.de/medien/ifa/de/fac/kollaborierende_roboter/medizin_biomech_anforderungen/master_thesis_bjoern_ostermann.pdf (accessed on 29 June 2024).
- Vicentini, F.; Pedrocchi, N.; Giussani, M.; Molinari Tosatti, L. Dynamic safety in collaborative robot workspaces through a network of devices fulfilling functional safety requirements. In Proceedings of the ISR/Robotik 2014: 41st International Symposium on Robotics, Munich, Germany, 2–3 June 2014; pp. 1–7. [Google Scholar]
- Lucci, N.; Lacevic, B.; Zanchettin, A.M.; Rocco, P. Combining speed and separation monitoring with power and force limiting for safe collaborative robotics applications. IEEE Robot. Autom. Lett. 2020, 5, 6121–6128. [Google Scholar] [CrossRef]
- Andersen, M.R.; Jensen, T.; Lisouski, P.; Mortensen, A.K.; Hansen, M.K.; Gregersen, T.; Ahrendt, P. Kinect Depth Sensor Evaluation for Computer Vision Applications; Aarhus University: Copenhagen, Denmark, 2012; pp. 1–37. [Google Scholar]
- Zanchettin, A.M.; Ceriani, N.M.; Rocco, P.; Ding, H.; Matthias, B. Safety in human-robot collaborative manufacturing environments: Metrics and control. IEEE Trans. Autom. Sci. Eng. 2016, 13, 882–893. [Google Scholar] [CrossRef]
- Parigi Polverini, M.; Zanchettin, A.M.; Rocco, P. A computationally efficient safety assessment for collaborative robotics applications. Robot. Comput. Integr. Manuf. 2017, 46, 25–37. [Google Scholar] [CrossRef]
- Rosenstrauch, M.J.; Pannen, T.J.; Krüger, J. Human robot collaboration-using kinect v2 for ISO/TS 15066 speed and separation monitoring. Procedia Cirp 2018, 76, 183–186. [Google Scholar] [CrossRef]
- Andres, C.P.C.; Hernandez, J.P.L.; Baldelomar, L.T.; Martin, C.D.F.; Cantor, J.P.S.; Poblete, J.P.; Raca, J.D.; Vicerra, R.R.P. Tri-modal speed and separation monitoring technique using static-dynamic danger field implementation. In Proceedings of the 2018 IEEE 10th International Conference on Humanoid, Nanotechnology, Information Technology, Communication and Control, Environment and Management (HNICEM 2018), Baguio City, Philippines, 29 November–2 December 2018. [Google Scholar] [CrossRef]
- Du, G.; Liang, Y.; Yao, G.; Li, C.; Murat, R.J.; Yuan, H. Active Collision Avoidance for Human-Manipulator Safety. IEEE Access 2020, 10, 16518–16529. [Google Scholar] [CrossRef]
- Lacevic, B.; Zanchettin, A.M.; Rocco, P. Safe Human-Robot Collaboration via Collision Checking and Explicit Representation of Danger Zones. IEEE Trans. Autom. Sci. Eng. 2022, 20, 846–861. [Google Scholar] [CrossRef]
- Yang, B.; Xie, S.; Chen, G.; Ding, Z.; Wang, Z. Dynamic Speed and Separation Monitoring Based on Scene Semantic Information. J. Intell. Robot. Syst. 2022, 106, 35. [Google Scholar] [CrossRef]
- lolambean. HoloLens 2 Hardware. 2023. Available online: https://learn.microsoft.com/en-us/hololens/hololens2-hardware (accessed on 12 March 2025).
- Subramanian, K.; Arora, S.; Adamides, O.; Sahin, F. Using Mixed Reality for Safe Physical Human-Robot Interaction. In Proceedings of the 2024 IEEE Conference on Telepresence, Pasadena, CA, USA, 16–17 November 2024; pp. 225–229. [Google Scholar] [CrossRef]
- ORBBEC. Femto Bolt. 2023. Available online: https://www.orbbec.com/products/tof-camera/femto-bolt/ (accessed on 8 March 2025).
- ORBBEC. Broadening the Application and Accessibility of 3D Vision. 2023. Available online: https://www.orbbec.com/microsoft-collaboration/ (accessed on 8 March 2025).
- Tsuji, S.; Kohama, T. Proximity Skin Sensor Using Time-of-Flight Sensor for Human Collaborative Robot. IEEE Sens. J. 2019, 19, 5859–5864. [Google Scholar] [CrossRef]
- Tsuji, S.; Kohama, T. Sensor Module Combining Time-of-Flight with Self-Capacitance Proximity and Tactile Sensors for Robot. IEEE Sens. J. 2022, 22, 858–866. [Google Scholar] [CrossRef]
- Tsuji, S.; Kohama, T. A General-Purpose Safety Light Curtain Using ToF Sensor for End Effector on Human Collaborative Robot. IEEJ Trans. Electr. Electron. Eng. 2020, 15, 1868–1874. [Google Scholar] [CrossRef]
- Kumar, S.; Savur, C.; Sahin, F. Dynamic Awareness of an Industrial Robotic Arm Using Time-of-Flight Laser-Ranging Sensors. In Proceedings of the 2018 IEEE International Conference on Systems, Man, and Cybernetics (SMC 2018), Miyazaki, Japan, 7–10 October 2018; pp. 2850–2857. [Google Scholar] [CrossRef]
- Tsuji, S. String-Like Time of Flight Sensor Module for a Collaborative Robot. IEEJ Trans. Electr. Electron. Eng. 2023, 18, 1576–1582. [Google Scholar] [CrossRef]
- Tsuji, S.; Kohama, T. Proximity and Tactile Sensor Combining Multiple ToF Sensors and a Self-Capacitance Proximity and Tactile Sensor. IEEJ Trans. Electr. Electron. Eng. 2023, 18, 797–805. [Google Scholar] [CrossRef]
- Arducam. Time of Flight (ToF) Camera for Raspberry Pi. Available online: https://www.arducam.com/time-of-flight-camera-raspberry-pi/ (accessed on 12 March 2025).
- Rinaldi, A.; Menolotto, M.; Kelly, D.; Torres-Sanchez, J.; O’Flynn, B.; Chiaberge, M. Assessing Latency Cascades: Quantify Time-to-Respond Dynamics in Human-Robot Collaboration for Speed and Separation Monitoring. In Proceedings of the 2024 Smart Systems Integration Conference and Exhibition (SSI), Hamburg, Germany, 16–18 April 2024; pp. 1–6. [Google Scholar] [CrossRef]
- Iovescu, C.; Rao, S. The Fundamentals of Millimeter Wave Radar Sensors. 2020. Available online: https://www.ti.com/lit/wp/spyy005a/spyy005a.pdf?ts=1737121458941&ref_url=https%253A%252F%252Fwww.google.com%252F (accessed on 17 January 2025).
- mmWave Radar Sensors|TI.com. Available online: https://www.ti.com/sensors/mmwave-radar/overview.html (accessed on 12 March 2025).
- IWR6843AOP Data Sheet, Product Information and Support|TI.com. Available online: https://www.ti.com/product/IWR6843AOP (accessed on 12 March 2025).
- Radar Sensors. Available online: https://www.d3embedded.com/product-category/radar-sensors/ (accessed on 12 March 2025).
- TI. xWRL6432 MMWAVE-L-SDK: 2D Capon Beamforming. Available online: https://software-dl.ti.com/ra-processors/esd/MMWAVE-L-SDK/05_04_00_01/exports/api_guide_xwrL64xx/CAPON_BEAMFORMING_2D.html (accessed on 12 March 2025).
- Wang, G.; Munoz-Ferreras, J.M.; Gu, C.; Li, C.; Gomez-Garcia, R. Application of linear-frequency-modulated continuous-wave (LFMCW) radars for tracking of vital signs. IEEE Trans. Microw. Theory Tech. 2014, 62, 1387–1399. [Google Scholar] [CrossRef]
- Ubezio, B.; Zangl, H.; Hofbaur, M. Extrinsic Calibration of a Multiple Radar System for Proximity Perception in Robotics. In Proceedings of the 2023 IEEE International Instrumentation and Measurement Technology Conference (I2MTC), Kuala Lumpur, Malaysia, 22–25 May 2023; pp. 1–6. [Google Scholar] [CrossRef]
- Gietler, H.; Ubezio, B.; Zangl, H. Simultaneous AMCW ToF Camera and FMCW Radar Simulation. In Proceedings of the 2023 IEEE International Instrumentation and Measurement Technology Conference (I2MTC), Kuala Lumpur, Malaysia, 22–25 May 2023; pp. 1–6. [Google Scholar] [CrossRef]
- Nimac, P.; Petrič, T.; Krpič, A.; Gams, A. Evaluation of FMCW Radar for Potential Use in SSM. In Proceedings of the Advances in Service and Industrial Robotics; Müller, A., Brandstötter, M., Eds.; Springer: Cham, Switzerland, 2022; pp. 580–588. [Google Scholar] [CrossRef]
- Ubezio, B.; Schoffmann, C.; Wohlhart, L.; Mulbacher-Karrer, S.; Zangl, H.; Hofbaur, M. Radar Based Target Tracking and Classification for Efficient Robot Speed Control in Fenceless Environments. In Proceedings of the IEEE International Conference on Intelligent Robots and Systems, Prague, Czech Republic, 27 September–1 October 2021; pp. 799–806. [Google Scholar] [CrossRef]
- Moravec, H. Robot spatial perception by stereoscopic vision and 3d evidence grids. Perception 1996, 483, 484. [Google Scholar]
- Intel. Intel® RealSenseTM Product Family D400 Series. 2023. Available online: https://www.intelrealsense.com/wp-content/uploads/2024/10/Intel-RealSense-D400-Series-Datasheet-October-2024.pdf?_ga=2.253170190.609063794.1743342439-1801352430.1743342439 (accessed on 29 June 2024).
- Tan, J.T.C.; Arai, T. Triple stereo vision system for safety monitoring of human-robot collaboration in cellular manufacturing. In Proceedings of the 2011 IEEE International Symposium on Assembly and Manufacturing (ISAM), Tampere, Finland, 25–27 May 2011; pp. 1–6. [Google Scholar] [CrossRef]
- Rybski, P.; Anderson-Sprecher, P.; Huber, D.; Niessl, C.; Simmons, R. Sensor fusion for human safety in industrial workcells. In Proceedings of the 2012 IEEE/RSJ International Conference on Intelligent Robots and Systems, Vilamoura-Algarve, Portugal, 7–12 October 2012; pp. 3612–3619, ISSN 2153-0866. [Google Scholar] [CrossRef]
- Antão, L.; Reis, J.; Gonçalves, G. Voxel-based Space Monitoring in Human-Robot Collaboration Environments. In Proceedings of the IEEE International Conference on Emerging Technologies and Factory Automation (ETFA), Zaragoza, Spain, 10–13 September 2019; pp. 552–559. [Google Scholar] [CrossRef]
- Svarny, P.; Tesar, M.; Behrens, J.K.; Hoffmann, M. Safe physical HRI: Toward a unified treatment of speed and separation monitoring together with power and force limiting. In Proceedings of the IEEE International Conference on Intelligent Robots and Systems, Macau, China, 3–8 November 2019; pp. 7580–7587. [Google Scholar] [CrossRef]
- Costanzo, M.; Maria, G.D.; Lettera, G.; Natale, C. A Multimodal Approach to Human Safety in Collaborative Robotic Workcells. IEEE Trans. Autom. Sci. Eng. 2021, 19, 1202–1216. [Google Scholar] [CrossRef]
- Karagiannis, P.; Kousi, N.; Michalos, G.; Dimoulas, K.; Mparis, K.; Dimosthenopoulos, D.; Tokçalar, Ö.; Guasch, T.; Gerio, G.P.; Makris, S. Adaptive speed and separation monitoring based on switching of safety zones for effective human robot collaboration. Robot. Comput. Integr. Manuf. 2022, 77, 102361. [Google Scholar] [CrossRef]
- Flowers, J.; Faroni, M.; Wiens, G.; Pedrocchi, N. Spatio-Temporal Avoidance of Predicted Occupancy in Human-Robot Collaboration. In Proceedings of the 2023 32nd IEEE International Conference on Robot and Human Interactive Communication (RO-MAN), Busan, Republic of Korea, 28–31 August 2023; pp. 2162–2168. [Google Scholar] [CrossRef]
- Flowers, J.; Wiens, G. A Spatio-Temporal Prediction and Planning Framework for Proactive Human–Robot Collaboration. J. Manuf. Sci. Eng. 2023, 145, 121011. [Google Scholar] [CrossRef]
- Zhang, C.; Peng, J.; Ding, S.; Zhao, N. Binocular Vision-based Speed and Separation Monitoring of Perceive Scene Semantic Information. In Proceedings of the 2024 36th Chinese Control and Decision Conference (CCDC), Xi’an, China, 25–27 May 2024; pp. 3200–3205. [Google Scholar] [CrossRef]
- Lu, Y.F.; Shivam, K.; Hsiao, J.C.; Chen, C.C.; Chen, W.M. Enhancing Human-Machine Collaboration Safety Through Personnel Behavior Detection and Separate Speed Monitoring. In Proceedings of the 2024 International Conference on Advanced Robotics and Intelligent Systems (ARIS), Taipei, Taiwan, 22–24 August 2024; pp. 1–6. [Google Scholar] [CrossRef]
- Amaya-Mejía, L.M.; Duque-Suárez, N.; Jaramillo-Ramírez, D.; Martinez, C. Vision-Based Safety System for Barrierless Human-Robot Collaboration. In Proceedings of the 2022 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), Kyoto, Japan, 23–27 October 2022; pp. 7331–7336. [Google Scholar] [CrossRef]
- An, S.; Zhou, F.; Yang, M.; Zhu, H.; Fu, C.; Tsintotas, K.A. Real-Time Monocular Human Depth Estimation and Segmentation on Embedded Systems. In Proceedings of the 2021 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), Prague, Czech Republic, 27 September–1 October 2021; pp. 55–62. [Google Scholar] [CrossRef]
- anshan XR-ROB. HDES-Net. 2021. Available online: https://github.com/anshan-XR-ROB/HDES-Net?tab=readme-ov-file (accessed on 9 March 2025).
- Hu, M.; Yin, W.; Zhang, C.; Cai, Z.; Long, X.; Wang, K.; Chen, H.; Yu, G.; Shen, C.; Shen, S. Metric3Dv2: A Versatile Monocular Geometric Foundation Model for Zero-shot Metric Depth and Surface Normal Estimation. IEEE Trans. Pattern Anal. Mach. Intell. 2024, 46, 10579–10596. [Google Scholar] [CrossRef]
- Bochkovskii, A.; Delaunoy, A.; Germain, H.; Santos, M.; Zhou, Y.; Richter, S.R.; Koltun, V. Depth Pro: Sharp Monocular Metric Depth in Less Than a Second. arXiv 2024, arXiv:2410.02073. [Google Scholar] [CrossRef]
- Terabee. TeraRanger Evo Thermal User Manual. 2023. Available online: https://acroname.com/sites/default/files/assets/teraranger-evo-thermal-user-manual.pdf?srsltid=AfmBOorcFGGPBEiNlHVTcAy7o8mv8zG20rtjJ1hR2HQ0ZlgVd8K-yAqd (accessed on 30 March 2025).
- Voynick, S. What is a Microbolometer? 2023. Available online: https://sierraolympia.com/what-is-a-microbolometer/ (accessed on 8 November 2024).
- Benli, E.; Spidalieri, R.L.; Motai, Y. Thermal Multisensor Fusion for Collaborative Robotics. IEEE Trans. Ind. Inform. 2019, 15, 3784–3795. [Google Scholar] [CrossRef]
- Himmelsbach, U.B.; Wendt, T.M.; Hangst, N.; Gawron, P.; Stiglmeier, L. Human–Machine Differentiation in Speed and Separation Monitoring for Improved Efficiency in Human–Robot Collaboration. Sensors 2021, 21, 7144. [Google Scholar] [CrossRef] [PubMed]
- Himmelsbach, U.B.; Wendt, T.M.; Lai, M. Towards safe speed and separation monitoring in human-robot collaboration with 3D-time-of-flight cameras. In Proceedings of the 2nd IEEE International Conference on Robotic Computing (IRC 2018), Laguna Hills, CA, USA, 31 January–2 February 2018; pp. 197–200. [Google Scholar] [CrossRef]
- Optris. PI 450i. Available online: https://optris.com/us/products/infrared-cameras/precision-line/pi-450i/ (accessed on 24 January 2025).
- Mouser. TR-EVO-T33-USB Terabee|Mouser. Available online: https://www.mouser.com/ProductDetail/Terabee/TR-EVO-T33-USB?qs=OTrKUuiFdkYKUuhq9B0%252BOA%3D%3D (accessed on 9 March 2025).
- STMicroelectronics. VL53L1X-Time-of-Flight (ToF) Ranging Sensor Based on ST’s FlightSense Technology-STMicroelectronics. Available online: https://www.st.com/en/imaging-and-photonics-solutions/vl53l1x.html (accessed on 12 March 2025).
- RS-1843A mmWAVE RADAR SENSOR EVALUATION KIT. Available online: https://www.d3embedded.com/wp-content/uploads/2020/02/D3Eng-DesignCore-RS-1843AandRS-6843-DataSheet.pdf (accessed on 12 March 2025).
- TI. mmWaveSensingEstimator. Available online: https://dev.ti.com/gallery/view/mmwave/mmWaveSensingEstimator/ver/2.4.0/ (accessed on 12 March 2025).
- D3. Social Distance Tracking Using mmWave Radar. Available online: https://www.d3embedded.com/solutions/tracking-social-distancing/ (accessed on 12 March 2025).
- Esposito, M.; O’Flaherty, R.; Li, Y.; Virga, S.; Joshi, R.; Haschke, R. IFL-CAMP/Easy_Handeye. Original-Date: 2017-06-25T20:22:05Z. 2024. Available online: https://github.com/IFL-CAMP/easy_handeye (accessed on 13 October 2024).
- Sifferman, C.; Mehrotra, D.; Gupta, M.; Gleicher, M. Geometric Calibration of Single-Pixel Distance Sensors. IEEE Robot. Autom. Lett. 2022, 7, 6598–6605. [Google Scholar] [CrossRef]
- Sifferman, C.; Wang, Y.; Gupta, M.; Gleicher, M. Unlocking the Performance of Proximity Sensors by Utilizing Transient Histograms. IEEE Robot. Autom. Lett. 2023, 8, 6843–6850. [Google Scholar] [CrossRef]
- Intel. Intel® CoreTM i9-12900K Processor (30M Cache, up to 5.20 GHz)-Product Specifications. Available online: https://www.intel.com/content/www/us/en/products/sku/134599/intel-core-i912900k-processor-30m-cache-up-to-5-20-ghz/specifications.html (accessed on 12 March 2025).
- NVIDIA. NVIDIA RTX A5000 Datasheet. Available online: https://resources.nvidia.com/en-us-briefcase-for-datasheets/nvidia-rtx-a5000-dat-1 (accessed on 12 March 2025).
- Newegg. NeweggBusiness-PNY VCNRTXA5000-PB RTX A5000 24GB 384-bit GDDR6 PCI Express 4.0 Workstation Video Card. Available online: https://www.neweggbusiness.com/product/product.aspx?item=9siv7kvjy39435&bri=9b-14-133-832 (accessed on 12 March 2025).
- Intel. Intel® CoreTM i7-920 Processor (8M Cache, 2.66 GHz, 4.80 GT/s Intel® QPI)-Product Specifications. Available online: https://www.intel.com/content/www/us/en/products/sku/37147/intel-core-i7920-processor-8m-cache-2-66-ghz-4-80-gts-intel-qpi/specifications.html (accessed on 12 March 2025).
- Amazon. Amazon.com: Intel Core i7 Processor i7-920 2.66GHz 8 MB LGA1366 CPU BX80601920: Electronics. Available online: https://www.amazon.com/Intel-Processor-2-66GHz-LGA1366-BX80601920/dp/B001H5T7LK/ref=asc_df_B001H5T7LK?mcid=cf2f78a548833789b337453383ab2693&tag=hyprod-20&linkCode=df0&hvadid=693562313188&hvpos=&hvnetw=g&hvrand=15348408733688645708&hvpone=&hvptwo=&hvqmt=&hvdev=c&hvdvcmdl=&hvlocint=&hvlocphy=9005654&hvtargid=pla-2007964176847&psc=1 (accessed on 12 March 2025).
- PassMark. AMD EPYC 9655P Benchmark. Available online: https://www.cpubenchmark.net/cpu.php?cpu=AMD+EPYC+9655P&id=6354 (accessed on 12 March 2025).
- Kumar, S.; Savur, C.; Sahin, F. Survey of Human-Robot Collaboration in Industrial Settings: Awareness, Intelligence, and Compliance. IEEE Trans. Syst. Man Cybern. Syst. 2021, 51, 280–297. [Google Scholar] [CrossRef]
- Rakhmatulin, V.; Grankin, D.; Konenkov, M.; Davidenko, S.; Trinitatova, D.; Sautenkov, O.; Tsetserukou, D. AirTouch: Towards Safe Human-Robot Interaction Using Air Pressure Feedback and IR Mocap System. In Proceedings of the 2023 IEEE International Conference on Systems, Man, and Cybernetics (SMC), Honolulu, HI, USA, 1–4 October 2023; pp. 2034–2039. [Google Scholar] [CrossRef]
- IEEE. IEEE Xplorer. 2025. Available online: https://ieeexplore.ieee.org/ (accessed on 11 March 2025).
- ASME. ASME Digital Collection. 2025. Available online: https://asmedigitalcollection.asme.org/ (accessed on 11 March 2025).
- Elsevier. ScienceDirect. 2025. Available online: https://www.sciencedirect.com/ (accessed on 11 March 2025).
- Wiley. Wiley Online Library. 2025. Available online: https://onlinelibrary.wiley.com/ (accessed on 11 March 2025).
- ProQuest. ProQuest. 2025. Available online: https://www.proquest.com/ (accessed on 11 March 2025).
- Springer. SpringerLink. 2025. Available online: https://link.springer.com/ (accessed on 11 March 2025).
- Media, F. Frontiers. 2025. Available online: https://www.frontiersin.org/ (accessed on 11 March 2025).
- MDPI. 2025. Available online: https://www.mdpi.com/ (accessed on 11 March 2025).
- Journal of Open Source Software. 2025. Available online: https://joss.theoj.org/ (accessed on 11 March 2025).
- Subramanian, K. Survey_SSM_Robotics. 2024. Available online: https://github.com/kxs8997/survey_SSM_robotics (accessed on 12 March 2025).
- inciteful. Available online: https://inciteful.xyz/ (accessed on 9 October 2024).
- Glogowski, P.; Lemmerz, K.; Hypki, A.; Kuhlenkotter, B. Extended calculation of the dynamic separation distance for robot speed adaption in the human-robot interaction. In Proceedings of the 2019 19th International Conference on Advanced Robotics (ICAR 2019), Belo Horizonte, Brazil, 2–6 December 2019; pp. 205–212. [Google Scholar] [CrossRef]
- Lemmerz, K.; Glogowski, P.; Kleineberg, P.; Hypki, A.; Kuhlenkötter, B. A Hybrid Collaborative Operation for Human-Robot Interaction Supported by Machine Learning. In Proceedings of the International Conference on Human System Interaction (HSI), Richmond, VA, USA, 25–27 June 2019; IEEE Computer Society: Washington, DC, USA, 2019; pp. 69–75. [Google Scholar] [CrossRef]
- Lacevic, B.; Rocco, P. Kinetostatic danger field - A novel safety assessment for human-robot interaction. In Proceedings of the IEEE/RSJ 2010 International Conference on Intelligent Robots and Systems (IROS 2010), Taipei, Taiwan, 18–22 October 2010; pp. 2169–2174. [Google Scholar] [CrossRef]
Sensor Product | GP2Y0A 21YK0F | Femto Bolt | VL53l1x | RS-6843A | Realsense D435I | TeraBee Evo Thermal 33 | Optirs PI 450i | Ouster OS0 |
Sensor Type | IR | ToF | ToF | Radar | Stereo | Thermopile | Bolometer | LiDAR |
Sensor Range | 0.1–0.8 (m) | 0.25–5.46 (m) | 0.04–4 (m) | 0.3–9.62 (m) | 0.02–10 (m) | 30–45 °C | −20–900 °C | −40–60 °C |
Unit Cost | $12.95 | $415.00 | $5.77 | $599.00 | $343.75 | $95.70 | $6300.00 | $6000.00 |
FPS | 26 1 | 30 2 | 50 3 | 50 | 90 4 | 7 | 80 | 20 |
Coverage 5 (m2) | 0.27 | 17.28 | 0.26 | 1.5 | 3.03 | 0.51 | 2.46 | 9.05 |
PD 6 (pts/m2) | N/A | 60,681.48 | 981.79 | 1108.779 | 304,171.15 | 2026.13 | 44,673.91 | 3621.66 |
Typical Power | 0.165 watts | 4.35 watts | 0.066 watts | 0.75 watts | 3.5 watts | 0.225 watts | 2.5 watts | 20 watts |
Full Coverage Min Sensor # | 10 | 3 | 14 | 3 | 4 | 10 | 5 | 1 |
Full Coverage Power | 1.65 watts | 13.05 watts | 0.924 watts | 2.25 watts | 14 watts | 2.25 watts | 12.5 watts | 20 watts |
Full Coverage Cost | $129.50 | $1245.00 | $80.78 | $1797.00 | $1375.00 | $957.00 | $31,500.00 | $6000.00 |
Computational Platform | Known Components and Software | Cost ($) | Platform Power (Watts) | Processing Power (TOPS) |
---|---|---|---|---|
PC Specs from [74] | Intel i9 with 16 CPU cores | >$648.00 1 | >250 1 | <1.2 2 |
PC Specs from [76] | 3.33 GHz CPU and RTX A5000 GPU running Ubuntu 18.04.2 LTS | >$2099.00 1 | >230 1 | 222.2 |
PC Specs from [69] | 2.67 GHz Intel i7 920 quad-core running Ubuntu 10.04 LTS | >$75.00 1 | >130 1 | <1.2 2 |
Jetson Orin NX | 8-core ARM A78, 16 Gigs of LPDDR5, Ampere GPU | $699.00 | 25 | 70 TOPS |
Jetson Nano NX | 6-core ARM A57, 4 Gigs of LPDDR4, Maxwell GPU | $200.00 | 15 | 0.5 TOPS |
PILZ PSS4000 PLC [73] | Safety Critical Microprocessors | $20,000.00 | 50 | ARM A7 and high-end microprocessor capabilities |
STM32 Nucleo Board | 48 MHz ARM M0, 64 KB flash, 8 KB RAM | $11.04 | <1 | Serial and low-speed data processing only |
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content. |
© 2025 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
Adamides, O.; Subramanian, K.; Arora, S.; Sahin, F. Perception and Computation for Speed and Separation Monitoring Architectures. Robotics 2025, 14, 41. https://doi.org/10.3390/robotics14040041
Adamides O, Subramanian K, Arora S, Sahin F. Perception and Computation for Speed and Separation Monitoring Architectures. Robotics. 2025; 14(4):41. https://doi.org/10.3390/robotics14040041
Chicago/Turabian StyleAdamides, Odysseus, Karthik Subramanian, Sarthak Arora, and Ferat Sahin. 2025. "Perception and Computation for Speed and Separation Monitoring Architectures" Robotics 14, no. 4: 41. https://doi.org/10.3390/robotics14040041
APA StyleAdamides, O., Subramanian, K., Arora, S., & Sahin, F. (2025). Perception and Computation for Speed and Separation Monitoring Architectures. Robotics, 14(4), 41. https://doi.org/10.3390/robotics14040041