LiDAR-Based Maintenance of a Safe Distance between a Human and a Robot Arm
Abstract
:1. Introduction
2. Materials and Methods
2.1. Devices, Materials and Validation Software
2.2. Intelligent Control System
2.2.1. Geometric Data Registration
2.2.2. Robot Arm Forward Kinematics
2.2.3. Geometric Data Segmentation
2.2.4. Motion Prediction
- The intruder has just been detected, and is thus present in a single frame only. The prediction is that he or she is moving directly towards the robot with a standard fast walking speed of m/s [49];
- There are already two consecutive frames containing the intruder. The constant speed of the intruder between the two frames is computed. The prediction is that the intruder continues motion with unchanged speed in an unchanged direction;
- There are already three or more consecutive frames containing the intruder. The intruder’s locations from the last three slides are used to assess trends in how the speed and motion direction are changing. These trends are then used in predicting the future position. The intruder’s trajectory in this case is a quadratic Bézier curve, i.e., a parabolic arc.
- The robot controller reports the current robot joint coordinates. ICS uses FK to translate them into the positions of the corresponding links;
- The ICS must synchronise the real-time trajectory and the stored one. In the described setup, we used a very limited repertoire of the robot’s velocities (, , or of the original speed from the robot programme), so ICS was able to determine how many stored positions should be skipped from the current one simply;
- In the same manner, the recorded positions may be skipped to reach the “predicted” position at a selected future moment.
2.2.5. Speed and Separation Monitoring
- Determine PSD for the current situation for all four robot speeds.
- Set distance between the robot and the intruder.
- Choose the maximum PSD below D and set the robot speed related to that PSD.
- If (Predictions are used) then
- ▹
- Predict the positions after time by using the robot speed .
- ▹
- Determine PSDs for the predicted situation for robot speeds not exceeding .
- ▹
- Set distance between the predicted positions of the robot and intruder.
- ▹
- Choose the maximum PSD below D and update accordingly.
- Send to the robot controller.
2.3. Validation of the Protective Separation Distance Calculation
3. Results
3.1. Test Scenarios
- Scenario 1—Slow movement of the intruder towards the robot. The robot arm carried a cube-shaped load with the tip moving at a speed of 0.2 m/s towards the intruder. The latter was moving towards the HRC workspace with a speed of approximately 0.4 m/s. When they became close to each other, the speed of the robot first decreased, and then it stopped completely. The intruder then moved away from the robot, and the latter started to move again (Figure 9c, Supplementary Video S1).
- Scenario 2—Fast movement of the intruder towards the robot. The scenario was similar to the previous one. Here, the speed of the intruder approaching the robot arm was approximately 1.6 m/s (Supplementary Video S2).
- Scenario 3—The intruder was standing in the HRC workspace and the robot arm was moving towards the intruder. This scenario extended the previous two, with a case where the speed of the intruder was zero (Supplementary Video S3).
- Scenario 4—The intruder approached the HRC workspace with his hand only. This scenario demonstrated that the ICS also responded to movements of the intruder’s body parts, not just to his walk. When the intruder’s arm moved away from the robot arm, then the robot programme continued (Supplementary Video S4).
3.2. Validation Results of the Protective Separation Distance Calculation
3.3. Protective Separation Distance Calculation
4. Discussion
- Grey zones. A grey zone is an area in the HRC workspace which cannot be safeguarded all the time due to obstacles between the sensor and this area.
- Inability to detect narrow objects. A human arm is a reference object, requiring that two scanning rows or columns at the operational distance should not be more than 5–6 cm apart [51].
- Safety margin violation. This critical situation can arise if an intruder suddenly appears from a grey zone, or is already present close to the robot when the ICS activates the SSM.
- Low sensor scanning speed. If is too long, the speed and direction of movement of the human or robot may change in two consecutive sensor frames in such a way that it is no longer possible to stop the robot in time.
- “Out-of-range” , and/or weight of the load. The validation confirmed the safe operation in reasonably limited ranges of robot and human speeds. The maximum m/s and load capacity of 25 kg were given in the robot arm FANUC M-20iD/ 25 specifications. However, the maximum was not defined strictly, and depended on the physical limitations of the individual. m at the standard fast walking speed m/s, m/s (see Table 3) and load of 25 kg ensured that the robot arm with a reach of 1.831 m would stop at least 32 cm from a human, which is outside the required minimum cm (Table 2). On the other hand, human speeds above m/s do not guarantee safe operation, as the calculated PSD is often above the dimensions of the workspace, which usually results in safety margin violation.
- Advances in models and calculations. The distances from each robot’s voxel to the closest intruder’s voxel could be found easily, but this calculation would increase significantly. Therefore, only the local coordinate systems’ origins of the robot’s joints are considered in the current version of the ICS. The tests in Section 3 were accelerated, additionally, by considering only TCP (see Figure 6b and Figure 9b and Equation (10)), which was indeed the closest to the intruder in most cases. Furthermore, circular moves of the robot links were interpolated linearly. The error compensation was included in the parameter C.
- False negatives. Detection of false negatives usually occurs when the calculated PSD is too high, due to the oversizing of individual parameters. Some of these cannot be determined accurately. For example, we used s, which corresponds to the worst-case value (at the highest and heaviest load). It is highly important to estimate such parameters in a manner to increase the PSD and not to decrease it. The efficiency may be sacrificed for the good of safety, while the opposite is not allowed. Note that false negatives can also be met during validation, due to the estimation that the intruder and the robot are operating at approximately the same .
- Multiple intruders. Two intruders forming a connected voxel region are identified as one. The number of intruders detected may vary through time as they move closer or further apart. This makes tracking impossible. As a consequence, the prediction mode is only useful in situations with a single intruder. Particularly dangerous are situations where an intruder suddenly appears from a grey zone behind another intruder.
- Motion prediction. The predictions improve safety slightly by forcing the robot to brake earlier and preventing it from accelerating too soon. They can also make the robot’s operation smoother and more efficient. With the current sensor capabilities, a single prediction one frame ahead is acceptable. In general, however, intermediate predictions in the interval between two frames and predictions several frames ahead could also be useful, depending on , , , and .
- An overhead LiDAR scanner and/or multiple scanners represent the only reasonable way to address grey zones. This approach can also significantly improve, or even enable, the detection of multiple intruders when they are not too close to each other. The registration of data from multiple sensors is conducted in the initialisation phase. Therefore, only a slight extension of is expected, due to the merging of segmented point clouds. Of course, the sensors must be synchronised, as the point clouds to be merged are assumed to be acquired at the same time. In addition, each sensor must be able to distinguish its own reflected laser beam from beams from other sensors. Wearable sensors are a possible alternative, but represent too large a deviation from the presented concept.
- Higher sensor resolution would improve detection of narrow objects.
- Safety margin violation can be addressed partly, together with grey zones. Besides this, the detection of potential intruders is required before the robot is started, which places additional requirements on the synchronisation of the robot and sensors.
- Higher sensor scanning speeds means a simple replacement of the presented prototype LiDAR scanner with some off-the-shelf product. The increased frame rate would make changes between two consecutive frames more predictable. Consequenlty, a higher could be allowed, if reasonable in the limited workspace dimensions. Furthermore, a higher scanning speed is also a prerequisite for advanced intermediate and multiple predictions. Finally, the commercial LiDAR scanners typically have an integrated IMU that could, importantly, unprove the accuracy of the proposed direct registration method (Section 2.2.1).
- Improved specifications of the system parameters could reduce the number of false negatives detected.
- Most of these modifications would increase and, consequently, require a more powerful computer. The latter would also enable the use of advanced models and calculations.
Supplementary Materials
Author Contributions
Funding
Institutional Review Board Statement
Informed Consent Statement
Data Availability Statement
Acknowledgments
Conflicts of Interest
Abbreviations
CIP | Common industrial protocol |
FK | Forward kinematics / Forward kinematic |
GPS | Global positioning system |
HRC | Human–robot collaboration |
ICS | Intelligent control system |
IMU | Inertial measurement unit |
LiDAR | Light detection and ranging |
PCL | Point cloud library |
PSD | Protective separation distance |
RGB | Red, Green, Blue |
RGB-D | Red, Green, Blue – Depth |
SSM | Speed and separation monitoring |
TCP | Tool centre point |
TIN | Triangulated irregular network |
TOF | Time-of-flight |
UDP | User datagram protocol |
References
- ISO/TS 15066:2016; Robots and Robotic Devices—Collaborative Robots. International Organization for Standardization: Geneva, Switzerland, 2016.
- Marvel, J.A.; Norcross, R. Implementing speed and separation monitoring in collaborative robot workcells. Robot. Comput. Integr. Manuf. 2017, 44, 144–155. [Google Scholar] [CrossRef] [PubMed]
- Hanna, A.; Larsson, S.; Götvall, P.L.; Bengtsson, K. Deliberative safety for industrial intelligent human–robot collaboration: Regulatory challenges and solutions for taking the next step towards industry 4.0. Robot. Comput. Integr. Manuf. 2022, 78, 102386. [Google Scholar] [CrossRef]
- Martinetti, A.; Chemweno, P.K.; Nizamis, K.; Fosch-Villaronga, E. Redefining Safety in Light of Human-Robot Interaction: A Critical Review of Current Standards and Regulations. Front. Chem. Eng. 2021, 3, 666237. [Google Scholar] [CrossRef]
- Zacharaki, A.; Gasteratos, A.; Kostavelis, I.; Dokas, I. Safety bounds in human robot interaction: A survey. Comput. Sci. 2020, 127, 104667. [Google Scholar] [CrossRef]
- Hameed, A.; Ordys, A.; Możaryn, J.; Sibilska-Mroziewicz, A. Control System Design and Methods for Collaborative Robots: Review. Appl. Sci. 2023, 13, 675. [Google Scholar] [CrossRef]
- Karagiannis, P.; Kousi, N.; Michalos, G.; Dimoulas, K.; Mparis, K.; Dimosthenopoulos, D.; Tokçalar, Ö.; Guasch, T.; Gerio, G.P.; Makris, S. Adaptive speed and separation monitoring based on switching of safety zones for effective human robot collaboration. Robot. Comput. Integr. Manuf. 2022, 77, 102361. [Google Scholar] [CrossRef]
- Vogel, C.; Walter, C.; Elkmann, N. Safeguarding and supporting future human-robot cooperative manufacturing processes by a projection-and camera-based technology. In Proceedings of the 27th International Conference on Flexible Automation and Intelligent Manufacturing (FAIM2017), Modena, Italy, 27–30 June 2017; Volume 11, pp. 39–46. [Google Scholar]
- Malm, T.; Salmi, T.; Marstio, I.; Montonen, J. Dynamic safety system for collaboration of operators and industrial robots. Open Eng. 2019, 9, 61–71. [Google Scholar] [CrossRef]
- Rosenstrauch, M.J.; Pannen, T.J.; Krüger, J. Human robot collaboration-using kinect v2 for iso/ts 15066 speed and separation monitoring. In Proceedings of the 7th CIRP Conference on Assembly Technologies and Systems (CATS 2018), Tianjin, China, 10–12 May 2018; Volume 76, pp. 183–186. [Google Scholar]
- Reddy, A.; Bright, G.; Padayachee, J. A Review of Safety Methods for Human-robot Collaboration and a Proposed Novel Approach. In Proceedings of the 16th International Conference on Informatics in Control, Automation and Robotics (ICINCO), Prague, Czech Republic, 29–31 July 2019; pp. 243–248. [Google Scholar]
- Himmelsbach, U.B.; Wendt, T.M.; Hangst, N.; Gawron, P.; Stiglmeier, L. Human–machine differentiation in speed and separation monitoring for improved efficiency in human–robot collaboration. Sensors 2021, 21, 7144. [Google Scholar] [CrossRef]
- Szabo, S.; Shackleford, W.; Norcross, R.; Marvel, J. A Testbed for Evaluation of Speed and Separation Monitoring in a Human Robot Collaborative Environment (NISTIR 7851); U.S. Department of Commerce, National Institute of Standards and Technology: Gaithersburg, MD, USA, 2012. [Google Scholar]
- Himmelsbach, U.B.; Wendt, T.M.; Lai, M. Towards safe speed and separation monitoring in human-robot collaboration with 3d-time-of-flight cameras. In Proceedings of the 2018 Second IEEE International Conference on Robotic Computing (IRC), Laguna Hills, CA, USA, 31 January–2 February 2018; pp. 197–200. [Google Scholar]
- Kumar, S.; Arora, S.; Sahin, F. Speed and separation monitoring using on-robot time-of-flight laser-ranging sensor arrays. In Proceedings of the 2019 IEEE 15th International Conference on Automation Science and Engineering (CASE), Vancouver, BC, Canada, 22–26 August 2019; pp. 1684–1691. [Google Scholar]
- Safeea, M.; Neto, P. Minimum distance calculation using laser scanner and IMUs for safe human-robot interaction. Robot. Comput. Integr. Manuf. 2019, 58, 33–42. [Google Scholar] [CrossRef]
- Park, J.; Sørensen, L.C.; Mathiesen, S.F.; Schlette, C. A Digital Twin-based Workspace Monitoring System for Safe Human-Robot Collaboration. In Proceedings of the 2022 10th International Conference on Control, Mechatronics and Automation (ICCMA), Luxembourg, 9–12 November 2022; pp. 24–30. [Google Scholar]
- Yang, B.; Xie, S.; Chen, G.; Ding, Z.; Wang, Z. Dynamic Speed and Separation Monitoring Based on Scene Semantic Information. J. Intell. Robot Syst. 2022, 106, 35. [Google Scholar] [CrossRef]
- Vicentini, F.; Giussani, M.; Tosatti, L.M. Trajectory-dependent safe distances in human-robot interaction. In Proceedings of the IEEE Emerging Technology and Factory Automation (ETFA), Barcelona, Spain, 16–19 September 2014; pp. 1–4. [Google Scholar]
- Pereira, A.; Althoff, M. Overapproximative arm occupancy prediction for human-robot co-existence built from archetypal movements. In Proceedings of the IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), Daejeon, Republic of Korea, 9–14 October 2016; pp. 1394–1401. [Google Scholar]
- Ragaglia, M.; Zanchettin, A.M.; Rocco, P. Trajectory generation algorithm for safe human-robot collaboration based on multiple depth sensor measurements. Mechatronics 2018, 55, 267–281. [Google Scholar] [CrossRef]
- Močnik, G.; Kačič, Z.; Šafarič, R.; Mlakar, I. Capturing conversational gestures for embodied conversational agents using an optimized Kaneda–Lucas–Tomasi tracker and Denavit–Hartenberg-based kinematic model. Sensors 2022, 22, 8318. [Google Scholar] [CrossRef] [PubMed]
- Marvel, J.A. Performance metrics of speed and separation monitoring in shared workspaces. IEEE Trans. Autom. Sci. Eng. 2013, 10, 405–414. [Google Scholar] [CrossRef]
- Byner, C.; Matthias, B.; Ding, H. Dynamic speed and separation monitoring for collaborative robot applications—Concepts and performance. Robot. Comput. Integr. Manuf. 2019, 58, 239–252. [Google Scholar] [CrossRef]
- Balan, L.; Bone, G. Real-time 3D collision avoidance method for safe human and robot coexistence. In Proceedings of the IEEE/RSJ International Conference on Intelligent Robots and Systems, Beijing, China, 9–15 October 2006; pp. 276–282. [Google Scholar]
- Rodriguez, L.; Przedworska, Z.; Obidat, O.; Parron, J.; Wang, W. Development and Implementation of an AI-Embedded and ROS-Compatible Smart Glove System in Human-Robot Interaction. In Proceedings of the 2022 IEEE 19th International Conference on Mobile Ad Hoc and Smart Systems (MASS), Denver, CO, USA, 19–23 October 2022; pp. 699–704. [Google Scholar]
- Lasota, P.A.; Fong, T.; Shah, J.A. A Survey of Methods for Safe Human-Robot Interaction. Found. Trends Robot. 2017, 5, 261–349. [Google Scholar] [CrossRef]
- Cherubini, A.; Navarro-Alarcon, D. Sensor-Based Control for Collaborative Robots: Fundamentals, Challenges, and Opportunities. Front. Neurorobot. 2021, 14, 113. [Google Scholar] [CrossRef]
- Palleschi, A.; Hamad, M.; Abdolshah, S.; Garabini, M.; Haddadin, S.; Pallottino, L. Fast and safe trajectory planning: Solving the cobot performance/safety trade-off in human-robot shared environments. IEEE Robot. Autom. Lett. 2021, 6, 5445–5452. [Google Scholar] [CrossRef]
- Liu, B.; Zhac, F.; Sun, Z.; Liu, X.; Jiang, G. A Short-term Motion Prediction Approach for Guaranteed Collision-Free Planning. In Proceedings of the 2019 IEEE International Conference on Advanced Robotics and its Social Impacts (ARSO), Beijing, China, 31 October–2 November 2019; pp. 153–156. [Google Scholar]
- TEASER—A Fast and Robust Point Cloud Registration Library. Available online: https://github.com/MIT-SPARK/TEASER-plusplus (accessed on 13 March 2023).
- FANUC ADRIA d.o.o. Available online: https://www.fanuc.eu/si/en/who-we-are/sl-country-landing-page (accessed on 13 March 2023).
- EN IEC 61496-3:2019; Safety of Machinery—Electro-Sensitive Protective Equipment Part 3: Particular Requirements for Active Opto-Electronic Protective Devices Responsive to Diffuse Reflection (AOPDDR). CEI: Milano, Italy, 2019.
- FOKUS TECH d.o.o. Available online: https://fokus.si (accessed on 13 March 2023).
- Dimec, M.; Kraljević, M.; Žalik, B.; Krejan, M.; Pečnik, S.; Podgorelec, D. Use of LiDAR and autonomous mobile robots in safety and inspection applications on railways. In Proceedings of the 7th International Conference on Road and Rail Infrastructure (CETRA 2022), Pula, Croatia, 11–13 May 2022; pp. 229–234. [Google Scholar]
- Network Infrastructure for Ethernet/IP: Introduction and Considerations. Available online: https://www.odva.org/wp-content/uploads/2020/05/PUB00035R0_Infrastructure_Guide.pdf (accessed on 13 March 2023).
- FANUC Simulation Software ROBOGUIDE. Available online: https://www.fanuc.eu/si/en/robots/accessories/roboguide (accessed on 13 March 2023).
- Kaufman, A.; Cohen, D.; Yagel, R. Volume graphics. Computer 1993, 26, 51–64. [Google Scholar] [CrossRef]
- Bellekens, B.; Spruyt, V.; Berkvens, R.; Penne, R.; Weyn, M. A benchmark survey of rigid 3D point cloud registration algorithms. Int. J. Adv. Intell. Syst. 2015, 8, 118–127. [Google Scholar]
- Huang, Y.P. Triangular irregular network generation and topographical modeling. Comput. Ind. 1989, 12, 203–213. [Google Scholar] [CrossRef]
- Méndez, D.; Manuel, G.F.; Murcía, H.F. Comparative study of point cloud registration techniques between ICP and others. In Proceedings of the Applications of Digital Image Processing XLIII, SPIE, Online, 24 August–4 September 2020; Volume 11510, pp. 292–305. [Google Scholar]
- Huang, X.; Mei, G.; Zhang, J.; Abbas, R. A comprehensive survey on point cloud registration. arXiv 2021, arXiv:2103.02690. [Google Scholar]
- Aldoma, A.; Marton, Z.C.; Tombari, F.; Wohlkinger, W.; Potthast, C.; Zeisl, B.; Rusu, R.B.; Gedikli, S.; Vincze, M. Tutorial: Point cloud library: Three-dimensional object recognition and 6 dof pose estimation. IEEE Robot. Autom Mag. 2012, 19, 80–91. [Google Scholar] [CrossRef]
- Holz, D.; Ichim, A.E.; Tombari, F.; Rusu, R.B.; Behnke, S. Registration with the point cloud library: A modular framework for aligning in 3-D. IEEE Robot. Autom Mag. 2015, 22, 110–124. [Google Scholar] [CrossRef]
- Yang, H.; Shi, J.; Carlone, L. Teaser: Fast and certifiable point cloud registration. IEEE Trans. Robot. 2020, 37, 314–333. [Google Scholar] [CrossRef]
- Mongus, D.; Žalik, B. Parameter-free ground filtering of LiDAR data for automatic DTM generation. ISPRS J. Photogramm. Remote Sens. 2012, 67, 1–12. [Google Scholar] [CrossRef]
- Mongus, D.; Lukač, N.; Žalik, B. Ground and building extraction from LiDAR data based on differential morphological profiles and locally fitted surfaces. ISPRS J. Photogramm. Remote Sens. 2014, 93, 145–156. [Google Scholar] [CrossRef]
- Fu, K.S.; Gonzalez, R.C.; Lee, C.S.G. Robotics: Control, Sensing, Vision, and Intelligence; McGraw-Hill: New York, NY, USA, 1987. [Google Scholar]
- ISO 13855:2010; Safety of Machinery—Positioning of Safeguards with Respect to the Approach Speeds of Parts of the Human Body. International Organization for Standardization: Geneva, Switzerland, 2010.
- FANUC America Corporation SYSTEM R-30iA and R-30iB Controller KAREL Reference Manual. Available online: https://studylib.net/doc/25629757/karel-programming-guide (accessed on 13 March 2023).
- ROB-MSD-3—Test 3D Safety Sensors in Speed and Separation Monitoring Cobot Applications, COVR Toolkit Protocol. 2021. Available online: https://covrfilestorage.blob.core.windows.net/documents/protocols/ROB-MSD-3-Test_3D_Safety_Sensors_in_Speed_and_Separation_Monitoring_Cobot_Applications.pdf (accessed on 14 March 2023).
Samples per Line | Scan Lines | MinX | MaxX | MinY | MaxY | Frame Rate |
---|---|---|---|---|---|---|
293 | 292 | −146 | 146 | −146 | 145 | 0.4 |
142 | 141 | −71 | 70 | −70 | 70 | 1.6 |
142 | 70 | −71 | 70 | −35 | 34 | 3.3 |
142 | 47 | −71 | 70 | −23 | 23 | 4.8 |
Example | Scenario | Video | Video Frame | xMan [dot] | yMan [dot] | SDMan [cm] | SDTest [cm] | PSD [cm] | Passed/ Failed |
---|---|---|---|---|---|---|---|---|---|
1 | 1 | S5 | 0:00:02.460 | 8.1 | 2.9 | 43.0 | 24.7 | 20.0 | Passed |
2 | 1 | S6 | 0:00:03.125 | 10.6 | 3.0 | 55.1 | 31.6 | 20.0 | Passed |
3 | 1 | S7 | 0:00:02.416 | 11.1 | 0.6 | 55.6 | 31.9 | 20.0 | Passed |
4 | 1 | S8 | 0:00:01.958 | 7.1 | 1.9 | 36.7 | 21.1 | 20.0 | Passed |
5 | 1 | S9 | 0:00:03.208 | 9.5 | 2.0 | 48.5 | 27.9 | 20.0 | Passed |
6 | 3 | S10 | 0:00:03.750 | 8.5 | 0.0 | 42.5 | 24.4 | 20.0 | Passed |
7 | 3 | S11 | 0:00:02.458 | 6.6 | 2.2 | 34.8 | 19.9 | 20.0 | Failed |
8 | 3 | S12 | 0:00:03.375 | 7.0 | 0.5 | 35.1 | 20.1 | 20.0 | Passed |
9 | 4 | S13 | 0:00:04.333 | 7.3 | 1.7 | 37.5 | 21.5 | 20.0 | Passed |
10 | 4 | S14 | 0:00:04.291 | 5.7 | 1.8 | 29.9 | 17.2 | 20.0 | Failed |
11 | 4 | S15 | 0:00:03.418 | 6.2 | 0.1 | 31.0 | 17.8 | 20.0 | Failed |
12 | 4 | S16 | 0:00:04.500 | 6.4 | 3.7 | 37.0 | 21.2 | 20.0 | Passed |
vh [m/s] | vr [m/s] | PSD [m] (Proposed) | PSD [m] (2021 [12]) | PSD [m] (2012 [13]) |
---|---|---|---|---|
0.25 | 0.0 | 0.428 | 0.330 | 1.256 |
0.25 | 0.5 | 0.756 | 0.578 | 1.508 |
0.25 | 1.0 | 1.084 | 0.827 | 1.813 |
0.25 | 1.5 | 1.412 | 1.075 | 2.168 |
0.25 | 2.0 | 1.740 | 1.324 | 2.573 |
1.60 | 0.0 | 1.659 | 1.680 | 1.806 |
1.60 | 0.5 | 1.987 | 1.928 | 2.196 |
1.60 | 1.0 | 2.315 | 2.177 | 2.636 |
1.60 | 1.5 | 2.643 | 2.425 | 3.126 |
1.60 | 2.0 | 2.971 | 2.677 | 3.666 |
2.50 | 0.0 | 2.480 | 2.580 | 2.175 |
2.50 | 0.5 | 2.808 | 2.828 | 2.655 |
2.50 | 1.0 | 3.136 | 3.077 | 3.185 |
2.50 | 1.5 | 3.464 | 3.325 | 3.765 |
2.50 | 2.0 | 3.792 | 3.577 | 4.395 |
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content. |
© 2023 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
Podgorelec, D.; Uran, S.; Nerat, A.; Bratina, B.; Pečnik, S.; Dimec, M.; Žaberl, F.; Žalik, B.; Šafarič, R. LiDAR-Based Maintenance of a Safe Distance between a Human and a Robot Arm. Sensors 2023, 23, 4305. https://doi.org/10.3390/s23094305
Podgorelec D, Uran S, Nerat A, Bratina B, Pečnik S, Dimec M, Žaberl F, Žalik B, Šafarič R. LiDAR-Based Maintenance of a Safe Distance between a Human and a Robot Arm. Sensors. 2023; 23(9):4305. https://doi.org/10.3390/s23094305
Chicago/Turabian StylePodgorelec, David, Suzana Uran, Andrej Nerat, Božidar Bratina, Sašo Pečnik, Marjan Dimec, Franc Žaberl, Borut Žalik, and Riko Šafarič. 2023. "LiDAR-Based Maintenance of a Safe Distance between a Human and a Robot Arm" Sensors 23, no. 9: 4305. https://doi.org/10.3390/s23094305
APA StylePodgorelec, D., Uran, S., Nerat, A., Bratina, B., Pečnik, S., Dimec, M., Žaberl, F., Žalik, B., & Šafarič, R. (2023). LiDAR-Based Maintenance of a Safe Distance between a Human and a Robot Arm. Sensors, 23(9), 4305. https://doi.org/10.3390/s23094305