Dynamic Camera Planning for Robot-Integrated Manufacturing Processes Using a UAV
Abstract
:1. Introduction
- How can the optimal viewpoint for any robot application be determined and evaluated?
- What methods can efficiently and accurately verify collision-free paths between two points in a continuous simulation?
- What is the optimal system configuration, including minimum distances and collision zones, for ensuring safe cooperation between UAVs and industrial robots?
2. Related Works
2.1. Camera Planning
2.2. Applications in Robot Vision Involving Dynamic Camera Planning
2.3. Visual Servoing a Dynamic Camera in Robot Tasks for Viewpoint Optimization
2.4. Using a UAV as Dynamic Camera Within Robot Tasks
2.5. Requirements for Visually Assisting Robot Applications with a UAV
3. Our Approach
3.1. Measurability of Perspective Coverage from Geometric Relationship
3.2. Eliminating the Effect of Distortion
3.3. Finding Occlusion-Free Optimal Viewpoints
Maximize: | |||
Subject to: | ) | ||
) | |||
) | |||
remains within half the FOV) |
3.4. Starting Points for the Hill Climber
3.5. Collision-Free Path Planning
3.6. Decision for the Optimal Viewpoint
3.7. Implementation of a State Machine
4. Experimental Results and Discussion
4.1. Results of the Palletizing Application
4.2. Results of the Welding Application
4.3. Limitations and Future Work
- If the viewpoint disappears shortly after the UAV changes to the change state without having been reached by the UAV, the theoretical perspective coverage calculated in the score will not be achieved.
- The UAV could reach a point in the routine where suddenly no other viewpoint is available and it gets stuck.
- The UAV does not recognize a change to a viewpoint that is worse at this point, but which will provide a higher perspective coverage in the future.
5. Conclusions
Author Contributions
Funding
Data Availability Statement
Acknowledgments
Conflicts of Interest
References
- Xiao, X.; Dufek, J.; Murphy, R.R. Tethered Aerial Visual Assistance. arXiv 2020, arXiv:2001.06347. [Google Scholar]
- Gonzalez-Barbosa, J.-J.; Garcia-Ramirez, T.; Salas, J.; Hurtado-Ramos, J.-B.; Rico-Jimenez, J.-J. Optimal camera placement for total coverage. In Proceedings of the 2009 IEEE International Conference on Robotics and Automation, Kobe, Japan, 2–17 May 2009; pp. 844–848. [Google Scholar] [CrossRef]
- Christie, M.; Machap, R.; Normand, J.M.; Olivier, P.; Pickering, J. Virtual Camera Planning: A Survey. In Smart Graphics; Butz, A., Fisher, B., Krüger, A., Olivier, P., Eds.; Springer: Berlin/Heidelberg, Germany, 2005; pp. 40–52. [Google Scholar]
- Kaiser, D.; Quek, G.L.; Cichy, R.M.; Peelen, M.V. Object Vision in a Structured World. Trends Cogn. Sci. 2019, 23, 672–685. [Google Scholar] [CrossRef] [PubMed]
- Rudoy, D.; Zelnik-Manor, L. Viewpoint Selection for Human Actions. Int. J. Comput. Vis. 2012, 97, 243–254. [Google Scholar] [CrossRef]
- Bares, W.H.; Thainimit, S.; McDermott, S. A Model for Constraint-Based Camera Planning. In Smart Graphics, Proceedings of the 2000 AAAI Spring Symposium, Palo Alto, CA, USA, 20–22 March 2000; AAAI Press: Menlo Park, CA, USA, 2000; pp. 84–91. [Google Scholar]
- Halper, N.; Olivier, P. CamPlan: A Camera Planning Agent. In Smart Graphics, Proceedings of the 2000 AAAI Spring Symposium, Palo Alto, CA, USA, 20–22 March 2000; AAAI Press: Menlo Park, CA, USA, 2000; pp. 92–100. [Google Scholar]
- Magaña, A.; Dirr, J.; Bauer, P.; Reinhart, G. Viewpoint Generation Using Feature-Based Constrained Spaces for Robot Vision Systems. Robotics 2023, 12, 108. [Google Scholar] [CrossRef]
- Jangir, R.; Hansen, N.; Ghosal, S.; Jain, M.; Wang, X. Look Closer: Bridging Egocentric and Third-Person Views with Transformers for Robotic Manipulation. IEEE Robot. Autom. Lett. 2022, 7, 3046–3053. [Google Scholar] [CrossRef]
- Singh, A.; Kalaichelvi, V.; Karthikeyan, R. A survey on vision guided robotic systems with intelligent control strategies for autonomous tasks. Cogent Eng. 2022, 9, 2050020. [Google Scholar] [CrossRef]
- Nicolis, D.; Palumbo, M.; Zanchettin, A.M.; Rocco, P. Occlusion-Free Visual Servoing for the Shared Autonomy Teleoperation of Dual-Arm Robots. IEEE Robot. Autom. Lett. 2018, 3, 796–803. [Google Scholar] [CrossRef]
- Chao, F.; Zhu, Z.; Lin, C.-M.; Hu, H.; Yang, L.; Shang, C.; Zhou, C. Enhanced Robotic Hand–Eye Coordination Inspired From Human-Like Behavioral Patterns. IEEE Trans. Cogn. Dev. Syst. 2018, 10, 384–396. [Google Scholar] [CrossRef]
- Triggs, B.; Laugier, C. Automatic camera placement for robot vision tasks. In Proceedings of the 1995 IEEE International Conference on Robotics and Automation, Nagoya, Japan, 21–27 May 1995; pp. 1732–1737. [Google Scholar] [CrossRef]
- Baumgärtner, J.; Bertschinger, B.; Hoffmann, K.; Puchta, A.; Sawodny, O.; Reichelt, S.; Fleischer, J. Camera Placement Optimization for a Novel Modular Robot Tracking System. In Proceedings of the 2023 IEEE SENSORS, Vienna, Austria, 29 October–1 November 2023; pp. 1–4. [Google Scholar] [CrossRef]
- Akinola, I.; Varley, J.; Kalashnikov, D. Learning Precise 3D Manipulation from Multiple Uncalibrated Cameras. In Proceedings of the 2020 IEEE International Conference on Robotics and Automation (ICRA), Paris, France, 31 May–31 August 2020; pp. 4616–4622. [Google Scholar] [CrossRef]
- Boshoff, M.; Kuhlenkötter, B.; Jakschik, M.; Sinnemann, J. Dynamische Kameraverfolgung von Regions of Interest in der Produktion mit Flugrobotern. Z. Wirtsch. Fabr. 2022, 117, 733–736. [Google Scholar] [CrossRef]
- Boshoff, M.; Barros, G.; Kuhlenkötter, B. Performance measurement of unmanned aerial vehicles to suit industrial applications. Prod. Eng. 2024. [Google Scholar] [CrossRef]
- Tarabanis, K.A.; Tsai, R.Y.; Allen, P.K. The MVP sensor planning system for robotic vision tasks. IEEE Trans. Robot. Automat. 1995, 11, 72–85. [Google Scholar] [CrossRef]
- Abrams, S.; Allen, P.K.; Tarabanis, K.A. Dynamic sensor planning. In Proceedings of the IEEE International Conference on Robotics and Automation, Atlanta, GA, USA, 2–6 May 1993; pp. 605–610. [Google Scholar] [CrossRef]
- Cowan, C.K.; Kovesi, P.D. Automatic sensor placement from vision task requirements. IEEE Trans. Pattern Anal. Mach. Intell. 1988, 10, 407–416. [Google Scholar] [CrossRef]
- Tarabanis, K.; Tsai, R.Y. Computing viewpoints that satisfy optical constraints. In Proceedings of the 1991 IEEE Computer Society Conference on Computer Vision and Pattern Recognition, Maui, HI, USA, 3–6 June 1991; pp. 152–158. [Google Scholar] [CrossRef]
- Tarabanis, K.; Tsai, R.Y.; Abrams, S. Planning viewpoints that simultaneously satisfy several feature detectability constraints for robotic vision. In Proceedings of the Fifth International Conference on Advanced Robotics ‘Robots in Unstructured Environments’, Pisa, Italy, 19–22 June 1991; Volume 2, pp. 1410–1415. [Google Scholar] [CrossRef]
- Tarabanis, K.; Tsai, R.Y.; Allen, P.K. Automated sensor planning for robotic vision tasks. In Proceedings of the 1991 IEEE International Conference on Robotics and Automation, Sacramento, CA, USA, 9–11 April 1991; pp. 76–82. [Google Scholar] [CrossRef]
- Abrams, S.; Allen, P.K.; Tarabanis, K. Computing Camera Viewpoints in an Active Robot Work Cell. Int. J. Robot. Res. 1999, 18, 267–285. [Google Scholar] [CrossRef]
- Peuzin-Jubert, M.; Polette, A.; Nozais, D.; Mari, J.-L.; Pernot, J.-P. Survey on the View Planning Problem for Reverse Engineering and Automated Control Applications. Comput.-Aided Des. 2021, 141, 103094. [Google Scholar] [CrossRef]
- Zeng, R.; Wen, Y.; Zhao, W.; Liu, Y.-J. View planning in robot active vision: A survey of systems, algorithms, and applications. Comput. Vis. Media 2020, 6, 225–245. [Google Scholar] [CrossRef]
- Magaña, A.; Gebel, S.; Bauer, P.; Reinhart, G. Knowledge-Based Service-Oriented System for the Automated Programming of Robot-Based Inspection Systems. In Proceedings of the 2020 25th IEEE International Conference on Emerging Technologies and Factory Automation (ETFA), Vienna, Austria, 8–11 September 2020. [Google Scholar] [CrossRef]
- Moniruzzaman, M.D.; Rassau, A.; Chai, D.; Islam, S.M.S. Teleoperation methods and enhancement techniques for mobile robots: A comprehensive survey. Robot. Auton. Syst. 2022, 150, 103973. [Google Scholar] [CrossRef]
- Xiao, X.; Dufek, J.; Murphy, R.R. Autonomous Visual Assistance for Robot Operations Using a Tethered UAV. In Field and Service Robotics: Results of the 12th International Conference; Springer: Singapore, 2019; Volume 16, pp. 15–29. [Google Scholar] [CrossRef]
- Sato, R.; Kamezaki, M.; Niuchi, S.; Sugano, S.; Iwata, H. Derivation of an Optimum and Allowable Range of Pan and Tilt Angles in External Sideway Views for Grasping and Placing Tasks in Unmanned Construction Based on Human Object Recognition. In Proceedings of the 2019 IEEE/SICE International Symposium on System Integration (SII), Paris, France, 14–16 January 2019; pp. 776–781. [Google Scholar] [CrossRef]
- Gawel, A.; Lin, Y.; Koutros, T.; Siegwart, R.; Cadena, C. Aerial-Ground collaborative sensing: Third-Person view for teleoperation. In Proceedings of the 2018 IEEE International Symposium on Safety, Security, and Rescue Robotics (SSRR), Philadelphia, PA, USA, 6–8 August 2018; pp. 1–7. [Google Scholar] [CrossRef]
- Haldankar, T.; Kedia, S.; Panchmatia, R.; Parmar, D.; Sawant, D. Review of Implementation of Vision Systems in Robotic Welding. In Proceedings of the 2021 5th International Conference on Intelligent Computing and Control Systems (ICICCS), Madurai, India, 6–8 May 2021; pp. 692–700. [Google Scholar] [CrossRef]
- Glorieux, E.; Franciosa, P.; Ceglarek, D. Coverage path planning with targetted viewpoint sampling for robotic free-form surface inspection. Robot. Comput.-Integr. Manuf. 2020, 61, 101843. [Google Scholar] [CrossRef]
- Herakovic, N. Robot Vision in Industrial Assembly and Quality Control Processes. In Robot Vision; Ude, A., Ed.; IntechOpen: Rijeka, Croatia, 2010. [Google Scholar] [CrossRef]
- Gemerek, J. Active Vision and Perception. Ph.D. Thesis, Cornell University, Ithaca, NY, USA, 2020. [Google Scholar]
- Rakita, D.; Mutlu, B.; Gleicher, M. Remote Telemanipulation with Adapting Viewpoints in Visually Complex Environments. In Proceedings of the Robotics: Science and Systems XV, Freiburg im Breisgau, Germany, 22–26 June 2019; pp. 1–10. [Google Scholar] [CrossRef]
- Rakita, D.; Mutlu, B.; Gleicher, M. An Autonomous Dynamic Camera Method for Effective Remote Teleoperation. In Proceedings of the 2018 ACM/IEEE International Conference on Human-Robot Interaction, Chicago, IL, USA, 5–8 March 2018; Kanda, T., Ŝabanović, S., Hoffman, G., Tapus, A., Eds.; ACM: New York, NY, USA, 2018; pp. 325–333. [Google Scholar] [CrossRef]
- Jia, R.; Yang, L.; Cao, Y.; Or, C.K.; Wang, W.; Pan, J. Learning Autonomous Viewpoint Adjustment from Human Demonstrations for Telemanipulation. J. Hum.-Robot. Interact. 2024, 13, 32. [Google Scholar] [CrossRef]
- Saakes, D.; Choudhary, V.; Sakamoto, D.; Inami, M.; Lgarashi, T. A teleoperating interface for ground vehicles using autonomous flying cameras. In Proceedings of the 2013 23rd International Conference on Artificial Reality and Telexistence (ICAT), Tokyo, Japan, 11–13 December 2013; pp. 13–19. [Google Scholar] [CrossRef]
- Claret, J.-A.; Zaplana, I.; Basanez, L. Teleoperating a mobile manipulator and a free-flying camera from a single haptic device. In Proceedings of the 2016 IEEE International Symposium on Safety, Security, and Rescue Robotics (SSRR), Lausanne, Switzerland, 23–27 October 2016; pp. 291–296. [Google Scholar] [CrossRef]
- Claret, J.-A.; Basañez, L. Using an UAV to guide the teleoperation of a mobile manipulator. In Proceedings of the XXXVIII Jornadas de Automática, Gijón, Spain, 6–8 September 2017; Universidade da Coruña, Servizo de Publicacións: Oviedo, Spain, 2017; pp. 694–700. [Google Scholar] [CrossRef]
- Claret, J.-A.; Basañez, L. Teleoperating a mobile manipulation using a UAV camera without robot self-occlusions. In Proceedings of the XL Jornadas de Automática: Libro de Actas, Ferrol, Spain, 4–6 September 2019; pp. 694–701. [Google Scholar] [CrossRef]
- Xiao, X.; Dufek, J.; Murphy, R. Visual servoing for teleoperation using a tethered UAV. In Proceedings of the 15th IEEE International Symposium on Safety, Security and Rescue Robotics, Shanghai, China, 11–13 October 2017; IEEE: Piscataway, NJ, USA, 2017. [Google Scholar] [CrossRef]
- Dufek, J.; Xiao, X.; Murphy, R.R. Best Viewpoints for External Robots or Sensors Assisting Other Robots. IEEE Trans. Hum.-Mach. Syst. 2021, 51, 324–334. [Google Scholar] [CrossRef]
- Senft, E.; Hagenow, M.; Praveena, P.; Radwin, R.; Zinn, M.; Gleicher, M.; Mutlu, B. A Method for Automated Drone Viewpoints to Support Remote Robot Manipulation. In Proceedings of the 2022 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), Kyoto, Japan, 23–27 October 2022; pp. 7704–7711. [Google Scholar] [CrossRef]
- Hartley, R.; Zisserman, A. Camera Models. In Multiple View Geometry in Computer Vision; Cambridge University Press: Cambridge, UK, 2004; pp. 153–177. [Google Scholar]
- Cormen, T.H. Introduction to Algorithms, 3rd ed.; MIT Press: Cambridge, MA, USA, 2009. [Google Scholar]
- Liu, J.; Sridharan, S.; Fookes, C. Recent Advances in Camera Planning for Large Area Surveillance. ACM Comput. Surv. 2016, 49, 6. [Google Scholar] [CrossRef]
- Sumi Suresh, M.S.; Menon, V.; Setlur, S.; Govindaraju, V. Maximizing Coverage over a Surveillance Region Using a Specific Number of Cameras. In Pattern Recognition; Antonacopoulos, A., Chaudhuri, S., Chellappa, R., Liu, C.-L., Bhattacharya, S., Pal, U., Eds.; Springer Nature: Cham, Switzerland, 2025; pp. 303–320. [Google Scholar] [CrossRef]
- Melo, A.G.; Pinto, M.F.; Marcato, A.L.M.; Honório, L.M.; Coelho, F.O. Dynamic Optimization and Heuristics Based Online Coverage Path Planning in 3D Environment for UAVs. Sensors 2021, 21, 1108. [Google Scholar] [CrossRef] [PubMed]
- Chattaoui, S.; Jarray, R.; Bouallègue, S. Comparison of A* and D* Algorithms for 3D Path Planning of Unmanned Aerial Vehicles. In Proceedings of the 2023 IEEE International Conference on Artificial Intelligence & Green Energy (ICAIGE), Sousse, Tunisia, 12–14 October 2023; pp. 1–6. [Google Scholar] [CrossRef]
- Wang, P.; Mutahira, H.; Kim, J.; Muhammad, M.S. ABA*–Adaptive Bidirectional A* Algorithm for Aerial Robot Path Planning. IEEE Access 2023, 11, 103521–103529. [Google Scholar] [CrossRef]
- Liang, X.; Meng, G.; Xu, Y.; Luo, H. A geometrical path planning method for unmanned aerial vehicle in 2D/3D complex environment. Intell. Serv. Robot. 2018, 11, 301–312. [Google Scholar] [CrossRef]
- Yao, Z.; Wang, W. An efficient tangent based topologically distinctive path finding for grid maps. arXiv 2023, arXiv:2311.00853. [Google Scholar] [CrossRef]
- O’Neill, B. Elementary Differential Geometry, 2nd ed.; Revised; Elsevier Academic Press: Amsterdam, The Netherlands, 2006. [Google Scholar]
- Park, S.; Deyst, J.; How, J. A New Nonlinear Guidance Logic for Trajectory Tracking. In Proceedings of the AIAA Guidance, Navigation, and Control Conference and Exhibit, Providence, PI, USA, 16–19 August 2004. [Google Scholar] [CrossRef]
Requirement | Explanation |
---|---|
Determine the optimal viewpoint from defined ROI | The user should be able to define the position, orientation, and size of the ROI dynamically during a running task. The optimal viewpoint is defined from the geometric relationship of the ROI and the camera perspective to maximize process insight over the runtime. |
Use UAV as dynamic camera within a control framework | Using a UAV as dynamic camera offers the highest flexibility in continuously achieving the optimal viewpoint, independent of the environment or other robot systems, which may be constrained by their reach and flexibility. A technical framework is needed to enable control while considering the specific characteristics of the system. |
Unoccluded | The optimal viewpoint must grant a clear, unoccluded view onto the ROI. |
Protected collision zones for path planning | Collision zones must be defined between the robot and the UAV, as well as between the UAV and its environment. Similarly, no-fly zones should be defined in the same way. These collision zones must also be considered when determining the available flight paths. |
Comparison of the costs of viewpoints | The costs of getting from one viewpoint to another should be considered. The costs are defined as the length of the route and the lack of perspective coverage on the route. The path with the lowest costs is considered optimal. |
Consideration of fluctuations in the selection of the optimal viewpoint | It can be assumed that there are system states in which the costs of flying a path are similar, but sudden changes in path decisions can occur. This would result in an unstable flight behavior of the UAV, as it responds to spontaneous changes in movement direction. Therefore, a decision-making method or strategy should be implemented to ensure consistent decision behavior. |
Measurability of method | The entire method of visually covering an ROI should be measurable. For this purpose, perspective coverage of the ROI over time should be expressed in a numerical value and the resulting method should be assessable. |
State | Explanation |
---|---|
Following State | In the following state, the UAV follows an optimal viewpoint until it is not available anymore or the threshold condition for is reached. If the optimal viewpoint changes, the UAV switches to the changing viewpoint state. Arrived at the next viewpoint, the UAV returns to the following state. |
Changing Viewpoint State | This state becomes active when a next viewpoint has been identified either if is better or if the last followed is not available anymore. The UAV navigates to the next viewpoint and then returns to the following state. |
Hovering State | If a viewpoint is suddenly no longer available in the following state, the UAV switches to the hovering state. In this state, it remains hovering in the air for a maximum of two seconds and waits for a viewpoint to become available. If no viewpoint becomes available within this time, the UAV switches to the safe position state. |
Safe Position State | The safe position is a manually defined point by the user and is located to be reached safely, e.g., above the robot with a sufficient safety distance. Flying to the safe position has to be completed before changing to another viewpoint, in order to avoid abrupt changes in direction. The UAV waits at the safe position until a new viewpoint becomes available. |
Final Results | Value |
---|---|
Theoretically maximum coverage in the scenario | 2.513% |
Average perspective coverage of the UAV | 2.166% |
Highest average coverage of a static camera | 0.850% |
Portion of following time | 100% |
Portion of changing time | 0% |
Portion of safe position state time | 0% |
Portion of time hovering | 0% |
Number of viewpoint changes | 0 |
Number of safe position flights | 0 |
Results | Value |
---|---|
Theoretically highest possible coverage in the scene | 2.513% |
Average perspective coverage of the UAV | 1.667% |
Highest average coverage of a static camera | 0.298% |
Portion of following time | 74.136% |
Portion of changing time | 14.987% |
Portion of safe position state time | 8.199% |
Portion of time hovering | 2.678% |
Number of viewpoint changes | 5 |
Number of safe position flights | 3 |
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content. |
© 2025 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
Boshoff, M.; Kuhlenkötter, B.; Koslowski, P. Dynamic Camera Planning for Robot-Integrated Manufacturing Processes Using a UAV. Robotics 2025, 14, 23. https://doi.org/10.3390/robotics14030023
Boshoff M, Kuhlenkötter B, Koslowski P. Dynamic Camera Planning for Robot-Integrated Manufacturing Processes Using a UAV. Robotics. 2025; 14(3):23. https://doi.org/10.3390/robotics14030023
Chicago/Turabian StyleBoshoff, Marius, Bernd Kuhlenkötter, and Paul Koslowski. 2025. "Dynamic Camera Planning for Robot-Integrated Manufacturing Processes Using a UAV" Robotics 14, no. 3: 23. https://doi.org/10.3390/robotics14030023
APA StyleBoshoff, M., Kuhlenkötter, B., & Koslowski, P. (2025). Dynamic Camera Planning for Robot-Integrated Manufacturing Processes Using a UAV. Robotics, 14(3), 23. https://doi.org/10.3390/robotics14030023