An Autonomous Tracking and Landing Method for Unmanned Aerial Vehicles Based on Visual Navigation
Abstract
:1. Introduction
2. Materials and Methods
3. Results
3.1. Tracking Experiments
3.1.1. Linear Reciprocating Trajectory
3.1.2. Circular Trajectory
3.1.3. Straight-Sided Ellipse Trajectory
3.2. Landing Experiments
4. Discussion
5. Conclusions
Author Contributions
Funding
Data Availability Statement
Conflicts of Interest
References
- Al-Ghussain, L.; Bailey, S.C.C. Uncrewed Aircraft System Measurements of Atmospheric Surface-Layer Structure during Morning Transition. Bound. -Layer Meteorol. 2022, 185, 229–258. [Google Scholar] [CrossRef]
- Greco, R.; Barca, E.; Raumonen, P.; Persia, M.; Tartarino, P. Methodology for measuring dendrometric parameters in a mediterranean forest with UAVs flying inside forest. Int. J. Appl. Earth Obs. Geoinf. 2023, 122, 13. [Google Scholar] [CrossRef]
- Abrahams, M.; Sibanda, M.; Dube, T.; Chimonyo, V.G.P.; Mabhaudhi, T. A Systematic Review of UAV Applications for Mapping Neglected and Underutilised Crop Species’ Spatial Distribution and Health. Remote Sens. 2023, 15, 4672. [Google Scholar] [CrossRef]
- Sanchez-Lopez, J.L.; Pestana, J.; Saripalli, S.; Campoy, P. An Approach Toward Visual Autonomous Ship Board Landing of a VTOL UAV. J. Intell. Robot. Syst. 2014, 74, 113–127. [Google Scholar] [CrossRef]
- Michael, N.; Shen, S.J.; Mohta, K.; Mulgaonkar, Y.; Kumar, V.; Nagatani, K.; Okada, Y.; Kiribayashi, S.; Otake, K.; Yoshida, K.; et al. Collaborative mapping of an earthquake-damaged building via ground and aerial robots. J. Field Robot. 2012, 29, 832–841. [Google Scholar] [CrossRef]
- Miki, T.; Khrapchenkov, P.; Hori, K. UAV/UGV Autonomous Cooperation: UAV assists UGV to climb a cliff by attaching a tether. In Proceedings of the IEEE International Conference on Robotics and Automation (ICRA), Montreal, QC, Canada, 20–24 May 2019; IEEE: Piscataway, NJ, USA, 2019; pp. 8041–8047. [Google Scholar]
- Sanchez-Lopez, J.L.; Castillo-Lopez, M.; Olivares-Mendez, M.A.; Voos, H. Trajectory Tracking for Aerial Robots: An Optimization-Based Planning and Control Approach. J. Intell. Robot. Syst. 2020, 100, 531–574. [Google Scholar] [CrossRef]
- Kassab, M.A.; Maher, A.; Elkazzaz, F.; Zhang, B.C. UAV Target Tracking by Detection via Deep Neural Networks. In Proceedings of the IEEE International Conference on Multimedia and Expo (ICME), Shanghai, China, 8–12 July 2019; IEEE: Piscataway, NJ, USA, 2019; pp. 139–144. [Google Scholar]
- Elloumi, M.; Escrig, B.; Dhaou, R.; Idoudi, H.; Saidane, L.A. Designing an energy efficient UAV tracking algorithm. In Proceedings of the 13th International Wireless Communications and Mobile Computing Conference (IWCMC), Valencia, Spain, 26–30 June 2017; IEEE: Piscataway, NJ, USA, 2017; pp. 127–132. [Google Scholar]
- Altan, A.; Hacioglu, R. Model predictive control of three-axis gimbal system mounted on UAV for real-time target tracking under external disturbances. Mech. Syst. Signal Proc. 2020, 138, 23. [Google Scholar] [CrossRef]
- Jin, S.G.; Zhang, J.Y.; Shen, L.C.; Li, T.X. On-board Vision Autonomous Landing Techniques for Quadrotor: A Survey. In Proceedings of the 35th Chinese Control Conference (CCC), Chengdu, China, 27–29 July 2016; pp. 10284–10289. [Google Scholar]
- Nepal, U.; Eslamiat, H. Comparing YOLOv3, YOLOv4 and YOLOv5 for Autonomous Landing Spot Detection in Faulty UAVs. Sensors 2022, 22, 464. [Google Scholar] [CrossRef] [PubMed]
- Zhang, H.T.; Hu, B.B.; Xu, Z.C.; Cai, Z.; Liu, B.; Wang, X.D.; Geng, T.; Zhong, S.; Zhao, J. Visual Navigation and Landing Control of an Unmanned Aerial Vehicle on a Moving Autonomous Surface Vehicle via Adaptive Learning. IEEE Trans. Neural Netw. Learn. Syst. 2021, 32, 5345–5355. [Google Scholar] [CrossRef] [PubMed]
- Abujoub, S.; McPhee, J.; Westin, C.; Irani, R.A. Unmanned Aerial Vehicle Landing on Maritime Vessels using Signal Prediction of the Ship Motion. In Proceedings of the Conference on OCEANS MTS/IEEE Charleston, Charleston, SC, USA, 22–25 October 2018; IEEE: Piscataway, NJ, USA, 2018. [Google Scholar]
- Li, W.Z.; Ge, Y.; Guan, Z.H.; Ye, G. Synchronized Motion-Based UAV-USV Cooperative Autonomous Landing. J. Mar. Sci. Eng. 2022, 10, 1214. [Google Scholar] [CrossRef]
- Persson, L.; Muskardin, T.; Wahlberg, B. Cooperative Rendezvous of Ground Vehicle and Aerial Vehicle using Model Predictive Control. In Proceedings of the 56th Annual IEEE Conference on Decision and Control (CDC), Melbourne, Australia, 12–15 December 2017; IEEE: Piscataway, NJ, USA, 2017. [Google Scholar]
- Araar, O.; Aouf, N.; Vitanov, I. Vision Based Autonomous Landing of Multirotor UAV on Moving Platform. J. Intell. Robot. Syst. 2017, 85, 369–384. [Google Scholar] [CrossRef]
- Borowczyk, A.; Nguyen, D.T.; Nguyen, A.P.V.; Nguyen, D.Q.; Saussié, D.; Le Ny, J. Autonomous Landing of a Multirotor Micro Air Vehicle on a High Velocity Ground Vehicle. In Proceedings of the 20th World Congress of the International-Federation-of-Automatic-Control (IFAC), Toulouse, France, 9–14 July 2017; pp. 10488–10494. [Google Scholar]
- Xu, Y.B.; Liu, Z.H.; Wang, X.K. Monocular Vision based Autonomous Landing of Quadrotor through Deep Reinforcement Learning. In Proceedings of the 37th Chinese Control Conference (CCC), Wuhan, China, 25–27 July 2018; pp. 10014–10019. [Google Scholar]
- Li, W.J.; Fu, Z.Y. Unmanned aerial vehicle positioning based on multi-sensor information fusion. Geo-Spat. Inf. Sci. 2018, 21, 302–310. [Google Scholar] [CrossRef]
- Lim, J.; Lee, T.; Pyo, S.; Lee, J.; Kim, J.; Lee, J. Hemispherical InfraRed (IR) Marker for Reliable Detection for Autonomous Landing on a Moving Ground Vehicle from Various Altitude Angles. IEEE-ASME Trans. Mechatron. 2022, 27, 485–492. [Google Scholar] [CrossRef]
- Yuan, B.X.; Ma, W.Y.; Wang, F. High Speed Safe Autonomous Landing Marker Tracking of Fixed Wing Drone Based on Deep Learning. IEEE Access 2022, 10, 80415–80436. [Google Scholar] [CrossRef]
- de Croon, G.; Ho, H.W.; De Wagter, C.; van Kampen, E.; Remes, B.; Chu, Q.P. Optic-flow based slope estimation for autonomous landing. Int. J. Micro Air Veh. 2013, 5, 287–297. [Google Scholar] [CrossRef]
- Kalaitzakis, M.; Cain, B.; Carroll, S.; Ambrosi, A.; Whitehead, C.; Vitzilaios, N. Fiducial Markers for Pose Estimation: Overview, Applications and Experimental Comparison of the ARTag, AprilTag, ArUco and STag Markers. J. Intell. Robot. Syst. 2021, 101, 26. [Google Scholar] [CrossRef]
Attitude Parameters | Performance Indicators | |||
---|---|---|---|---|
(s) | (s) | (s) | (%) | |
0.40 | 0.60 | |||
0.45 | 0.60 | |||
0.27 | 0.40 |
Parameter Types | Performance Parameters | |
---|---|---|
SY011HD | T265 | |
Active Pixels/frames per second | ||
Angle of vision | ||
Interface type | USB2.0 | USB3.1 |
Size (mm) | Reference Value (mm) | Measured Value 1 (mm) | Measured Value 2 (mm) | Measured Value 3 (mm) | Average Value (mm) |
---|---|---|---|---|---|
20 × 20 | 300 | 306 | 310 | 312 | 309.3 |
350 | - | - | - | - | |
400 | - | - | - | - | |
500 | - | - | - | - | |
600 | - | - | - | - | |
30 × 30 | 300 | 313 | 312 | 311 | 312 |
350 | 369 | 366 | 368 | 367.7 | |
400 | 426 | 419 | 421 | 422 | |
500 | - | - | - | - | |
600 | - | - | - | - | |
40 × 40 | 300 | 313 | 311 | 313 | 312.3 |
350 | 369 | 366 | 371 | 368.7 | |
400 | 426 | 419 | 421 | 423 | |
500 | 524 | 527 | 520 | 523.7 | |
600 | - | - | - | - |
Size (mm) | Reference Value (mm) | Measured Value 1 (mm) | Measured Value 2 (mm) | Measured Value 3 (mm) | Average Value (mm) |
---|---|---|---|---|---|
100 × 100 | 300 | 305 | 306 | 306 | 306 |
350 | 361 | 360 | 361 | 361 | |
400 | 411 | 412 | 413 | 412 | |
500 | 517 | 518 | 519 | 518 | |
600 | 620 | 623 | 621 | 621 | |
150 × 150 | 300 | 306 | 306 | 306 | 306 |
350 | 359 | 358 | 358 | 358 | |
400 | 410 | 411 | 410 | 410 | |
500 | 516 | 514 | 515 | 515 | |
600 | 621 | 619 | 620 | 620 | |
200 × 200 | 300 | 307 | 307 | 307 | 307 |
350 | 359 | 360 | 358 | 359 | |
400 | 410 | 411 | 410 | 410 | |
500 | 515 | 514 | 515 | 515 | |
600 | 619 | 618 | 618 | 618 |
Velocity | Average Value Error | Mean Square Error | ||
---|---|---|---|---|
(m/s) | (m/s) | (m/s) | (m/s) | |
v1 | ||||
v2 |
Velocity | Mean Square Error | |
---|---|---|
(m/s) | (m/s) | |
v1 | ||
v2 |
Velocity | Average Value Error | Mean Square Error | ||
---|---|---|---|---|
(m/s) | (m/s) | (m/s) | (m/s) | |
v1 | ||||
v2 |
Velocity | Mean Square Error | |
---|---|---|
(m/s) | (m/s) | |
v1 | ||
v2 |
Experimental Sequence | x-Axis Landing Accuracy (m) | y-Axis Landing Accuracy (m) |
---|---|---|
Test 1 | 0.05 | 0.04 |
Test 2 | 0.01 | 0.04 |
Test 3 | 0.04 | 0.05 |
Average value | 0.03 | 0.04 |
Velocity | x-Axis Landing Accuracy (m) | y-Axis Landing Accuracy (m) |
---|---|---|
v1 | 0.14 | 0.04 |
v2 | 0.23 | 0.01 |
v3 | 0.32 | 0.01 |
Average value | 0.23 | 0.02 |
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content. |
© 2023 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
Wang, B.; Ma, R.; Zhu, H.; Sha, Y.; Yang, T. An Autonomous Tracking and Landing Method for Unmanned Aerial Vehicles Based on Visual Navigation. Drones 2023, 7, 703. https://doi.org/10.3390/drones7120703
Wang B, Ma R, Zhu H, Sha Y, Yang T. An Autonomous Tracking and Landing Method for Unmanned Aerial Vehicles Based on Visual Navigation. Drones. 2023; 7(12):703. https://doi.org/10.3390/drones7120703
Chicago/Turabian StyleWang, Bingkun, Ruitao Ma, Hang Zhu, Yongbai Sha, and Tianye Yang. 2023. "An Autonomous Tracking and Landing Method for Unmanned Aerial Vehicles Based on Visual Navigation" Drones 7, no. 12: 703. https://doi.org/10.3390/drones7120703
APA StyleWang, B., Ma, R., Zhu, H., Sha, Y., & Yang, T. (2023). An Autonomous Tracking and Landing Method for Unmanned Aerial Vehicles Based on Visual Navigation. Drones, 7(12), 703. https://doi.org/10.3390/drones7120703