Distance Measurement of Unmanned Aerial Vehicles Using Vision-Based Systems in Unknown Environments
Abstract
:1. Introduction
2. Materials and Methods
2.1. Materials
2.2. The Proposed Methods
2.3. UAV Attitude Estimation
2.4. Camera Position Correction
2.5. Semantic Segmentation
2.6. Distance Measurement
2.7. Sensor Specifications
3. Results
3.1. The User Interface
3.2. Frame Size Comparison
3.3. Segmentation
3.4. Distance Measurement
4. Conclusions
Author Contributions
Funding
Acknowledgments
Conflicts of Interest
References
- Cai, S.; Huang, Y.; Ye, B.; Xu, C. Dynamic illumination optical flow computing for sensing multiple mobile robots from a drone. IEEE Trans. Syst. Man Cybern. Syst. 2017, 48, 1370–1382. [Google Scholar] [CrossRef]
- Minaeian, S.; Liu, J.; Son, Y.J. Vision-based target detection and localization via a team of cooperative UAV and UGVs. IEEE Trans. Syst. Man Cybern. Syst. 2016, 46, 1005–1016. [Google Scholar] [CrossRef]
- Wang, L.; Zhang, Z. Automatic detection of wind turbine blade surface cracks based on UAV-taken images. IEEE Trans. Ind. Electron. 2017, 64, 7293–7309. [Google Scholar] [CrossRef]
- Rahmaniar, W.; Rakhmania, A.E. Online digital image stabilization for an unmanned aerial vehicle (UA ). J. Robot. Control 2021, 2, 234–239. [Google Scholar] [CrossRef]
- Mebarki, R.; Lippiello, V.; Siciliano, B. Nonlinear visual control of unmanned aerial vehicles in GPS-denied environments. IEEE Trans. Robot. 2015, 31, 1004–1017. [Google Scholar] [CrossRef]
- Chen, S.; Chen, H.; Zhou, W.; Wen, C.-Y.; Li, B. End-to-end UAV simulation for visual SLAM and navigation. arXiv 2020, arXiv:2012.00298. [Google Scholar]
- Gao, F.; Wu, W.; Gao, W.; Shen, S. Flying on point clouds: Online trajectory generation and autonomous navigation for quadrotors in cluttered environments. J. Field Robot. 2019, 36, 710–733. [Google Scholar] [CrossRef]
- Hornung, A.; Wurm, K.M.; Bennewitz, M.; Stachniss, C.; Burgard, W. OctoMap: An efficient probabilistic 3D mapping framework based on octrees. Auton. Robots 2013, 34, 189–206. [Google Scholar] [CrossRef] [Green Version]
- Ni, L.M.; Liu, Y.; Lau, Y.C.; Patil, A.P. LANDMARC: Indoor location sensing using active RFID. In Proceedings of the Pervasive Computing and Communications, Fort Worth, TX, USA, 26–26 March 2003; pp. 407–415. [Google Scholar]
- Guerrieri, J.R.; Francis, M.H.; Wilson, P.F.; Kos, T.; Miller, L.E.; Bryner, N.P.; Stroup, D.W.; Klein-Berndt, L. RFID-assisted indoor localization and communication for first responders. In Proceedings of the European Conference on Antennas and Propagation, Nice, France, 6–10 November 2006; pp. 1–6. [Google Scholar]
- Subramanian, V.; Burks, T.F.; Arroyo, A.A. Development of machine vision and laser radar based autonomous vehicle guidance systems for citrus grove navigation. Comput. Electron. Agric. 2006, 53, 130–143. [Google Scholar] [CrossRef]
- Barawid, O.C.; Mizushima, A.; Ishii, K.; Noguchi, N. Development of an autonomous navigation system using a two-dimensional laser scanner in an orchard application. Biosyst. Eng. 2007, 96, 139–149. [Google Scholar] [CrossRef]
- Lin, T.H.; Huang, P.; Chu, H.H.; You, C.W. Energy-efficient boundary detection for RF-based localization systems. IEEE Trans. Mob. Comput. 2009, 8, 29–40. [Google Scholar] [CrossRef] [Green Version]
- Cheok, A.D.; Yue, L. A novel light-sensor-based information transmission system for indoor positioning and navigation. IEEE Trans. Instrum. Meas. 2011, 60, 290–299. [Google Scholar] [CrossRef]
- Nakahira, K.; Kodama, T.; Morita, S.; Okuma, S. Distance measurement by an ultrasonic system based on a digital polarity correlator. IEEE Trans. Instrum. Meas. 2001, 50, 1748–1752. [Google Scholar] [CrossRef]
- Kim, J.; Jun, H. Vision-based location positioning using augmented reality for indoor navigation. IEEE Trans. Consum. Electron. 2008, 54, 954–962. [Google Scholar] [CrossRef]
- Li, I.H.; Chen, M.C.; Wang, W.Y.; Su, S.F.; Lai, T.W. Mobile robot self-localization system using single webcam distance measurement technology in indoor environments. Sensors 2014, 14, 2089–2109. [Google Scholar] [CrossRef] [Green Version]
- Chen, H.; Matsumoto, K.; Ota, J.; Arai, T. Self-calibration of environmental camera for mobile robot navigation. Rob. Auton. Syst. 2007, 55, 177–190. [Google Scholar] [CrossRef]
- Shim, J.H.; Cho, Y.I. A mobile robot localization via indoor fixed remote surveillance cameras. Sensors 2016, 16, 195. [Google Scholar] [CrossRef] [Green Version]
- Huang, L.; Song, J. Research of autonomous vision-based absolute navigation for unmanned aerial vehicle. In Proceedings of the Control, Automation, Robotics and Vision (ICARCV), Phuket, Thailand, 13–15 November 2016; pp. 13–15. [Google Scholar]
- Mahony, R.; Kumar, V.; Corke, P. Multirotor aerial vehicles: Modeling, estimation, and control of quadrotor. IEEE Robot. Autom. Mag. 2012, 19, 20–32. [Google Scholar] [CrossRef]
- Mahony, R.; Hamel, T.; Pflimlin, J.-M. Nonlinear complementary filters on the special orthogonal group. IEEE Trans. Autom. Control 2008, 53, 1203–1218. [Google Scholar] [CrossRef] [Green Version]
- Wang, H.; Ye, X.; Tian, Y.; Zheng, G.; Christov, N. Model-free-based terminal SMC of quadrotor attitude and position. IEEE Trans. Aerosp. Electron. Syst. 2016, 52, 2519–2528. [Google Scholar] [CrossRef]
- Xuan-Mung, N.; Hong, S.K. Improved altitude control algorithm for quadcopter unmanned aerial vehicles. Appl. Sci. 2019, 9, 2122. [Google Scholar] [CrossRef] [Green Version]
- Rahmaniar, W.; Wang, W.; Chen, H. Real-time detection and recognition of multiple moving objects for aerial surveillance. Electronics 2019, 8, 1373. [Google Scholar] [CrossRef] [Green Version]
- Farneback, G. Two-frame motion estimation based on polynomial expansion. In Lecture Notes in Computer Science; Springer: Berlin/Heidelberg, Germany, 2003; Volume 2749, pp. 363–370. [Google Scholar]
- Chen, L.C.; Papandreou, G.; Kokkinos, I.; Murphy, K.; Yuille, A.L. DeepLab: Semantic image segmentation with deep convolutional nets, atrous convolution, and fully connected CRFs. IEEE Trans. Pattern Anal. Mach. Intell. 2018, 40, 834–848. [Google Scholar] [CrossRef]
- He, K.; Zhang, X.; Ren, S.; Sun, J. Deep residual learning for image recognition. arXiv 2015, arXiv:1512.03385. [Google Scholar]
- Krähenbühl, P.; Koltun, V. Efficient inference in fully connected CRFs with Gaussian edge potentials. arXiv 2011, arXiv:1210.5644. [Google Scholar]
- Kumar, S.; Azartash, H.; Biswas, M.; Nguyen, T. Real-time affine global motion estimation using phase correlation and its application for digital image stabilization. IEEE Trans. Image Process. 2011, 20, 3406–3418. [Google Scholar] [CrossRef] [Green Version]
- Lowe, D.G. Distinctive image features from scale-invariant keypoints. Int. J. Comput. Vis. 2004, 60, 91–110. [Google Scholar] [CrossRef]
- Rahmaniar, W.; Wang, W.-J. A novel object detection method based on Fuzzy sets theory and SURF. In Proceedings of the International Conference on System Science and Engineering, Morioka, Japan, 6–8 July 2015; pp. 570–584. [Google Scholar]
- Bay, H.; Ess, A.; Tuytelaars, T.; Van Gool, L. Speeded-Up Robust Features (SURF). Comput. Vis. Image Underst. 2008, 110, 346–359. [Google Scholar] [CrossRef]
- Zhu, Y.; Huang, C. An improved median filtering algorithm for image noise reduction. Phys. Procedia 2012, 25, 609–616. [Google Scholar] [CrossRef] [Green Version]
- Muja, M.; Lowe, D.G. Fast approximate nearest neighbors with automatic algorithm configuration. In Proceedings of the International Conference on Computer Vision Theory and Applications, Lisboa, Portugal, 5–8 February 2009; pp. 331–340. [Google Scholar]
- Fischler, M.A.; Bolles, R.C. Random sample consensus: A paradigm for model fitting with applications to image analysis and automated cartography. Commun. ACM 1981, 24, 381–395. [Google Scholar] [CrossRef]
- Ultrasonic HC - SR04 Datasheet. Available online: http://www.micropik.com/PDF/HCSR04.pdf (accessed on 7 October 2018).
- ULN2803A Datasheet. Available online: http://www.ti.com/lit/ds/symlink/uln2803a.pdf (accessed on 7 October 2018).
- ZigBee RF Modules User Guide. Available online: https://www.digi.com/resources/documentation/digidocs/pdfs/90000976.pdf (accessed on 7 October 2018).
Component Name | Output | Supply (V) | Power (mA) | Range |
---|---|---|---|---|
HC-SR04 | I2C | 5 | 15 | 200–400 cm |
XBee pro s2b | UART | 2.7–3.6 | 295 | 1600 m |
Wireless camera | Audio and video | 9–12 | 250 | 100 m |
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations. |
© 2021 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
Rahmaniar, W.; Wang, W.-J.; Caesarendra, W.; Glowacz, A.; Oprzędkiewicz, K.; Sułowicz, M.; Irfan, M. Distance Measurement of Unmanned Aerial Vehicles Using Vision-Based Systems in Unknown Environments. Electronics 2021, 10, 1647. https://doi.org/10.3390/electronics10141647
Rahmaniar W, Wang W-J, Caesarendra W, Glowacz A, Oprzędkiewicz K, Sułowicz M, Irfan M. Distance Measurement of Unmanned Aerial Vehicles Using Vision-Based Systems in Unknown Environments. Electronics. 2021; 10(14):1647. https://doi.org/10.3390/electronics10141647
Chicago/Turabian StyleRahmaniar, Wahyu, Wen-June Wang, Wahyu Caesarendra, Adam Glowacz, Krzysztof Oprzędkiewicz, Maciej Sułowicz, and Muhammad Irfan. 2021. "Distance Measurement of Unmanned Aerial Vehicles Using Vision-Based Systems in Unknown Environments" Electronics 10, no. 14: 1647. https://doi.org/10.3390/electronics10141647