Visual SLAM for Indoor Livestock and Farming Using a Small Drone with a Monocular Camera: A Feasibility Study
Abstract
:1. Introduction
2. Material and Methods
2.1. Visual SLAM Algorithms
2.2. UAV and Software Architecture
2.3. Experimental Setup
2.4. Evaluation Metrics
3. Results
3.1. Comparing Absolute and Relative Position Errors
3.2. Waypoints Navigation
3.3. Semantic Mapping with Octomaps
4. Discussion
5. Conclusions
Author Contributions
Funding
Institutional Review Board Statement
Informed Consent Statement
Acknowledgments
Conflicts of Interest
Abbreviations
UAV | Unmanned Aerial Vehicle |
SVM | Support Vector Machine |
UGV | Unmanned Ground Vehicles |
ML | Machine Learning |
DL | Deep Learning |
CV | Computer Vision |
VSLAM | Visual Simultaneous Localization and Mapping |
SDK | Software Development Kit |
ROS | Robot Operating System |
RMSE | Root Mean Squared Error |
References
- McLeod, A. World Livestock 2011-Livestock in Food Security; Food and Agriculture Organization of the United Nations (FAO): Paris, France, 2011. [Google Scholar]
- Raghunandan Swarnkar, S.; Srinivas, B.; Namwade, G.; Ahirwar, S.; Swarnkar, R.; Bhukya, S. Application of Drone in Agriculture. Artic. Int. J. Curr. Microbiol. Appl. Sci. 2019, 8, 2500–2505. [Google Scholar] [CrossRef]
- Gongal, A.; Amatya, S.; Karkee, M.; Zhang, Q.; Lewis, K. Sensors and systems for fruit detection and localization: A review. Comput. Electron. Agric. 2015, 116, 8–19. [Google Scholar] [CrossRef]
- McBratney, A.; Whelan, B.; Ancev, T.; Bouma, J. Future directions of precision agriculture. Precis. Agric. 2005, 6, 7–23. [Google Scholar] [CrossRef]
- Yang, I.C.; Chen, S. Precision cultivation system for greenhouse production. In Intelligent Environmental Sensing; Springer: Berlin/Heidelberg, Germany, 2015; pp. 191–211. [Google Scholar]
- Borges, D.L.; Guedes, S.T.C.d.M.; Nascimento, A.R.; Melo-Pinto, P. Detecting and grading severity of bacterial spot caused by Xanthomonas spp. in tomato (Solanum lycopersicon) fields using visible spectrum images. Comput. Electron. Agric. 2016, 125, 149–159. [Google Scholar] [CrossRef]
- Liu, X.; Zhao, D.; Jia, W.; Ji, W.; Ruan, C.; Sun, Y. Cucumber fruits detection in greenhouses based on instance segmentation. IEEE Access 2019, 7, 139635–139642. [Google Scholar] [CrossRef]
- Odintsov Vaintrub, M.; Levit, H.; Chincarini, M.; Fusaro, I.; Giammarco, M.; Vignola, G. Review: Precision livestock farming, automats and new technologies: Possible applications in extensive dairy sheep farming. Animal 2020, 15, 100143. [Google Scholar] [CrossRef]
- Cadéro, A.; Aubry, A.; Dourmad, J.Y.; Salaün, Y.; Garcia-Launay, F. Towards a decision support tool with an individual-based model of a pig fattening unit. Comput. Electron. Agric. 2018, 147, 44–50. [Google Scholar] [CrossRef]
- Jorquera-Chavez, M.; Fuentes, S.; Dunshea, F.R.; Jongman, E.C.; Warner, R.D. Computer vision and remote sensing to assess physiological responses of cattle to pre-slaughter stress, and its impact on beef quality: A review. Meat Sci. 2019, 156, 11–22. [Google Scholar] [CrossRef]
- Norton, T.; Chen, C.; Larsen, M.L.V.; Berckmans, D. Precision livestock farming: Building ‘digital representations’ to bring the animals closer to the farmer. Animal 2019, 13, 3009–3017. [Google Scholar] [CrossRef] [Green Version]
- Tasdemir, S.; Urkmez, A.; Inal, S. Determination of body measurements on the Holstein cows using digital image analysis and estimation of live weight with regression analysis. Comput. Electron. Agric. 2011, 76, 189–197. [Google Scholar] [CrossRef]
- Chou, W.C.; Tsai, W.R.; Chang, H.H.; Lu, S.Y.; Lin, K.F.; Lin, P. Prioritization of pesticides in crops with a semi-quantitative risk ranking method for Taiwan postmarket monitoring program. J. Food Drug Anal. 2019, 27, 347–354. [Google Scholar] [CrossRef] [Green Version]
- Schor, N.; Bechar, A.; Ignat, T.; Dombrovsky, A.; Elad, Y.; Berman, S. Robotic disease detection in greenhouses: Combined detection of powdery mildew and tomato spotted wilt virus. IEEE Robot. Autom. Lett. 2016, 1, 354–360. [Google Scholar] [CrossRef]
- Schor, N.; Berman, S.; Dombrovsky, A.; Elad, Y.; Ignat, T.; Bechar, A. Development of a robotic detection system for greenhouse pepper plant diseases. Precis. Agric. 2017, 18, 394–409. [Google Scholar] [CrossRef]
- Vakilian, K.A.; Massah, J. Design, development and performance evaluation of a robot to early detection of nitrogen deficiency in greenhouse cucumber (Cucumis sativus) with machine vision. Int. J. Agric. Res. Rev. 2012, 2, 448–454. [Google Scholar]
- Ju, C.; Son, H.I. Multiple UAV systems for agricultural applications: Control, implementation, and evaluation. Electronics 2018, 7, 162. [Google Scholar] [CrossRef] [Green Version]
- Gonzalez-de Soto, M.; Emmi, L.; Perez-Ruiz, M.; Aguera, J.; Gonzalez-de Santos, P. Autonomous systems for precise spraying—Evaluation of a robotised patch sprayer. Biosyst. Eng. 2016, 146, 165–182. [Google Scholar] [CrossRef]
- Montalvo, M.; Guerrero, J.M.; Romeo, J.; Emmi, L.; Guijarro, M.; Pajares, G. Automatic expert system for weeds/crops identification in images from maize fields. Expert Syst. Appl. 2013, 40, 75–82. [Google Scholar] [CrossRef]
- Roldán, J.J.; Joossen, G.; Sanz, D.; Del Cerro, J.; Barrientos, A. Mini-UAV Based Sensory System for Measuring Environmental Variables in Greenhouses. Sensors 2015, 15, 3334–3350. [Google Scholar] [CrossRef] [PubMed] [Green Version]
- Guo, Y.; Guo, J.; Liu, C.; Xiong, H.; Chai, L.; He, D. Precision Landing Test and Simulation of the Agricultural UAV on Apron. Sensors 2020, 20, 3369. [Google Scholar] [CrossRef]
- Roldán, J.J.; Garcia-Aunon, P.; Garzón, M.; De León, J.; Del Cerro, J.; Barrientos, A. Heterogeneous Multi-Robot System for Mapping Environmental Variables of Greenhouses. Sensors 2016, 16, 1018. [Google Scholar] [CrossRef] [Green Version]
- Le, T.; Omholt Gjevestad, J.G.; From, P.J. Online 3D Mapping and Localization System for Agricultural Robots. IFAC-PapersOnLine 2019, 52, 167–172. [Google Scholar] [CrossRef]
- Huang, Z.; Fukuda, H.; Wai Jacky, T.L.; Zhao, X.; Habaragamuwa, H.; Shiigi, T.; Suzuki, T.; Naoshi, K. Greenhouse Based Orientation Measurement System using Spread Spectrum Sound. IFAC-PapersOnLine 2018, 51, 108–111. [Google Scholar] [CrossRef]
- Mehta, S.S.; Burks, T.F.; Dixon, W.E. Vision-based localization of a wheeled mobile robot for greenhouse applications: A daisy-chaining approach. Comput. Electron. Agric. 2008, 63, 28–37. [Google Scholar] [CrossRef]
- Famili, A.; Park, J.M. ROLATIN: Robust Localization and Tracking for Indoor Navigation of Drones. In Proceedings of the 2020 IEEE Wireless Communications and Networking Conference (WCNC), Seoul, Korea, 25–28 May 2020. [Google Scholar]
- Kempke, B.; Pannuto, P.; Dutta, P. PolyPoint: Guiding Indoor Quadrotors with Ultra-Wideband Localization. 2015. Available online: https://patpannuto.com/pubs/kempke15polypoint.pdf (accessed on 1 February 2021).
- Ajay Kumar, G.; Kumar Patil, A.; Patil, R.; Sill Park, S.; Ho Chai, Y.; Ureña, J.; Hernández Alonso, Á.; Jesús García Domínguez, J. A LiDAR and IMU Integrated Indoor Navigation System for UAVs and Its Application in Real-Time Pipeline Classification. Sensors 2017, 17, 1268. [Google Scholar] [CrossRef] [Green Version]
- Khosrobeygi, Z.; Rafiee, S.; Mohtasebi, S.S.; Nasiri, A. Simultaneous Localization and Mapping in Greenhouse with Stereo Vision. J. Agric. Mach. 2020, 10, 141–153. [Google Scholar] [CrossRef]
- Floreano, D.; Wood, R.J. Science, technology and the future of small autonomous drones. Nature 2015, 521, 460–466. [Google Scholar] [CrossRef] [PubMed] [Green Version]
- Durrant-Whyte, H.; Bailey, T. Simultaneous localization and mapping: Part I. IEEE Robot. Autom. Mag. 2006, 13, 99–110. [Google Scholar] [CrossRef] [Green Version]
- Taketomi, T.; Uchiyama, H.; Ikeda, S. Visual SLAM algorithms: A survey from 2010 to 2016. IPSJ 2017, 9, 16. [Google Scholar] [CrossRef]
- Li, Y.; Scanavino, M.; Capello, E.; Dabbene, F.; Guglieri, G.; Vilardi, A. A novel distributed architecture for UAV indoor navigation. In Transportation Research Procedia; Elsevier: Amsterdam, The Netherlands, 2018; Volume 35, pp. 13–22. [Google Scholar]
- Shu, F.; Lesur, P.; Xie, Y.; Pagani, A.; Stricker, D. SLAM in the Field: An Evaluation of Monocular Mapping and Localization on Challenging Dynamic Agricultural Environment. In Proceedings of the IEEE/CVF Winter Conference on Applications of Computer Vision, Virtual, 5–9 January 2021; pp. 1761–1771. [Google Scholar] [CrossRef]
- Jiang, G.; Yin, L.; Jin, S.; Tian, C.; Ma, X.; Ou, Y. A simultaneous localization and mapping (SLAM) framework for 2.5D map building based on low-cost LiDAR and vision fusion. Appl. Sci. 2019, 9, 2105. [Google Scholar] [CrossRef] [Green Version]
- Huletski, A.; Kartashov, D.; Krinkin, K. Evaluation of the modern visual SLAM methods. In Proceedings of the Artificial Intelligence and Natural Language and Information Extraction, Social Media and Web Search FRUCT Conference, AINL-ISMW FRUCT 2015, St. Petersburg, Russia, 9–14 November 2016; pp. 19–25. [Google Scholar] [CrossRef]
- López, E.; García, S.; Barea, R.; Bergasa, L.M.; Molinos, E.J.; Arroyo, R.; Romera, E.; Pardo, S. A Multi-Sensorial Simultaneous Localization and Mapping (SLAM) System for Low-Cost Micro Aerial Vehicles in GPS-Denied Environments. Sensors 2017, 17, 802. [Google Scholar] [CrossRef] [PubMed]
- Mingachev, E.; Lavrenov, R.; Tsoy, T.; Matsuno, F.; Svinin, M.; Suthakorn, J.; Magid, E. Comparison of ROS-Based Monocular Visual SLAM Methods: DSO, LDSO, ORB-SLAM2 and DynaSLAM. Interactive Collaborative Robotics; Lecture Notes in Computer Science; Springer: Berlin/Heidelberg, Germany, 2020; Volume 12336, pp. 222–233. [Google Scholar]
- Engel, J.; Sturm, J.; Cremers, D. Scale-aware navigation of a low-cost quadrocopter with a monocular camera. In Robotics and Autonomous Systems; Elsevier: Amsterdam, The Netherlands, 2014; Volume 62, pp. 1646–1656. [Google Scholar] [CrossRef] [Green Version]
- Mur-Artal, R.; Tardós, J.D. Orb-slam2: An open-source slam system for monocular, stereo, and rgb-d cameras. IEEE Trans. Robot. 2017, 33, 1255–1262. [Google Scholar] [CrossRef] [Green Version]
- Klein, G.; Murray, D. Parallel Tracking and Mapping for Small AR Workspaces. In Proceedings of the 2007 6th IEEE and ACM International Symposium on Mixed and Augmented Reality, Nara, Japan, 13–16 November 2007; pp. 225–234. [Google Scholar]
- Rublee, E.; Rabaud, V.; Konolige, K.; Bradski, G. ORB: An efficient alternative to SIFT or SURF. In Proceedings of the IEEE International Conference on Computer Vision, Barcelona, Spain, 6–13 November 2011; pp. 2564–2571. [Google Scholar]
- Campos, C.; Elvira, R.; Gómez Rodríguez, J.J.; Montiel, J.M.M.; Tardós, J.D. ORB-SLAM3: An Accurate Open-Source Library for Visual, Visual-Inertial and Multi-Map SLAM. arXiv 2020, arXiv:2007.11898. [Google Scholar]
- Grupp, M. evo: Python Package for the Evaluation of Odometry and SLAM. 2017. Available online: https://github.com/MichaelGrupp/evo (accessed on 1 February 2021).
- Umeyama, S. Least-squares estimation of transformation parameters between two point patterns. IEEE Comput. Archit. Lett. 1991, 13, 376–380. [Google Scholar] [CrossRef] [Green Version]
- Sturm, J.; Engelhard, N.; Endres, F.; Burgard, W.; Cremers, D. A benchmark for the evaluation of RGB-D SLAM systems. IEEE Int. Conf. Intell. Robot. Syst. 2012, 573–580. [Google Scholar] [CrossRef] [Green Version]
- Sun, L.; Yan, Z.; Zaganidis, A.; Zhao, C.; Duckett, T. Recurrent-OctoMap: Learning State-Based Map Refinement for Long-Term Semantic Mapping with 3-D-Lidar Data. IEEE Robot. Autom. Lett. 2018, 3, 3749–3756. [Google Scholar] [CrossRef] [Green Version]
- Papadopoulos, A.P.; Ormrod, D.P. Plant spacing effects on growth and development of the greenhouse tomato. Can. J. Plant Sci. 1991, 71, 297–304. [Google Scholar] [CrossRef]
- Webb, A.M.; Brown, G.; Luján, M. ORB-SLAM-CNN: Lessons in Adding Semantic Map Construction to Feature-Based SLAM BT—Towards Autonomous Robotic Systems; Springer International Publishing: Cham, Switzerland, 2019; pp. 221–235. [Google Scholar]
- Filipenko, M.; Afanasyev, I. Comparison of Various SLAM Systems for Mobile Robot in an Indoor Environment. In Proceedings of the 9th International Conference on Intelligent Systems 2018: Theory, Research and Innovation in Applications, IS 2018—Proceedings, Funchal, Portugal, 25–27 September 2018; pp. 400–407. [Google Scholar] [CrossRef]
- Gaoussou, H.; Dewei, P. Evaluation of the visual odometry methods for semi-dense real-time. Adv. Comput. Int. J. ACIJ 2018, 9. [Google Scholar] [CrossRef] [Green Version]
- Qin, T.; Li, P.; Shen, S. VINS-Mono: A Robust and Versatile Monocular Visual-Inertial State Estimator. IEEE Trans. Robot. 2018, 34, 1004–1020. [Google Scholar] [CrossRef] [Green Version]
- Mahdoui, N.; Frémont, V.; Frémont, F.; Natalizio, ·.E.; Natalizio, E. Communicating Multi-UAV System for Cooperative SLAM-based Exploration. J. Intell. Robot. Syst. 2019, 98, 325–343. [Google Scholar] [CrossRef] [Green Version]
- Islam, N.; Rashid, M.M.; Pasandideh, F.; Ray, B.; Moore, S.; Kadel, R. A Review of Applications and Communication Technologies for Internet of Things (IoT) and Unmanned Aerial Vehicle (UAV) Based Sustainable Smart Farming. Sustainability 2021, 13, 1821. [Google Scholar] [CrossRef]
- Krishnamoorthy, V. The Drone of Drones: A Preliminary Investigation of Drone Noise and Animal Welfare in New Zealand Sheep. Ph.D. Thesis, University of Auckland, Auckland, New Zealand, 2019. [Google Scholar]
- Fossel, J.; Hennes, D.; Claes, D.; Alers, S.; Tuyls, K. OctoSLAM: A 3D Mapping Approach to Situational Awareness of Unmanned Aerial Vehicles. In Proceedings of the 2013 International Conference on Unmanned Aircraft Systems (ICUAS), Atlanta, GA, USA, 28–31 May 2013; pp. 179–188. [Google Scholar]
- Steenbeek, A. CNN Based Dense Monocular Visual SLAM for Indoor Mapping and Autonomous Exploration. 2020. Available online: http://essay.utwente.nl/81420/ (accessed on 1 February 2021).
- Radoglou-Grammatikis, P.; Sarigiannidis, P.; Lagkas, T.; Moscholios, I. A compilation of UAV applications for precision agriculture. Comput. Netw. 2020, 172, 107148. [Google Scholar] [CrossRef]
- Mozaffari, M.; Saad, W.; Bennis, M.; Nam, Y.H.; Debbah, M. A tutorial on UAVs for wireless networks: Applications, challenges, and open problems. IEEE Commun. Surv. Tutor. 2019, 21, 2334–2360. [Google Scholar] [CrossRef] [Green Version]
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations. |
© 2021 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
Krul, S.; Pantos, C.; Frangulea, M.; Valente, J. Visual SLAM for Indoor Livestock and Farming Using a Small Drone with a Monocular Camera: A Feasibility Study. Drones 2021, 5, 41. https://doi.org/10.3390/drones5020041
Krul S, Pantos C, Frangulea M, Valente J. Visual SLAM for Indoor Livestock and Farming Using a Small Drone with a Monocular Camera: A Feasibility Study. Drones. 2021; 5(2):41. https://doi.org/10.3390/drones5020041
Chicago/Turabian StyleKrul, Sander, Christos Pantos, Mihai Frangulea, and João Valente. 2021. "Visual SLAM for Indoor Livestock and Farming Using a Small Drone with a Monocular Camera: A Feasibility Study" Drones 5, no. 2: 41. https://doi.org/10.3390/drones5020041
APA StyleKrul, S., Pantos, C., Frangulea, M., & Valente, J. (2021). Visual SLAM for Indoor Livestock and Farming Using a Small Drone with a Monocular Camera: A Feasibility Study. Drones, 5(2), 41. https://doi.org/10.3390/drones5020041