Intelligent Grazing UAV Based on Airborne Depth Reasoning
Abstract
:1. Introduction
2. Materials and Methods
2.1. Introduction to Studied Area and Study Object
2.2. Study Workflow
- Data acquisition. The video streaming image data by controlling the P600 intelligent UAV equipped with a three-axis photoelectric pod to fly to a specified location are captured.
- Animal detection and localization. The acquired video stream data were reshaped into 299 × 299 images, and processed by the improved YOLOv5 animal detection model, predicting the presence or absence of animals and mapping the detected animals to finally obtain an initialized ROI prediction frame.
- Trajectory recording. It is possible to select the KCF kernel correlation filtering algorithm to track the object of interest when it is detected/recognized because of its advantages of high precision and high processing speed, both in terms of tracking effect and tracking speed. Through the KCF target tracking algorithm, the detection frame is determined to see whether the target animal is monitored. If it is detected, it is learned and tracked, or otherwise, the new frame is re-examined to find the animal of interest. The fast extraction of detected trajectories is helpful to obtain more accurate ROI annotation frames to be helpful to further extract more realistic visual features.
- Generate space–time trajectories. The individual ROI annotation frames obtained from KCF were converted into a set of space–time trajectories.
- Individual prediction. Both the weights of CNNs and Long Short-Term Memory model (LSTM) were shared across time, allowing real-time identification tracking of targets in the video. Each set of the space–time trajectories was rescaled as well as passed to an Inception V3 network until reaching layer 3 of the pool, where the visual features were extracted from the input frames and fed into an LSTM recurrent neural network, followed by being recombined with image frames as input to subsequent iteration frames based on a time-lapse data sequence. After processing this set of spatiotemporal trajectories, the whole input sequence can obtain ID final predictions via a layer with full connection.
2.3. Data Acquisition
2.4. Hardware Communication Architecture
2.5. Experimental Platform
3. Airborne Depth Reasoning Network
3.1. Annotation and Augmentation of Training Data
3.2. Species Detection and Localization Based on Improved YOLOv5
- 1.
- Modification of the anchor box size
- 2.
- Improvements of neck layer
- 3.
- Introduction of attention mechanisms
- 4.
- Optimization of the backbone layer
3.3. Aerial Real-Time Photography LRCN Identification of Cattle
4. Results
4.1. Detection and Location of Cattle
4.2. Accuracy Comparison
4.3. Video-Based LRCN Identification
5. Discussion
6. Conclusions
Author Contributions
Funding
Data Availability Statement
Acknowledgments
Conflicts of Interest
References
- Andriamandroso, A.L.H.; Bindelle, J.; Mercatoris, B.; Lebeau, F. A review on the use of sensors to monitor cattle jaw movements and behavior when grazing. Biotechnol. Agron. Société Environ. 2016, 20, 273–286. [Google Scholar] [CrossRef]
- Debauche, O.; Mahmoudi, S.; Andriamandroso, A.L.H.; Manneback, P.; Bindelle, J.; Lebeau, F. Web-based cattle behavior service for researchers based on the smartphone inertial central. Procedia Comput. Sci. 2017, 110, 110–116. [Google Scholar] [CrossRef]
- Laca, E.A. Precision livestock production: Tools and concepts. Rev. Bras. Zootec. 2009, 38, 123–132. [Google Scholar] [CrossRef]
- Larson-Praplan, S.; George, M.; Buckhouse, J.; Laca, E. Spatial and temporal domains of scale of grazing cattle. Anim. Prod. Sci. 2015, 55, 284–297. [Google Scholar] [CrossRef]
- Bowling, M.; Pendell, D.; Morris, D.; Yoon, Y.; Katoh, K.; Belk, K.; Smith, G. Review: Identification and traceability of cattle in selected countries outside of north america. Prof. Anim. Sci. 2008, 24, 287–294. [Google Scholar] [CrossRef]
- European Parliament and Council. Establishing A System for the Identification and Registration of Bovine Animals and Regarding the Labelling of Beef and Beef Products and Repealing Council Regulation (Ec) No 820/97. 2000. Available online: http://eur-lex.europa.eu/legal-content/EN/TXT/?uri=celex:32000R1760 (accessed on 28 July 2022).
- Houston, R. A computerised database system for bovine traceability. Rev. Sci. Tech. 2001, 20, 652–661. [Google Scholar] [CrossRef]
- Buick, W. Animal passports and identification. Defra Vet. J. 2004, 15, 20–26. [Google Scholar]
- Shanahan, C.; Kernan, B.; Ayalew, G.; Mcdonnell, K.; Butler, F.; Ward, S. A framework for beef traceability from farm to slaughter using global standards: An Irish perspective. Comput. Electron. Agric. 2009, 66, 62–69. [Google Scholar] [CrossRef]
- Rossing, W. Animal identification: Introduction and history. Comput. Electron. Agric. 1999, 24, 1–4. [Google Scholar] [CrossRef]
- Medicine, P.V.; Adesiyun, A.; Indies, W. Ear-tag retention and identification methods for extensively managed water buffalo (Bubalus bubalis) in Trinidad for extensively managed water buffalo. Prev. Vet. Med. 2014, 73, 286–296. [Google Scholar]
- Edwards, D.S.; Johnston, A.M.; Pfeiffer, D.U. A comparison of commonly used ear tags on the ear damage of sheep. Anim. Welf. 2001, 10, 141–151. [Google Scholar]
- Wardrope, D.D. Problems with the use of ear tags in cattle. Vet. Rec. 1995, 137, 675. [Google Scholar] [PubMed]
- López, J.J.; Mulero-Pázmány, M. Drones for conservation in protected areas: Present and future. Drones 2019, 3, 10. [Google Scholar] [CrossRef] [Green Version]
- Schroeder, N.M.; Panebianco, A.; Gonzalez Musso, R.; Carmanchahi, P. An experimental approach to evaluate the potential of drones in terrestrial mammal research: A gregarious ungulate as a study model. R. Soc. Open Sci. 2020, 7, 191482. [Google Scholar] [CrossRef]
- Jones, G.P., IV; Percival, L. An assessment of small unmanned aerial vehicals for wildlife research. Wildl. Soc. Bull. 2006, 34, 750–758. [Google Scholar] [CrossRef]
- Landeo-Yauri, S.S.; Ramos, E.A.; Castelblanco-Martínez, D.N.; Niño-Torres, C.A.; Searle, L. Using small drones to photo-identify Antillean manatees: A novel method for monitoring an endangered marine mammal in the Caribbean Sea. Endanger. Species Res. 2020, 41, 79–90. [Google Scholar] [CrossRef]
- Petso, T.; Jamisola, R.S.J.; Mpoeleng, D.; Bennitt, E.; Mmereki, W. Automatic animal identification from drone camera based on point pattern analysis of herd behaviour. Ecol. Inform. 2021, 66, 101485. [Google Scholar] [CrossRef]
- Christie, A.I.; Colefax, A.P.; Cagnazzi, D. Feasibility of using small UAVs to derive morphometric measurements of Australian snubfin (Orcaella heinsohni) and humpback (Sousa sahulensis) dolphins. Remote Sens. 2022, 14, 21. [Google Scholar] [CrossRef]
- Fiori, L.; Martinez, E.; Bader, M.K.F.; Orams, M.B.; Bollard, B. Insights into the use of an unmanned aerial vehicle (uav) to investigate the behavior of humpback whales (Megaptera novaeangliae) in Vava’u, kingdom of Tonga. Mar. Mammal Sci. 2020, 36, 209–223. [Google Scholar] [CrossRef]
- Herlin, A.; Brunberg, E.; Hultgren, J.; Högberg, N.; Rydberg, A.; Skarin, A. Animal welfare implications of digital tools for monitoring and management of cattle and sheep on pasture. Animals 2021, 11, 829. [Google Scholar] [CrossRef]
- Youngflesh, C.; Jones, F.M.; Lynch, H.J.; Arthur, J.; Ročkaiová, Z.; Torsey, H.R.; Hart, T. Large-scale assessment of intra- and inter-annual breeding success using a remote camera network. Remote Sens. Ecol. Conserv. 2021, 7, 97–108. [Google Scholar] [CrossRef] [PubMed]
- Zhou, M.; Elmore, J.A.; Samiappan, S.; Evans, K.O.; Pfeiffer, M.B.; Blackwell, B.F.; Iglay, R.B. Improving animal monitoring using small unmanned aircraft systems (sUAS) and deep learning networks. Sensors 2021, 21, 5697. [Google Scholar] [CrossRef] [PubMed]
- Ju, C.; Son, H.I. Investigation of an autonomous tracking system for localization of radio-tagged flying insects. IEEE Access 2022, 10, 4048–4062. [Google Scholar] [CrossRef]
- Lecun, Y.; Bengio, Y.; Hinton, G. Deep learning. Nature 2015, 521, 436–444. [Google Scholar] [CrossRef] [PubMed]
- He, K.; Zhang, X.; Ren, S.; Sun, J. Deep Residual Learning for Image Recognition. arXiv 2015, arXiv:1512.033852016. Available online: https://arxiv.org/abs/1512.03385 (accessed on 28 July 2022).
- Long, J.; Shelhamer, E.; Darrell, T. Fully convolutional networks for semantic segmentation. In Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition (CVPR), Boston, MA, USA, 7–12 June 2015; pp. 3431–3440. [Google Scholar] [CrossRef]
- Wäldchen, J.; Mäder, P. Machine learning for image based species identification. Methods Ecol. Evol. 2018, 9, 2216–2225. [Google Scholar] [CrossRef]
- Norouzzadeh, M.S.; Nguyen, A.; Kosmala, M.; Swanson, A.; Palmer, M.S.; Packer, C.; Clune, J. Automatically identifying, counting, and describing wild animals in camera-trap images with deep learning. Proc. Natl. Acad. Sci. USA 2018, 115, E5716–E5725. [Google Scholar] [CrossRef]
- Gray, P.C.; Fleishman, A.B.; Klein, D.J.; McKown, M.W.; Bézy, V.S.; Lohmann, K.J.; Johnston, D.W. A convolutional neural network for detecting sea turtles in drone imagery. Methods Ecol. Evol. 2018, 10, 345–355. [Google Scholar] [CrossRef]
- Kühl, H.S.; Burghardt, T. Animal biometrics: Quantifying and detecting phenotypic appearance. Trends Ecol. Evol. 2013, 28, 432–441. [Google Scholar] [CrossRef]
- Town, C.; Marshall, A.; Sethasathien, N. M anta M atcher: Automated photographic identification of manta rays using keypoint features. Ecol. Evol. 2013, 3, 1902–1914. [Google Scholar] [CrossRef]
- Sherley, R.B.; Burghardt, T.; Barham, P.J.; Campbell, N.; Cuthill, I.C. Spotting the difference: Towards fully-automated population monitoring of African penguins Spheniscus demersus. Endanger. Species Res. 2010, 11, 101–111. [Google Scholar] [CrossRef]
- Bonnell, T.R.; Henzi, S.P.; Barrett, L. Sparse movement data can reveal social influences on individual travel decisions. arXiv 2015, arXiv:1511.01536. [Google Scholar]
- Hiby, L.; Lovell, P.; Patil, N.; Kumar, N.S.; Gopalaswamy, A.M.; Karanth, K.U. A tiger cannot change its stripes: Using a three-dimensional model to match images of living tigers and tiger skins. Biol. Lett. 2009, 5, 383–386. [Google Scholar] [CrossRef] [PubMed]
- Awad, A.I.; Zawbaa, H.M.; Mahmoud, H.A.; Nabi, E.H.H.A.; Fayed, R.H.; Hassanien, A.E. A robust cattle identification scheme using muzzle print images. In Proceedings of the 2013 Federated Conference on Computer Science and Information Systems (FedCSIS), Krakow, Poland, 8–11 September 2013. [Google Scholar]
- Corkery, G.; Gonzales-Barron, U.A.; Butler, F.; McDonnell, K.; Ward, S. A preliminary investigation on face recognition as a biometric identifier of sheep. Trans. ASABE 2007, 50, 313–320. [Google Scholar] [CrossRef]
- Barron, U.G.; Corkery, G.; Barry, B.; Butler, F.; McDonnell, K.; Ward, S. Assessment of retinal recognition technology as a biometric method for sheep identification. Comput. Electron. Agric. 2008, 60, 156–166. [Google Scholar] [CrossRef]
- Jarraya, I.; Ouarda, W.; Alimi, A.M. A preliminary investigation on horses recognition using facial texture features. In Proceedings of the 2015 IEEE International Conference on Systems, Man, and Cybernetics (SMC), Hong Kong, China, 9–12 October 2015. [Google Scholar]
- Hansen, M.F.; Smith, M.L.; Smith, L.N.; Salter, M.G.; Baxter, E.M.; Farish, M.; Grieve, B. Towards on-farm pig face recognition using convolutional neural networks. Comput. Ind. 2018, 98, 145–152. [Google Scholar] [CrossRef]
- Redmon, J.; Divvala, S.; Girshick, R.; Farhadi, A. You only look once: Unified, real-time object detection. In Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition, Las Vegas, NV, USA, 27 June–1 July 2016; pp. 779–788. [Google Scholar]
- Luo, W.; Han, W.; Fu, P.; Wang, H.; Zhao, Y.; Liu, K.; Liu, Y.; Zhao, Z.; Zhu, M.; Xu, R.; et al. A water surface contaminants monitoring method based on airborne depth reasoning. Processes 2022, 10, 131. [Google Scholar] [CrossRef]
- Shao, Q.Q.; Guo, X.J.; Li, Y.Z.; Wang, Y.C.; Wang, D.L.; Liu, J.Y.; Fan, J.W.; Yang, F. Using UAV remote sensing to analyze the population and distribution of large wild herbivores. J. Remote Sens. 2018, 22, 497–507. [Google Scholar]
- Luo, W.; Jin, Y.; Li, X.; Liu, K. Application of Deep Learning in Remote Sensing Monitoring of Large Herbivores—A Case Study in Qinghai Tibet Plateau. Pak. J. Zool. 2022, 54, 413. [Google Scholar] [CrossRef]
- Wang, D.L.; Liao, X.H.; Zhang, Y.J.; Cong, N.; Ye, H.P.; Shao, Q.Q.; Xin, X.P. Drone vision On-line detection and weight estimation of frequency-streaming grassland grazing livestock. J. Ecol. 2021, 40, 4066–4108. [Google Scholar]
- Yang, L.; Yan, J.; Li, H.; Cao, X.; Ge, B.; Qi, Z.; Yan, X. Real-time classification of invasive plant seeds based on improved YOLOv5 with attention Mechanism. Diversity 2022, 14, 254. [Google Scholar] [CrossRef]
- Li, H.; Li, J.; Wei, H.; Liu, Z.; Zhan, Z.; Ren, Q. Slim-neck by GSConv: A better design paradigm of detector architectures for autonomous vehicles. arXiv 2022, arXiv:2206.02424. [Google Scholar]
- Kim, H.M.; Kim, J.H.; Park, K.R.; Moon, Y.S. Small object detection using prediction head and attention. In Proceedings of the 2022 International Conference on Electronics, Information, and Communication (ICEIC), Jeju, Korea, 6–9 February 2022; pp. 1–4. [Google Scholar]
- Ding, X.; Zhang, X.; Ma, N.; Han, J.; Ding, G.; Sun, J. Repvgg: Making vgg-style convnets great again. In Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, Nashville, TN, USA, 20–25 June 2021; pp. 13733–13742. [Google Scholar]
- Andrew, W.; Greatwood, C.; Burghardt, T. Visual localisation and individual identification of holstein friesian cattle via deep learning. In Proceedings of the 2017 IEEE International Conference on Computer Vision Workshops (ICCVW), Venice, Italy, 22–29 October 2017; pp. 2850–2859. [Google Scholar]
- Henriques, J.F.; Caseiro, R.; Martins, P.; Batista, J. High-speed tracking with kernelized correlation filters. IEEE Trans. Pattern Anal. Mach. Intell. 2015, 37, 583–596. [Google Scholar] [CrossRef] [PubMed] [Green Version]
- Szegedy, C.; Vanhoucke, V.; Ioffe, S.; Shlens, J.; Wojna, Z. Rethinking the inception architecture for computer vision. In Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition, Las Vegas, NV, USA, 27–30 June 2016; pp. 2818–2826. [Google Scholar]
- Donahue, J.; Hendricks, L.A.; Guadarrama, S.; Rohrbach, M.; Saenko, K. Long-term recurrent convolutional networks for visual recognition and description. In Proceedings of the IEEE Computer Society Conference on Computer Vision and Pattern Recognition, Boston, MA, USA, 7–12 June 2015; pp. 2625–2634. [Google Scholar]
Feature | Illustration |
---|---|
Tonal | With black, gray-black and other dark colors |
Color | The main colors are white, black-white and black |
Texture | A solid color or a plurality of solid color splicing |
Size | The adult domestic cattle have a body length of about 1.6~2.2 m. For example, if the resolution is resolution, the individual length is more than 40~50 pixels. |
Shape | The overall shape is nearly elliptic, rectangular. The ratio of length and width is mostly between 1.4:1 and 3:1. |
Group image | |
Individual recognition paradigm | |
Shape features |
Network | FPS | Precision | Recall | Average Precision | Size of Model |
---|---|---|---|---|---|
Faster RCNN | 10.24 | 0.964 | 0.893 | 0.971 | 345 MB |
YOLOv5 | 46.37 | 0.969 | 0.902 | 0.975 | 14.5 MB |
Modified YOLOv5 | 43.63 | 0.984 | 0.921 | 0.983 | 15.2 MB |
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations. |
© 2022 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
Luo, W.; Zhang, Z.; Fu, P.; Wei, G.; Wang, D.; Li, X.; Shao, Q.; He, Y.; Wang, H.; Zhao, Z.; et al. Intelligent Grazing UAV Based on Airborne Depth Reasoning. Remote Sens. 2022, 14, 4188. https://doi.org/10.3390/rs14174188
Luo W, Zhang Z, Fu P, Wei G, Wang D, Li X, Shao Q, He Y, Wang H, Zhao Z, et al. Intelligent Grazing UAV Based on Airborne Depth Reasoning. Remote Sensing. 2022; 14(17):4188. https://doi.org/10.3390/rs14174188
Chicago/Turabian StyleLuo, Wei, Ze Zhang, Ping Fu, Guosheng Wei, Dongliang Wang, Xuqing Li, Quanqin Shao, Yuejun He, Huijuan Wang, Zihui Zhao, and et al. 2022. "Intelligent Grazing UAV Based on Airborne Depth Reasoning" Remote Sensing 14, no. 17: 4188. https://doi.org/10.3390/rs14174188
APA StyleLuo, W., Zhang, Z., Fu, P., Wei, G., Wang, D., Li, X., Shao, Q., He, Y., Wang, H., Zhao, Z., Liu, K., Liu, Y., Zhao, Y., Zou, S., & Liu, X. (2022). Intelligent Grazing UAV Based on Airborne Depth Reasoning. Remote Sensing, 14(17), 4188. https://doi.org/10.3390/rs14174188