Optimizing Deep Learning Algorithms for Effective Chicken Tracking through Image Processing
Abstract
:1. Introduction
2. Materials and Methods
2.1. Video Collection Setup
2.2. Deep Learning Algorithm
2.2.1. Detection Model—YOLOv8n
- Bounding Box Loss (train/box_loss): this metric measures the error in predicting the coordinates of the bounding boxes.
- Classification Loss (train/cls_loss): this metric measures the error in classifying the objects.
- Objectness Loss (train/obj_loss): this metric evaluates the confidence score for the presence of an object in a proposed region.
- Bounding Box Loss (val/box_loss): similar to the training bounding box loss, this validation metric tends to decrease, reflecting the improved generalization of the model for object localization on unseen data.
- Classification Loss (val/cls_loss): the reduction in validation classification loss indicates the enhanced generalization for object classification.
- Objectness Loss (val/obj_loss): this metric, which also decreases over the epochs, shows that the model’s confidence in detecting objects improves on the validation set.
- Precision for Class B (metrics/precision(B)): Precision measures the accuracy of positive predictions. A rising precision curve indicates that the model reduces false positives for class B.
- Recall for Class B (metrics/recall(B)): Recall measures the model’s ability to detect all relevant instances of class B. The upward trend in recall shows that the model becomes more effective in finding all true positives over time.
- Mean Average Precision at IoU = 0.50 for Class B (metrics/mAP50(B)): This metric evaluates the precision of the model at an intersection over union (IoU) threshold of 0.50.
- Mean Average Precision at IoU = 0.50:0.95 for Class B (metrics/mAP50-95(B)): This metric averages the precision over multiple IoU thresholds from 0.50 to 0.95.
2.2.2. The New Approach Modified the Deep Learning Algorithm
3. Results
3.1. YOLO Detection Model
3.2. Evaluation of the New Modified Deep Learning Algorithm
3.2.1. Scenario One: Reappearing Chicken with Temporary Invisibility
3.2.2. Scenario Two: Multiple Missing Chickens, Occlusion with Objects
3.2.3. Scenario Three: Multiple Missing Chickens, Coalescing Chickens
3.3. Tracking Four Chickens Using a Modified Deep Learning Algorithm
4. Discussion
5. Practical Applications and Perspectives
- Validation in Diverse Environments: Validating the algorithm in different poultry environments, including various housing systems and flock sizes, will ensure its robustness and generalizability.
- Artificial Intelligence Integration: Integrating artificial intelligence and machine learning technologies with the algorithm can enhance its predictive capabilities and adaptability, enabling more sophisticated analyses of bird behavior and health.
- Collaborative Research: Interdisciplinary collaboration among researchers, veterinarians, and industry professionals is essential to advance the field. Combining expertise from various domains can lead to innovative solutions and broader applications.
- Technological Innovations: Continuous advances in video sensors and image processing techniques will further refine the algorithm’s performance. Incorporating these innovations can drive significant progress in automated poultry monitoring.
6. Conclusions
- Bird Overlap and Occlusion: Despite advances, challenges remain such as bird overlap and temporary occlusion by objects. These situations can lead to misidentification and tracking errors, particularly in environments with high bird density or complex structures.
- Environmental Variability: The algorithm’s performance can be affected by varying lighting conditions, camera angles, and backgrounds typical in different poultry farming setups. These factors can introduce noise and reduce the accuracy of detection and tracking.
- Generalization Across Different Contexts: While the algorithm shows promise in specific environments, its generalizability to different poultry contexts and species requires further validation. Adapting the algorithm to diverse farming practices and housing conditions is essential for broader applicability.
- Real-Time Processing: Although the YOLOv8n model is portable, the computational requirements for real-time processing in large-scale poultry facilities may pose challenges. Ensuring the algorithm can operate efficiently without significant delays is crucial for practical implementation.
- Behavioral Complexity: The dynamic and often unpredictable nature of bird behavior adds another layer of complexity. The algorithm needs continuous refinement to accurately interpret and respond to various behavioral patterns and interactions among birds.
Supplementary Materials
Author Contributions
Funding
Institutional Review Board Statement
Data Availability Statement
Acknowledgments
Conflicts of Interest
Appendix A
Algorithm A1: Pseudo-code for chicken tracking algorithm (with head and tail removal). |
References
- Neethirajan, S. Automated Tracking Systems for the Assessment of Farmed Poultry. Animals 2022, 12, 232. [Google Scholar] [CrossRef] [PubMed]
- Campbell, D.L.M.; Karcher, D.M.; Siegford, J.M. Location tracking of individual laying hens housed in aviaries with different litter substrates. Appl. Anim. Behav. Sci. 2016, 184, 74–79. [Google Scholar] [CrossRef]
- Li, N.; Ren, Z.; Li, D.; Zeng, L. Review: Automated techniques for monitoring the behaviour and welfare of broilers and laying hens: Towards the goal of precision livestock farming. Animal 2020, 14, 617–625. [Google Scholar] [CrossRef] [PubMed]
- Aydin, A.; Cangar, O.; Ozcan, S.E.; Bahr, C.; Berckmans, D. Application of a fully automatic analysis tool to assess the activity of broiler chickens with different gait scores. Comput. Electron. Agric. 2010, 73, 194–199. [Google Scholar] [CrossRef]
- Bloemen, H.; Aerts, J.; Berckmans, D.; Goedseels, V. Image analysis to measure activity index of animals. Equine Vet. J. 1997, 23 (Suppl. S23), 16–19. [Google Scholar] [CrossRef] [PubMed]
- Kristensen, H.H.; Aerts, J.M.; Leroy, T.; Wathes, C.M.; Berckmans, D. Modelling the dynamic activity of broiler chickens in response to step-wise changes in light intensity. Appl. Anim. Behav. Sci. 2006, 101, 125–143. [Google Scholar] [CrossRef]
- Dawkins, M.S.; Cain, R.; Roberts, S.J. Optical flow, flock behaviour and chicken welfare. Anim. Behav. 2012, 84, 219–223. [Google Scholar] [CrossRef]
- Wu, S.; Hadachi, A.; Lu, C.; Vivet, D. Transformer for multiple object tracking: Exploring locality to vision. Pattern Recognit. Lett. 2023, 170, 70–76. [Google Scholar] [CrossRef]
- Sun, L.; Zhang, J.; Gao, D.; Fan, B.; Fu, Z. Occlusion-aware visual object tracking based on multi-template updating Siamese network. Digit. Signal Process. 2024, 148, 104440. [Google Scholar] [CrossRef]
- Liu, Y.; Li, W.; Liu, X.; Li, Z.; Yue, J. Deep learning in multiple animal tracking: A survey. Comput. Electron. Agric. 2024, 224, 109161. [Google Scholar] [CrossRef]
- Zhu, Z.; Li, X.; Zhai, J.; Hu, H. PODB: A learning-based polarimetric object detection benchmark for road scenes in adverse weather conditions. Inf. Fusion 2024, 108, 102385. [Google Scholar] [CrossRef]
- Zhai, X.; Wei, H.; Wu, H.; Zhao, Q.; Huang, M. Multi-target tracking algorithm in aquaculture monitoring based on deep learning. Ocean Eng. 2023, 286, 116005. [Google Scholar] [CrossRef]
- Lu, X.; Ma, C.; Shen, J.; Yang, X.; Reid, I.; Yang, M.-H. Deep Object Tracking with Shrinkage Loss. IEEE Trans. Pattern Anal. Mach. Intell. 2022, 44, 2386–2401. [Google Scholar] [CrossRef] [PubMed]
- Luo, H.; Jiang, W.; Gu, Y.; Liu, F.; Liao, X.; Lai, S.; Gu, J. A Strong Baseline and Batch Normalization Neck for Deep Person Re-Identification. IEEE Trans. Multimed. 2020, 22, 2597–2609. [Google Scholar] [CrossRef]
- Evangelidis, G.D.; Psarakis, E.Z. Parametric Image Alignment Using Enhanced Correlation Coefficient Maximization. IEEE Trans. Pattern Anal. Mach. Intell. 2008, 30, 1858–1865. [Google Scholar] [CrossRef] [PubMed]
- Du, Y.; Zhao, Z.; Song, Y.; Zhao, Y.; Su, F.; Gong, T.; Meng, H. StrongSORT: Make DeepSORT Great Again. IEEE Trans. Multimed. 2023, 25, 8725–8737. [Google Scholar] [CrossRef]
- Wang, Z.; Zheng, L.; Liu, Y.; Li, Y.; Wang, S. Towards Real-Time Multi-Object Tracking. In Proceedings of the Computer Vision—ECCV 2020, Glasgow, UK, 23–28 August 2020; Vedaldi, A., Bischof, H., Brox, T., Frahm, J.M., Eds.; Springer: Cham, Switzerland, 2020; Volume 12356. [Google Scholar] [CrossRef]
- Bechar, A.; Vigneault, C. Agricultural robots for field operations: Concepts and components. Biosyst. Eng. 2016, 149, 94–111. [Google Scholar] [CrossRef]
- Badgujar, C.M.; Poulose, A.; Gan, H. Agricultural object detection with You Only Look Once (YOLO) Algorithm: A bibliometric and systematic literature review. Comput. Electron. Agric. 2024, 223, 109090. [Google Scholar] [CrossRef]
- Yan, P.; Wang, W.; Li, G.; Zhao, Y.; Wang, J.; Wen, Z. A lightweight coal gangue detection method based on multispectral imaging and enhanced YOLOv8n. Microchem. J. 2024, 199, 110142. [Google Scholar] [CrossRef]
- Fernandes, A.M.; De Sartori, D.D.; De Morais, F.J.D.; Salgado, D.D.; Pereira, D.F. Analysis of Cluster and Unrest Behaviors of Laying Hens Housed under Different Thermal Conditions and Light Wave Length. Animals 2021, 11, 2017. [Google Scholar] [CrossRef] [PubMed]
- Redmon, J.; Divvala, S.; Girshick, R. You only look once: Unified, real-time object detection computer vision & pattern recognition. In Proceedings of the 2016 IEEE Conference on Computer Vision and Pattern Recognition (CVPR), Las Vegas, NV, USA, 27–30 June 2016; pp. 779–788. [Google Scholar] [CrossRef]
- Agrawal, P.; Girshick, R.; Malik, J. Analyzing the Performance of Multilayer Neural Networks for Object Recognition. In Proceedings of the Computer Vision—ECCV 2014, Zurich, Switzerland, 6–12 September 2014; Fleet, D., Pajdla, T., Schiele, B., Tuytelaars, T., Eds.; Springer: Cham, Switzerland, 2014; Volume 8695. [Google Scholar] [CrossRef]
- Ren, S.; He, K.; Girshick, R.; Sun, J. Faster R-CNN: Towards Real-Time Object Detection with Region Proposal Networks. IEEE Trans. Pattern Anal. Mach. Intell. 2016, 39, 1137–1149. [Google Scholar] [CrossRef] [PubMed]
- Siriani, A.L.R.; De Miranda, I.B.D.; Mehdizadeh, S.A.; Pereira, D.F. Chicken Tracking and Individual Bird Activity Monitoring Using the BoT-SORT Algorithm. AgriEngineering 2023, 5, 1677–1693. [Google Scholar] [CrossRef]
- Amraei, S.; Mehdizadeh, S.A.; Salari, S. Broiler weight estimation based on machine vision and artificial neural network. Br. Poult. Sci. 2017, 58, 200–205. [Google Scholar] [CrossRef] [PubMed]
- Liu, H.; Zhang, W.; Wang, F.; Sun, X.; Wang, J.; Wang, C.; Wang, X. Application of an improved watershed algorithm based on distance map reconstruction in bean image segmentation. Heliyon 2023, 9, e15097. [Google Scholar] [CrossRef] [PubMed]
- Kashiha, M.A.; Green, A.R.; Sales, T.G.; Bahr, C.; Berckmans, D.; Gates, R.S. Performance of an image analysis processing system for hen tracking in an environmental preference chamber. Poult. Sci. 2014, 93, 2439–2448. [Google Scholar] [CrossRef] [PubMed]
- Doornweerd, J.E.; Veerkamp, R.F.; De Klerk, B.; Van der Sluis, M.; Bouwman, A.C.; Ellen, E.D.; Kootstra, G. Tracking individual broilers on video in terms of time and distance. Poult. Sci. 2024, 103, 103185. [Google Scholar] [CrossRef] [PubMed]
- Tan, X.; Yin, C.; Li, X.; Cai, M.; Chen, W.; Liu, Z.; Wang, J.; Han, Y. SY-Track: A tracking tool for measuring chicken flock activity level. Comput. Electron. Agric. 2024, 217, 108603. [Google Scholar] [CrossRef]
- Neethirajan, S. ChickTrack—A quantitative tracking tool for measuring chicken activity. Measurement 2022, 191, 110819. [Google Scholar] [CrossRef]
Scenario | Precision | Accuracy | F1-Score | Specificity | Sensitivity | Error Rate | Kappa Coefficient |
---|---|---|---|---|---|---|---|
I | 1 | 1 | 1 | N/A | 1 | 0 | 1 |
II | 0.993 | 0.993 | 0.996 | 0 | 0.996 | 0.007 | 0.756 |
III | 0.983 | 0.983 | 0.992 | 0 | 0.992 | 0.017 | 0.719 |
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content. |
© 2024 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
Mehdizadeh, S.A.; Siriani, A.L.R.; Pereira, D.F. Optimizing Deep Learning Algorithms for Effective Chicken Tracking through Image Processing. AgriEngineering 2024, 6, 2749-2767. https://doi.org/10.3390/agriengineering6030160
Mehdizadeh SA, Siriani ALR, Pereira DF. Optimizing Deep Learning Algorithms for Effective Chicken Tracking through Image Processing. AgriEngineering. 2024; 6(3):2749-2767. https://doi.org/10.3390/agriengineering6030160
Chicago/Turabian StyleMehdizadeh, Saman Abdanan, Allan Lincoln Rodrigues Siriani, and Danilo Florentino Pereira. 2024. "Optimizing Deep Learning Algorithms for Effective Chicken Tracking through Image Processing" AgriEngineering 6, no. 3: 2749-2767. https://doi.org/10.3390/agriengineering6030160
APA StyleMehdizadeh, S. A., Siriani, A. L. R., & Pereira, D. F. (2024). Optimizing Deep Learning Algorithms for Effective Chicken Tracking through Image Processing. AgriEngineering, 6(3), 2749-2767. https://doi.org/10.3390/agriengineering6030160