YOLO-Based Model for Automatic Detection of Broiler Pathological Phenomena through Visual and Thermal Images in Intensive Poultry Houses
Abstract
:1. Introduction
2. Materials and Methods
2.1. The Dataset Characteristics
2.2. Thermal Image Processing
2.3. Feature Extraction from Thermal Images
2.4. Experimental Environment and Hyper-Parameter Settings
2.5. Data Augmentation
2.6. Mosaic Data Augmentation
2.7. Architecture of YOLOv8
2.8. Improved YOLOv8 with Anchor Free
2.9. Evaluation Metrics
2.9.1. Detection Evaluation Metrics: Mean Average Precision, mAP
2.9.2. Performance Assessors
2.10. Train YOLOv8 Model
3. Results and Discussion
3.1. Experimental Analysis of Chicken Detection and Diseases Classifications
3.2. Model Comparison and the Influence of Different Dataset Augmentation Methods
3.3. Developed Models Capacity
4. Conclusions
Author Contributions
Funding
Institutional Review Board Statement
Data Availability Statement
Conflicts of Interest
References
- Willits-Smith, A.; Aranda, R.; Heller, M.C.; Rose, D. Addressing the carbon footprint, healthfulness, and costs of self-selected diets in the USA: A population-based cross-sectional study. Lancet Planet Health 2020, 4, e98–e106. [Google Scholar] [CrossRef] [PubMed]
- Baéza, E.; Guillier, L.; Petracci, M. Review: Production factors affecting poultry carcass and meat quality attributes. Animal 2022, 16, 100331. [Google Scholar] [CrossRef] [PubMed]
- Neethirajan, S.; Tuteja, S.K.; Huang, S.T.; Kelton, D. Recent advancement in biosensors technology for animal and livestock health management. Biosens. Bioelectron. 2017, 98, 398–407. [Google Scholar] [CrossRef] [PubMed]
- Aydin, A. Development of an early detection system for lameness of broilers using computer vision. Comput. Electron. Agric. 2017, 136, 140–146. [Google Scholar] [CrossRef]
- Dawkins, M.S.; Cain, R.; Roberts, S.J. Optical flow, flock behaviour and chicken welfare. Anim. Behav. 2012, 84, 219–223. [Google Scholar] [CrossRef]
- Mortensen, A.K.; Lisouski, P.; Ahrendt, P. Weight prediction of broiler chickens using 3D computer vision. Comput. Electron. Agric. 2016, 123, 319–326. [Google Scholar] [CrossRef]
- Okinda, C.; Lu, M.; Liu, L.; Nyalala, I.; Muneri, C.; Wang, J.; Zhang, H.; Shen, M. A machine vision system for early detection and prediction of sick birds: A broiler chicken model. Biosyst. Eng. 2019, 188, 229–242. [Google Scholar] [CrossRef]
- Zhuang, X.; Bi, M.; Guo, J.; Wu, S.; Zhang, T. Development of an early warning algorithm to detect sick broilers. Comput. Electron. Agric. 2018, 144, 102–113. [Google Scholar] [CrossRef]
- Okinda, C.; Liu, L.; Zhang, G.; Shen, M. Swine live weight estimation by adaptive neuro-fuzzy inference system. Indian J. Anim. Res. 2018, 52, 923–928. [Google Scholar]
- Wongsriworaphon, A.; Arnonkijpanich, B.; Pathumnakul, S. An approach based on digital image analysis to estimate the live weights of pigs in farm environments. Comput. Electron. Agric. 2015, 115, 26–33. [Google Scholar] [CrossRef]
- Kurnianggoro, L.; Wahyono; Jo, K. A survey of 2D shape representation: Methods, evaluations, and future research directions. Neurocomputing 2018, 300, 1–16. [Google Scholar] [CrossRef]
- Patel, D.; Upadhyay, S. Optical Flow Measurement using Lucas kanade Method. Int. J. Comput. Appl. 2013, 61, 6–10. [Google Scholar] [CrossRef]
- Neethirajan, S. ChickTrack—A quantitative tracking tool for measuring chicken activity. Measurement 2022, 191, 110819. [Google Scholar] [CrossRef]
- Okinda, C.; Nyalala, I.; Korohou, T.; Okinda, C.; Wang, J.; Achieng, T.; Wamalwa, P.; Mang, T.; Shen, M. A review on computer vision systems in monitoring of poultry: A welfare perspective. Artif. Intell. Agric. 2020, 4, 184–208. [Google Scholar] [CrossRef]
- Wang, C.; Chen, H.; Zhang, X.; Meng, C. Evaluation of a laying-hen tracking algorithm based on a hybrid support vector machine. J. Anim. Sci. Biotechnol. 2016, 7, 60. [Google Scholar] [CrossRef]
- Lin, C.Y.; Hsieh, K.W.; Tsai, Y.C.; Kuo, Y.F. Monitoring chicken heat stress using deep convolutional neural networks. In Proceedings of the ASABE Annual International Meeting, Detroit, MI, USA, 29 July–1 August 2018; American Society of Agricultural and Biological Engineers: Joseph, MI, USA, 2018; p. 1. [Google Scholar]
- Zhuang, X.; Zhang, T. Detection of sick broilers by digital image processing and deep learning. Biosyst. Eng. 2019, 179, 106–116. [Google Scholar] [CrossRef]
- Fang, C.; Huang, J.; Cuan, K.; Zhuang, X.; Zhang, T. Comparative study on poultry target tracking algorithms based on a deep regression network. Biosyst. Eng. 2020, 190, 176–183. [Google Scholar] [CrossRef]
- Maegawa, Y.; Ushigome, Y.; Suzuki, M.; Taguchi, K.; Kobayashi, K.; Haga, C.; Matsui, T. A new survey method using convolutional neural networks for automatic classification of bird calls. Ecol. Inform. 2021, 61, 101164. [Google Scholar] [CrossRef]
- Li, P.; Lu, H.; Wang, F.; Zhao, S.; Wang, N. Detection of sick laying hens by infrared thermal imaging and deep learning. J. Phys. Conf. Ser. 2021, 2025, 012008. [Google Scholar] [CrossRef]
- Nasirahmadi, A.; Gonzalez, J.; Sturm, B.; Hensel, O.; Knierim, U. Pecking activity detection in group-housed turkeys using acoustic data and a deep learning technique. Biosyst. Eng. 2020, 194, 40–48. [Google Scholar] [CrossRef]
- Cuan, K.; Zhang, T.; Li, Z.; Huang, J.; Ding, Y.; Fang, C. Automatic Newcastle disease detection using sound technology and deep learning method. Comput. Electron. Agric. 2022, 194, 106740. [Google Scholar] [CrossRef]
- Nasiri, A.; Yoder, J.; Zhao, Y.; Hawkins, S.; Prado, M.; Gan, H. Pose estimation-based lameness recognition in broiler using CNN-LSTM network. Comput. Electron. Agric. 2022, 197, 106931. [Google Scholar] [CrossRef]
- Pu, H.; Lian, J.; Fan, M. Automatic recognition of flock behavior of chickens with convolutional neural network and kinect sensor. Intern. J. Pattern Recognit. Artif. Intell. 2018, 32, 1850023. [Google Scholar]
- Cheng, D.; Rong, T.; Cao, G. Density map estimation for crowded chicken. In Image and Graphics; Zhao, Y., Barnes, N., Chen, B., Westermann, R., Kong, X., Lin, C., Eds.; ICIG. Lecture Notes in Computer, Science; Springer: Cham, Switzerland, 2019; p. 11903. [Google Scholar]
- Cuan, K.; Zhang, T.; Huang, J.; Fang, C.; Guan, Y. Detection of avian influenza-infected chickens based on a chicken sound convolutional neural network. Comput. Electron. Agric. 2020, 178, 105688. [Google Scholar] [CrossRef]
- Geffen, O.; Yitzhaky, Y.; Barchilon, N.; Druyan, S.; Halachmi, I. A machine vision system to detect and count laying hens in battery cages. Animal 2020, 14, 2628–2634. [Google Scholar] [CrossRef]
- Li, G.; Ji, B.; Li, B.; Shi, Z.; Zhao, Y.; Dou, Y.; Brocato, J. Assessment of layer pullet drinking behaviors under selectable light colors using convolutional neural network. Comput. Electron. Agric. 2020, 172, 105333. [Google Scholar] [CrossRef]
- Yao, Y.; Yu, H.; Mu, J.; Li, J.; Pu, H. Estimation of the Gender Ratio of Chickens Based on Computer Vision: Dataset and Exploration. Entropy 2020, 22, 719. [Google Scholar] [CrossRef]
- Zhang, H.; Chen, C. Design of Sick Chicken Automatic Detection System Based on Improved Residual Network. In Proceedings of the 2020 IEEE 4th Information Technology, Networking, Electronic and Automation Control Conference, Chongqing, China, 12–14 June 2020; pp. 2480–2485. [Google Scholar]
- Cao, L.; Xiao, Z.; Liao, X.; Yao, Y.; Wu, K.; Mu, J.; Li, J.; Pu, H. Automated chicken counting in surveillance camera environments based on the point supervision algorithm: LC-DenseFCN. Agriculture 2021, 11, 493. [Google Scholar] [CrossRef]
- Li, G.; Zhao, Y.; Porter, Z.; Purswell, J.L. Automated measurement of broiler stretching behaviors under four stocking densities via faster region-based convolutional neural network. Animal 2021, 15, 100059. [Google Scholar] [CrossRef]
- Jung, D.H.; Kim, N.Y.; Moon, S.H.; Kim, H.S.; Lee, T.S.; Yang, J.S.; Lee, J.Y.; Han, X.; Park, S.H. Classification of Vocalization Recordings of Laying Hens and Cattle Using Convolutional Neural Network Models. J. Biosyst. Eng. 2021, 46, 217–224. [Google Scholar] [CrossRef]
- Fanioudakis, L.; Potamitis, I. Deep Networks tag the location of bird vocalisations on audio spectrograms. arXiv 2017, arXiv:1711.04347. [Google Scholar]
- Redmon, J.; Divvala, S.; Girshick, R.; Farhadi, A. You only look once: Unified, real-time object detection. In Proceedings of the 2016 IEEE Conference on Computer Vision and Pattern Recognition (CVPR’16), Las Vegas, NV, USA, 27–30 June 2016; pp. 779–788. [Google Scholar]
- Li, G.; Suo, R.; Zhao, G.; Gao, C.; Fu, L.; Shi, F.; Dhupia, J.; Li, R.; Cui, Y. Real-time detection of kiwifruit flower and bud simultaneously in orchard using YOLOv4 for robotic pollination. Comput. Electron. Agric. 2022, 193, 106641. [Google Scholar] [CrossRef]
- Lu, S.Y.; Wang, B.Z.; Wang, H.J.; Chen, L.H.; Ma, L.J.; Zhang, X.Y. A real-time object detection algorithm for video. Comput. Electr. Eng. 2019, 77, 398–408. [Google Scholar] [CrossRef]
- Yan, B.; Fan, P.; Lei, X.Y.; Liu, Z.J.; Yang, F.Z. A real-time apple targets detection method for picking robot based on improved YOLOv5. Remote Sens. 2021, 13, 1619. [Google Scholar] [CrossRef]
- Jiang, K.; Xie, T.; Yan, R.; Wen, X.; Li, D.; Jiang, H.; Jiang, N.; Feng, L.; Duan, X.; Wang, J. An attention mechanism-improved YOLOv7 object detection algorithm for hemp duck count estimation. Agriculture 2022, 12, 1659. [Google Scholar] [CrossRef]
- Wang, C.; Bochkovskiy, A.; Liao, H.Y.M. YOLOv7: Trainable bag-of-freebies sets new state-of-the-art for real-time object detectors. arXiv 2022, arXiv:2207.02696. [Google Scholar]
- Siriani, A.L.R.; Kodaira, V.; Mehdizadeh, S.A.; Nääs, I.; Jorge de Moura, D.; Pereira, D.F. Detection and tracking of chickens in low-light images using YOLO network and Kalman filter. Neur. Comput. Appl. 2022, 34, 21987–21997. [Google Scholar] [CrossRef]
- BlogRobow. 2023. Available online: https://blog.roboflow.com/whats-new-in-yolov8/ (accessed on 13 January 2023).
- Data. 2023. Available online: https://drive.google.com/drive/folders/1jj9LKL0d1YDyDez8xrmKWRWd3psFoeZ2?usp=sharing (accessed on 25 February 2023).
- RoboFlow Software Annotate. 2023. Available online: https://roboflow.com/annotate (accessed on 10 January 2023).
- Ciaglia, F.; Zuppichini, F.S.; Guerrie, P.; McQuade, M.; Solawetz, J. Roboflow 100: A Rich, Multi-Domain Object Detection Benchmark. arXiv 2022, arXiv:2211.13523. [Google Scholar]
- Roboflow. 2023. Available online: https://docs.roboflow.com/ (accessed on 13 January 2023).
- Gu, Y.; Wang, S.C.; Yan, Y.; Tang, S.J.; Zhao, S.D. Identification and analysis of emergency behavior of cage-reared laying ducks based on yolov5. Agriculture 2022, 12, 485. [Google Scholar] [CrossRef]
- Tian, Y.N.; Yang, G.D.; Wang, Z.; Wang, H.; Li, E.; Liang, Z.Z. Apple detection during different growth stages in orchards using the improved YOLO-V3 model. Comput. Electron. Agric. 2019, 157, 417–426. [Google Scholar] [CrossRef]
- Grubblyfarms. 2022. Available online: https://grubblyfarms.com (accessed on 25 October 2022).
- Poultryhub. 2022. Available online: https://www.poultryhub.org (accessed on 5 September 2022).
- UNI-T. 2022. Available online: https://thermal.uni-trend.com/service-support/download/ (accessed on 25 November 2022).
- Pytorch 1.8.1. 2023. Available online: https://pytorch.org/get-started/previous-versions/ (accessed on 15 January 2023).
- Python Version 3.8. 2023. Available online: https://www.python.org/ (accessed on 15 January 2023).
- Kisantal, M.; Wojna, Z.; Murawski, J.; Naruniec, J.; Cho, K. Augmentation for small object detection. arXiv 2019, arXiv:1902.07296. [Google Scholar]
- Zulkifley, M.A.; Moubark, A.M.; Saputro, A.H.; Abdani, S.R. Automated Apple Recognition System Using Semantic Segmentation Networks with Group and Shuffle Operators. Agriculture 2022, 12, 756. [Google Scholar] [CrossRef]
- Chen, L.Y.; Zheng, M.C.; Duan, S.Q.; Luo, W.L.; Yao, L.G. Underwater Target Recognition Based on Improved YOLOv4 Neur. Netw. Electron. 2021, 10, 1634. [Google Scholar] [CrossRef]
- Bochkovskiy, A.; Wang, C.Y.; Liao, H.Y.M. YOLOv4: Optimal speed and accuracy of object detection. arXiv 2020, arXiv:2004.10934. [Google Scholar]
- Robo100. 2023. Available online: https://www.rf100.org/ (accessed on 13 January 2023).
- Ahmed, G.; Malick, R.A.S.; Akhunzada, A.; Zahid, S.; Sagri, M.R.; Gani, A. An approach towards iot-based predictive service for early detection of diseases in poultry chickens. Sustainability 2021, 13, 13396. [Google Scholar] [CrossRef]
- Wu, D.; Jiang, S.; Zhao, E.; Liu, Y.; Zhu, H.; Wang, W.; Wang, R. Detection of Camellia oleifera Fruit in Complex Scenes by Using YOLOv7 and Data Augmentation. Appl. Sci. 2022, 12, 11318. [Google Scholar] [CrossRef]
- Everingham, M.; Gool, L.V.; Williams, C.K.I.; Winn, J.; Zisserman, A. The PASCAL Visual Object Classes (VOC) Challenge. Int. J. Comput. Vis. 2010, 88, 303–338. [Google Scholar] [CrossRef]
Hyperparameters | V.7 | V.5 | V.8 | Hyperparameters | V.7 | V.5 | V.8 |
---|---|---|---|---|---|---|---|
Initial learning rate | 0.01 | 0.01 | 0.01 | Box loss gain | 0.05 | 0.05 | 7.5 |
Final learning rate | 0.1 | 0.01 | 0.01 | Classification loss gain | 0.3 | 0.5 | 0.5 |
Momentum | 0.937 | 0.937 | 0.937 | Classification Binary Cross Entropy Loss Positive Weight | 1 | 1 | 1.5 |
Weight decay | 0.0005 | 0.0005 | 0.0005 | Objectness loss gain | 0.7 | 1 | - |
Warmup epochs | 3 | 3 | 3 | Objectness Binary Cross Entropy Loss Positive Weight | 1 | 1 | - |
Warmup initial momentum | 0.8 | 0.8 | 0.8 | IoU training threshold | 0.2 | 0.2 | 7 |
Warmup initial bias learning | 0.1 | 0.1 | 0.1 | Anchor multiple threshold | 4 | 4 | 4.43 |
Augmentation Method | Configuration | Augmentation Method | Configuration |
---|---|---|---|
Flip | Horizontal and Vertical | Flip of bounding box | Horizontal and Vertical |
Rotation | Between −30° and +30° | Rotation of bounding box | Between −15° and +15° |
Blur | Up to 1.25 px | Blur of bounding box | Up to 10 px |
Cutout | 10 boxes with 5% size each | 90° Rotation of bounding box | Clockwise, Counter-Clockwise, Upside Down |
Hue | Between −25° and +25° | Exposure | Between −20% and +20% |
Shear | ±20° Horizontal, ±20° Vertical | Brightness | Between −30% and +30% |
Noise (salt) | Up to 5% of pixels | Saturation | Between −30% and +30% |
YOLO Version | Precision | Recall | F1 Score | mAP50 | mAP50-95 |
---|---|---|---|---|---|
Thermal image dataset | |||||
Version 8 | 0.988 | 0.956 | 0.972 | 0.988 | 0.857 |
Version 7 | 0.989 | 0.944 | 0.966 | 0.976 | 0.823 |
Version 5 | 0.929 | 0.917 | 0.923 | 0.952 | 0.587 |
Visual image dataset | |||||
Version 8 | 0.861 | 0.691 | 0.767 | 0.829 | 0.679 |
Version 7 | 0.802 | 0.850 | 0.825 | 0.905 | 0.649 |
Version 5 | 0.950 | 0.404 | 0.567 | 0.497 | 0.298 |
Augmentation Method | Dataset Type | Precision | Recall | F1 Score | mAP50 | mAP50-95 |
---|---|---|---|---|---|---|
Flip + Rotation + Blur + Cutout | Thermal | 0.719 | 0.451 | 0.554 | 0.557 | 0.281 |
Visual | 0.914 | 0.529 | 0.670 | 0.573 | 0.349 | |
Brightness 30 | Thermal | 0.750 | 0.459 | 0.570 | 0.552 | 0.316 |
Visual | 0.969 | 0.592 | 0.735 | 0.654 | 0.412 | |
Horizontal flips | Thermal | 0.743 | 0.408 | 0.527 | 0.504 | 0.286 |
Visual | 0.707 | 0.615 | 0.658 | 0.661 | 0.396 | |
Raw dataset (non-augmented) | Thermal | 0.623 | 0.439 | 0.515 | 0.478 | 0.278 |
Visual | 0.780 | 0.529 | 0.630 | 0.652 | 0.535 | |
Cutout | Thermal | 0.595 | 0.399 | 0.478 | 0.469 | 0.282 |
Visual | 0.964 | 0.606 | 0.744 | 0.668 | 0.422 | |
Shear + Brightness + Noise | Thermal | 0.498 | 0.391 | 0.438 | 0.424 | 0.235 |
Visual | 0.903 | 0.539 | 0.675 | 0.595 | 0.338 | |
Hue + Saturation + Exposure + 90° Rotation of bounding box | Thermal | 0.430 | 0.340 | 0.380 | 0.399 | 0.230 |
Visual | 0.883 | 0.493 | 0.633 | 0.537 | 0.337 | |
Brightness 25 of bounding box only | Thermal | 0.251 | 0.317 | 0.280 | 0.314 | 0.154 |
Visual | 0.851 | 0.501 | 0.631 | 0.546 | 0.300 | |
Flip, Rotation and Blur of bounding box only | Thermal | 0.039 | 0.063 | 0.048 | 0.040 | 0.008 |
Visual | 0.766 | 0.222 | 0.344 | 0.301 | 0.146 | |
90° Rotation + Grayscale + Box-shear of bounding box | Thermal | 0.664 | 0.044 | 0.083 | 0.073 | 0.017 |
Visual | 0.901 | 0.232 | 0.369 | 0.296 | 0.144 |
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content. |
© 2023 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
Elmessery, W.M.; Gutiérrez, J.; Abd El-Wahhab, G.G.; Elkhaiat, I.A.; El-Soaly, I.S.; Alhag, S.K.; Al-Shuraym, L.A.; Akela, M.A.; Moghanm, F.S.; Abdelshafie, M.F. YOLO-Based Model for Automatic Detection of Broiler Pathological Phenomena through Visual and Thermal Images in Intensive Poultry Houses. Agriculture 2023, 13, 1527. https://doi.org/10.3390/agriculture13081527
Elmessery WM, Gutiérrez J, Abd El-Wahhab GG, Elkhaiat IA, El-Soaly IS, Alhag SK, Al-Shuraym LA, Akela MA, Moghanm FS, Abdelshafie MF. YOLO-Based Model for Automatic Detection of Broiler Pathological Phenomena through Visual and Thermal Images in Intensive Poultry Houses. Agriculture. 2023; 13(8):1527. https://doi.org/10.3390/agriculture13081527
Chicago/Turabian StyleElmessery, Wael M., Joaquín Gutiérrez, Gomaa G. Abd El-Wahhab, Ibrahim A. Elkhaiat, Ibrahim S. El-Soaly, Sadeq K. Alhag, Laila A. Al-Shuraym, Mohamed A. Akela, Farahat S. Moghanm, and Mohamed F. Abdelshafie. 2023. "YOLO-Based Model for Automatic Detection of Broiler Pathological Phenomena through Visual and Thermal Images in Intensive Poultry Houses" Agriculture 13, no. 8: 1527. https://doi.org/10.3390/agriculture13081527
APA StyleElmessery, W. M., Gutiérrez, J., Abd El-Wahhab, G. G., Elkhaiat, I. A., El-Soaly, I. S., Alhag, S. K., Al-Shuraym, L. A., Akela, M. A., Moghanm, F. S., & Abdelshafie, M. F. (2023). YOLO-Based Model for Automatic Detection of Broiler Pathological Phenomena through Visual and Thermal Images in Intensive Poultry Houses. Agriculture, 13(8), 1527. https://doi.org/10.3390/agriculture13081527