Using Data Augmentation to Improve the Generalization Capability of an Object Detector on Remote-Sensed Insect Trap Images
Abstract
:1. Introduction
- Our findings indicate that by using the appropriate combination of augmentation methods, YOLOv5’s mean average precision value (mAP50) could be further increased from 0.844 to 0.992 and from 0.421 to 0.727 on the two remote-sensed trap image datasets.
- This study reveals that incorporating photometric image transformations into the mosaic augmentation can be more effective than the standard combination of augmentation techniques.
- The experimental results show that this approach surpasses the native combination of augmentation techniques and YOLOv5’s built-in image enrichment process with HSV (hue, saturation, and value) and mosaic augmentations in efficiency.
2. Materials and Methods
2.1. The YOLO Model Family
2.2. Data Augmentation
2.3. Datasets
2.4. Evaluation Metrics
3. Results and Discussion
Modified Mosaic Augmentation
4. Conclusions
Funding
Data Availability Statement
Acknowledgments
Conflicts of Interest
References
- Damos, P.; Colomar, L.A.E.; Ioriatti, C. Integrated fruit production and pest management in Europe: The apple case study and how far we are from the original concept. Insects 2015, 6, 626–657. [Google Scholar] [CrossRef] [PubMed]
- Marlic, G.; Penvern, S.; Barbier, P.M.; Lescourret, F.; Capowiez, Y. Impact of crop protection on natural enemies in organic apple production. Agron. Sustain. Dev. 2015, 35, 803–813. [Google Scholar] [CrossRef]
- Cirjak, D.; Miklecic, I.; Lemic, D.; Kos, T.; Zivkovic, P.I. Automatic pest monitoring systems in apple production under changing climate conditions. Horticulture 2022, 8, 520. [Google Scholar] [CrossRef]
- Schrader, M.J.; Smytheman, P.; Beers, E.H.; Knot, L.R. An open-source low-cost imaging system plug-in for pheromone traps aiding remote insect pest population monitoring in fruit crops. Machines 2022, 10, 52. [Google Scholar] [CrossRef]
- Pezhman, H.; Saeidi, K. Effectiveness of various solar light traps with and without sex pheromone for mass trapping of tomato leaf miner (Tuta absoluta) in tomato field. Not. Sci. Biol. 2018, 10, 475–484. [Google Scholar] [CrossRef]
- Tóth, M.; Jósvai, J.; Hári, K.; Pénzes, B.; Vuity, Z.; Holb, I.; Szarukán, I.; Kecskés, Z.; Dorgán-Zsuga, I.; Koczor, S.; et al. Pear easter based lures for the codling moth Cydia pomonella L.—A summary of research effort in Hungary. Acta Phytopathol. Entomol. Hung. 2014, 49, 37–47. [Google Scholar] [CrossRef]
- Juszczak, R.; Kuchar, L.; Lesny, J.; Olejnik, J. Climate change impact on development rates of codling moth (Cydia pomonelly L.) in the Wielkopolska region, Poland. Int. J. Biometeorol. 2012, 57, 31–44. [Google Scholar] [CrossRef] [PubMed]
- Balasko, M.K.; Bazok, R.; Mikac, K.M.; Lemic, D.; Zivkovic, I.P. Pest management challenges and control practices in codling moth: A review. Insects 2020, 11, 38. [Google Scholar] [CrossRef] [PubMed]
- Chithambarathanu, M.; Jeyakumar, M.K. Survey on crop pest detection using deep learning and machine learning approaches. Multimed. Tools Appl. 2023, 82, 42277–42310. [Google Scholar] [CrossRef]
- Suto, J. Condling moth monitoring with camera-equipped automated traps: A review. Agriculture 2022, 12, 1721. [Google Scholar] [CrossRef]
- Lima, M.C.F.; Leandro, M.E.D.A.; Valero, C.; Coronel, L.C.P.; Bazzo, C.O.G. Automatic detection and monitoring of insect pests—A review. Agriculture 2020, 10, 161. [Google Scholar] [CrossRef]
- He, Y.; Zeng, H.; Fan, Y.; Ji, S.; Wu, J. Application of deep learning in integrated pest management: A real-time system for detection and diagnosis of oilseed rape pests. Mob. Inf. Syst. 2019, 2019, 4570808. [Google Scholar] [CrossRef]
- Amrani, A.; Sohel, F.; Diepeveen, D.; Murray, D.; Michael, G.; Jones, K. Insect detection from imagery using YOLOv3-based adaptive feature fusion convolutional network. Crop Pasture Sci. 2022, 74, 615–627. [Google Scholar] [CrossRef]
- Li, W.; Zhu, T.; Li, X.; Dong, J.; Liu, J. Recommending advanced deep learning models for efficient insect pest detection. Agriculture 2022, 12, 1065. [Google Scholar] [CrossRef]
- BaiDu Inc. Company, NFU. Baidu AI Insect Detection Dataset. Available online: https://aistudio.baidu.com/aistudio/datasetdetail/19638/ (accessed on 20 May 2024).
- Hong, S.J.; Kim, S.Y.; Kim, E.; Lee, C.H.; Lee, J.S.; Lee, D.S.; Bang, J.; Kim, G. Moth detection from pheromone trap images using deep learning object detectors. Agriculture 2020, 10, 170. [Google Scholar] [CrossRef]
- Ascolese, R.; Gargiulo, S.; Pace, R.; Nappa, P.; Griffo, R.; Nugnes, F.; Bernardo, U. E-traps: A valuable monitoring tool to be improved. Bull. OEPP 2022, 52, 175–184. [Google Scholar] [CrossRef]
- Martineau, M.; Conte, D.; Raveaux, R.; Arnault, I.; Munier, D.; Venturini, G. A survey on image-based insect classification. Pattern Recognit. 2017, 65, 273–284. [Google Scholar] [CrossRef]
- Albanese, A.; Nardello, M.; Brunelli, D. Automated pest detection with DNN on the edge for precision agriculture. IEEE J. Emerg. Sel. Top. Circuits Syst. 2021, 11, 458–467. [Google Scholar] [CrossRef]
- Kasinathan, T.; Uyyala, S.R. Machine learning ensemble with image processing for pest identification and classification in field crops. Neural Comput. Appl. 2021, 33, 7490–7504. [Google Scholar] [CrossRef]
- Viola, P.; Jones, M. Rapid object detection using a boosted cascade of simple features. In Proceedings of the 2001 IEEE Computer Society Conference on Computer Vision and Pattern Recognition, Kauai, HI, USA, 8–14 December 2001; pp. 511–518. [Google Scholar]
- Terven, J.; Esparza-Cordova, D.M.; Romero-Gonzales, J.A. A comprehensive review of YOLO architectures in computer vision: From YOLOv1 to YOLOv8 and YOLO-NAS. Mach. Learn. Knowl. Extr. 2023, 5, 1680–1716. [Google Scholar] [CrossRef]
- Ren, S.; He, K.; Girshick, R.; Sun, J. Faster r-cnn: Towards real-time object detection with region proposal networks. In Proceedings of the Advances in Neural Information Processing Systems 28, Montreal, QC, Canada, 7–12 December 2015; pp. 91–99. [Google Scholar]
- Pun, T.B.; Neupane, A.; Koech, R.; Walsh, K. Detection and counting of root-knot nematodes using YOLO models with mosaic augmentation. Biosens. Bioelectron.: X 2023, 15, 100407. [Google Scholar] [CrossRef]
- Xu, M.; Yoon, S.; Fuentes, M.; Park, D.S. A comprehensive survey of image augmentation techniques for deep learning. Pattern Recognit. 2023, 137, 109347. [Google Scholar] [CrossRef]
- Li, Y.; Wang, D.; Yuan, C.; Li, H.; Hu, J. Enhancing agricultural image segmentation with an agricultural segment anything model adapter. Sensors 2023, 23, 7884. [Google Scholar] [CrossRef]
- Suto, J. Improving the generalization capability of YOLOv5 on remote sensed insect trap images with data augmentation. Multimed. Tools Appl. 2024, 83, 27921–27934. [Google Scholar] [CrossRef]
- Nasare, T.S.; da Costa, D.G.P.; Contato, W.A.; Ponti, M. Deep convolutional neural networks and noisy images. In Progress in Pattern Recognition, Image Analysis, Computer Vision, and Applications, Proceedings of the 22nd Iberoamerican Congress, CIARP 2017, Valparaíso, Chile, 7–10 November 2017; Springer: Berlin/Heidelberg, Germany, 2018; pp. 416–424. [Google Scholar]
- Goodfellow, I.; Bengio, Y.; Courville, A. Deep Learning; MIT Press: Cambridge, MA, USA, 2016. [Google Scholar]
- Suto, J. A novel plug-in board for remote insect monitoring. Agriculture 2022, 12, 1897. [Google Scholar] [CrossRef]
- Ding, W.; Taylor, G. Automatic moth detection from trap images for pest management. Comput. Electron. Agric. 2016, 123, 17–28. [Google Scholar] [CrossRef]
- Suto, J. Embedded system-based sticky paper trap with deep learning-based insect counting algorithm. Electronics 2021, 10, 1754. [Google Scholar] [CrossRef]
Augmentation Strategy | mAP50 | mAP50-95 | Number of Images |
---|---|---|---|
Gamma + brightness–contrast | 0.922 | 0.605 | 990 |
Gamma + noise | 0.914 | 0.609 | |
Gamma + translation | 0.878 | 0.591 | |
Gamma + rotation | 0.897 | 0.567 | |
Gamma + mosaic | 0.981 | 0.679 | |
Brightness–contrast + noise | 0.920 | 0.599 | |
Brightness–contrast + translation | 0.924 | 0.615 | |
Brightness–contrast + rotation | 0.919 | 0.595 | |
Brightness–contrast + mosaic | 0.989 | 0.683 | |
Noise + translation | 0.928 | 0.605 | |
Noise + rotation | 0.949 | 0.616 | |
Noise + mosaic | 0.992 | 0.672 | |
Translation + rotation | 0.968 | 0.639 | |
Translation + mosaic | 0.985 | 0.668 | |
Rotation + mosaic | 0.980 | 0.675 |
Augmentation Strategy | mAP50 | mAP50-95 | Number of Images |
---|---|---|---|
Brightness–contrast + noise | 0.514 | 0.349 | 990 |
Brightness–contrast + translation | 0.471 | 0.335 | |
Brightness–contrast + rotation | 0.451 | 0.323 | |
Brightness–contrast + mosaic | 0.695 | 0.459 | |
Noise + translation | 0.606 | 0.409 | |
Noise + rotation | 0.549 | 0.362 | |
Noise + mosaic | 0.701 | 0.481 | |
Translation + rotation | 0.540 | 0.362 | |
Translation + mosaic | 0.647 | 0.447 | |
Rotation + mosaic | 0.594 | 0.383 |
Augmentation Strategy | mAP50 | mAP50-95 | Number of Images |
---|---|---|---|
Mosaic + noise + gamma | 0.981 | 0.647 | 1320 |
Mosaic + noise + brightness–contrast | 0.988 | 0.686 | |
Mosaic + noise + translation | 0.985 | 0.654 | |
Mosaic + noise + rotation | 0.979 | 0.678 |
Augmentation Strategy | mAP50 | mAP50-95 | Number of Images |
---|---|---|---|
Mosaic + noise + brightness–contrast | 0.727 | 0.466 | 1320 |
Mosaic + noise + translation | 0.709 | 0.460 | |
Mosaic + noise + rotation | 0.714 | 0.464 |
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content. |
© 2024 by the author. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
Suto, J. Using Data Augmentation to Improve the Generalization Capability of an Object Detector on Remote-Sensed Insect Trap Images. Sensors 2024, 24, 4502. https://doi.org/10.3390/s24144502
Suto J. Using Data Augmentation to Improve the Generalization Capability of an Object Detector on Remote-Sensed Insect Trap Images. Sensors. 2024; 24(14):4502. https://doi.org/10.3390/s24144502
Chicago/Turabian StyleSuto, Jozsef. 2024. "Using Data Augmentation to Improve the Generalization Capability of an Object Detector on Remote-Sensed Insect Trap Images" Sensors 24, no. 14: 4502. https://doi.org/10.3390/s24144502
APA StyleSuto, J. (2024). Using Data Augmentation to Improve the Generalization Capability of an Object Detector on Remote-Sensed Insect Trap Images. Sensors, 24(14), 4502. https://doi.org/10.3390/s24144502