YOLO-Based Light-Weight Deep Learning Models for Insect Detection System with Field Adaption
Abstract
:1. Introduction
- developing faster and more accurate YOLO-based detection and a classification model with an attention mechanism to classify the insect order and bring down the classification to the species level using the same model;
- comparing the discovered model’s performance with other state-of-the-art insect detection techniques.
2. Related Work
2.1. Opto-Acoustic Techniques
2.2. Image Processing Techniques
2.3. Machine Learning Techniques
2.4. Deep Learning Techniques
3. Materials and Methods
3.1. Dataset Collection
3.2. Data Annotation
3.3. Data Augmentation
3.4. YOLOv5
3.5. Attention Mechanism
3.6. Experimental Setup
4. Results and Discussions
5. Conclusions
Author Contributions
Funding
Institutional Review Board Statement
Informed Consent Statement
Data Availability Statement
Conflicts of Interest
Abbreviations
DL | Deep Learning |
ML | Machine Learning |
SVM | Support Vector Machine |
CNN | Convolutional neural network |
WSN | Wireless Sensor Network |
SPP | Spatial Pyramid Pooling |
CBAM | Convolutional Block Attention Module |
CBL | CBAM layer |
CSP | Cross Stage Partial |
BN | Batch Normalization |
ReLU | Rectified Linear Unit |
mAP | Mean Average Precision |
FPN | Feature Pyramid Network |
References
- Cheng, D.F.; Wu, K.M.; Tian, Z.; Wen, L.P.; Shen, Z.R. Acquisition and analysis of migration data from the digitised display of a scanning entomological radar. Comput. Electron. Agric. 2002, 35, 63–75. [Google Scholar] [CrossRef]
- Martineau, M.; Conte, D.; Raveaux, R.; Arnault, I.; Munier, D.; Venturini, G. A survey on image-based insect classification. Pattern Recognit. 2017, 65, 273–284. [Google Scholar] [CrossRef] [Green Version]
- Thenmozhi, K.; Reddy, U.S. Crop pest classifcation based on deep convolutional neural network and transfer learning. Comput. Electron. Agric. 2009, 164, 104906. [Google Scholar] [CrossRef]
- Ngô-Muller, V.; Garrouste, R.; Nel, A. Small but important: A piece of mid-cretaceous burmese amber with a new genus and two new insect species (odonata: Burmaphlebiidae & ‘psocoptera’: Compsocidae). Cretac. Res. 2020, 110, 104405. [Google Scholar] [CrossRef]
- Serres, J.R.; Viollet, S. Insect-inspired vision for autonomous vehicles. Curr. Opin. Insect Sci. 2018, 30, 46–51. [Google Scholar] [CrossRef] [PubMed] [Green Version]
- Fox, R.; Harrower, C.A.; Bell, J.R.; Shortall, C.R.; Middlebrook, I.; Wilson, R.J. Insect population trends and the IUCN red list process. J. Insect Conserv. 2019, 23, 269–278. [Google Scholar] [CrossRef] [Green Version]
- Krawchuk, M.A.; Meigs, G.W.; Cartwright, J.M.; Coop, J.D.; Davis, R.; Holz, A.; Kolden, C.; Meddens, A.J. Disturbance refugia within mosaics of forest fire, drought, and insect outbreaks. Front. Ecol. Environ. 2020, 18, 235–244. [Google Scholar] [CrossRef]
- Gullan, P.J.; Cranston, P.S. The Insects: An Outline of Entomology; Wiley: Hoboken, NJ, USA, 2014. [Google Scholar]
- Wheeler, W.C.; Whiting, M.; Wheeler, Q.D.; Carpenter, J.M. The Phylogeny of the Extant Hexapod Orders. Cladistics 2001, 17, 113–169. [Google Scholar] [CrossRef] [Green Version]
- Kumar, N.; Nagarathna. Survey on Computational Entomology: Sensors based Approaches to Detect and Classify the Fruit Flies. In Proceedings of the 2020 11th International Conference on Computing, Communication and Networking Technologies (ICCCNT), Kharagpur, India, 1–3 July 2020; pp. 1–6. [Google Scholar] [CrossRef]
- Amarathunga, D.; Grundy, J.; Parry, H.; Dorin, A. Methods of insect image capture and classification: A Systematic literature review. Smart Agric. Technol. 2021, 1, 100023. [Google Scholar] [CrossRef]
- Yang, Z.; He, W.; Fan, X.; Tjahjadi, T. PlantNet: Transfer learning based fine-grained network for high-throughput plants recognition. Soft Comput. 2022, 26, 10581–10590. [Google Scholar] [CrossRef]
- Rehman, M.Z.U.; Ahmed, F.; Khan, M.A.; Tariq, U.; Jamal, S.S.; Ahmad, J.; Hussain, L. Classification of citrus plant diseases using deep transfer learning. Comput. Mater. Continua 2022, 70, 1401–1417. [Google Scholar] [CrossRef]
- Xiao, Z.; Yin, K.; Geng, L.; Wu, J.; Zhang, F.; Liu, Y. Pest identification via hyperspectral image and deep learning. Signal Image Video Process. 2022, 16, 873–880. [Google Scholar] [CrossRef]
- Amrani, A.; Sohel, F.; Diepeveen, D.; Murray, D.; Jones, M. Deep learning-based detection of aphid colonies on plants from a reconstructed Brassica image dataset. Comput. Electron. Agric. 2023, 205, 107587. [Google Scholar] [CrossRef]
- Wang, Q.J.; Zhang, S.Y.; Dong, S.F.; Zhang, G.C.; Yang, J.; Li, R.; Wang, H.Q. Pest24: A large-scale very small object data set of agricultural pests for multi-target detection. Comput. Electron. Agric. 2020, 175, 105585. [Google Scholar] [CrossRef]
- Amrani, A.; Sohel, F.; Diepeveen, D.; Murray, D.; Jones, M. Insect detection from imagery using YOLOv3-based adaptive feature fusion convolution network. Crop Pasture Sci. 2023. [Google Scholar] [CrossRef]
- Zhang, J.P.; Li, Z.W.; Yang, J. A parallel SVM training algorithm on large-scale classification problems. In Proceedings of the 2005 International Conference on Machine Learning and Cybernetics, Guangzhou, China, 18–21 August 2005. [Google Scholar] [CrossRef]
- Xia, D.; Chen, P.; Wang, B.; Zhang, J.; Xie, C. Insect detection and classification based on an improved convolutional neural network. Sensors 2018, 18, 4169. [Google Scholar] [CrossRef] [Green Version]
- Shi, Z.; Dang, H.; Liu, Z.; Zhou, X. Detection and Identification of Stored-Grain Insects Using Deep Learning: A More Effective Neural Network. IEEE Access 2020, 8, 163703–163714. [Google Scholar] [CrossRef]
- Mamdouh, N.; Khattab, A. YOLO-Based Deep Learning Framework for Olive Fruit Fly Detection and Counting. IEEE Access 2021, 9, 84252–84262. [Google Scholar] [CrossRef]
- Sciarretta, A.; Tabilio, M.R.; Amore, A.; Colacci, M.; Miranda, M.A.; Nestel, D.; Papadopoulos, N.T.; Trematerra, P. Defining and evaluating a decision support system (DSS) for the precisepest management of the Mediterranean fruit fly, Ceratitis capitata, at the farm level. Agronomy 2019, 9, 608. [Google Scholar] [CrossRef] [Green Version]
- Potamitis, I.; Rigakis, I.; Tatlas, N.A. Automated surveillance of fruitflies. Sensors 2017, 17, 110. [Google Scholar] [CrossRef] [Green Version]
- Doitsidis, L.; Fouskitakis, G.N.; Varikou, K.N.; Rigakis, I.I.; Chatzichristo, S.A.; Papafilippaki, A.K.; Birouraki, A.E. Remote monitoring of the Bactrocera oleae (Gmelin) (Diptera: Tephritidae) population using an automated McPhail trap. Comput. Electron. Agric. 2017, 137, 69–78. [Google Scholar] [CrossRef]
- Tirelli, P.; Borghese, N.A.; Pedersini, F.; Galassi, G.; Oberti, R. Automatic monitoring of pest insects traps by Zigbee-based wireless networking of image sensors. In Proceedings of the 2011 IEEE International Instrumentation and Measurement Technology Conference, Hangzhou, China, 10–12 May 2011; pp. 1–5. [Google Scholar] [CrossRef]
- Sun, C.; Flemons, P.; Gao, Y.; Wang, D.; Fisher, N.; La Salle, J. Automated image analysis on insect soups. In Proceedings of the 2016 International Conference on Digital Image Computing: Techniques and Applications (DICTA), Gold Coast, QLD, Australia, 30 November–2 December 2016; pp. 1–6. [Google Scholar] [CrossRef]
- Philimis, P.; Psimolophitis, E.; Hadjiyiannis, S.; Giusti, A.; Perelló, J.; Serrat, A.; Avila, P. A centralised remote data collection system using automated traps for managing and controlling the population of the Mediterranean (Ceratitis capitata) and olive (Dacus oleae) fruit flies. In Proceedings of the First International Conference on Remote Sensing and Geoinformation of the Environment (RSCy2013), Paphos, Cyprus, 8 April 2013. [Google Scholar] [CrossRef]
- Kaya, Y.; Kayci, L. Application of artificial neural network for automatic detection of butterfly species using color and texture features. Vis. Comput. 2014, 30, 71–79. [Google Scholar] [CrossRef]
- Zhong, Y.; Gao, J.; Lei, Q.; Zhou, Y. A vision-based counting and recognition system for flying insects in intelligent agriculture. Sensors 2018, 18, 1489. [Google Scholar] [CrossRef] [PubMed] [Green Version]
- Redmon, J.; Divvala, S.; Girshick, R.; Farhadi, A. You only look once: Unified, real-time object detection. In Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition (CVPR), Las Vegas, NV, USA, 27–30 June 2016; pp. 779–788. [Google Scholar] [CrossRef]
- Kalamatianos, R.; Karydis, I.; Doukakis, D.; Avlonitis, M. DIRT: The dacus image recognition toolkit. J. Imaging 2018, 4, 129. [Google Scholar] [CrossRef] [Green Version]
- Ding, W.; Taylor, G. Automatic moth detection from trap images for pest management. Comput. Electron. Agric. 2016, 123, 17–28. [Google Scholar] [CrossRef] [Green Version]
- Simonyan, K.; Zisserman, A. Very deep convolutional networks for large-scale image recognition. arXiv 2014, arXiv:1409.1556. [Google Scholar]
- Kaggle. Available online: https://www.kaggle.com/mistag/arthropod-taxonomy-orders-object-detection-dataset (accessed on 1 December 2022).
- Patrício, D.I.; Rieder, R. Computer vision and artificial intelligence in precision agriculture for grain crops: A systematic review. Comput. Electron. Agric. 2018, 153, 69–81. [Google Scholar] [CrossRef] [Green Version]
- Murthy, C.B.; Hashmi, M.F.; Muhammad, G.; AlQahtani, S.A. YOLOv2pd: An efficient pedestrian detection algorithm using improved YOLOv2 model. Comput. Mater. Contin. 2021, 69, 3015–3031. [Google Scholar] [CrossRef]
- Wang, Y.M.; Jia, K.B.; Liu, P.Y. Impolite pedestrian detection by using enhanced YOLOv3-Tiny. J. Artif. Intell. 2020, 2, 113–124. [Google Scholar] [CrossRef]
- Wu, Z. Using YOLOv5 for Garbage Classification. In Proceedings of the 2021 4th International Conference on Pattern Recognition and Artificial Intelligence (PRAI), Yibin, China, 20–22 August 2021; pp. 35–38. [Google Scholar] [CrossRef]
- Han, S.; Dong, X.; Hao, X.; Miao, S. Extracting Objects’ Spatial–Temporal Information Based on Surveillance Videos and the Digital Surface Model. ISPRS Int. J. Geo-Inf. 2022, 11, 103. [Google Scholar] [CrossRef]
- Jaderberg, M.; Simonyan, K.; Zisserman, A. Spatial transformer networks. Adv. Neural Inf. Process. Syst. 2015, 28, 2017–2025. [Google Scholar] [CrossRef]
- Hou, G.; Qin, J.; Xiang, X.; Tan, Y.; Xiong, N.N. AF-net: A medical image segmentation network based on attention mechanism and feature fusion. Comput. Mater. Contin. 2021, 69, 1877–1891. [Google Scholar] [CrossRef]
- Fukui, H.; Hirakawa, T.; Yamashita, T. Attention branch network: Learning of attention mechanism for visual explanation. In Proceedings of the CVPR, Long Beach, CA, USA, 15–20 June 2019; pp. 10705–10714. [Google Scholar] [CrossRef]
- Rezatofighi, H.; Tsoi, N.; Gwak, J.; Sadeghian, A.; Reid, I.; Savarese, S. Generalized Intersection Over Union: A Metric and a Loss for Bounding Box Regression. In Proceedings of the 2019 IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR), Long Beach, CA, USA, 15–20 June 2019; pp. 658–666. [Google Scholar] [CrossRef] [Green Version]
- Redmon, J.; Farhadi, A. YOLO9000: Better, faster, stronger. In Proceedings of the CVPR, Honolulu, HI, USA, 21–26 July 2017; pp. 7263–7271. [Google Scholar] [CrossRef]
- Ahmad, I.; Yang, Y.; Yue, Y.; Ye, C.; Hassan, M.; Cheng, X.; Wu, Y.; Zhang, Y. Deep Learning Based Detector YOLOv5 for Identifying Insect Pests. Appl. Sci. 2022, 12, 10167. [Google Scholar] [CrossRef]
- Kasinathan, T.; Singaraju, D.; Uyyala, S.R. Insect classification and detection in field crops using modern machine learning techniques. Inform. Process. Agric. 2020, 8, 446–457. [Google Scholar] [CrossRef]
- Karar, M.E.; Alsunaydi, F.; Albusaymi, S.; Alotaibi, S. A new mobile application of agricultural pests recognition using deep learning in cloud computing system. Alex. Eng. J. 2021, 60, 4423–4432. [Google Scholar] [CrossRef]
Configuration | Version |
---|---|
CPU | v4 CPU |
RAM | 16 GB |
GPU | Nvidia Tesla T4 |
GPU Memory | 16 GB |
Python Version | 3.8.10 |
Dataset Size | 12GB |
YOLO Version | No. of Layers | No. of Parameters | Training Time (in h) | Inference Time (in s) |
---|---|---|---|---|
Version-5s | 272 | 7,027,720 | 4 | 0.15 |
Version-5m | 369 | 20,879,400 | 6.5 | 0.23 |
Version-5l | 468 | 46,149,064 | 7.8 | 0.33 |
Version-5x | 567 | 83,365,852 | 8.1 | 0.59 |
New Version-5x | 641 | 94,246,049 | 8.97 | 0.50 |
Model | Training | Validation | Testing | |||||||||
---|---|---|---|---|---|---|---|---|---|---|---|---|
P | R | F1 | mAP | P | R | F1 | mAP | P | R | F1 | mAP | |
Version-5s | 0.808 | 0.816 | 0.811 | 0.864 | 0.806 | 0.821 | 0.813 | 0.865 | 0.812 | 0.799 | 0.850 | 0.851 |
Version-5m | 0.859 | 0.857 | 0.857 | 0.908 | 0.861 | 0.857 | 0.858 | 0.908 | 0.836 | 0.855 | 0.865 | 0.889 |
Version-5l | 0.868 | 0.870 | 0.869 | 0.920 | 0.870 | 0.869 | 0.869 | 0.921 | 0.865 | 0.854 | 0.859 | 0.908 |
Version-5x | 0.884 | 0.875 | 0.880 | 0.928 | 0.883 | 0.876 | 0.879 | 0.928 | 0.882 | 0.846 | 0.863 | 0.912 |
New Version-5x | 0.899 | 0.857 | 0.877 | 0.930 | 0.898 | 0.856 | 0.876 | 0.930 | 0.868 | 0.886 | 0.878 | 0.922 |
Reference | Dataset | Model | No. of Insects in One Picture | Purpose | Accuracy |
---|---|---|---|---|---|
[21] | 24 classes of insects from internet | VGG19 | One Insect | Classification | 89.22 |
[44] | 8 classes of stored grain insects | DenseNet-121 | Multiple Insect | Classification and Detection | 88.06 |
[46] | 5 classes of insects | CNN | One Insect | Classification | 90 |
[47] | 5 classes of insects | Faster R-CNN | One Insect | Classification | 98.9 |
[42] | DIRT Dataset (Olive fruit fly) | MobileNet | One Insect | Classification | 96.89 |
Proposed Model | 7 classes of flying insects | YOLOv5 + CBAM | Multiple Insect | Classification and Detection | 93 |
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content. |
© 2023 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
Kumar, N.; Nagarathna; Flammini, F. YOLO-Based Light-Weight Deep Learning Models for Insect Detection System with Field Adaption. Agriculture 2023, 13, 741. https://doi.org/10.3390/agriculture13030741
Kumar N, Nagarathna, Flammini F. YOLO-Based Light-Weight Deep Learning Models for Insect Detection System with Field Adaption. Agriculture. 2023; 13(3):741. https://doi.org/10.3390/agriculture13030741
Chicago/Turabian StyleKumar, Nithin, Nagarathna, and Francesco Flammini. 2023. "YOLO-Based Light-Weight Deep Learning Models for Insect Detection System with Field Adaption" Agriculture 13, no. 3: 741. https://doi.org/10.3390/agriculture13030741