Classification of Fruit Flies by Gender in Images Using Smartphones and the YOLOv4-Tiny Neural Network
Abstract
:1. Introduction
1.1. Biological Motivation
1.2. Related Works
1.3. Contribution of the Work
2. Materials and Methods
2.1. D. melanogaster Lines
2.2. Imaging Protocols
2.2.1. Images for Neural Network Training
2.2.2. Protocol for Taking Pictures of Flies on a Grid
2.3. Labelling Flies in Images
2.4. Preprocessing Step
2.5. Network Architecture
2.6. Estimation of the Fly Gender Recognition Performance
2.7. Synthetic Image Generation
2.8. Data Stratification
2.9. Training Strategies
2.10. Comparison of Expert Prediction Performance with Network Prediction
2.11. Mobile App FlyCounter for Fly Gender Recognition
3. Results
3.1. Fly Gender Recognition Performance by Neural Network
3.2. Comparison of the Performance of Automatic and Expert Recognition
3.3. Analysis of Factors Affecting the Accuracy of Recognition
3.4. Recognition Performance Analysis Depending on the Position of Flies
3.5. FlyCounter Mobile App
4. Discussion
4.1. Choosing the Network Model
4.2. Dataset Preparation and Expansion by Synthetic Images
4.3. Performance of the Network Model
4.4. Comparison with Other Methods and Experts’ Evaluation
4.5. Factors Affecting the Performance of the Method
5. Conclusions
Supplementary Materials
Author Contributions
Funding
Institutional Review Board Statement
Informed Consent Statement
Acknowledgments
Conflicts of Interest
Abbreviations
AUC | Area under the receiver operation curve |
AP | Average precision |
CNN | Convolutional neural network |
HSV | Color space described by Hue, Saturation, and Value components |
IoU | Intersection over union measure |
mAP | Mean average precision |
SVM | Support Vector machine |
References
- Neumüller, R.A.; Perrimon, N. Where gene discovery turns into systems biology: Genome-scale RNAi screens in Drosophila. Wiley Int. Rev. Syst. Biol. Med. 2011, 3, 471–478. [Google Scholar] [CrossRef] [PubMed] [Green Version]
- Pandey, U.B.; Nichols, C.D. Human disease models in Drosophila melanogaster and the role of the fly in therapeutic drug discovery. Pharmacol. Rev. 2011, 63, 411–436. [Google Scholar] [CrossRef] [PubMed] [Green Version]
- Adonyeva, N.V.; Menshanov, P.N.; Gruntenko, N.A. Link between atmospheric pressure and fertility of Drosophila laboratory strains. Insects 2021, 12, 947. [Google Scholar] [CrossRef]
- Karpova, E.K.; Komyshev, E.G.; Genaev, M.A.; Adonyeva, N.V.; Afonnikov, D.A.; Eremina, M.A.; Gruntenko, N.E. Quantifying Drosophila adults with the use of a smartphone. Biol. Open 2020, 9, bio054452. [Google Scholar] [CrossRef]
- Komyshev, E.G.; Genaev, M.A.; Afonnikov, D.A. Evaluation of the SeedCounter, a mobile application for grain phenotyping. Front. Plant Sci 2017, 7, 1990. [Google Scholar] [CrossRef] [Green Version]
- Høye, T.T.; Ärje, J.; Bjerge, K.; Hansen, O.L.; Iosifidis, A.; Leese, F.; Mann, H.M.R.; Meissner, K.; Melvad, C.; Raitoharju, J. Deep learning and computer vision will transform entomology. Proc. Nat. Acad. Sci. USA 2021, 118, e2002545117. [Google Scholar] [CrossRef] [PubMed]
- Gerovichev, A.; Sadeh, A.; Winter, V.; Bar-Massada, A.; Keasar, T.; Keasar, C. High throughput data acquisition and deep learning for insect ecoinformatics. Front. Ecol. Evol. 2021, 9, 309. [Google Scholar] [CrossRef]
- Cardim Ferreira Lima, M.; Damascena de Almeida Leandro, M.E.; Valero, C.; Pereira Coronel, L.C.; Gonçalves Bazzo, C.O. Automatic detection and monitoring of insect pests—A review. Agriculture 2020, 10, 161. [Google Scholar] [CrossRef]
- Barbedo, J.G.A. Detecting and classifying pests in crops using proximal images and machine learning: A review. AI 2020, 1, 312–328. [Google Scholar] [CrossRef]
- Alves, A.N.; Souza, W.S.; Borges, D.L. Cotton pests classification in field-based images using deep residual networks. Comp. Electron. Agricult. 2020, 174, 105488. [Google Scholar] [CrossRef]
- Ayan, E.; Erbay, H.; Varçın, F. Crop pest classification with a genetic algorithm-based weighted ensemble of deep convolutional neural networks. Comp. Electron. Agricult. 2020, 179, 105809. [Google Scholar] [CrossRef]
- Xie, C.; Zhang, J.; Li, R.; Li, J.; Hong, P.; Xia, J.; Chen, P. Automatic classification for field crop insects via multiple-task sparse representation and multiple-kernel learning. Comp. Electron. Agricult. 2015, 119, 123–132. [Google Scholar] [CrossRef]
- Ding, W.; Taylor, G. Automatic moth detection from trap images for pest management. Comp. Electron. Agricult. 2016, 123, 17–28. [Google Scholar] [CrossRef] [Green Version]
- Wang, J.; Li, Y.; Feng, H.; Ren, L.; Du, X.; Wu, J. Common pests image recognition based on deep convolutional neural network. Comp. Electron. Agricult. 2020, 179, 105834. [Google Scholar] [CrossRef]
- Liu, H.; Chahl, J.S. Proximal detecting invertebrate pests on crops using a deep residual convolutional neural network trained by virtual images. Artif. Intell. Agricult. 2021, 5, 13–23. [Google Scholar] [CrossRef]
- Toda, Y.; Okura, F.; Ito, J.; Okada, S.; Kinoshita, T.; Tsuji, H.; Saisho, D. Training instance segmentation neural network with synthetic datasets for crop seed phenotyping. Comm. Biol. 2020, 3, 173. [Google Scholar] [CrossRef] [Green Version]
- Tuda, M.; Luna-Maldonado, A.I. Image-based insect species and gender classification by trained supervised machine learning algorithms. Ecol. Inf. 2020, 60, 101135. [Google Scholar] [CrossRef]
- Roosjen, P.P.; Kellenberger, B.; Kooistra, L.; Green, D.R.; Fahrentrapp, J. Deep learning for automated detection of Drosophila suzukii: Potential for UAV-based monitoring. Pest Manag. Sci. 2020, 76, 2994–3002. [Google Scholar] [CrossRef] [PubMed]
- Redmon, J.; Divvala, S.; Girshick, R.; Farhadi, A. You Only Look Once: Unified, real-time object detection. J. Chem. Eng. Data 2015, 27, 306–308. [Google Scholar]
- Redmon, J.; Farhadi, A. YOLO9000: Better, faster, stronger. In Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition (CVPR), Honolulu, HI, USA, 21–26 July 2017; pp. 7263–7271. [Google Scholar]
- Redmon, J.; Farhadi, A. YOLOv3: An incremental improvement. arXiv 2018, arXiv:1804.02767. [Google Scholar]
- Bochkovskiy, A.; Wang, C.Y.; Liao, H.Y.M. YOLOv4: Optimal speed and accuracy of object detection. arXiv 2020, arXiv:2004.10934. [Google Scholar]
- Kuznetsova, A.; Maleva, T.; Soloviev, V. Using YOLOv3 algorithm with pre- and post-processing for apple detection in fruit harvesting robot. Agronomy 2020, 10, 1016. [Google Scholar] [CrossRef]
- Liu, G.; Nouaze, J.C.; Touko Mbouembe, P.L.; Kim, J.H. YOLO-tomato: A robust algorithm for tomato detection based on YOLOv3. Sensors 2020, 20, 2145. [Google Scholar] [CrossRef] [PubMed] [Green Version]
- Wu, D.; Lv, S.; Jiang, M.; Song, H. Using channel pruning-based YOLO v4 deep learning algorithm for the real-time and accurate detection of apple flowers in natural environments. Comp. Electron. Agricult. 2020, 178, 105742. [Google Scholar] [CrossRef]
- Wang, C.-Y.; Bochkovskiy, A.; Liao, H.-Y.M. Scaled-YOLOv4: Scaling cross stage partial network. arXiv 2020, arXiv:2011.08036. [Google Scholar]
- Parico, A.I.B.; Ahamed, T. Real time pear fruit detection and counting using YOLOv4 models and deep SORT. Sensors 2021, 21, 4803. [Google Scholar] [CrossRef] [PubMed]
- Li, F.; Liu, Z.; Shen, W.; Wang, Y.; Wang, Y.; Ge, C.; Sun, F.; Lan, P. A remote sensing and airborne edge-computing based detection system for pine wilt disease. IEEE Access 2021, 9, 66346–66360. [Google Scholar] [CrossRef]
- Wu, H.; Du, C.; Ji, Z.; Gao, M.; He, Z. SORT-YM: An algorithm of multi-object tracking with YOLOv4-tiny and motion prediction. Electronics 2021, 10, 2319. [Google Scholar] [CrossRef]
- Kulshreshtha, M.; Chandra, S.S.; Randhawa, P.; Tsaramirsis, G.; Khadidos, A.; Khadidos, A.O. OATCR: Outdoor autonomous trash-collecting robot design using YOLOv4-tiny. Electronics 2021, 10, 2292. [Google Scholar] [CrossRef]
- Ramalingam, B.; Mohan, R.E.; Pookkuttath, S.; Gómez, B.F.; Sairam Borusu, C.S.C.; Wee Teng, T.; Tamilselvam, Y.K. Remote insects trap monitoring system using deep learning framework and IoT. Sensors 2020, 20, 5280. [Google Scholar] [CrossRef]
- Zhong, Y.; Gao, J.; Lei, Q.; Zhou, Y. A vision-based counting and recognition system for flying insects in intelligent agriculture. Sensors 2018, 18, 1489. [Google Scholar] [CrossRef] [PubMed] [Green Version]
- Chen, J.-W.; Lin, W.-J.; Cheng, H.-J.; Hung, C.-L.; Lin, C.-Y.; Chen, S.-P. A smartphone-based application for scale pest detection using multiple-object detection methods. Electronics 2021, 10, 372. [Google Scholar] [CrossRef]
- Colomb, J.; Brembs, B. Sub-strains of Drosophila Canton-S differ markedly in their locomotor behavior. F1000Research 2015, 3, 176. [Google Scholar] [CrossRef] [PubMed] [Green Version]
- Mackay, T.F.C.; Lyman, R.F.; Jackson, M.S. Effects of P-element mutations on quantitative traits in Drosophila melanogaster. Genetics 1992, 130, 315–332. [Google Scholar] [CrossRef]
- Konaç, T.; Bozcuk, A.N.; Kence, A. The effect of hybrid dysgenesis on life span of Drosophila. AGE 1995, 18, 19–23. [Google Scholar] [CrossRef]
- Zakharenko, L.P.; Petrovskii, D.V.; Dorogova, N.V.; Putilov, A.A. Association between the effects of high temperature on fertility and sleep in female intra-specific hybrids of Drosophila melanogaster. Insects 2021, 12, 336. [Google Scholar] [CrossRef] [PubMed]
- Russell, B.C.; Torralba, A.; Murphy, K.P.; Freeman, W.T. LabelMe: A database and web-based tool for image annotation. Int. J. Comput. Vision. 2008, 77, 157–173. [Google Scholar] [CrossRef]
- Everingham, M.; van Gool, L.; Williams, C.K.; Winn, J.; Zisserman, A. The Pascal visual object classes (voc) challenge. Int. J. Comput. Vis. 2010, 88, 303–338. [Google Scholar] [CrossRef] [Green Version]
- Padilla, R.; Netto, S.L.; da Silva, E.A.B. A Survey on performance metrics for object-detection algorithms. In Proceedings of the 2020 International Conference on Systems, Signals and Image Processing (IWSSIP), Niteroi, Brazil, 1–3 July 2020; pp. 237–242. [Google Scholar]
- Yu, J.; Zhang, W. Face mask wearing detection algorithm based on improved YOLO-v4. Sensors 2021, 21, 3263. [Google Scholar] [CrossRef]
- Busin, L.; Vandenbroucke, N.; Macaire, L. Color spaces and image segmentation. Adv. Imaging Electron Phys. 2008, 151, 65–168. [Google Scholar]
- Kaehler, A.; Bradski, G. Learning OpenCV 3: Computer Vision in C++ with the OpenCV Library; O’Reilly Media, Inc.: Sebastopol, CA, USA, 2016. [Google Scholar]
Device (Number of Cores/Memory Size) | Main Camera Configuration, Aperture |
---|---|
Xiaomi Mi Max 3 (4/64GB) | 12 MP, f/1.90; 5 MP |
Xiaomi Mi Note 10 Lite (6/64 GB) | 64 MP, f/1.89; 9 MP, f/2.20; 5 MP, f/2.40; 2 MP, f/2.40 |
Xiaomi Redmi 5 (3/32GB) | 12 MP, f/2.20 |
Samsung Galaxy A3 (SM-A320F) | 13 MP, f/1.90 |
Samsung Galaxy J2 (SM-J250F) | 8 MP, f/2.2 |
Sony Xperia XA | 13 MP, f/2.0 |
Xiaomi Redmi Note 8T (3/32 GB) | 48 MP, f/1.75; 8 MP, f/2.20; 2 MP, f/2.40; 2 MP, f/2.40 |
Xiaomi Redmi Note 9S (4/64 GB) | 48 MP, f/1.79; 8 MP, f/2.20; 5 MP, f/2.40; 2 MP, f/2.40 |
Canon EOS 5D Mark IV | Canon EF 100 mm f/2.8 L lens, aperture 5.0, shutter speed 1/100 sec, ISO 250, manual focus mode |
Device | Number of Images for Training/Validation/Testing |
---|---|
Xiaomi Mi Max 3 (4/64 GB) | 111/9/3 |
Xiaomi Mi Note 10 Lite (6/64 GB) | 8/7/3 |
Xiaomi Redmi 5 (3/32 GB) | 97/9/3 |
Samsung Galaxy A3 (SM-A320F) | 17/0/0 |
Samsung Galaxy J2 (SM-J250F) | 99/7/3 |
Sony Xperia XA | 16/0/0 |
Xiaomi Redmi Note 8T (3/32 GB) | 0/0/23 |
Xiaomi Redmi Note 9S (4/64 GB) | 0/0/7 |
Canon EOS 5D Mark IV | 17/9/0 |
Training Strategy | Validation, Precision | Validation, Recall | Validation, F1 | Test, Precision | Test, Recall | Test, F1 |
---|---|---|---|---|---|---|
YOLOv4-tiny-base | 0.741 | 0.953 | 0.834 | 0.628 | 0.981 | 0.766 |
YOLOv4-tiny-synt | 0.852 | 0.966 | 0.905 | 0.700 | 0.980 | 0.819 |
YOLOv4-tiny-synt + mosaic | 0.860 | 0.951 | 0.904 | 0.726 | 0.991 | 0.838 |
Prediction Source | Image Source | Precision |
---|---|---|
Expert 1 | Mobile device | 0.732 |
Expert 2 | Mobile device | 0.713 |
Expert 2 | Mobile device | 0.704 |
YOLOv4-tiny-synt + mosaic | Mobile device | 0.726 |
Expert 1 | Canon 5D Mark IV | 0.900 |
Expert 2 | Canon 5D Mark IV | 0.880 |
Expert 2 | Canon 5D Mark IV | 0.920 |
YOLOv4-tiny-synt + mosaic | Canon 5D Mark IV | 0.824 |
Device | Image Number | TP | FP | FN | F1 Measure |
---|---|---|---|---|---|
Xiaomi Redmi 5 | 3 | 39 | 33 | 1 | 0.696 |
Xiaomi Mi Max 3 | 3 | 41 | 32 | 0 | 0.719 |
Xiaomi Redmi Note 9S | 7 | 123 | 79 | 2 | 0.752 |
Xiaomi Mi Note 10 Lite | 3 | 47 | 20 | 0 | 0.824 |
Samsung J2 | 3 | 53 | 20 | 0 | 0.841 |
Xiaomi Redmi Note 8T | 23 | 535 | 132 | 4 | 0.887 |
Illumination, lm | Image Number | TP | FP | FN | F1 Measure |
---|---|---|---|---|---|
400 | 6 | 134 | 40 | 1 | 0.867 |
600 | 8 | 180 | 52 | 2 | 0.869 |
800 | 9 | 221 | 40 | 1 | 0.915 |
Without measurement | 19 | 303 | 184 | 3 | 0.764 |
Line | Image Number | TP | FP | FN | F1 Measure |
---|---|---|---|---|---|
Canton-S | 12 | 222 | 89 | 0 | 0.833 |
Harwich | 30 | 616 | 227 | 7 | 0.840 |
Gender | Image Number | TP | FP | FN | F1 Measure |
---|---|---|---|---|---|
Female | 574 | 466 | 211 | 4 | 0.812 |
Male | 581 | 372 | 105 | 3 | 0.873 |
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations. |
© 2022 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
Genaev, M.A.; Komyshev, E.G.; Shishkina, O.D.; Adonyeva, N.V.; Karpova, E.K.; Gruntenko, N.E.; Zakharenko, L.P.; Koval, V.S.; Afonnikov, D.A. Classification of Fruit Flies by Gender in Images Using Smartphones and the YOLOv4-Tiny Neural Network. Mathematics 2022, 10, 295. https://doi.org/10.3390/math10030295
Genaev MA, Komyshev EG, Shishkina OD, Adonyeva NV, Karpova EK, Gruntenko NE, Zakharenko LP, Koval VS, Afonnikov DA. Classification of Fruit Flies by Gender in Images Using Smartphones and the YOLOv4-Tiny Neural Network. Mathematics. 2022; 10(3):295. https://doi.org/10.3390/math10030295
Chicago/Turabian StyleGenaev, Mikhail A., Evgenii G. Komyshev, Olga D. Shishkina, Natalya V. Adonyeva, Evgenia K. Karpova, Nataly E. Gruntenko, Lyudmila P. Zakharenko, Vasily S. Koval, and Dmitry A. Afonnikov. 2022. "Classification of Fruit Flies by Gender in Images Using Smartphones and the YOLOv4-Tiny Neural Network" Mathematics 10, no. 3: 295. https://doi.org/10.3390/math10030295
APA StyleGenaev, M. A., Komyshev, E. G., Shishkina, O. D., Adonyeva, N. V., Karpova, E. K., Gruntenko, N. E., Zakharenko, L. P., Koval, V. S., & Afonnikov, D. A. (2022). Classification of Fruit Flies by Gender in Images Using Smartphones and the YOLOv4-Tiny Neural Network. Mathematics, 10(3), 295. https://doi.org/10.3390/math10030295