Detection of Power Poles in Orchards Based on Improved Yolov5s Model
Abstract
:1. Introduction
2. Materials and Methods
2.1. Power Pole Data Collection
2.2. Introduction of Yolov5s Model
2.3. Model Improvement
2.3.1. Mixup Data Enhancement
2.3.2. Replace the GhostBottleneck Module
2.3.3. Adding SA Module
2.4. Model Evaluation Indicators
- Precision refers to the proportion of predicted power poles that are truly power poles. For the proportion of correctly predicted samples to the number of samples predicted as power poles, the calculation formula is as follows:
- 2.
- Recall refers to the proportion of truly power pole samples that are correctly predicted to be power poles. The calculation formula is as follows:
- 3.
- mAP@50 is a composite indicator. mAP@50 is obtained by calculating the area under the precision and recall curves with the following formula. Specifically, mAP@50 is the average value of AP that calculated for all categories when the IoU threshold is 0.5.
3. Results
3.1. Yolov5s Experiment
3.2. Experiment with the Improved Yolov5s
3.2.1. Ablation Experiments
3.2.2. Different Brightness Comparison
3.2.3. Comparison of Yolov5s-Pole and Yolov5s Effects
4. Discussion
4.1. Comparison of Different Models
4.2. The Effect of Brightness
4.3. Adding Identification Classes
5. Conclusions
Author Contributions
Funding
Data Availability Statement
Acknowledgments
Conflicts of Interest
References
- Lan, Y.; Chen, S. Current Status and Trends of Plant Protection UAV and Its Spraying Technology in China. Int. J. Precis. Agric. Aviat. 2018, 1, 1–9. [Google Scholar] [CrossRef]
- Yang, S.; Yang, X.; Mo, J. The Application of Unmanned Aircraft Systems to Plant Protection in China. Precis. Agric. 2018, 19, 278–292. [Google Scholar] [CrossRef]
- Huang, Y.; Thomson, S.J.; Hoffmann, W.C.; Lan, Y.; Fritz, B.K. Development and Prospect of Unmanned Aerial Vehicle Technologies for Agricultural Production Management. Int. J. Agric. Biol. Eng. 2013, 6, 1–10. [Google Scholar] [CrossRef]
- Xiongkui, H.; Bonds, J.; Herbst, A.; Langenakens, J. Recent Development of Unmanned Aerial Vehicle for Plant Protection in East Asia. Int. J. Agric. Biol. Eng. 2017, 10, 18–30. [Google Scholar] [CrossRef]
- Uche, U.E.; Audu, S.T. UAV for Agrochemical Application: A Review. Niger. J. Technol. 2021, 40, 795–809. [Google Scholar] [CrossRef]
- Wang, L.; Lan, Y.; Zhang, Y.; Zhang, H.; Tahir, M.N.; Ou, S.; Liu, X.; Chen, P. Applications and Prospects of Agricultural Unmanned Aerial Vehicle Obstacle Avoidance Technology in China. Sensors 2019, 19, 642. [Google Scholar] [CrossRef] [Green Version]
- Qian, S.; Dai, S. Identification of High-Speed Railway Trackside Equipments Based on YOLOv4. In Proceedings of the 2022 China Automation Congress (CAC), Xiamen, China, 25–27 November 2022; pp. 3635–3640. [Google Scholar]
- Clothier, R.; Walker, R. Determination and Evaluation of UAV Safety Objectives. In Proceedings of the 21st International Conference on Unmanned Air Vehicle Systems, Dubrovnik, Croatia, 21–24 June 2022; Hugo, S., Ed.; University of Bristol: Bristol, UK, 2006; pp. 18.1–18.16. ISBN 978-0-9552644-0-5. [Google Scholar]
- Balestrieri, E.; Daponte, P.; De Vito, L.; Picariello, F.; Tudosa, I. Sensors and Measurements for UAV Safety: An Overview. Sensors 2021, 21, 8253. [Google Scholar] [CrossRef]
- Cao, G.; Li, Y.; Nan, F.; Liu, D.; Chen, C.; Zhang, J. Development and Analysis of Plant Protection UAV Flight Control System and Route Planning Research. Trans. Chin. Soc. Agric. Mach. 2020, 51, 1–16. [Google Scholar]
- ShengDe, C.; YuBin, L.; ZhiYan, Z.; JiYu, L.; Fan, O.; XiaoJie, X.; WeiXiang, Y. Test and evaluation for flight quality of aerial spraying of plant protection UAV. J. South China Agric. Univ. 2019, 40, 89–96. [Google Scholar]
- Chen, Y.; Pi, D.; Xu, Y. Neighborhood Global Learning Based Flower Pollination Algorithm and Its Application to Unmanned Aerial Vehicle Path Planning. Expert Syst. Appl. 2021, 170, 114505. [Google Scholar] [CrossRef]
- Broussard, M.A.; Coates, M.; Martinsen, P. Artificial Pollination Technologies: A Review. Agronomy 2023, 13, 1351. [Google Scholar] [CrossRef]
- Wang, X.; Sun, H.; Long, Y.; Zheng, L.; Liu, H.; Li, M. Development of Visualization System for Agricultural UAV Crop Growth Information Collection. IFAC-PapersOnLine 2018, 51, 631–636. [Google Scholar] [CrossRef]
- Tsouros, D.C.; Bibi, S.; Sarigiannidis, P.G. A Review on UAV-Based Applications for Precision Agriculture. Information 2019, 10, 349. [Google Scholar] [CrossRef] [Green Version]
- Xie, C.; Yang, C. A Review on Plant High-Throughput Phenotyping Traits Using UAV-Based Sensors. Comput. Electron. Agric. 2020, 178, 105731. [Google Scholar] [CrossRef]
- Velusamy, P.; Rajendran, S.; Mahendran, R.K.; Naseer, S.; Shafiq, M.; Choi, J.-G. Unmanned Aerial Vehicles (UAVs) in Precision Agriculture. Applications and Challenges. Energies 2022, 15, 217. [Google Scholar] [CrossRef]
- Morgan, B.E.; Chipman, J.W.; Bolger, D.T.; Dietrich, J.T. Spatiotemporal Analysis of Vegetation Cover Change in a Large Ephemeral River: Multi-Sensor Fusion of Unmanned Aerial Vehicle (UAV) and Landsat Imagery. Remote Sens. 2021, 13, 51. [Google Scholar] [CrossRef]
- Fei, S.; Hassan, M.A.; Xiao, Y.; Su, X.; Chen, Z.; Cheng, Q.; Duan, F.; Chen, R.; Ma, Y. UAV-Based Multi-Sensor Data Fusion and Machine Learning Algorithm for Yield Prediction in Wheat. Precis. Agric. 2023, 24, 187–212. [Google Scholar] [CrossRef]
- Mokhtari, A.; Ahmadi, A.; Daccache, A.; Drechsler, K. Actual Evapotranspiration from UAV Images: A Multi-Sensor Data Fusion Approach. Remote Sens. 2021, 13, 2315. [Google Scholar] [CrossRef]
- Qiu, Z.; Zhao, N.; Zhou, L.; Wang, M.; Yang, L.; Fang, H.; He, Y.; Liu, Y. Vision-Based Moving Obstacle Detection and Tracking in Paddy Field Using Improved Yolov3 and Deep SORT. Sensors 2020, 20, 4082. [Google Scholar] [CrossRef]
- Chen, B.; Zhang, M.; Xu, H.; Li, H.; Yin, Y. Farmland Obstacle Detection in Panoramic Image Based on Improved YOLO v3—Tiny. Trans. Chin. Soc. Agric. Mach. 2021, 52, 58–65. [Google Scholar]
- Chen, B.; Miao, X. Distribution Line Pole Detection and Counting Based on YOLO Using UAV Inspection Line Video. J. Electr. Eng. Technol. 2020, 15, 441–448. [Google Scholar] [CrossRef]
- Basso, M.; Stocchero, D.; Ventura Bayan Henriques, R.; Vian, A.L.; Bredemeier, C.; Konzen, A.A.; Pignaton de Freitas, E. Proposal for an Embedded System Architecture Using a GNDVI Algorithm to Support UAV-Based Agrochemical Spraying. Sensors 2019, 19, 5397. [Google Scholar] [CrossRef] [PubMed] [Green Version]
- Ki, M.; Cha, J.; Lyu, H. Detect and Avoid System Based on Multi Sensor Fusion for UAV. In Proceedings of the 2018 International Conference on Information and Communication Technology Convergence (ICTC), Jeju Island, Republic of Korea, 17–19 October 2018; pp. 1107–1109. [Google Scholar]
- Liu, X.; Li, Y.; Shuang, F.; Gao, F.; Zhou, X.; Chen, X. ISSD: Improved SSD for Insulator and Spacer Online Detection Based on UAV System. Sensors 2020, 20, 6961. [Google Scholar] [CrossRef]
- Yu, Z.; Ye, J.; Li, C.; Zhou, H.; Li, X. TasselLFANet: A Novel Lightweight Multi-Branch Feature Aggregation Neural Network for High-Throughput Image-Based Maize Tassels Detection and Counting. Front. Plant Sci. 2023, 14, 1158940. [Google Scholar] [CrossRef] [PubMed]
- Zhang, H.; Cisse, M.; Dauphin, Y.N.; Lopez-Paz, D. Mixup: Beyond Empirical Risk Minimization. arXiv 2018, arXiv:1710.09412. [Google Scholar]
- Han, K.; Wang, Y.; Tian, Q.; Guo, J.; Xu, C.; Xu, C. GhostNet: More Features from Cheap Operations. In Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR), Seattle, WA, USA, 14–19 June 2020; pp. 1580–1589. [Google Scholar]
- Zhang, Q.-L.; Yang, Y.-B. SA-Net: Shuffle Attention for Deep Convolutional Neural Networks. In Proceedings of the ICASSP 2021–2021 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP), Toronto, ON, Canada, 6–11 June 2021; pp. 2235–2239. [Google Scholar]
- Steffens, C.R.; Messias, L.R.V.; Drews, P.L.J.; da Costa Botelho, S.S. Can Exposure, Noise and Compression Affect Image Recognition? An Assessment of the Impacts on State-of-the-Art ConvNets. In Proceedings of the 2019 Latin American Robotics Symposium (LARS), 2019 Brazilian Symposium on Robotics (SBR) and 2019 Workshop on Robotics in Education (WRE), Joao Pessoa, Brazil, 6–10 October 2019; pp. 61–66. [Google Scholar]
- Wang, C.-Y.; Bochkovskiy, A.; Liao, H.-Y.M. YOLOv7: Trainable Bag-of-Freebies Sets New State-of-the-Art for Real-Time Object Detectors. In Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR), New Orleans, LA, USA, 18 June 2022. [Google Scholar]
- Kedzierski, M.; Wierzbicki, D. Radiometric Quality Assessment of Images Acquired by UAV’s in Various Lighting and Weather Conditions. Measurement 2015, 76, 156–169. [Google Scholar] [CrossRef]
- YuBin, L.; LinLin, W.; YaLi, Z. Application and prospect on obstacle avoidance technology for agricultural UAV. Trans. Chin. Soc. Agric. Eng. 2018, 34, 104–113. [Google Scholar]
Components | Parameters |
---|---|
CPU | AMD Ryzen 7 5800X 8-Core Processor (3801 MHz) |
Motherboard | MAG B550M MORTAR WIFI (MS-7C94) |
Memory | 16.00 GB (2133 MHz) |
Main Hard Drive | 1000 GB (KIOXIA-EXCERIA SSD) |
Video Cards | NVIDIA GeForce RTX 3060 (12,288 MB) |
Monitors | Redmi Monitor\u0001j 32-bit true color 60 Hz |
Hyperparameters | Value |
---|---|
Batch_size | 16 |
Steps | 300 |
Lr0 | 0.01 |
Lrf | 0.01 |
Momentum | 0.937 |
Weight_decay | 0.0005 |
Image Size | Precision | Recall | mAP@50 | Inference Time (ms) |
---|---|---|---|---|
416 | 0.803 | 0.7 | 0.719 | 2.8 |
512 | 0.804 | 0.7 | 0.722 | 4.7 |
640 | 0.798 | 0.731 | 0.746 | 6.5 |
Brightness Classification | Precision | Recall | mAP@50 | Inference Time (ms) |
---|---|---|---|---|
Dark | 0.758 | 0.683 | 0.696 | 4.8 |
Medium | 0.797 | 0.756 | 0.74 | 6.7 |
Bright | 0.838 | 0.88 | 0.865 | 6.7 |
Group | Mixup | GhostBottleneck | SA |
---|---|---|---|
A | |||
B | √ | ||
C | √ | ||
D | √ | ||
E | √ | √ | |
F | √ | √ | √ |
Group | Precision | Recall | mAP@50 | Weights (MB) | Layers | Parameters (Pieces) | GFLOPs | Inference Time (ms) |
---|---|---|---|---|---|---|---|---|
A | 0.798 | 0.731 | 0.746 | 13.6 | 214 | 7,022,326 | 15.9 | 6.5 |
B | 0.825 | 0.796 | 0.836 | 13.6 | 214 | 7,022,326 | 15.9 | 6.4 |
C | 0.864 | 0.732 | 0.783 | 7.86 | 271 | 3,974,118 | 9 | 4.2 |
D | 0.812 | 0.785 | 0.79 | 13.6 | 218 | 7,022,518 | 15.9 | 6.6 |
E | 0.823 | 0.8 | 0.812 | 7.86 | 271 | 3,974,118 | 9 | 4.1 |
F | 0.803 | 0.831 | 0.838 | 7.86 | 275 | 3,974,310 | 9 | 4.2 |
Group | Dark | Medium | Bright | ||||||
---|---|---|---|---|---|---|---|---|---|
Precision | Recall | mAP@50 | Precision | Recall | mAP@50 | Precision | Recall | mAP@50 | |
A | 0.758 | 0.683 | 0.696 | 0.797 | 0.756 | 0.74 | 0.838 | 0.88 | 0.865 |
B | 0.756 | 0.8 | 0.786 | 0.821 | 0.817 | 0.849 | 0.944 | 0.88 | 0.958 |
C | 0.804 | 0.684 | 0.716 | 0.91 | 0.733 | 0.807 | 0.913 | 0.839 | 0.893 |
D | 0.817 | 0.745 | 0.736 | 0.785 | 0.822 | 0.799 | 0.817 | 0.894 | 0.917 |
E | 0.827 | 0.75 | 0.784 | 0.816 | 0.778 | 0.803 | 0.905 | 0.88 | 0.915 |
F | 0.742 | 0.783 | 0.754 | 0.955 | 0.778 | 0.871 | 0.961 | 0.986 | 0.99 |
Model | Precision | Recall | mAP@50 | Weights (MB) | Layers | Parameters (Pieces) | GFLOPs | Inference Time (ms) |
---|---|---|---|---|---|---|---|---|
Yolov5s | 0.798 | 0.731 | 0.746 | 13.6 | 214 | 7,022,326 | 15.9 | 6.5 |
Yolov5m | 0.768 | 0.765 | 0.784 | 40.1 | 291 | 20,871,318 | 48.2 | 7.8 |
Yolov5l | 0.796 | 0.812 | 0.823 | 88.4 | 368 | 46,138,294 | 108.2 | 13 |
Yolov7-tiny | 0.775 | 0.769 | 0.783 | 11.6 | 255 | 6,014,038 | 13.2 | 3 |
Yolov5-Pole | 0.803 | 0.831 | 0.838 | 7.86 | 275 | 3,974,310 | 9 | 4.2 |
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content. |
© 2023 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
Zhang, Y.; Lu, X.; Li, W.; Yan, K.; Mo, Z.; Lan, Y.; Wang, L. Detection of Power Poles in Orchards Based on Improved Yolov5s Model. Agronomy 2023, 13, 1705. https://doi.org/10.3390/agronomy13071705
Zhang Y, Lu X, Li W, Yan K, Mo Z, Lan Y, Wang L. Detection of Power Poles in Orchards Based on Improved Yolov5s Model. Agronomy. 2023; 13(7):1705. https://doi.org/10.3390/agronomy13071705
Chicago/Turabian StyleZhang, Yali, Xiaoyang Lu, Wanjian Li, Kangting Yan, Zhenjie Mo, Yubin Lan, and Linlin Wang. 2023. "Detection of Power Poles in Orchards Based on Improved Yolov5s Model" Agronomy 13, no. 7: 1705. https://doi.org/10.3390/agronomy13071705
APA StyleZhang, Y., Lu, X., Li, W., Yan, K., Mo, Z., Lan, Y., & Wang, L. (2023). Detection of Power Poles in Orchards Based on Improved Yolov5s Model. Agronomy, 13(7), 1705. https://doi.org/10.3390/agronomy13071705