Next Article in Journal
A Comprehensive Assessment of Rice Straw Returning in China Based on Life Cycle Assessment Method: Implications on Soil, Crops, and Environment
Previous Article in Journal
Sowing Date as a Factor Affecting Soybean Yield—A Case Study in Poland
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
This is an early access version, the complete PDF, HTML, and XML versions will be available soon.
Article

Multi-Feature Fusion Recognition and Localization Method for Unmanned Harvesting of Aquatic Vegetables

by
Xianping Guan
1,2,*,
Longyuan Shi
1,2,
Weiguang Yang
1,2,
Hongrui Ge
1,2,
Xinhua Wei
1,2 and
Yuhan Ding
3
1
Key Laboratory of Modern Agricultural Equipment and Technology, Ministry of Education, Jiangsu University, Zhenjiang 212013, China
2
School of Agricultural Engineering, Jiangsu University, Zhenjiang 212013, China
3
School of Electrical and Information Engineering, Jiangsu University, Zhenjiang 212013, China
*
Author to whom correspondence should be addressed.
Agriculture 2024, 14(7), 971; https://doi.org/10.3390/agriculture14070971
Submission received: 20 May 2024 / Revised: 17 June 2024 / Accepted: 19 June 2024 / Published: 21 June 2024
(This article belongs to the Section Digital Agriculture)

Abstract

The vision-based recognition and localization system plays a crucial role in the unmanned harvesting of aquatic vegetables. After field investigation, factors such as illumination, shading, and computational cost have become the main difficulties restricting the identification and positioning of Brasenia schreberi. Therefore, this paper proposes a new lightweight detection method, YOLO-GS, which integrates feature information from both RGB and depth images for recognition and localization tasks. YOLO-GS employs the Ghost convolution module as a replacement for traditional convolution and innovatively introduces the C3-GS, a cross-stage module, to effectively reduce parameters and computational costs. With the redesigned detection head structure, its feature extraction capability in complex environments has been significantly enhanced. Moreover, the model utilizes Focal EIoU as the regression loss function to mitigate the adverse effects of low-quality samples on gradients. We have developed a data set of Brasenia schreberi that covers various complex scenarios, comprising a total of 1500 images. The YOLO-GS model, trained on this dataset, achieves an average accuracy of 95.7%. The model size is 7.95 MB, with 3.75 M parameters and a 9.5 GFLOPS computational cost. Compared to the original YOLOv5s model, YOLO-GS improves recognition accuracy by 2.8%, reduces the model size and parameter number by 43.6% and 46.5%, and offers a 39.9% reduction in computational requirements. Furthermore, the positioning errors of picking points are less than 5.01 mm in the X direction, 3.65 mm in the Y direction, and 1.79 mm in the Z direction. As a result, YOLO-GS not only excels with high recognition accuracy but also exhibits low computational demands, enabling precise target identification and localization in complex environments so as to meet the requirements of real-time harvesting tasks.
Keywords: deep learning; localization; object detection; lightweight network deep learning; localization; object detection; lightweight network

Share and Cite

MDPI and ACS Style

Guan, X.; Shi, L.; Yang, W.; Ge, H.; Wei, X.; Ding, Y. Multi-Feature Fusion Recognition and Localization Method for Unmanned Harvesting of Aquatic Vegetables. Agriculture 2024, 14, 971. https://doi.org/10.3390/agriculture14070971

AMA Style

Guan X, Shi L, Yang W, Ge H, Wei X, Ding Y. Multi-Feature Fusion Recognition and Localization Method for Unmanned Harvesting of Aquatic Vegetables. Agriculture. 2024; 14(7):971. https://doi.org/10.3390/agriculture14070971

Chicago/Turabian Style

Guan, Xianping, Longyuan Shi, Weiguang Yang, Hongrui Ge, Xinhua Wei, and Yuhan Ding. 2024. "Multi-Feature Fusion Recognition and Localization Method for Unmanned Harvesting of Aquatic Vegetables" Agriculture 14, no. 7: 971. https://doi.org/10.3390/agriculture14070971

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop