Next Article in Journal
Agronomic Biofortification Increases Concentrations of Zinc and Storage Proteins in Cowpea Grains
Previous Article in Journal
YOLOv8MS: Algorithm for Solving Difficulties in Multiple Object Tracking of Simulated Corn Combining Feature Fusion Network and Attention Mechanism
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

YOLOv7-Based Intelligent Weed Detection and Laser Weeding System Research: Targeting Veronica didyma in Winter Rapeseed Fields

1
Research Institute of Motor and Intelligent Control Technology, Taizhou University, Taizhou 318000, China
2
School of Mechanical and Power Engineering, Harbin University of Science and Technology, Harbin 150080, China
*
Author to whom correspondence should be addressed.
Agriculture 2024, 14(6), 910; https://doi.org/10.3390/agriculture14060910
Submission received: 7 May 2024 / Revised: 5 June 2024 / Accepted: 6 June 2024 / Published: 8 June 2024
(This article belongs to the Section Digital Agriculture)

Abstract

:
In recent years, rapeseed oil has received considerable attention in the agricultural sector, experiencing appreciable growth. However, weed-related challenges are hindering the expansion of rapeseed production. This paper outlines the development of an intelligent weed detection and laser weeding system—a non-chemical and precision agricultural protection method of weeding Veronica didyma in winter rapeseed fields in the Yangtze River Basin. A total of 234 Veronica didyma images were obtained to compile a database for a deep-learning model, and YOLOv7 was used as the detection model for training. The effectiveness of the model was demonstrated, with a final accuracy of 94.94%, a recall of 95.65%, and a [email protected] of 0.972 obtained. Subsequently, parallel-axis binocular cameras were selected as the image acquisition platform, with binocular calibration and semi-global block matching used to locate Veronica didyma within a cultivation box, yielding a minimum confidence and camera height values of 70% and 30 cm, respectively. The intelligent weed detection and laser weeding system was then built, and the experimental results indicated that laser weeding was practicable with a 100 W power and an 80 mm/s scanning speed, resulting in visibly lost activity in Veronica didyma and no resprouting within 15 days of weeding. The successful execution of Veronica didyma detection and laser weeding provides a new reference for the precision agricultural protection of rapeseed in winter and holds promise for its practical application in agricultural settings.

1. Introduction

Rapeseed oil is abundant in nutrients and stands as a widely distributed and extensively cultivated oil crop in China, holding a crucial position in the Chinese edible oil market [1,2,3]. The Chinese government has explicitly emphasized the need to vigorously implement edible oil production capacity enhancement projects, as well as proposed to develop winter fallow fields and expand rapeseed cultivation in the Yangtze River Basin. These efforts play a significant role in boosting rapeseed oil production, advancing the rapeseed industry, ensuring a secure supply of edible oil, promoting sustainable agricultural development, and driving rural economic growth [4,5]. However, in current agricultural production, climate conditions favorable for weed growth in the middle and lower reaches of the Yangtze River pose a challenge to the cultivation of rapeseed. Here, broadleaved weeds—particularly members of the Poaceae family—have emerged as critical factors affecting rapeseed yield [6,7]. The competition between weeds and young rapeseed seedlings for essential nutrients like light and water significantly impacts rapeseed growth and development. Additionally, weeds in the field can act as intermediate hosts for rapeseed pathogenic bacteria and pests, substantially increasing the likelihood of disease and pest outbreaks [8,9]. Therefore, it is crucial to conduct weeding during rapeseed cultivation.
Weed control has been a long-standing problem in the agricultural sector. In general, weeds can cause a 10% to 20% reduction in rapeseed production, and in severe cases, it can even reduce production by 50% [10]. The main methods of weed control currently used include manual weeding [11], chemical weeding [12], mechanical weeding [13,14], biotechnological weeding [15], and thermal power weeding [16,17]. Among them, manual weeding, chemical weeding, and mechanical weeding are the most commonly used methods in rapeseed fields. Manual weeding has good efficacy and causes minimal damage to the field environment and crops, but requires a large amount of labor [18]. Chemical weeding kills weeds efficiently and rapidly, especially for large-scale farmland, but most herbicides are toxic to humans and may also pollute the soil, water sources, and air, disrupting the ecological balance [19]. Mechanical weeding refers to a method of weeding that destroys the stems, leaves, or roots of weeds through rotation, pulling, and other movements of the end-effector of the machine, making the weeds unable to continue growing or removing them from the field; however, this method has drawbacks, such as the turning-over of dead soil and the high risk of damaging crop roots [20].
The main goal of weeding is to provide the most suitable method of ensuring sustainable agricultural ecosystems, minimizing the impact of harmful weeds in various situations, and increasing agricultural yield [21,22,23]. Precision weeding [24] can reduce labor, protect people from exposure to herbicides, and minimize the negative impact on the ecological environment and crops. Therefore, using automated robotic systems equipped with high-powered laser weed-killing equipment for precise weed management is a promising weeding solution for sustainable agricultural ecosystems.
In recent years, the fields of machine vision, deep learning (DL), and object detection have attracted much attention and achieved rapid development, including in the agricultural sector [25]. DL technology, distinguished using artificial neural networks (ANNs), is currently positioned at the forefront of advancements in weed detection [26]. The accurate detection of weeds is the first step to achieving precise weeding for weed targets. To this end, Lee [27] et al. designed an intelligent weed recognition system that focuses on weeds in experimental fields and extracts their shape contours as the main features, achieving a recognition accuracy of 68.8%, with a single image taking 344 ms. Partel [28] et al. used datasets of sunflowers and weeds, and peppers and weeds (purslane), to train the You Only Live Once (YOLO) v3 network; the final accuracy and recall rate of the entire network model exceeded 95% and 89%, respectively. Zhang [29] et al. studied a classification system for lettuce seedlings and weeds based on a squeeze-and-excitation-YOLOv5x model and, based on the plants’ morphological characteristics, to detect the position of lettuce stem emergence points, with a classification accuracy of 97.14%, providing a theoretical reference for weed control automation. Nitin Rai [30,31] et al. used the YOLO-Spot M model to identify weeds in images captured by a drone, from recognition to localization in 72.4 ms, with a recognition accuracy up to 84%.
Another key point in precision weeding is the problem of locating weeds. Based on the application of binocular stereo vision in agriculture, Liu [32] et al. proposed a pineapple detection and localization method based on binocular stereo vision and an improved YOLOv3 model, achieving an average absolute error of 24.414 mm and an average relative error of 1.17% at a distance range of 1.7–2.7 m. Sun [33] et al. used a MATLAB calibration tool to perform stereo calibration on the cameras, combined with OpenCV for stereo correction and stereo matching. The authors then calculated the distance to the camera using parallax, and a distance measurement experiment was conducted in the VS2013 environment, with results showing an error of less than 5% within 2 m. Wu [34] et al. used binocular stereo vision principles to obtain location information on ripe tomatoes through DL, calculated the depth information, and controlled a robot arm to grasp tomatoes, resulting in a picking success rate of 82%. Li [35] et al. proposed an improved apple binocular localization method based on DL fruit detection, which detected apples in binocular images using a faster region-convolutional neural network (R-CNN) model, achieving an average standard deviation of 0.51 cm for apple localization. In recent years, due to the extreme irregularity of weed traits and the complexity of field environments, research on the precise positioning of weeds in the field using binocular stereo vision has begun to receive attention in agricultural research, such as for fruit and vegetable picking [36,37], disease and pest detection [38,39], and so forth. Binocular stereo vision technology has also made progress in agricultural applications; for example, Tang [40] et al. studied a fruit detection and positioning technology for oil tea plantations based on an improved YOLOv4-tiny model and binocular stereo vision. The binocular camera used in the study calculated the median absolute deviation of oil tea fruits under sunlit and shady conditions to be 7.420 and 7.428 mm, respectively. The positioning accuracy met the application requirements of picking robots. Özlüoymak [41] used binocular vision technology to carefully locate and detect samples of one artificial crop and six artificial weeds in laboratory conditions, and achieved measurement errors of less than 3.50% and less than 4.20%, respectively. Zhang [42] et al. focused on a binocular stereo vision-based canopy volume extraction system for precision pesticide application by an Unmanned Aerial Vehicle (UAV). When the speed of the UAV was set to 2 m/s, the maximum errors between the measured volume of the rectangular and triangular systems used and the actual volume were 6.58% and 9.37%, respectively, meeting the accuracy requirements for spraying.
Laser technology is widely applied in agriculture due to its advantages, fast propagation speed, good concentration, and high energy density, such as for crop growth monitoring [43], farmland measurement [44], and the navigation and autopilot of agricultural vehicles [45]. Laser technology has also been applied to weed-killing treatments, whereby plants are precisely irradiated by high-energy laser beams to destroy plant cells and achieve complete weed eradication. For example, Mathiassen [46] et al. investigated the effects of laser treatment targeting the apical meristematic tissues of selected weeds at the cotyledon stage, including the effects of different laser types, spot sizes, and irradiation time durations on weed control, to experimentally verify the effect of laser beams on weed growth and thus determine the feasibility of laser weed control. Shah [47] et al. developed an innovative method of the on-board laser treatment of weeds in rows, achieving a weed control efficacy of 63%.
However, systematic research into the precision management of weeds in winter rapeseed fields that combines deep-learning models, high-energy lasers, and agricultural weeding robots is lacking. To resolve this, this paper aims to build an intelligent weed detection and laser weeding system based on YOLOv7, and to realize the intelligent detection and precise management of the weed Veronica didyma in winter rapeseed fields in the Yangtze River Basin.
The main research contents and contributions of this paper are as follows: (1) Multiple images of Veronica didyma in a winter rapeseed field were collected, and a dataset was created using an offline augmentation method. (2) Veronica didyma was recognized using the YOLOv7 model which was trained. Veronica didyma was calibrated, corrected, and matched in conjunction with a binocular camera to determine the precise location of Veronica didyma. (3) An intelligent weed detection and laser weeding system was built, the localization and movement control of the laser via the target coordinates obtained by the YOLOv7 model and binocular camera were realized, and the feasibility of laser weeding was experimentally verified. In addition, the optimal scanning weeding speed was evaluated.

2. Overall Technical Route

To achieve the systematic combination and actual application of the CNN, binocular stereo vision, and laser technology, it was firstly necessary to create an image dataset and train the model, then to realize the Veronica didyma detection, complete the stereo correction and matching of the binocular camera, convert the anchor box information detected by the model into physical coordinates used by the camera, transform the obtained physical coordinates to the laser vibrating mirror, and, finally, to control the movement of the laser spot to the center of Veronica didyma and achieve the goal of laser weeding. The overall technical route of this paper is shown in Figure 1.

3. Detection and Localization of Veronica didyma

3.1. Dataset Preparation

Veronica didyma is an annual or biennial spreading multi-branched broad-leaved herbaceous plant 10–25 cm in height. The plant is widely found in many places in China, such as Henan, Shandong, Jiangsu, Zhejiang, and so forth, and is among the weeds with the widest distribution and highest abundance in oilseed rape fields in the middle and lower reaches of the Yangtze River (its biosample map is shown in Figure 2). Cultivating Veronica didyma under artificial greenhouse conditions and controlling its bio-density to the average distribution density of weeds in rape fields (310 plants/m2) reflected the distribution of this species during the seedling stage in rape fields in the middle and lower reaches of the Yangtze River under natural conditions.
Considering that the detection of Veronica didyma in the field is affected by the light factors in different weather and time periods, certain light supplementation measures were taken to improve the accuracy of the weed detection during the shooting in different time periods. The image resolution from the Huawei P60 mobile phone (Huawei Technologies Co., Ltd, Shenzhen, China) used to obtain the photographs was 3000 × 4000 pixels, and the relative shooting height was from 20 to 60 cm. As shown in Figure 3, the images were acquired at 8:00 (fill light of 20%), 12:00, 18:00, and 24:00 (fill light of 100%) in the local time, respectively, with a total of 234 images containing Veronica didyma obtained.
The dataset was randomly divided into a training set (a total of 210 images) and a validation set (a total of 24 images) in a ratio of 9:1, and the training set was labeled with data using the LabelImg (Version 1.8.6) tool.

3.2. YOLOv7 Detection of Veronica didyma

Considering the real-time requirements of smart laser weeding detection, and as the main purpose of this paper involved the non-algorithmic development and deployment of practical applications, it was necessary to select an accurate single-stage target detection network model. To this end, the YOLOv7 (You Only Look Once, YOLO) [48] system, with better real-time performance, was chosen as the Veronica didyma detection model.
Details of the environment configuration for performing the YOLOv7 training are shown in Table 1. The training parameters used were as follows: the training epoch was 200, the batch size was set as 4, the initial learning rate was 0.01, and the Final OneCycleLR learning rate was 0.001. In addition, we used Adam optimization with the momentum and weight decay, respectively, set to 0.937 and 0.0005.
Using the YOLOV7 model, the following results were obtained: the 50% mean Average Precision IoU: 0.984, precision: 0.954, and recall: 0.967. The YOLOv5 model was used again and trained again with the same parameters to obtain the following results: the 50% mean Average Precision IoU: 0.951, precision: 0.967, and recall: 0.893. Comparing the training results of the two, and considering them together, the YOLOv7 model was chosen.

3.3. Localization of Veronica didyma

Due to the randomness and high uncertainty of the position of Veronica didyma in the field, and the need for real-time localization, we selected a parallel optical axis binocular camera with good stereo matching accuracy (PXYZ-S-AR135-130T400, Pixel XYZ™, 640 × 480 pixels @ 30 FPS, focal length 3.4 mm Pixel, Shenzhen, China) in combination with binocular stereo vision [40] localization methods to enable acquisition of 3D information on Veronica didyma during the robot’s movement.
In the ideal parallel optical axis binocular camera model, the two cameras would be in the same plane, and the binocular localization and ranging based on the parallax principle would be perfectly accurate. In practice, however, manufacturing and assembly errors will ultimately lead to the production of bias in the localization results. To solve these problems, the binocular camera was first calibrated using a calibration plate, with an overview of the calibration process shown in Figure 4. From Figure 4d, it can be seen that the final total average error of the binocular camera after calibration was 0.11 pixels, which indicates that the calibration effect was good.
To further improve the localization accuracy of the binocular camera for Veronica didyma in areas with dense vegetation and complicated terrain, and to enhance its localization stability under different lighting conditions, a stereo matching algorithm was used in this study so as to enhance the robustness of the localization accuracy to the environmental change factors. The block matching (BM) algorithm [49] was used for comparison with the SGBM [50] algorithm, with the results shown in Figure 5. It can be seen that the SGBM algorithm produced a relatively good parallax map, and had a better stereo matching effect and faster matching speed (by 0.289 s) than the BM algorithm. Considering that the target algorithm was to be deployed to the mobile laser weeding platform, and that the growing environment of the crop was more complex, the SGBM algorithm was selected for stereo matching.
The binocular camera was fixed on the X-Y two-axis experimental table, and the camera was placed at a height of 30 cm from the ground, so as to carry out the experimental weed localization based on binocular vision in order to verify the localization accuracy of the binocular vision system. Eight Veronica didyma plants in different locations were selected as the experimental objects, and the method of “moving the camera and calculating the difference” was adopted. After completing the acquisition of the coordinate points, the error rate σ in the X and Y directions was calculated according to the following formula:
σ = | S | C 0 C 1 | | S
where C0 denotes the coordinate value before moving on the X and Y axes, C1 denotes the coordinate value after moving on the X and Y axes, and S denotes the distance moved on the X and Y axes; the experimental results are shown in Table 2.
The laser weeding system uses the heat from laser scanning to rapidly ablate the meristemic buds of the weed, thus preventing the plant from growing until it eventually dies. After binocular calibration and SGBM matching, the localization error of the binocular camera in the X and Y directions was 3.06% and 2.81%, respectively. This error met the localization accuracy requirements of the laser weeding system for Veronica didyma.
Pre-tests were first conducted to determine the range of values for the minimum confidence level and camera height. In these tests, false detection occurred when the confidence level was less than 70%, and the detection rate was close to zero when the confidence level was more than 90%; the confidence level was therefore set to 70–90%. Furthermore, the field of view was too small and the efficiency was low when the camera height was below 20 cm, and the detection rate was poor when it was above 60 cm because the weed targets were too small and their features were difficult to retain; consequently, the camera height was set to 20–60 cm. Finally, the lowest confidence level was used for three repetitions of the test, taking the number of recognized objects divided by the total number of objects in the field of view as the detection rate, and taking the average of the three detection rates. These Veronica didyma detection results are shown in Table 3.
When the minimum confidence level was 70% and the camera height was 20 cm, the detection rate of Veronica didyma was 100%, whereas when the minimum confidence level was 70% and the camera height was 30 cm, the detection rate of Veronica didyma was 98%. Ultimately, the final detection conditions selected were a 30 cm installation height of the camera and 70% for the lowest confidence level.

4. Weeding Experiment and Result Analysis

4.1. The Intelligent Weed Detection and Laser Weeding System Set-Up

The fixed-position laser weeding system was built in a laboratory environment, as shown in Figure 6. The set-up mainly consisted of a PC console (12th Gen Intel® Core™ i9-12900H 2.50 GHz (Intel, California, USA), NVIDIA GeForce GTX 3090 GPU, RAM 16 GB (AUSU, Shenzhen, China) ), a binocular camera, a continuous fiber laser (QSFL-100Q, Shenzhen Super Laser Output spot 8 mm, 100 W, 1055–1070 nm, Shenzhen, China), a laser vibrating mirror (ZB2D-10C-1064, Shenzhen Zbtk Technology Co., Ltd., Shenzhen, China), and a Veronica didyma cultivation box.
The overall workflow for designing the intelligent weed detection and laser weeding system is shown in Figure 7. In brief, after the binocular camera images the center point of the weed, the coordinates of the center point of the weed in the galvanometer coordinate system are obtained through the matrix obtained by the camera–galvanometer calibration, and the coordinates of the four anchor frames of the weed in the galvanometer coordinates are obtained through the matrix obtained by the image–galvanometer calibration. The coordinates of the four vertices of the weed anchor frame are obtained by the matrix obtained from the image–vibrator calibration, the coordinates of the four vertices of the weed anchor frame are obtained by the matrix obtained from the image–vibrator calibration, and the coordinates of the four vertices of the weed anchor frame are obtained by the matrix obtained from the image–vibrator calibration.

4.2. Laser Weeding of Veronica didyma

After the test bed was built, the vision system was firstly connected to view the position information of the weeds under the coordinate system of the binocular camera and the position of the anchor frame of the weeds was identified, as shown in Figure 8a. After obtaining the position information of the weeds under the camera coordinate system, the galvanometer delineated the square area according to the coordinates obtained from the binocular camera and the anchor frame coordinates of weeds to guide the laser for ablation. Since the laser used in this study had a wavelength of 1064 nm, which is invisible, a red light was used for the determination of the scanning area. Due to the binocular camera frame rate, the complete scanning frame indicated by the red light was not captured, so additional shots were taken using the camera on the side of the galvanometer, as shown in Figure 8b.
Laser ablation was performed at a laser scanning speed of 20 mm/s, a power of 100 W, a scanning path of a 45° cross-fill, and line spacing of 1 mm; a comparison of the weeds before and after laser irradiation is shown in Figure 9. Under these lasering conditions, the weed leaves and terminal buds were completely carbonized, and after later observation, it was determined that the weeds were completely dead. This experimental result demonstrated that laser weeding under binocular camera guidance and microscope control is feasible. However, it is not necessary to carbonize the weeds to kill them or stop their growth, but only to reach the critical temperature of their cells. Although the above laser parameters ensured the destructive effect of the laser weed control, they also caused substantial energy waste.

4.3. Determination of Optimal Scanning Parameters

Using a scanning path of a 45° cross-fill and a line spacing of 1 mm, the laser spot fully covered the weed blade. Therefore, it was only necessary to change the laser scanning speed. By changing the amount of time for which the laser energy would act on the weed leaf through different scanning speeds in the weed control experiments, it was possible to observe the wilting effects on Veronica didyma and thus determine the scanning speed that would maximize these wilting effects and thus the weed control efficacy.
In order to reduce the error caused by the uncertainty of weed growth, this experiment was repeated in three groups for each scanning speed, and the average height of the three groups of plants after the laser treatment was taken as the most common plant height. Through the pre-test, it was found that, at a laser scanning speed of about 100 mm/s, two of three groups of the experimental Veronica didyma plants died, while one group survived. Therefore, the scanning speeds were adjusted to find an optimal value. A blank control group was set up to try to exclude the influence of the growing environment on the laser scanning effect. The change in height of Veronica didyma under different scanning speeds is shown in Figure 10, in which the height of wilted weeds is noted as zero. As can be seen from the figure, at laser scanning speeds of 160 mm/s and 140 mm/s, the Veronica didyma plant height decreased in the first day because of dehydration, although no wilting plants were observed, and the average plant height changed little. The overall trend was a gradually increasing one; that is, at laser scanning speeds of 120 mm/s and 100 mm/s, the average plant height decreased rapidly in conjunction with an increase in wilted plants, and the average height reached a minimum after the third day, after which it began to rise slowly. Finally, at laser scanning speeds of 80 mm/s and below, the average height of Veronica didyma plants decreased with their dehydration, collapse, and complete withering in the first day after scanning. Furthermore, no signs of resprouting were observed by day 15.
After the test, it was determined that the scanning speed for Veronica didyma would be set to 80 mm/s, as this had resulted in the weeds completely withering, which satisfied the design requirements of the laser weeding test. Under the condition of this scanning speed, as well as a 100 W power, a scanning path of a 45° cross-filling, and a 0.5 mm line spacing, the morphological transformation of Veronica didyma after laser scanning is shown in Figure 11.

5. Conclusions

This paper aimed to address the problem of Veronica didyma weed management in winter oilseed rape fields in the Yangtze River Basin by building and validating an intelligent weed detection and laser weeding system based on YOLOv7 for the precise management of this weed. A dataset of 234 images containing Veronica didyma was collected and organized, and YOLOv7 was used for training, yielding a final detection accuracy of 94.94% and a [email protected] of 0.972. When combined with binocular stereo vision technology, this resulted in the accurate localization of Veronica didyma, which provided reliable target location information for laser weeding. Subsequently, laser weeding experiments were carried out, and the optimal power and scanning speed for laser weeding were determined to be 100 W and 80 mm/s, respectively. Veronica didyma was inactivated and did not resprout within 15 days after ablation, which satisfied the experimental design requirements. The main contribution of this paper is the innovation of a non-chemical, precision agricultural protection method through the integration of deep learning, binocular stereo vision, and laser technology. This method opens new possibilities of weed management in sustainable agroecosystems. Overall, this study not only verifies the feasibility of the intelligent system in precision agriculture management, but also provides a solid theoretical and practical foundation for the future automation and intelligence of agriculture, which foretells broad application prospects of the system in other agricultural fields. However, this study was conducted on young Veronica didyma, and no tests were conducted on mature Veronica didyma. In subsequent studies, Veronica didyma will be investigated at various periods of time, and the selection of laser wavelengths, generators, etc., will be considered. Research on other weed species with different mechanical properties will also be continued.

Author Contributions

Conceptualization, L.Q.; methodology, L.Q.; software, Z.X. and L.Q.; data curation and writing—original draft preparation, W.W.; writing—review and editing, W.W. and Z.X.; supervision and project administration, X.W. All authors have read and agreed to the published version of the manuscript.

Funding

The research project was funded by Taizhou Science and Technology Planning Project (grant no. 22nya10).

Data Availability Statement

Data is contained within the article.

Acknowledgments

Thanks to all of the authors cited in this article and the referees for their helpful comments and suggestions.

Conflicts of Interest

The authors declare no conflicts of interest.

References

  1. Yang, J.M.; Long, Y.; Ye, H.; Wu, Y.L.; Zhu, Q.; Zhang, J.H.; Huang, H.; Zhong, Y.B.; Luo, Y.; Wang, M.Y. Effects of rapeseed oil on body composition and glucolipid metabolism in people with obesity and overweight: A systematic review and meta-analysis. Eur. J. Clin. Nutr. 2024, 78, 6–18. [Google Scholar] [CrossRef] [PubMed]
  2. Todorović, Z.B.; Mitrović, P.M.; Zlatković, V.; Grahovac, N.L.; Banković-Ilić, I.B.; Troter, D.Z.; Marjanović-Jeromela, A.M.; Veljković, V.B. Optimization of oil recovery from oilseed rape by cold pressing using statistical modeling. Food Meas. 2024, 18, 474–488. [Google Scholar] [CrossRef]
  3. Ji, C.X.; Zhai, Y.J.; Zhang, T.Z.; Shen, X.X.; Bai, Y.Y.; Hong, J.L. Carbon, energy and water footprints analysis of rapeseed oil production: A case study in China. J. Environ. Manag. 2021, 287, 112359. [Google Scholar] [CrossRef] [PubMed]
  4. Liu, L.; Xu, X.L.; Hu, Y.M.; Liu, Z.J.; Qiao, Z. Efficiency analysis of bioenergy potential on winter fallow fields: A case study of rape. Sci. Total Environ. 2018, 628–629, 103–109. [Google Scholar] [CrossRef]
  5. Tian, Z.; Ji, Y.H.; Xu, H.Q.; Sun, L.X.; Zhong, H.; Liu, J.G. The potential contribution of growing rapeseed in winter fallow fields across Yangtze River Basin to energy and food security in China. Resour. Conserv. Recycl. 2021, 164, 105159. [Google Scholar] [CrossRef]
  6. Biswas, B.; Timsina, J.; Garai, S.; Mondal, M.; Banerjee, H.; Adhikary, S.; Kanthal, S. Weed control in transplanted rice with post-emergence herbicides and their effects on subsequent rapeseed in Eastern India. Int. J. Pest Manag. 2023, 69, 89–101. [Google Scholar] [CrossRef]
  7. Li, R.H.; Qiang, S.; Qiu, D.S.; Chu, Q.H.; Pan, G.X. Effects of long-term diferent fertilization regimes on the diversity of weed communities in oilseed rape fields under rice-oilseed rape cropping system. Biodivers. Sci. 2008, 2, 118–125. [Google Scholar]
  8. Zheng, X.; Koopmann, B.; Ulber, B.; Tiedemann, A.V. A Global Survey on Diseases and Pests in Oilseed Rape-Current Challenges and Innovative Strategies of Control. Front. Agron. 2020, 2, 590908. [Google Scholar] [CrossRef]
  9. Williams, I.H. The Major Insect Pests of Oilseed Rape in Europe and Their Management: An Overview; Springer: Dordrecht, The Netherlands, 2020; pp. 1–43. [Google Scholar] [CrossRef]
  10. Diepenbrock, W. Yield analysis of winter oilseed rape (Brassica napus L.): A review. Field Crop. Res. 2000, 67, 35–49. [Google Scholar] [CrossRef]
  11. Sundaram, P.K.; Rahman, A.; Singh, A.K.; Sarkar, B. A novel method for manual weeding in row crops. Indian J. Agric. Sci. 2021, 91, 946–948. [Google Scholar] [CrossRef]
  12. Altmanninger, A.; Brandmaier, V.; Spangl, B.; Gruber, E.; Takács, E.; Mörtl, M.; Klátyik, S.; Székács, A.; Zaller, J.G. Glyphosate-Based Herbicide Formulations and Their Relevant Active Ingredients Affect Soil Springtails Even Five Months after Application. Agriculture 2023, 13, 2260. [Google Scholar] [CrossRef]
  13. Cordova-Cardenas, R.; Emmi, L.; Gonzalez-de-Santos, P. Enabling Autonomous Navigation on the Farm: A Mission Planner for Agricultural Tasks. Agriculture 2023, 13, 2181. [Google Scholar] [CrossRef]
  14. Zingsheim, M.L.; Döring, T.F. What weeding robots need to know about ecology. Agric. Ecosyst. Environ. 2024, 364, 108861. [Google Scholar] [CrossRef]
  15. Pessina, A.; Humair, L.; Naderi, R.; Röder, G.; Seehausen, M.L.; Rasmann, S.; Weyl, P. Investigating the host finding behaviour of the weevil Phytobius vestitus for the biological control of the invasive aquatic weed Myriophyllum aquaticum. Biol. Control 2024, 192, 105509. [Google Scholar] [CrossRef]
  16. Hanley, M.E. Thermal shock and germination in North-West European Genisteae: Implications for heathland management and invasive weed control using fire. Appl. Veg. Sci. 2009, 12, 385–390. [Google Scholar] [CrossRef]
  17. Krupanek, J.; Santos, P.G.; Emmi, L.; Wollweber, M.; Sandmann, H.; Scholle, K.; Tran, D.D.M.; Schouteten, J.J.; Andreasen, C. Environmental performance of an autonomous laser weeding robot-a case study. Int. J. Life Cycle Assess. 2024, 29, 1021–1052. [Google Scholar] [CrossRef]
  18. N’cho, S.A.; Mourits, M.; Rodenburg, J.; Lansink, A.O. Inefficiency of manual weeding in rainfed rice systems affected by parasitic weeds. Agric. Econ. 2018, 50, 151–163. [Google Scholar] [CrossRef]
  19. Jacquet, F.; Delame, N.; Vita, J.L.; Huyghe, C.; Reboud, X. The micro-economic impacts of a ban on glyphosate and its replacement with mechanical weeding in French vineyards. Crop Prot. 2021, 150, 105778. [Google Scholar] [CrossRef]
  20. Pannacci, E.; Tei, F.; Guiducci, M. Evaluation of mechanical weed control in legume crops. Crop Prot. 2018, 104, 52–59. [Google Scholar] [CrossRef]
  21. Radicetti, E.; Mancinelli, R. Sustainable Weed Control in the Agro-Ecosystems. Sustainability 2021, 13, 8639. [Google Scholar] [CrossRef]
  22. Bajwa, A.A. Sustainable weed management in conservation agriculture. Crop Prot. 2014, 65, 105–113. [Google Scholar] [CrossRef]
  23. Katie-Kangas, D.V.M. A Perspective on Glyphosate Toxicity: The Expanding Prevalence of This Chemical Herbicide and Its Vast Impacts on Human and Animal Health. J. Am. Holist. Vet. Med. Assoc. 2022, 68, 11–21. [Google Scholar] [CrossRef]
  24. Raj, J.; Kumar, P.; Jat, S.; Yadav, A. A Review on Weed Management Techniques. Int. J. Plant Soil Sci. 2023, 35, 66–74. [Google Scholar] [CrossRef]
  25. Shams, M.Y.; Gamel, S.A.; Talaat, F.M. Enhancing crop recommendation systems with explainable artificial intelligence: A study on agricultural decision-making. Neural Comput. Appl. 2024, 36, 5695–5714. [Google Scholar] [CrossRef]
  26. Vijayakumar, V.; Ampatzidis, Y.; Schueller, J.K.; Burks, T. Smart spraying technologies for precision weed management: A review. Smart Agric. Technol. 2023, 6, 100337. [Google Scholar] [CrossRef]
  27. Lee, W.S.; Slaughter, D.C.; Giles, D.K. Robotic weed control system for tomatoes. Precis. Agric. 1999, 1, 95–113. [Google Scholar] [CrossRef]
  28. Partel, V.; Kakarla, S.C.; Ampatzidis, Y. Development and evaluation of a low-cost and smart technology for precision weed management utilizing artificial intelligence. Comput. Electron. Agric. 2019, 157, 339–350. [Google Scholar] [CrossRef]
  29. Zhang, J.; Su, W.; Zhang, H.; Peng, Y. SE-YOLOv5x: An Optimized Model Based on Transfer Learning and Visual Attention Mechanism for Identifying and Localizing Weeds and Vegetables. Agronomy 2022, 12, 2061. [Google Scholar] [CrossRef]
  30. Rai, N.; Zhang, Y.; Villamil, M.; Howatt, K.; Ostlie, M.; Sun, X. Agricultural weed identification in images and videos by integrating optimized deep learning architecture on an edge computing technology. Comput. Electron. Agric. 2024, 216, 108442. [Google Scholar] [CrossRef]
  31. Rai, N.; Sun, X. WeedVision: A single-stage deep learning architecture to perform weed detection and segmentation using drone-acquired images. Comput. Electron. Agric. 2024, 219, 108792. [Google Scholar] [CrossRef]
  32. Liu, T.H.; Nie, X.N.; Wu, J.M.; Zhang, D.; Liu, W.; Cheng, Y.; Zheng, Y.; Qiu, J.; Qi, L. Pineapple (Ananas comosus) fruit detection and localization in natural environment based on binocular stereo vision and improved YOLOv3 model. Precis. Agric. 2023, 24, 139–160. [Google Scholar] [CrossRef]
  33. Sun, X.; Jiang, Y.; Ji, Y.; Fu, W.; Yan, S.; Chen, Q.; Yu, B.; Gan, X. Distance measurement system based on binocular stereo vision. IOP Conf. Ser. Earth Environ. Sci. 2019, 252, 052051. [Google Scholar] [CrossRef]
  34. Wu, Y.; Qiu, C.; Liu, S.; Zou, X.; Li, X. Tomato Harvesting Robot System Based on Binocular Vision. In Proceedings of the 2021 IEEE International Conference on Unmanned Systems (ICUS), Beijing, China, 15–17 October 2021; pp. 757–761. [Google Scholar] [CrossRef]
  35. Li, T.F.; Fang, W.T.; Zhao, G.A.; Gao, F.F.; Wu, Z.C.; Li, R.; Fu, L.S.; Dhupia, J. An improved binocular localization method for apple based on fruit detection using deep learning. Inf. Process. Agric. 2021, 10, 276–287. [Google Scholar] [CrossRef]
  36. Pal, A.; Leite, A.C.; From, P.J. A novel end-to-end vision-based architecture for agricultural human–robot collaboration in fruit picking operations. Robot. Auton. Syst. 2024, 172, 104567. [Google Scholar] [CrossRef]
  37. Shu, Y.F.; Zheng, W.B.; Xiong, C.W.; Xie, Z.M. Research on the vision system of lychee picking robot based on stereo vision. J. Radiat. Res. Appl. Sci. 2024, 17, 100777. [Google Scholar] [CrossRef]
  38. Thai, H.; Le, K.; Nguyen, N. FormerLeaf: An efficient vision transformer for Cassava Leaf Disease detection. Comput. Electron. Agric. 2023, 204, 107518. [Google Scholar] [CrossRef]
  39. Wójtowicz, A.; Piekarczyk, J.; Wójtowicz, M.; Jasiewicz, J.; Królewicz, S.; Starzycka-Korbas, E. Classification of Plenodomus lingam and Plenodomus biglobosus in Co-Occurring Samples Using Reflectance Spectroscopy. Agriculture 2023, 13, 2228. [Google Scholar] [CrossRef]
  40. Tang, Y.C.; Zhou, H.; Wang, H.J.; Zhang, Y.Q. Fruit detection and positioning technology for a Camellia oleifera C. Abel orchard based on improved YOLOv4-tiny model and binocular stereo vision. Expert Syst. Appl. 2023, 211, 118573. [Google Scholar] [CrossRef]
  41. Özlüoymak, Ö.B. Determination of Plant Height for Crop and Weed Discrimination by Using Stereo Vision System. J. Tekirdag Agric. Fac. 2020, 17, 97–107. [Google Scholar] [CrossRef]
  42. Zhang, R.R.; Lian, S.K.; Li, L.L.; Zhang, L.H.; Zhang, C.C.; Chen, L.P. Design and experiment of a binocular vision-based canopy volume extraction system for precision pesticide application by UAVs. Comput. Electron. Agric. 2023, 213, 108197. [Google Scholar] [CrossRef]
  43. Miao, Y.L.; Wang, L.Y.; Peng, C.; Li, H.; Li, X.H.; Zhang, M. Banana plant counting and morphological parameters measurement based on terrestrial laser scanning. Plant Methods 2022, 18, 66. [Google Scholar] [CrossRef] [PubMed]
  44. Liu, G.Y.; Xia, J.F.; Zheng, K.; Cheng, J.; Wang, K.X.; Liu, Z.Y.; Wei, Y.S.; Xie, D.Y. Measurement and evaluation method of farmland microtopography feature information based on 3D LiDAR and inertial measurement unit. Soil Tillage Res. 2024, 236, 105921. [Google Scholar] [CrossRef]
  45. Thanpattranon, P.; Ahamed, T.; Takigawa, T. Navigation of an Autonomous Tractor for a Row-Type Tree Plantation Using a Laser Range Finder—Development of a Point-to-Go Algorithm. Robotics 2015, 4, 341–364. [Google Scholar] [CrossRef]
  46. Mathiassen, S.K.; Bak, T.; Christensen, S.; Kudsk, P. The effect of laser treatment as a weed control method. Biosyst. Eng. 2006, 95, 497–505. [Google Scholar] [CrossRef]
  47. Shah, R.; Lee, W.S. An approach to a laser weeding system for elimination of in-row weeds. In Precision Agriculture ’15; Stafford, J.V., Ed.; Wageningen Academic: Wageningen, The Netherlands, 2015; pp. 307–312. [Google Scholar] [CrossRef]
  48. Redmon, J.; Farhadi, A. YOLO9000: Better, Faster, Stronger. In Proceedings of the 2017 IEEE Conference on Computer Vision and Pattern Recognition (CVPR), Honolulu, HI, USA, 21–26 July 2017; pp. 6517–6525. [Google Scholar] [CrossRef]
  49. Rao, K.S.; Paramkusam, A.V.; Darimireddy, N.K.; Chehri, A. Block Matching Algorithms for the Estimation of Motion in Image Sequences: Analysis. Procedia Comput. Sci. 2021, 192, 2980–2989. [Google Scholar] [CrossRef]
  50. Yin, W.; Ji, Y.F.; Chen, J.T.; Li, R.; Feng, S.J.; Chen, Q.; Pan, B.; Jiang, Z.Y.; Zou, C. Initializing and accelerating Stereo-DIC computation using semi-global matching with geometric constraints. Opt. Lasers Eng. 2024, 172, 107879. [Google Scholar] [CrossRef]
Figure 1. The overall technical route of this paper.
Figure 1. The overall technical route of this paper.
Agriculture 14 00910 g001
Figure 2. Veronica didyma. (a) The seedling of veronica didyma. (b) The plant diagram of veronica didyma.
Figure 2. Veronica didyma. (a) The seedling of veronica didyma. (b) The plant diagram of veronica didyma.
Agriculture 14 00910 g002
Figure 3. Veronica didyma images collected at different times of day. (a) 8:00 (fill light of 20%). (b) 12:00. (c) 18:00. (d) 24:00 (fill light of 100%).
Figure 3. Veronica didyma images collected at different times of day. (a) 8:00 (fill light of 20%). (b) 12:00. (c) 18:00. (d) 24:00 (fill light of 100%).
Agriculture 14 00910 g003
Figure 4. Overview of the calibration process for the binocular cameras. (a) Checkerboard calibration board. (b) Checkerboard calibration board. (c) Images of multiple sets of checkerboard calibration boards collected. (d) Checkerboard calibration board corner detection. (e) Bar chart of calibration error of binocular camera.
Figure 4. Overview of the calibration process for the binocular cameras. (a) Checkerboard calibration board. (b) Checkerboard calibration board. (c) Images of multiple sets of checkerboard calibration boards collected. (d) Checkerboard calibration board corner detection. (e) Bar chart of calibration error of binocular camera.
Agriculture 14 00910 g004
Figure 5. Stereo matching parallax image comparison between the SGBM and BM algorithms. (a) SGBM algorithm parallax image. (b) BM algorithm parallax image.
Figure 5. Stereo matching parallax image comparison between the SGBM and BM algorithms. (a) SGBM algorithm parallax image. (b) BM algorithm parallax image.
Agriculture 14 00910 g005
Figure 6. The intelligent weed detection and laser weeding system set-up.
Figure 6. The intelligent weed detection and laser weeding system set-up.
Agriculture 14 00910 g006
Figure 7. The overall process of the intelligent weed detection and laser weeding system.
Figure 7. The overall process of the intelligent weed detection and laser weeding system.
Agriculture 14 00910 g007
Figure 8. Localization process of binocular camera for Veronica didyma. (a) Anchor box position under the left camera. (b) Completed scanning anchor box position.
Figure 8. Localization process of binocular camera for Veronica didyma. (a) Anchor box position under the left camera. (b) Completed scanning anchor box position.
Agriculture 14 00910 g008
Figure 9. Laser weeding ablation experiment. (a) Before ablation. (b) Ablating. (c) After ablation.
Figure 9. Laser weeding ablation experiment. (a) Before ablation. (b) Ablating. (c) After ablation.
Agriculture 14 00910 g009
Figure 10. Growth height trend of the Veronica didyma at different laser scanning speeds.
Figure 10. Growth height trend of the Veronica didyma at different laser scanning speeds.
Agriculture 14 00910 g010
Figure 11. Morphological transformation of Veronica didyma after laser scanning. (a) Before ablation. (b) 7 days after being burned. (c) 15 days after being burned.
Figure 11. Morphological transformation of Veronica didyma after laser scanning. (a) Before ablation. (b) 7 days after being burned. (c) 15 days after being burned.
Agriculture 14 00910 g011
Table 1. The environment configuration for YOLOv7 training.
Table 1. The environment configuration for YOLOv7 training.
SystemCondaGPUCudaPyTorchTorchvision
Win1023.3.1NVIDIA GeForce GTX 4070 Ti (AUSU, Shenzhen, China)11.31.12.10.13.1
Table 2. The coordinates of the Veronica didyma before and after the camera move.
Table 2. The coordinates of the Veronica didyma before and after the camera move.
Weed No.Coordinate
before Moving
Coordinate after
Moving (Moved 80 mm)
Actual Movement
Distance of X-axis (σ [%])
Actual Movement
Distance of Y-axis (σ [%])
1(10.31, 13.10)(93.47, −70.32)83.16 (3.16%)83.42 (3.42%)
2(81.42, 25.48)(159.26, −58.06)77.84 (2.16%)83.54 (3.54%)
3(−78.14, 67.69)(3.97, −13.68)82.11 (2.11%)81.37 (1.37%)
4(1.04, 63.71)(85.46, −20.61)84.42 (4.42%)84.32 (4.32%)
5(20.69, −74.38)(−60.54, 5.56)81.23 (1.23%)79.94 (0.06%)
6(137.03, −69.90)(61.54, 13.64)75.49 (4.51%)83.54 (3.54%)
7(65.96, −54.06)(−17.06, 22.53)83.02 (3.02%)76.59 (3.41%)
8(153.53, −18.64)(69.65, 64.21)83.88 (3.88%)82.85 (2.85%)
Average error3.06%2.81%
Table 3. Orthogonal analysis table of experimental design and results.
Table 3. Orthogonal analysis table of experimental design and results.
Experiment No.Confidence [%]Camera Height [cm]Recognition Rate
11 (70%)11.00
2120.98
3130.94
4140.94
5150.61
62 (75%)11.00
7220.94
8230.90
9240.84
10250.48
113 (80%)11.00
12320.84
13330.87
14340.71
15350.39
164 (85%)10.90
17420.77
18430.65
19440.55
20450.26
215 (90%)10.57
22520.48
23530.19
24540.16
25550.00
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Qin, L.; Xu, Z.; Wang, W.; Wu, X. YOLOv7-Based Intelligent Weed Detection and Laser Weeding System Research: Targeting Veronica didyma in Winter Rapeseed Fields. Agriculture 2024, 14, 910. https://doi.org/10.3390/agriculture14060910

AMA Style

Qin L, Xu Z, Wang W, Wu X. YOLOv7-Based Intelligent Weed Detection and Laser Weeding System Research: Targeting Veronica didyma in Winter Rapeseed Fields. Agriculture. 2024; 14(6):910. https://doi.org/10.3390/agriculture14060910

Chicago/Turabian Style

Qin, Liming, Zheng Xu, Wenhao Wang, and Xuefeng Wu. 2024. "YOLOv7-Based Intelligent Weed Detection and Laser Weeding System Research: Targeting Veronica didyma in Winter Rapeseed Fields" Agriculture 14, no. 6: 910. https://doi.org/10.3390/agriculture14060910

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop