Next Article in Journal
Structure Optimization of Gliding Arc Electrodes for Seed Treatment Based on the Study of Plasma Distribution Characteristics
Previous Article in Journal
Effect of Muddy Water Characteristics on Infiltration Laws and Stratum Compactum Soil Particle Composition under Film Hole Irrigation
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Review

Key Technologies of Intelligent Weeding for Vegetables: A Review

1
College of Engineering, South China Agricultural University, Guangzhou 510642, China
2
School of Intelligent Engineering, Shaoguan University, Shaoguan 512005, China
3
Key Laboratory of Key Technology on Agricultural Machine and Equipment (South China Agricultural University), Ministry of Education, Guangzhou 510642, China
4
State Key Laboratory of Agricultural Equipment Technology, Guangzhou 510642, China
5
Guangdong Provincial Key Laboratory of Agricultural Artificial Intelligence (GDKL-AAI), Guangzhou 510642, China
*
Author to whom correspondence should be addressed.
Agriculture 2024, 14(8), 1378; https://doi.org/10.3390/agriculture14081378
Submission received: 15 July 2024 / Revised: 13 August 2024 / Accepted: 14 August 2024 / Published: 16 August 2024
(This article belongs to the Section Agricultural Technology)

Abstract

:
Vegetables are an essential part of people’s daily diet, and weeds can cause serious losses in vegetable yield and quality. Intelligent weeding technology for vegetables will be one of the mainstream technologies in modern agricultural development. This article reviews the current research status of intelligent weeding technology for vegetables, including vegetable and weed detection technology, weeding actuators, and weeding robots. Firstly, the vegetable and weed detection technology was introduced in detail from three aspects: global weed detection, crop-rows detection, and vegetable/weed precise recognition technology. The research results of some researchers were summarised, and the vegetable/weed precise recognition technology, including machine learning and proximal sensor technology, was introduced. Secondly, the weeding actuators and robots were introduced, including intelligent chemical weeding, mechanical weeding, physical weeding, and integrated weed management methods. Some weeding actuators and robots developed by researchers and agricultural companies were showcased. Finally, the challenges and future development directions of intelligent weeding technology were discussed and analysed. Intelligent weeding technology for vegetables is still mainly limited by natural conditions and a lack of technology. In the future, it will be possible to develop in the direction of multi-algorithm and multi-sensor fusion technologies. It is necessary to improve the applicability of intelligent weeding equipment for various environments, crops, and weeds. This article can provide a reference for future research in the field of intelligent weeding for vegetables.

1. Introduction

1.1. Background and Motivation

Vegetables with extremely a high nutritional value can provide a variety of essential vitamins and minerals for the human body [1,2,3,4,5]. Nutrients such as chemicals and trace elements contained in vegetables can effectively prevent various diseases that affect human health [6,7,8,9]. Therefore, there is a great demand for vegetables around the world. According to the official statistics from the Food and Agriculture Organization of the United Nations (FAO) [10], By 2022, the global vegetable planting area was approximately 5.83 × 107 hm2, with a yield of about 2.01 × 104 kg/hm2. Compared with 2010, the vegetable planting area increased by 12% and the yield increased by 8.9%. The world population is expected to exceed 9 billion by 2050. Therefore, the planting area and yield of vegetables will continue to increase according to people’s demand. Vegetable cultivation, as one of the main income sources of farmers, can also improve economic and social benefits [11,12,13]. However, harmful weeds, which compete with vegetables for sunlight, air, water, and nutrients [14,15,16], can affect the growth and development of vegetables [17] and exacerbate the severity of the pests and diseases of vegetables [18,19,20,21], leading to serious losses in vegetables’ yield, quality, and nutritional substance [22,23]. Therefore, the weed control in vegetable fields is of great importance.
The main methods of weed control in vegetables include manual weeding [24,25], chemical weeding [26,27,28,29], mechanical weeding [30,31,32], biological weeding [33], and physical weeding [34,35,36]. Chemical weeding, as the most widely used weed control method worldwide, has the advantages of a low cost and high efficiency, but it can cause environmental pollution, and the herbicide residues are harmful to human health [37,38,39,40]. The widespread use of herbicides can also cause problems such as increased weed resistance [41,42,43], soil degradation, and so on [44,45]. Several countries around the world have adopted various measures to reduce the use of herbicides [46]. In addition, with the improvement of peoples’ living standards, organic vegetables have received more and more attention due to their good quality, high safety, and high nutrition [47,48,49,50,51]. Therefore, how to reduce the use of herbicides is urgent.
Non-chemical weeding methods include manual weeding, mechanical weeding, and physical weeding. Manual weeding has good weeding effects, but it is labour intensive and inefficient [52,53]. Mechanical weeding, which is carried out by mechanical structural devices for weeding [54,55,56,57], can loosen the soil, increase soil permeability, and accelerate the mineralisation of soil organic matter and nutrient uptake [58,59], but it is difficult to remove intra-row weeds, which can easily cause damage to crops. Physical weeding methods mainly include mulching [60,61], flame weeding [36,62,63,64,65], electric weeding [66,67,68,69], thermal weeding (hot oil, hot steam, and hot foam) [35,70,71,72], and laser weeding [73], etc. The widespread use of physical weeding has disadvantages such as a high cost and high resource consumption.
With the development of agricultural technology, plant detection and automatic control technologies have been applied to weed management. These advanced technologies are combined with chemical, mechanical, or physical weeding equipment to form intelligent weeding technology [71,72,73,74]. Intelligent weeding is the weeding process in which chemical, mechanical, or physical weeding actuators are controlled by automatic control technology for weeding, which is performed after crops or weeds are identified by machine vision technology. Intelligent weeding can remove weeds according to the needs of farmers [75,76,77]. It can accurately and selectively remove harmful weeds and retain beneficial weeds by identifying them at specific locations [78,79] and can identify crops while performing large-scale weed control to avoid damaging the crops [73,74]. It can also carry out specific site weed management (SSWM) using weed maps. Compared with the large-scale use of chemical weeding, intelligent weeding technology can reduce the use of herbicides. It can easily remove intra-row weeds, avoid the damage to crops caused by only applying mechanical or physical weeding actuators [55] and reduce the cost and energy consumption of physical weeding [69,70,71,72]. Therefore, intelligent weeding technology will become one of the mainstream technologies for vegetable weeding.
Intelligent weeding technology that targets vegetables and weeds integrates various advanced technologies, such as image recognition, sensor, communication, and automatic control, etc. Although the intelligent weeding technology for vegetables has been deeply and extensively studied in many countries around the world, and a series of mature equipment and products have been developed, the research results have not been summarised in detail and systematically, which is not conducive to the development of key technologies. Therefore, this article reviewed the relevant literature and research materials and provided a comprehensive overview of the research status of intelligent weeding key technologies for vegetables.

1.2. Related Surveys

We investigated the relevant literature on intelligent weeding technology, which provided references for this paper. In this section, some similar studies are introduced and discussed. Roberts et al. [80] introduced the application of weed detection in the agricultural field and analysed the benefits, challenges, and limitations of various detection methods, imagery, and sensor systems. Their review also showcased and described emerging weed control technologies, such as automatic chemical weeding, mechanical weeding, electric weeding, laser weeding, and thermal weeding, and explored the challenges and opportunities for their application in the field. This review does not specifically introduce the weed detection process and associated methods. Xiang et al. [81] reviewed intelligent mechanical weeding technologies, including crop and weed detection technologies (contact sensors, non-contact sensors, and machine vision) and mechanical weeding executive parts (hoes, spring teeth, fingers, brushes, swing, and rotary executive parts). The limitations and future development directions of intelligent mechanical weeding were also discussed. In this paper, only the intelligent mechanical weeding technology has been introduced. Coleman et al. [82] reviewed the imperatives of integrated weed management (IWM) in vegetable production. Their review investigated the effectiveness of prevailing weed control methods and evaluated emerging weeding technologies for vegetables. This review can provide useful information for advanced vegetable production systems globally and smallholder vegetable production in developing economies. Li et al. [83] showcased typical weeding robots from the past thirty years and provided a detailed introduction to weed detection techniques that are based on machine learning and deep learning. It provided a guide for the researchers and practitioners. Zhang et al. [84] introduced weed detection methods based on machine learning and deep learning and some emerging weeding robots and discussed the limitations and development trends of current systems.
These reviews can provide valuable information for researchers. However, there are few literature reviews that can provide detailed information on both weed detection and weed control techniques in the vegetable field. In this paper, the literature and research results in the field of intelligent weed control for vegetables are investigated and summarised. Weed detection technologies, including global weed detection, crop-rows detection, and the precise identification of vegetables and weeds are comprehensively introduced. Researchers can learn about the research and application of weed detection technology in the field of vegetable weed control through this review and can also discover which aspects of the technology are relatively lacking. The knowledge regarding the image acquisition technology, image processing methods, detection algorithms, and detection precision of vegetables and their associated weeds can all be obtained from this article. In addition, weeding actuators and robots (including intelligent chemical weeding, mechanical weeding, and physical weeding) that can be used for vegetable weeding are also presented in this article. Researchers or vegetable growers can learn about which weed control actuators and robots have been applied in the field of vegetable weed control. It is important for researchers to know which vegetables and weeds have been studied for detection and weed control techniques. The challenges and development directions of vegetable detection and weed control technologies are also reflected at the end of the paper. In conclusion, this review can provide a reference for researchers’ and vegetable growers’ research and practice in the field of vegetable weed control.

1.3. Literature Indexing Methods

The literature in this review were retrieved from the Web of Science and EI database, which includes journals published by Elsevier, SpringerLink, Wiley, MDPI, IEEE Xplore, and others. The aim of the literature search was to obtain the latest research results in the field of intelligent weed control for vegetables in the last decade (2014–2024), including reviews and research papers. The search keywords were set based on the review content of each section. In Section 3, we initially executed the following search queries: {“vegetable” AND “weed detection”}; {“vegetable” AND “weed map”}; {“vegetable” AND “crop row detection”}; {“vegetable” AND “machine learning” OR “deep learning”}; {“vegetable” AND “sensor detection”}; {“vegetable” AND “spectral” OR “fluorescence” OR “ultrasonic” OR “LiDAR”}; {“vegetable” AND “crop signal” OR “plant label”}; etc. In Section 4, we initially executed the following search queries: {“vegetable” AND “weeding”}; {“vegetable” AND “intelligent weeding” OR “chemical weeding” OR “mechanical weeding” OR “physical weeding”}; etc. In addition, we also searched for some major vegetables, such as sugar beet, potatoes, tomatoes, cabbage, lettuce, carrots, and so on. We filtered the retrieved results by reading the abstracts, removed duplicate studies, and determined them to be in the agriculture field. A total of 402 articles were obtained, including 39 reviews and 357 research papers. A total of 270 articles were used in this review.

1.4. Paper Organisation

This paper is organised as follows: Section 1 elaborates on the research background and motivation, related surveys, and literature indexing methods of intelligent weeding technology for vegetables; Section 2 provides a brief introduction to the types of vegetables and weeds; Section 3 introduces vegetable weed detection technology from three aspects: global weed detection, crop-rows detection, and precise identification of vegetables/weeds; Section 4 presents the weeding execution mechanism and weeding robot, including intelligent chemical weeding, mechanical weeding, physical weeding, and comprehensive weed management methods; Section 5 discusses the problems and future development of intelligent weeding technology; and Section 6 provides a summary of the entire paper.

2. Types of Vegetables and Weeds

The intelligent weeding technology introduced in this article mainly targets vegetables and their associated weeds. Therefore, a brief introduction to the types of vegetables and weeds is given here. There is a wide variety of edible vegetables around the world, with hundreds of them, and more than a hundred commonly used vegetables for cultivation. There is no unified and authoritative classification system for the types of vegetables around the world. In this study, according to edible parts, vegetables were divided into rhizomes, leafy vegetables, cabbage, fruits and melons, scallions and garlic, sprouted vegetables, beans, aquatic vegetables, and fungi, as shown in the Table 1. Some vegetable crops, especially those of the same species, have similar biological characteristics and planting patterns, which makes intelligent weeding technology more widely used, but it is also difficult to distinguish crop types. In addition, the wide variety of vegetables also poses a challenge for researchers in the study of weeding techniques.
There are many more types of weeds than vegetables. There are over 1000 types around the world. According to the biological characteristics, weeds can be divided into categories such as annual and perennial weeds [85]. According to the needs of chemical control, they can also be divided into grasses, broad-leaved weeds, and sedges. Due to the wide variety of weed species and their similar biological characteristics to vegetables, the research of weed identification technology will meet great difficulties. Many researchers only detect one or a few types of weeds, or indirectly identify weeds by identifying crop-rows [74,76,77].

3. Vegetable and Weed Identification Technology

Identification technology can identify vegetables and weeds in the field, serving weed control mechanisms. The identification of vegetables and weeds has the following advantages: (1) it can detect the precise location and quantity of vegetables and weeds to achieve precise weed control [72,85]; (2) it can clarify the types of vegetables and weeds to select appropriate weeding methods [15,76,77,78]; (3) it can know the growth period of vegetables and weeds to select the optimal weeding time [86]. Researchers have conducted in-depth research on the identification technology, which mainly includes global weed detection, crop-rows detection, and the precise identification of vegetables and weeds [86,87].

3.1. Global Weed Detection (GWD)

The appearance, distribution, and growth density of weeds in vegetable fields can be obtained by GWD technology, which is beneficial for farmers to carry out SSWM and formulate reasonable and effective weeding measures. Researchers have conducted research on GWD techniques. Fernández et al. [87] reviewed weed detection techniques, including GWD for SSWM, introduced image acquisition platforms and application scenarios, and evaluated the applicability and usability of various acquisition methods. The image acquisition platforms used for GWD mainly include drones and satellite remote sensing, etc. The image acquisition sensors include red–green–blue (RGB) cameras, multispectral sensors, and hyperspectral sensors, etc. Table 2 shows some research achievements of researchers in GWD techniques in recent years. Unmanned aerial vehicles (UAVs) have been widely used in GWD, due to their high flexibility and efficiency [86,88,89,90,91], but their endurance is poor, and they are greatly affected by lighting and wind. Compared with UAVs, the resolution of images acquired by aviation aircraft and satellites are relatively lower, although they have a wide detection range. RGB and multispectral sensors have also been widely used in GWD. Compared with RGB and multispectral sensors, hyperspectral sensors have a high spectral resolution, but they are applied less because of their high cost and complex data processing [90]. Machine learning and deep learning algorithms with high recognition accuracy are used in GWD. As shown in Table 2, some researchers have created a weed distribution map with high accuracy by GWD [92,93,94]. Researchers have conducted extensive research on food crops and grains in GWD, but there is a lack of research on vegetables. Therefore, researchers should increase research on vegetables in GWD to fill the gap in this area. The research results shown in Table 2 can provide a reference for the study of vegetables in GWD.

3.2. Crop-Rows Detection (CWD)

CWD is one of the techniques used to achieve SSWM. CWD technology can be used for vehicle navigation [122,123], inter-row weed detection, and inter-row weeding [124,125], in which it can control the travel routes of the vehicle and weeding device and prevent damage to crops. Most researchers conduct research on automatic navigation by CWD [123]. Shi et al. [126] reviewed the relevant methods and applications of CWD technology in agricultural machinery navigation and discussed and summarised the advantages and disadvantages of current mainstream CWD methods. CWD technology has great potential to identify inter-row weeds, although there is limited research in this area. Table 3 shows the research results of some researchers on CWD for vegetables. The collection platforms that are available for CWD include UAV, robot, tractor, moving vehicle, handheld camera, etc. The collection sensors include RGB, multispectral, hyperspectral, LiDAR (Light Detection and Ranging Sensor), etc. The recognition algorithms of crop rows include Hough transform, linear regression, vertical projection in combination with Hough transform, a random sample consensus, frequency analysis, the exploration of horizontal strips, a clustering algorithm, deep learning, etc. Hough transform, a wide image analysis method, can extract various shapes such as lines, circles, and ellipses from images. The detection principle is to convert the spatial coordinate points into curves in parameter space and detect the shape in the image by statistically analysing the intersection points of the curves in parameter space. As shown in Table 3, researchers have studied about thirty types of vegetables in CWD using different methods, and most of the detection accuracies reached over 80% [127,128,129,130,131]. Research methods and results can provide references for the study of other vegetables in CWD.

3.3. Vegetable Weed Precise Identification

3.3.1. Machine Learning (ML)

Machine vision technology, which is widely used in the agriculture field, can identify crops and weeds by image acquisition technology, image recognition technology, etc. Image acquisition technology can collect crop and weed images by various sensors such as RGB, multispectral, LiDAR, ultrasound, and so on. Image recognition technology includes traditional image recognition and ML technology. Traditional image recognition uses segmentation models to distinguish crops and weeds based on plant features such as their colour, shape, texture, geometric parameters, and so on [149]. ML utilises classification models with data learning capabilities, which are more accurate and efficient in handling complex data. ML, which can be divided into traditional machine learning (TML) and deep learning (DL), involves five key steps: plant image acquisition, image pre-processing, image segmentation, plant image feature extraction, and plant classification [150,151,152].
Researchers have conducted extensive research on plant detection based on ML methods [150]. Some studies have reviewed and introduced ML methods and detection steps [150,151]. Al-Badri et al. [152] introduced traditional machine learning and deep learning weed detection methods and processes, analysed the advantages and limitations of ML, and discussed the future development trends of weed detection. Wang et al. [153] described the process of ML and detection, and provided a detailed introduction to image processing, image segmentation, plant feature extraction, and plant classification. Some review papers have also introduced DL methods and processes for plant detection [85,154]. Hu et al. [155] reviewed research on weed detection based on deep learning, presented common datasets and evaluation metrics, introduced deep learning architecture, and discussed future challenges and potential solutions. Qu et al. [156] reviewed the application of deep learning in vegetable and weed detection, provided detailed introductions to image data acquisition, image processing, image segmentation, image feature extraction, and plant classification algorithms, and presented the research results of some researchers. Table 4 and Table 5 show the research findings of some researchers in vegetable and weed detection based on TML and DL.
As shown in Table 5, researchers have used different methods to study the precise detection of various types of vegetables and weeds. Some researchers chose to identify weeds [163,164,165], while others chose to identify crops [161,162,171]. Most of them achieved high detection accuracy and precision. It can be seen that researchers are currently using deep learning methods more than traditional machine learning methods in the study of precise detection for vegetables. The literature and research findings in Table 4 and Table 5 can provide references and inspiration for future research in this field. The detection processes of traditional machine learning and deep learning methods are briefly introduced as follows:

Traditional Machine Learning (TML)

(1) Plant image acquisition
Some researchers collect crop and weed images under different natural conditions (different lighting and windy weather, etc.) by autonomous collection [157,160,161,162]. Image acquisition platforms generally include drones, ground vehicles, robots, or handheld cameras, etc. Acquisition sensors include RGB, spectral, near-infrared, and LiDAR, etc [198,199]. RGB sensors have been widely used due to their low cost and ability to clearly capture spatial backgrounds, plant colours, leaf morphology, textures, and other features, as shown in Table 4 and Table 5. Some researchers have used public datasets containing images from different scenarios and images that have undergone the processes of background segmentation and image annotation, which provide great convenience to researchers [158,173,181,182,183]. Deng et al. [200] investigated image datasets that can be used for weed detection, collected a total of 36 public datasets, and presented a new dataset. These datasets can be found on the website and are updated in real-time [201,202]. The dataset is expanded by various methods (such as rotation, scaling, trimming, flipping, spatial colour, etc.), which can reduce the risk of overfitting, solve the problem of data scarcity, improve the generalisation ability, robustness, and classification performance of classification models, and improve data balance. When identifying plants, the dataset is divided into training, validation, and test sets in certain proportions. The training set is used to construct the model, the validation set is used to determine the network structure or parameters that control the complexity of the model, and the test set is used to evaluate the performance of the final optimal model.
(2) Image pre-processing
The collected images are pre-processed to improve the image quality. Image pre-processing methods generally include resizing, image enhancement, normalisation processing, spatial colour conversion, and denoising. The image size of the original images is adjusted appropriately, which can remove excess information and minimise computational cost. Image enhancement is used to enhance and adjust the contrast and other parameters of the original image to solve the problems such as lighting and shadows [164]. Normalisation processing refers to the process of transforming images into a fixed standard form, which can avoid interference caused by uneven lighting. Spatial colour conversion can convert image colours according to different purposes. The colour space models, including RGB, Hue–Saturation–Intensity (HIS), Hue–Saturation–Value (HSV), Lab (Luminosity (L) and related colours a and b), YCrCb (luminance, tone, and saturation), can be obtained from RGB transformation functions [153]. Image noise can be removed by image filtering, which can smooth, sharpen images, and increase the resolution of images [166]. Common filtering methods include mean filtering, median filtering, bilateral filtering, Gaussian filtering, blur filtering, and homomorphic filtering.
(3) Image segmentation
Image segmentation is the process of dividing an image into specific and unique regions and defining regions of interest. In weed detection, crops and weeds are mainly separated from the background by image segmentation methods that include colour indices, thresholding, and learning segmentation [203]. The colour indexes include Normalised Difference Index (NDI), Excess Green Index (ExG), Excess Green (EG), and Excess Green minor Excess Red Index (ExGR), etc. [159,160,161,162,163]. The threshold segmentation method, which compares and classifies the pixel collection points of the processed images with a preset feature threshold to obtain the region of interest, is the most basic and widely used in weed detection [164,171,180]. Common methods include fixed threshold (using a fixed grayscale value as a threshold, comparing the pixel values of the image with the threshold, and classifying the pixels into two categories based on the comparison results), Ostu method (an automatic threshold segmentation technique that automatically selects the optimal global threshold based on the greyscale histogram of the image to maximise interclass variance or minimise intraclass variance), dynamic threshold (automatically adjusts the threshold based on changes in the data to adapt to fluctuations and changes in the data), automatic threshold (calculates local thresholds based on the brightness distribution of different regions in the image to adapt to the local characteristics of the image), histogram entropy (calculates histogram entropy based on image histogram information to determine optimal threshold for image segmentation), etc. Learning methods use machine learning algorithms to classify image pixels and achieve the segmentation of plants and backgrounds [157]. Compared with segmentation methods based on colour index and thresholds, learning methods have better classification performance but require a large amount of computation. The segmented image may contain undesirable noise, misclassified pixels, and connected areas, and polishing is necessary to improve image quality. Common measures include morphological opening and closing, median filtering, and pattern filtering.
(4) Image feature extraction
There are several characteristic differences between crops and weeds, which can be divided into four categories: texture features, biological morphology features, spectral features, and spatial background. Extraction methods for texture features (such as leaf vein texture [165], smoothness, and roughness, etc.) include statistical feature methods (such as grey level co-occurrence matrixs (GLCM), etc.) [157,171], structure methods (such as local binary patterns (LBP), etc.) [168], model methods (such as fractal models, autoregressive models (AR), Markov random field models, etc.), and transformation methods (such as curve transformations, wavelet transformations, etc.) [200]. Texture feature methods have the characteristics of high accuracy, adaptability, and robustness. Biological morphological features include region shape parameters (such as area, perimeter, length, and width, etc.) [167], region shape indices (such as eccentricity, circularity, convexity, rectangularity, etc.), boundary-based shape descriptors (such as moment invariants (MI), beam angle statistics (BAS), Fourier descriptor (FD), tensor scale descriptor (TSD), etc.) [161]. Biological morphology methods have strong independence and noise resistance but are strongly influenced by environmental factors. When there is a significant difference in spectral reflectance between crop and weed leaves, spectral features (colour indices) can be used for feature extraction [160,162]. The spatial background can be applied to weed identification, and some researchers indirectly identify inter-row weeds by identifying crop rows [125]. In addition, multi-feature fusion methods are also commonly used to distinguish crops from weeds, which can complement each other’s strengths, improve detection accuracy and stability, and resolve some interferences factors under non-ideal conditions.
(5) Plant classification
Plant classification classifies plant features by machine learning algorithms. TML classification algorithms, which can be divided into supervised and unsupervised learning, can make extracting image features form a data feature set and then use classification models to train data features. The trained model can classify and combine different feature data to distinguish between crops and weeds. Supervised learning algorithms, including Support Vector Machine (SVM) [157,160], Linear Classifier (LC) [159], Decision Tree (DT), Random Forest (RF) [158,161], and K-Nearest Neighbour (K-NN) [166], etc., train and classify labelled feature data. Unsupervised learning algorithms, including K-means clustering and principal component analysis (PCA), do not require the labelling of data and can automatically classify similar features or attributes in the data.

Deep Learning (DL)

DL algorithms can learn the inherent patterns and representation levels in data, extract the multi-scale and multi-dimensional spatial semantic feature information of plants, and improve the accuracy of weed identification. TML requires manual feature extraction from image data based on experience and professional knowledge, while DL is a hierarchical learning approach that uses representational learning to automatically extract deep features from images, with stronger representational capabilities and unique network feature structures. Compared with TML, DL can effectively avoid the subjectivity brought by the feature extraction process and improve the accuracy of plant identification. The detection process based on DL is similar to TML, and the main difference is the classification algorithm. Hasan et al. [204] reviewed the application of DL in weed detection, evaluated the advantages of DL over TML, introduced the detection process, and also presented some datasets.
DL includes supervised learning, unsupervised learning, and semi-supervised learning. Supervised learning needs to conduct image annotation in image pre-processing to distinguish different plants. Common annotation methods include bounding box annotation [174,175,176], pixel-level annotation, image annotation [169,170], polygon annotation, and composite annotation, etc. Unsupervised learning does not need to conduct image annotation, which can automatically extract discriminative information and features and divide them into separate groups. As shown in Table 5, common DL architectures include Convolutional Neural Networks (CNN), Recurrent Neural Networks (RNN), Fully Convolutional Networks (FCN), Deep Belief Networks (DBN), Generative Adversarial Networks (GAN), and Graph Convolutional Networks (GCN), etc.

3.3.2. Sensor Detection Technology

Some researchers have used near-end sensors (such as spectral sensors, ultrasound, LiDAR, and thermal imaging) and corresponding recognition algorithms for plant detection [149]. Palottino et al. [205] reviewed the main types of sensors and their applications in plant detection, including greyscale/RGB imaging, visible near-infrared (VIS-NIR) and near-infrared (NIR) sensors, stereoscopic vision, thermal imaging, combination sensors, phenotype sensors, etc. Singh et al. [149] reviewed existing technologies that can be used to develop ground sensor systems, including multispectral and hyperspectral recognition, and fluorescence imaging recognition techniques for weed identification. Fernandez-Quintanilla et al. [87] described the current research status and applicability of remote sensing and ground detection technologies and introduced the limiting factors and future development directions of weed detection systems. The research methods and results in sensor detection technology can provide reference and inspiration for future research in vegetable detection.

Spectral Detection

Differences in spectral reflectance can be used for plants identification. Spectral sensors, including multispectral and hyperspectral, can collect spectral images that reflect the reflectance information of visible, infrared, near-infrared, and ultraviolet light on plant leaves. Multispectral sensors can usually obtain spectral information from 3 to 10 bands, with a relatively small amount of captured information, while hyperspectral sensors can obtain information from hundreds or thousands of bands with a high resolution and rich information content, but with a high cost, complex information processing, slow computing speed, and the need for specific processing algorithms. The steps of spectral plant detection include image acquisition, image pre-processing, image binarisation, image segmentation, feature extraction, and decision/weed detection [149]. Spectral sensors are often used in remote sensing systems to detect plants in the field, due to the high resolution. Table 6 shows some research methods and results of vegetable and weed detection using spectral sensors. It can be seen that the technology has high detection accuracy for vegetables and weeds, but its application for vegetables is limited.
In addition, Mohammadi et al. [206] developed a new type of multispectral camera that can be used for crop and weed detection. The multispectral camera was based on Fabry-Pérot technology and can cover visible and near-infrared wavelengths ranging from 380 to 950 nm. A genetic algorithm was used to optimise spectral bands, and a total of 16 spectral bands suitable for agricultural applications were selected. Tao et al. [207] designed a multispectral sensor for winter rape detection and weed detection to eliminate the influence of astigmatism on spectral sensors. Experimental verification results showed that the system can achieve high reflectivity, measurement accuracy, and efficiency. Duncan et al. [208] designed a low-cost multispectral plant detection sensor and the experimental results showed that the near-infrared bands of 810 and 860 nm had the best response to vegetation and soil.
Table 6. Research methods and results of spectral detection for some vegetables and weeds.
Table 6. Research methods and results of spectral detection for some vegetables and weeds.
PlatformsCropsWeedsSensorsResolution RatioBandsDetection AlgorithmsPrecisionReferences
HandheldSorghumAmaranthus, Chenopodium retusum, Malva sylvestris, Cyperus rotundus Linn., Alloteropsis cimicinaHyperspectral-20 bands, 325–1075 nmStepwise linear discriminant analysis (SLDA)80–100%Che Ya et al. [209]
UAVMultispectral0.87 mm440, 560, 680, 710, 720, and 850 nmOBIA93%
AircraftCornSorghum halepense, Xanthium strumarium, Abutilon theophrastiHyperspectral2 m21 bands, 456–1650 nmSpectral angle mapper and Spectral mixture analysis60–80%Martin et al. [210]
TractorLettuceBroadleaf herbaMultispectral0.95 mm525, 650, and 850 nmOtsu’s multi-level threshold>81%Elstone et al. [211]
Flat bed
truck
Field pea, spring wheat, canolaAvena fatua Linn., Amaranthus retroflexus L., Chenopodium retusumHyperspectral1.25 mm61 bands, 400–1000 nmModified Chlorophyll Absorptance Reflectance Index; Principal component analysis and stepwise discriminant analysis88–94%Eddy et al. [212]

Fluorescence Detection

Plant stems and leaves with inherent characteristics will emit a portion of the light energy in a fluorescent manner under the irradiation of specific light. This is why fluorescence technology with high sensitivity is used for plant detection. The two most commonly used types of fluorescence, which can be induced by laser, ultraviolet, or red–blue light, include blue–green fluorescence in the range of 400–600 nm and chlorophyll fluorescence in the range of 650–800 nm. Fluorescence detection is a unique detection technique, but it still has certain limitations and can only be used for some plants. Table 7 shows some research methods and results of vegetables and weeds detection by fluorescence detection technology. It can be seen that the detection accuracy of the technology is relatively high, exceeding 90%, but its application in vegetables is also limited.

Ultrasonic Detection

Ultrasonic sensors measure the distance between the sensor and the plant by emitting ultrasonic waves and receiving reflection signals, thereby detecting the height information of the plant. Zhao et al. [219] used ultrasound technology to detect the canopy height of wheat and studied the detection accuracy of ultrasound at different operating speeds. The results showed that the average detection deviation for five types of wheat canopy densities was 0.14 m when the moving speed was between 0.3–1.2 m/s. Ultrasonic sensors, which can also be used to assess plant biomass, covered amount, weed density, etc., have a fast detection speed, low cost, and stability, and can relatively simply characterise plant structures in complex field conditions. However, ultrasonic sensors are greatly affected by leaf obstruction and cannot specifically distinguish between plant species [220].

LiDAR Detection

LiDAR, which can emit high-frequency laser beams to plants and receive signals reflected back from the plants, can obtain information about the posture, shape, distance, and height of plants, and present them in the form of point clouds. Compared with ultrasonic sensors, LiDAR has higher measurement frequency, accuracy, and measurement area, and has great potential in plant phenotype and target detection [221,222]. Table 8 shows some research methods and results of crop and weed detection by LiDAR detection.

Combination Sensors Detection

Combination sensors use multiple sensors for plant detection. Guo et al. [230] developed a high-throughput platform for plant crop phenotype analysis with multi-sensor fusion, which utilises LiDAR sensors, high resolution cameras, thermal cameras, and hyperspectral image sensors. The platform can obtain the parameters of plant shape and vegetation index. Zhang et al. [231] designed a LiDAR and multispectral sensors detection system based on drones, which was used to evaluate the leaf area index of snap beans and can also be used for sugar beets, soybeans, and winter wheat. Weis et al. [232] used LiDAR, ultrasonic sensors, RGB cameras, and spectral sensors to detect the plant characteristics of spring barley and oilseed rape under different conditions (planting density and herbicide stress) and compared two different data fusion methods. Table 9 shows some research methods and the results of crop and weed detection using combination sensors. It can be seen that the detection technology has high precision for vegetable detection, but combination sensors need to achieve synchronous cooperation between sensors and obtain a large amount of data, require more computing power.

3.3.3. Plant Modification Technology

Plant modification technology, which alters the appearance, internal substances, or genes of plants by physical, chemical, or biological techniques, can be used to detect crops and weeds. Su et al. [237] introduced crops signal technology, reviewed the research progress of plant signal technology in crop and weed detection, and discussed the advantages, challenges, and application prospects of the technology. As shown in Table 10, plant modification technologies can be divided into physical modification, chemical modification, and biological modification. Physical modification involves setting physical labels (such as biodegradable plastics, fluorescent coatings, etc.) on or near the surface of plants, and then using machine vision technology for crop recognition. Chemical modification involves applying compounds (such as fluorescent compounds, tracers, etc.) to plant seeds, roots, stems, or leaves for global or local labelling, and then detecting crops by inducing fluorescence methods or others technologies. Biological modification can genetically modify plants to produce substances that can be used for fluorescence detection. Some research results indicate that the application of plant modification technology in plant detection is feasible and has a high detection accuracy, but it requires a significant amount of labour, cost, or time [238,239,240,241,242,243,244,245,246].

4. Weeding Actuators and Robots

Researchers and agricultural companies have developed a series of weeding actuators that can be divided into chemical weeding, mechanical weeding, physical weeding, and combination weeding for weed control operations [248,249,250]. Roberts et al. [80] reviewed new technologies for weed control, such as chemical, mechanical, electrical, laser, and thermal weed control, and discussed the challenges and future development trends of these new technologies in weed control. Fennimore et al. [248,249,250] introduced some automated and mechanised weed control technologies that can distinguish between crops and weeds for selective weed control.
Weeding robots that integrate plant recognition and weed control technologies has also been developed by researchers and agricultural companies. Among them, some developed countries (such as the United States, Germany, the United Kingdom, and Australia, etc.) have developed a series of mature weeding robots and have been widely used them locally, while weeding robot technology in other countries is relatively rudimentary [83,84,251]. The walking platforms used by weeding robots generally include drones, robots, and tractors. Li et al. [83] introduced the weeding robots based on machine vision that has been developed by researchers in the past thirty years; Zhang et al. [84] reviewed different weed detection methods and weed control robots for precise weed management, summarised the development trends in this field, discussed the limitations of existing weeding systems, and proposed ideas for future research directions. Gerhards et al. [251] compared seven new weeding robots and systems and conducted experiments in sugar beet and winter rapeseed fields to evaluate the weed control effectiveness, work efficiency, and cost of each weed control system. The experimental results indicate that weeding robots have great potential in the weeding of vegetables or other crops and can reduce the use of herbicides.

4.1. Intelligent Chemical Weeding

With the development of machine vision and other technologies, intelligent chemical weeding technology, which can achieve precise spraying of weeds under the premise of weed detection technology and apply different types of herbicides according to weed resistance, is gradually becoming widely used in green agricultural production. The technology can reduce the use of herbicides and has a high weeding efficiency, but it has not completely eliminated the use of herbicides. Allmendinger et al. [252] reviewed the precise chemical weeding technology for precision patch spraying and spot spraying based on convolutional neural network detection and introduced tractor spray and autonomous robots. Özlüoymak et al. [253] designed a novel camera-integrated spraying needle nozzle for precise weed control. The camera can detect the position of weeds, and the nozzle can spray automatically. The experimental results showed that the system had a small spraying error and good spraying effect. Table 11 shows some intelligent chemical weeding robots and weeding methods developed by researchers and agricultural companies. In addition to the weeding robots shown in Table 11, some intelligent chemical weeding robots have also been developed by agricultural companies, such as Garford (garford.com, URL (accessed on 28 June 2024)), Agrifac (agrifac.com, URL (accessed on 28 June 2024)), AgriCon (agriconagroproducer.com, URL (accessed on 28 June 2024)), Fobro Mobil D49/D92 (fobro-mobil.com, URL (accessed on 28 June 2024)), Greeneye Technology (greeneye.ag, URL (accessed on 28 June 2024)), Goldacres (goldacres.com.au, URL (accessed on 28 June 2024)), Agrointelli (agrointelli.com, URL (accessed on 28 June 2024)), Blue-River Technology (bluerivertechnology.com, URL (accessed on 28 June 2024)), Bilberry, Amazone, etc. It can be seen that intelligent chemical weeding robots based on drones, tractors, and robots have been developed for the weeding of some vegetables, which use different weed detection methods, weeding actuators, and spraying methods. Some robots have good work efficiency and have achieved good effects in saving herbicides.

4.2. Intelligent Mechanical Weeding

Intelligent mechanical weeding, which can overcome the problem of crop damage caused by traditional mechanical weeding and can also remove intra-row weeds, has gradually become widely used in vegetable weeding. Xiang et al. [81] reviewed intelligent mechanical weeding technology, introduced various mechanical actuators, and discussed their limitations and future development trends. Some researchers have developed and tested intelligent mechanical weeding actuators [261,262,263], but the mechanisms have not been applied in actual production. Table 12 shows some intelligent mechanical weeding robots developed by researchers and agricultural companies. In addition, there are other agricultural companies that have developed similar robots: Kverneland (ien.kverneland.com, URL (accessed on 29 June 2024)), Naïo-technologies (naio-technologies.com, URL (accessed on 29 June 2024)), Steketee IC-Weeder AI, etc. The weeding robots can identify vegetables or weeds by machine vision and sensor technology and then control the mechanical weeding actuators to remove the inter-row and intra-row weeds. Table 13 shows some mechanical weeding tools developed by researchers and agricultural companies. In addition, other companies have also developed mechanical weeding components, such as KULT Kress (kult-kress.com, URL (accessed on 29 June 2024)), which has developed comb teeth and the habicht hoe, etc. These weeding tools and robots can provide references for the development of intelligent mechanical weeding technology.

4.3. Intelligent Physical Weeding

Physical methods, such as electric shock [268], laser [269], thermal weeding [71], etc., which have good weeding effects, are used for weed control. The widespread use of physical weeding has the problem of high cost, while point-precise physical weeding, which can be achieved by plant recognition technology, can reduce the cost problem to some extent. Physical weeding has the potential to successfully control weeds and has been studied by researchers for a long time [268,269]. Electric weeding is the process of killing plant cells by short bursts of high voltage electricity to achieve the weeding objective [69]. The weeding method involves small areas of contact with plant stems, leaves, or roots, and cannot completely kill weeds. More contact time and energy output are required to completely remove weeds, which undoubtedly increases costs. Bloomer et al. [24] reviewed the traditional and new technologies for vegetable weed management and separately introduced the pulse electric microshock weed control technology, which can be combined with automation technology and artificial intelligence to create an autonomous, ultra-low energy consumption, selective, non-chemical weeding system. Laser weeding kills weeds with laser beams using laser-emitting equipment, including carbon dioxide lasers, diode lasers, and fibre laser equipment, etc. [73]. Thermal weeding, including flame [63], steam [70], hot foam, or hot oil weeding, etc., can kill weeds using high temperatures and energy [71,72]. Table 14 shows some intelligent physical weeding robots developed by researchers and agricultural companies. The weeding robots can identify vegetables or weeds using machine vision and sensor technology, and then control the physical weeding actuators to remove the weeds. These weeding robots can provide references for the development of intelligent physical weeding technology.

4.4. Integrated Weed Management (IWM)

Traditional weeding robots adopt a single weeding method or equipment type. IWM, which combines multiple weeding methods and strategies, is used for weeding by some researchers to improve the weeding effects and efficiency [270,271,272]. Young [270] introduced the integrated weed management method and discussed the future development direction of the technology. The researchers drew the system devices that could be included in the integrated weed management robot, as shown in Figure 1a. The integrated weed management system arranges reasonable weed control strategies based on the field environment and weed types. Bawden et al. [271] developed a modular robot platform, Agbot II (qut.edu.au, URL (accessed on 29 June 2024)), with heterogeneous agricultural weed control arrays and introduced the weed control mechanism that combined chemical and mechanical weed control methods, as shown in Figure 1b. Wu et al. [272] also designed a modular weed control system that included mechanical and chemical weed control tools, as shown in Figure 1c. With the cooperation of the weed detection system, the weeding system can achieve precise weed control. The combination of mechanical and chemical weeding methods can reduce the use of chemical herbicides to a certain extent, and also improve work efficiency.

5. Challenges and Future Development

Intelligent weed control technology for vegetables is one of the mainstream technologies in modern agricultural development. Researchers have conducted extensive research in this field. Although some achievements have been made, there are still many challenges and problems:
(1) In terms of plant detection (including global weed detection, crop-rows recognition, and precise vegetable/weed identification), there is a lot of research on plant detection technology based on machine learning and deep learning methods, while there is relatively little research on detection technology based on near-end sensors and plant modification. Although the detection technologies based on machine learning and deep learning have high detection accuracy, there are still certain limitations. The resolution and information of images collected by collection platforms (drones, ground vehicles, remote sensing, etc.) and sensors (RGB, multispectral sensors, etc.), which are affected by the natural environment, are poor and incomplete. Therefore, most researchers need to collect thousands, or even more, of field images under various natural conditions, which undoubtedly increases the workload and is inevitable [164,165,166,167,168,169]. Traditional image processing techniques have some problems such as poor accuracy and slow processing speed, while machine learning and deep learning technologies can improve recognition accuracy, but the technologies require more time for image processing and feature training [174,175,176,177,178,179]. In addition, the plant characteristics of vegetables and weeds change with the different growth stages, which has a significant impact on detection accuracy. The existing plant detection technology can only identify individual plants and has a narrow application range. Therefore, it is a major problem to accurately distinguish types of crops or weeds and expand the application range [15,78,79]. According to the types of vegetables and existing detection technologies for vegetables, it can be seen that there is still a lack of research on detection technologies for a large number of vegetables and their associated weeds. In conclusion, the detection technology for vegetables and weeds still needs further in-depth research.
(2) In terms of weeding actuators, intelligent chemical weeding has high efficiency and weeding efficacy [256,257,258,259,260], but the application of herbicides has not been completely eliminated, and there are still problems such as environmental and soil pollution, weed resistance, etc. Intelligent mechanical weeding can remove inter-row and intra-row weeds simultaneously; some weeding actuators have a high weeding rate and efficiency [264,265,266,267]. This may be a better weeding method option for farmers. However, there are also some weeding actuators that have not shown weeding effects, which require further field experimental research. In addition, compared with intelligent chemical weeding, intelligent mechanical weeding has lower working efficiency and requires higher technology investment [81]. Physical weeding also has certain limitations. Laser and electric weeding have a certain effect on weed removal, but they cannot completely kill weeds and have lower efficiency and a higher cost, which is not conducive to their use in large-scale vegetable fields [24,78]. Thermal weeding (such as flames, hot oil, steam, etc.) can kill harmful bacteria and insects while removing weeds, but it can also kill beneficial bacteria and insects [60,70,71,72,73]. There are also problems related to a high cost and low efficiency. Therefore, there are a series of problems with existing weeding machinery, and efficient, environmentally friendly, and cost-effective weeding equipment still needs to be developed and improved.
(3) The weeding robots integrate plant detection and intelligent weeding devices, which need to be combined with information and automatic control technologies to complete the weeding operation. The existing weeding robots are still limited by some problems such as image processing speed and data transmission, resulting in low efficiency and poor real-time performance. Merfield et al. [273] proposed ten characteristics and functions that weeding robots should possess, and this article agrees with these viewpoints. Some developed countries have developed a series of mature weeding robots, and these have been widely used locally, while the weeding robot technology in other countries is relatively rudimentary. The vegetable weeding robot shown in this article can provide reference and inspiration for their research.
Despite the problems mentioned above, intelligent weeding technology for vegetables still has certain development potential. In the future development of modern agriculture, the technology will become one of the mainstream technologies in vegetable production. In response to existing problems, the future development direction is as follows: In terms of plant detection, the impact of the natural environment can be addressed by manual intervention or multi-sensor methods. The collection technology should be combined with actual field conditions, so that the collected images can reflect real field information. For image processing problems, new algorithms or multi-algorithm fusion methods can be used to minimise time costs as much as possible. In precision agriculture, future plant detection technology should be able to accurately identify vegetables and weeds at different growth stages, as well as distinguish vegetable varieties and weed species. In terms of weeding actuators, integrated weed management will become a mainstream technology to overcome the limitations of a single weeding method by integrating multiple weed control methods. As for weeding robots, the adaptability and applicability of weeding robots in complex agricultural environments should be improved, especially in terms of weeding efficiency. With the development of sensor technology, weeding robots will integrate multiple sensors for field operations in the future, enabling them to have multiple functions such as “hands”, “feet”, “eyes”, and a “brain”.

6. Conclusions

This review provides a detailed overview of intelligent weeding technology for vegetables from the perspective of vegetable weed detection technology, weeding actuators, and robots. In terms of vegetable weed detection technology, global weed detection, crop-rows detection, and precise vegetable/weed recognition technologies were introduced. Global weed detection can create a weed map and arrange appropriate weed control strategies based on the weed map; crop-rows detection can be used to detect inter-row weeds and correct weed control trajectories; the precise identification technology for vegetables/weeds can be used for precise weed control to avoid damaging seedlings. Researchers have achieved significant results in the plant detection technology for vegetables using various sensors and classification algorithms, but there are still some problems. The next step should be to conduct research on issues such as the impact on the natural environment, slow processing speed, and poor real-time performance. In terms of weeding actuators and weeding robots, intelligent chemical weeding, mechanical weeding, physical weeding, and integrated weed management methods were described. Although some researchers and agricultural companies have developed various weeding actuators and robots, some still suffer from high costs, low weeding efficiency, and high seedling damage rates. At the end of the article, the problems and future development directions of intelligent weeding technology were discussed and analysed. The systematic analysis method is a popular literature review method nowadays, but this review adopts statistical and descriptive methods, which may be a limitation of this paper. We hope that this review presents the research status of intelligent weeding technology for vegetables from several aspects, which can provide a reference for future research in this field.

Author Contributions

Conceptualization, J.J. and Y.Z.; methodology, J.J.; software, C.C.; validation, J.J.; formal analysis, J.J. and C.C.; investigation, J.J.; resources, C.C.; data curation, J.J.; writing—original draft preparation, J.J.; writing—review and editing, Y.Z.; visualization, J.J. and C.C.; supervision, Y.Z.; project administration, Y.Z.; funding acquisition, Y.Z. All authors have read and agreed to the published version of the manuscript.

Funding

This research was funded by The Earmarked Fund for CARS (Grant No. CARS-01) and the Key Realm R&D Program of Guangdong Province (No. 2019B020221002).

Institutional Review Board Statement

Not applicable.

Data Availability Statement

The data are contained within the article.

Acknowledgments

We would like to thank the anonymous reviewers for their critical comments and suggestions for improving the manuscript.

Conflicts of Interest

The authors declare no conflicts of interest.

References

  1. Sultanbawa, Y.; Sivakumar, D. Enhanced nutritional and phytochemical profiles of selected underutilized fruits, vegetables, and legumes. Curr. Opin. Food Sci. 2022, 46, 100853. [Google Scholar] [CrossRef]
  2. Appleton, K.; Dinnella, C.; Spinelli, S.; Morizet, D.; Saulais, L.; Hemingway, A.; Monteleone, E.; Depezay, L.; Perez-Cueto, F.; Hartwell, H. Consumption of a high quantity and a wide variety of vegetables are predicted by different food choice motives in older adults from France, Italy and the UK. Nutrients 2017, 9, 923. [Google Scholar] [CrossRef]
  3. Mwadzingeni, L.; Afari-Sefa, V.; Shimelis, H.; N’Danikou, S.; Figlan, S.; Depenbusch, L.; Shayanowako, A.I.T.; Chagomoka, T.; Mushayi, M.; Schreinemachers, P.; et al. Unpacking the value of traditional African vegetables for food and nutrition security. Food Secur. 2021, 13, 1215–1226. [Google Scholar] [CrossRef]
  4. Lee, S.; Choi, Y.; Jeong, H.S.; Lee, J.; Sung, J. Effect of different cooking methods on the content of vitamins and true retention in selected vegetables. Food Sci. Biotechnol. 2018, 27, 333–342. [Google Scholar] [CrossRef] [PubMed]
  5. Li, X.; Guo, C.; Zhang, Y.; Yu, L.; Ma, F.; Wang, X.; Zhang, L.; Li, P. Contribution of different food types to vitamin a intake in the chinese diet. Nutrients 2023, 15, 4028. [Google Scholar] [CrossRef] [PubMed]
  6. Schreinemachers, P.; Simmons, E.B.; Wopereis, M.C.S. Tapping the economic and nutritional power of vegetables. Glob. Food Secur. 2018, 16, 36–45. [Google Scholar] [CrossRef]
  7. Shinali, T.S.; Zhang, Y.; Altaf, M.; Nsabiyeze, A.; Han, Z.; Shi, S.; Shang, N. The Valorization of wastes and byproducts from cruciferous vegetables: A review on the potential utilization of cabbage, cauliflower, and broccoli byproducts. Foods 2024, 13, 1163. [Google Scholar] [CrossRef] [PubMed]
  8. Popovic-Djordjevic, J.B.; Kostic, A.Z.; Rajkovic, M.B.; Miljkovic, I.; Krstic, D.; Caruso, G.; Siavash, M.S.; Brceski, I. Organically vs. conventionally grown vegetables: Multi-elemental analysis and nutritional evaluation. Biol. Trace Elem. Res. 2022, 200, 426–436. [Google Scholar] [CrossRef] [PubMed]
  9. Wang, H.; Zheng, Q.; Dong, A.; Wang, J.; Si, J. Chemical constituents, biological activities, and proposed biosynthetic pathways of steroidal saponins from healthy nutritious vegetable-allium. Nutrients 2023, 15, 2233. [Google Scholar] [CrossRef]
  10. Available online: https://www.fao.org/faostat/zh/#data/QCL (accessed on 20 June 2024).
  11. Fischer, G.; Patt, N.; Ochieng, J.; Mvungi, H. Participation in and gains from traditional vegetable value chains: A gendered analysis of perceptions of labour, income and expenditure in producers’ and traders’ households. Eur. J. Dev. Res. 2020, 32, 1080–1104. [Google Scholar] [CrossRef]
  12. Liu, L.; Ross, H.; Ariyawardana, A. Building rural resilience through agri-food value chains and community interactions: A vegetable case study in wuhan, China. J. Rural. Stud. 2023, 101, 103047. [Google Scholar] [CrossRef]
  13. Ganesh, K.S.; Sridhar, A.; Vishali, S. Utilization of fruit and vegetable waste to produce value-added products: Conventional utilization and emerging opportunities-A review. Chemosphere 2022, 287, 132221. [Google Scholar] [CrossRef]
  14. Velasco-Ramírez, A.P.; Velasco-Ramírez, A.; Hernández-Herrera, R.M.; Ceja-Esquivez, J.; Velasco-Ramírez, S.F.; Ramírez-Anguiano, A.C.; Torres-Morán, M.I. The impact of aqueous extracts of verbesina sphaerocephala and verbesina fastigiata on germination and growth in solanum lycopersicum and cucumis sativus seedlings. Horticulturae 2022, 8, 652. [Google Scholar] [CrossRef]
  15. Gonzalez-Andujar, J.L.; Aguilera, M.J.; Davis, A.S.; Navarrete, L. Disentangling weed diversity and weather impacts on long-term crop productivity in a wheat-legume rotation. Field Crops Res. 2019, 232, 24–29. [Google Scholar] [CrossRef]
  16. Tanveer, A.; Khaliq, A.; Javaid, M.M.; Chaudhry, M.N.; Awan, I. Implications of weeds of genus euphorbia for crop production: A review. Planta Daninha 2013, 31, 723–731. [Google Scholar] [CrossRef]
  17. Abdallah, I.S.; Atia, M.A.M.; Nasrallah, A.K.; El-Beltagi, H.S.; Kabil, F.F.; El-Mogy, M.M.; Abdeldaym, E.A. Effect of new pre-emergence herbicides on quality and yield of potato and its associated weeds. Sustainability 2021, 13, 9796. [Google Scholar] [CrossRef]
  18. Cloyd, R.A.; Herrick, N.J. The case for sanitation as an insect pest management strategy in greenhouse production systems. J. Entomol. Sci. 2022, 57, 315–322. [Google Scholar] [CrossRef]
  19. Madden, M.K.; Widick, I.V.; Blubaugh, C.K. Weeds impose unique outcomes for pests, natural enemies, and yield in two vegetable crops. Environ. Entomol. 2021, 50, 330–336. [Google Scholar] [CrossRef]
  20. Thies, J.A. Grafting for managing vegetable crop pests. Pest. Manag. Sci. 2021, 77, 4825–4835. [Google Scholar] [CrossRef]
  21. Dentika, P.; Ozier-Lafontaine, H.; Penet, L. Weeds as pathogen hosts and disease risk for crops in the wake of a reduced use of herbicides: Evidence from yam (Dioscorea alata) fields and colletotrichum pathogens in the tropics. J. Fungi 2021, 7, 283. [Google Scholar] [CrossRef]
  22. Pumariño, L.; Sileshi, G.W.; Gripenberg, S.; Kaartinen, R.; Barrios, E.; Muchane, M.N.; Midega, C.; Jonsson, M. Effects of agroforestry on pest, disease and weed control: A meta-analysis. Basic Appl. Ecol. 2015, 16, 573–582. [Google Scholar] [CrossRef]
  23. Tolman, J.H.; McLeod, D.G.R.; Harris, C.R. Cost of crop losses in processing tomato and cabbage in southwestern Ontario due to insects, weeds and/or diseases. Can. J. Plant Sci. 2004, 3, 915–921. [Google Scholar] [CrossRef]
  24. Bloomer, D.J.; Harrington, K.C.; Ghanizadeh, H.; James, T.K. Robots and shocks: Emerging non-herbicide weed control options for vegetable and arable cropping. N. Z. J. Agric. Res. 2024, 67, 81–103. [Google Scholar] [CrossRef]
  25. Abit, M.; Dimas, E.; Ramirez, A. Weed survey of small-scale vegetable farms in ormoc city, philippines with emphasis on altitude variation. Philipp. J. Crop Sci. 2022, 3, 40–48. [Google Scholar]
  26. Da, S.S.R.; Vechia, J.; Dos, S.C.; Almeida, D.P.; Da, C.F.M. Relationship of contact angle of spray solution on leaf surfaces with weed control. Sci. Rep. 2021, 11, 9886. [Google Scholar] [CrossRef]
  27. Kaur, R.; Das, T.K.; Banerjee, T.; Raj, R.; Singh, R.; Sen, S. Impacts of sequential herbicides and residue mulching on weeds and productivity and profitability of vegetable pea in North-western Indo-Gangetic Plains. Sci. Hortic. 2020, 270, 109456. [Google Scholar] [CrossRef]
  28. Raja, R.; Nguyen, T.T.; Slaughter, D.C.; Fennimore, S.A. Real-time weed-crop classification and localisation technique for robotic weed control in lettuce. Biosyst. Eng. 2020, 192, 257–274. [Google Scholar] [CrossRef]
  29. Parkash, V.; Saini, R.; Singh, M.; Singh, S. Comparison of the effects of ammonium nonanoate and an essential oil herbicide on weed control efficacy and water use efficiency of pumpkin. Weed Technol. 2022, 36, 64–72. [Google Scholar] [CrossRef]
  30. Asaf, E.; Rozenberg, G.; Shulner, I.; Eizenberg, H.; Lati, R.N. Evaluation of finger weeder safety and efficacy for intra-row weed removal in irrigated field crops. Weed Res. 2023, 63, 102–114. [Google Scholar] [CrossRef]
  31. Jiao, J.; Wang, Z.; Luo, H.; Chen, G.; Liu, H.; Guan, J.; Hu, L.; Zang, Y. Development of a mechanical weeder and experiment on the growth, yield and quality of rice. Int. J. Agric. Biol. Eng. 2022, 15, 92–99. [Google Scholar] [CrossRef]
  32. Jiao, J.; Hu, L.; Chen, G.; Chen, C.; Zang, Y. Development and experimentation of intra-row weeding device for organic rice. Agriculture 2024, 14, 146. [Google Scholar] [CrossRef]
  33. Abdelaal, K.; Alsubeie, M.S.; Hafez, Y.; Emeran, A.; Moghanm, F.; Okasha, S.; Omara, R.; Basahi, M.A.; Darwish, D.B.E.; Ibrahim, M.F.M.; et al. Physiological and biochemical changes in vegetable and field crops under drought, salinity and weeds stresses: Control strategies and management. Agriculture 2022, 12, 2084. [Google Scholar] [CrossRef]
  34. Lewis, D.G.; Cutulle, M.A.; Schmidt-Jeffris, R.A.; Blubaugh, C.K. Better together? Combining cover crop mulches, organic herbicides, and weed seed biological control in reduced-tillage systems. Environ. Entomol. 2020, 49, 1327–1334. [Google Scholar] [CrossRef]
  35. Merfield, C.N.; Hampton, J.G.; Wratten, S.D. A direct-fired steam weeder. Weed Res. 2009, 49, 553–556. [Google Scholar] [CrossRef]
  36. Sportelli, M.; Frasconi, C.; Fontanelli, M.; Pirchio, M.; Gagliardi, L.; Raffaelli, M.; Peruzzi, A.; Antichi, D. Innovative living mulch management strategies for organic conservation field vegetables: Evaluation of continuous mowing, flaming, and tillage performances. Agronomy 2022, 12, 622. [Google Scholar] [CrossRef]
  37. Rastgordani, F.; Oveisi, M.; Mashhadi, H.R.; Naeimi, M.H.; Hosseini, N.M.; Asadian, N.; Bakhshian, A.; Müller-Schärer, H. Climate change impact on herbicide efficacy: A model to predict herbicide dose in common bean under different moisture and temperature conditions. Crop Prot. 2023, 163, 106097. [Google Scholar] [CrossRef]
  38. Abdallah, I.S.; Abdelgawad, K.F.; El-Mogy, M.M.; El-Sawy, M.B.I.; Mahmoud, H.A.; Fahmy, M.A.M. Weed control, growth, nodulation, quality and storability of peas as affected by pre- and postemergence herbicides. Horticulturae 2021, 7, 307. [Google Scholar] [CrossRef]
  39. Mosqueda, E.G.; Lim, C.A.; Sbatella, G.M.; Jha, P.; Lawrence, N.C.; Kniss, A.R. Effect of crop canopy and herbicide application on kochia (Bassia scoparia) density and seed production. Weed Sci. 2020, 68, 278–284. [Google Scholar] [CrossRef]
  40. Colquhoun, J.B.; Heider, D.J.; Rittmeyer, R.A. Potato injury risk and weed control from reduced rates of PPO-inhibiting herbicides. Weed Technol. 2021, 35, 632–637. [Google Scholar] [CrossRef]
  41. Buzanini, A.C.; Boyd, N.S. Tomato and bell pepper tolerance to preemergence herbicides applied posttransplant in plasticulture production. Weed Technol. 2023, 37, 67–70. [Google Scholar] [CrossRef]
  42. Boyd, N.S.; Moretti, M.L.; Sosnoskie, L.M.; Singh, V.; Kanissery, R.; Sharpe, S.; Besançon, T.; Culpepper, S.; Nurse, R.; Hatterman-Valenti, H.; et al. Occurrence and management of herbicide resistance in annual vegetable production systems in North America. Weed Sci. 2022, 70, 515–528. [Google Scholar] [CrossRef]
  43. Jhala, A.J.; Singh, M.; Shergill, L.; Singh, R.; Jugulam, M.; Riechers, D.E.; Ganie, Z.A.; Selby, T.P.; Werle, R.; Norsworthy, J.K. Very long chain fatty acid-inhibiting herbicides: Current uses, site of action, herbicide-resistant weeds, and future. Weed Technol. 2023, 38, e1. [Google Scholar] [CrossRef]
  44. Rao, A.N.; Singh, R.G.; Mahajan, G.; Wani, S.P. Weed research issues, challenges, and opportunities in India. Crop Prot. 2020, 134, 104451. [Google Scholar] [CrossRef]
  45. Martin, R.J. Weed research issues, challenges, and opportunities in Cambodia. Crop Prot. 2020, 134, 104288. [Google Scholar] [CrossRef]
  46. Zawada, M.; Legutko, S.; Gościańska-Łowińska, J.; Szymczyk, S.; Nijak, M.; Wojciechowski, J.; Zwierzyński, M. Mechanical weed control systems: Methods and effectiveness. Sustainability 2023, 15, 15206. [Google Scholar] [CrossRef]
  47. Zejak, D.; Popović, V.; Spalević, V.; Popović, D.; Radojević, V.; Ercisli, S.; Glišić, I. State and economical benefit of organic production: Fields crops and fruits in the world and Montenegro. Not. Bot. Horti Agrobot. Cluj-Napoca 2022, 50, 12815. [Google Scholar] [CrossRef]
  48. Mazur-Włodarczyk, K.; Gruszecka-Kosowska, A. Conventional or organic? Motives and trends in polish vegetable consumption. Int. J. Environ. Res. Public Health 2022, 19, 4667. [Google Scholar] [CrossRef] [PubMed]
  49. Migliavada, R.; Ricci, F.Z.; Denti, F.; Haghverdian, D.; Torri, L. Is purchasing of vegetable dishes affected by organic or local labels? Empirical evidence from a university canteen. Appetite 2022, 173, 105995. [Google Scholar] [CrossRef]
  50. Loera, B.; Murphy, B.; Fedi, A.; Martini, M.; Tecco, N.; Dean, M. Understanding the purchase intentions for organic vegetables across EU: A proposal to extend the TPB model. Br. Food J. 2022, 124, 4736–4754. [Google Scholar] [CrossRef]
  51. de Lima, D.P.; Dos Santos Pinto Júnior, E.; de Menezes, A.V.; de Souza, D.A.; de São José, V.P.B.; Da Silva, B.P.; de Almeida, A.Q.; de Carvalho, I.M.M. Chemical composition, minerals concentration, total phenolic compounds, flavonoids content and antioxidant capacity in organic and conventional vegetables. Food Res. Int. 2024, 175, 113684. [Google Scholar] [CrossRef] [PubMed]
  52. Imran; Amanullah. Assessment of chemical and manual weed control approaches for effective weed suppression and maize productivity enhancement under maize-wheat cropping system. Gesunde Pflanz. 2022, 74, 167–176. [Google Scholar] [CrossRef]
  53. Awan, D.; Ahmad, F.; Ashraf, S. Effective weed control strategy in tomato kitchen gardens—herbicides, mulching or manual weeding. Curr. Sci. India 2018, 6, 1325–1329. [Google Scholar] [CrossRef]
  54. Gazoulis, I.; Kanatas, P.; Antonopoulos, N. Cultural practices and mechanical weed control for the management of a low-diversity weed community in spinach. Diversity 2021, 13, 616. [Google Scholar] [CrossRef]
  55. Pandey, H.S.; Tiwari, G.S.; Sharma, A.K. Design and development of an e-powered inter row weeder for small farm mechanization. J. Sci. Ind. Res. 2023, 82, 671–682. [Google Scholar] [CrossRef]
  56. Baidhe, E.; Kigozi, J.; Kambugu, R.K. Design, construction and performance evaluation for a maize weeder attachable to an ox-plough frame. J. Biosyst. Eng. 2020, 45, 65–70. [Google Scholar] [CrossRef]
  57. Richard, D.; Leimbrock-Rosch, L.; Keßler, S.; Stoll, E.; Zimmer, S. Soybean yield response to different mechanical weed control methods in organic agriculture in Luxembourg. Eur. J. Agron. 2023, 147, 126842. [Google Scholar] [CrossRef]
  58. Jiao, J.K.; Hu, L.; Chen, G.L.; Tu, T.P.; Wang, Z.M.; Zang, Y. Design and experiment of an inter-row weeding equipment applied in paddy field. Trans. Chin. Soc. Agric. Eng. (Trans. CSAE) 2023, 39, 11–22. [Google Scholar]
  59. Ghorai, A.K. “Agricultural weeder with nail assembly” for weed control, soil moisture conservation, soil aeration and increasing crop productivity. Int. J. Environ. Clim. Chang. 2022, 12, 3056–3068. [Google Scholar] [CrossRef]
  60. Robačer, M.; Canali, S.; Kristensen, H.L.; Bavec, F.; Mlakar, S.G.; Jakop, M.; Bavec, M. Cover crops in organic field vegetable production. Sci. Hortic. 2016, 208, 104–110. [Google Scholar] [CrossRef]
  61. Greer, L.; Dole, J.M. Aluminum Foil, Aluminium-painted, plastic, and degradable mulches increase yields and decrease insectvectored viral diseases of vegetables. Hort. Technol. 2003, 2, 276–284. [Google Scholar] [CrossRef]
  62. McCollough, M.R.; Poulsen, F.; Melander, B. Informing the operation of intelligent automated intra-row weeding machines in direct-sown sugar beet (Beta vulgaris L.): Crop effects of hoeing and flaming across early growth stages, tool working distances, and intensities. Crop Prot. 2024, 177, 106562. [Google Scholar] [CrossRef]
  63. Morselli, N.; Puglia, M.; Pedrazzi, S.; Muscio, A.; Tartarini, P.; Allesina, G. Energy, environmental and feasibility evaluation of tractor-mounted biomass gasifier for flame weeding. Sustain. Energy Technol. Assess. 2022, 50, 101823. [Google Scholar] [CrossRef]
  64. Borowy, A.; Kapłan, M. Evaluating glufosinate-ammonium and flame weeding for weed control in sweet marjoram (Origanum majorana L.) cultivation. Acta Sci. Pol. Hortorum Cultus 2022, 21, 71–83. [Google Scholar] [CrossRef]
  65. Rajković, M.; Malidža, G.; Tomaš Simin, M.; Milić, D.; Glavaš-Trbić, D.; Meseldžija, M.; Vrbničanin, S. Sustainable organic corn production with the use of flame weeding as the most sustainable economical solution. Sustainability 2021, 13, 572. [Google Scholar] [CrossRef]
  66. Galbraith, C.G. Electrical Weed Control in Integrated Weed Management: Impacts on Vegetable Production, Weed Seed Germination, and Soil Microbial Communities. Master’s Thesis, Michigan State University, East Lansing, MI, USA, 2023. [Google Scholar]
  67. Moore, L.D.; Jennings, K.M.; Monks, D.W.; Boyette, M.D.; Leon, R.G.; Jordan, D.L.; Ippolito, S.J.; Blankenship, C.D.; Chang, P. Evaluation of electrical and mechanical Palmer amaranth (Amaranthus palmeri) management in cucumber, peanut, and sweetpotato. Weed Technol. 2023, 37, 53–59. [Google Scholar] [CrossRef]
  68. Matsuda, Y.; Kakutani, K.; Toyoda, H. Unattended electric weeder (UEW): A novel approach to control floor weeds in orchard nurseries. Agronomy 2023, 13, 1954. [Google Scholar] [CrossRef]
  69. Bloomer, D.J.; Harrington, K.C.; Ghanizadeh, H.; James, T.K. Micro electric shocks control broadleaved and grass weeds. Agronomy 2022, 12, 2039. [Google Scholar] [CrossRef]
  70. Guerra, N.; Fennimore, S.A.; Siemens, M.C.; Goodhue, R.E. Band steaming for weed and disease control in leafy greens and carrots. Hortscience 2022, 57, 1453–1459. [Google Scholar] [CrossRef]
  71. Zhang, Y.; Staab, E.S.; Slaughter, D.C.; Giles, D.K.; Downey, D. Automated weed control in organic row crops using hyperspectral species identification and thermal micro-dosing. Crop Prot. 2012, 41, 96–105. [Google Scholar] [CrossRef]
  72. Rasmussen, J.; Griepentrog, H.W.; Nielsen, J.; Henriksen, C.B. Automated intelligent rotor tine cultivation and punch planting to improve the selectivity of mechanical intra-row weed control. Weed Res. 2012, 52, 327–337. [Google Scholar] [CrossRef]
  73. Zhu, H.; Zhang, Y.; Mu, D.; Bai, L.; Zhuang, H.; Li, H. YOLOX-based blue laser weeding robot in corn field. Front. Plant Sci. 2022, 13, 1017803. [Google Scholar] [CrossRef] [PubMed]
  74. Kennedy, H.; Fennimore, S.A.; Slaughter, D.C.; Nguyen, T.T.; Vuong, V.L.; Raja, R.; Smith, R.F. Crop signal markers facilitate crop detection and weed removal from lettuce and tomato by an intelligent cultivator. Weed Technol. 2020, 34, 342–350. [Google Scholar] [CrossRef]
  75. Mennan, H.; Jabran, K.; Zandstra, B.H.; Pala, F. Non-chemical weed management in vegetables by using cover crops: A review. Agronomy 2020, 10, 257. [Google Scholar] [CrossRef]
  76. Merfield, C.N. Could the dawn of Level 4 robotic weeders facilitate a revolution in ecological weed management? Weed Res. 2023, 63, 83–87. [Google Scholar] [CrossRef]
  77. Cutulle, M.A.; Maja, J.M. Determining the utility of an unmanned ground vehicle for weed control in specialty crop systems. Ital. J. Agron. 2021, 16, 1865. [Google Scholar] [CrossRef]
  78. Stenchly, K.; Lippmann, S.; Waongo, A.; Nyarko, G.; Buerkert, A. Weed species structural and functional composition of okra fields and field periphery under different management intensities along the rural-urban gradient of two West African cities. Agric. Ecosyst. Environ. 2017, 237, 213–223. [Google Scholar] [CrossRef]
  79. Cruz-Garcia, G.S.; Price, L.L. Weeds as important vegetables for farmers. Acta Soc. Bot. Pol. 2012, 81, 397–403. [Google Scholar] [CrossRef]
  80. Roberts, J.; Florentine, S. Advancements and developments in the detection and control of invasive weeds: A global review of the current challenges and future opportunities. Weed Sci. 2024, 72, 205–215. [Google Scholar] [CrossRef]
  81. Xiang, M.; Qu, M.; Wang, G.; Ma, Z.; Chen, X.; Zhou, Z.; Qi, J.; Gao, X.; Li, H.; Jia, H. Crop detection technologies, mechanical weeding executive parts and working performance of intelligent mechanical weeding: A review. Front. Plant Sci. 2024, 15, 1361002. [Google Scholar] [CrossRef]
  82. Coleman, M.J.; Kristiansen, P.; Sindel, B.M.; Fyfe, C. Imperatives for integrated weed management in vegetable production: Evaluating research and adoption. Weed Biol. Manag. 2024, 24, 3–14. [Google Scholar] [CrossRef]
  83. Li, Y.; Guo, Z.; Shuang, F.; Zhang, M.; Li, X. Key technologies of machine vision for weeding robots: A review and benchmark. Comput. Electron. Agric. 2022, 196, 106880. [Google Scholar] [CrossRef]
  84. Zhang, W.; Miao, Z.; Li, N.; He, C.; Sun, T. Review of current robotic approaches for precision weed management. Curr. Robot. Rep. 2022, 3, 139–151. [Google Scholar] [CrossRef]
  85. Murad, N.Y.; Mahmood, T.; Forkan, A.; Morshed, A.; Jayaraman, P.P.; Siddiqui, M.S. Weed detection using deep learning: A systematic literature review. Sensors 2023, 23, 3670. [Google Scholar] [CrossRef] [PubMed]
  86. Esposito, M.; Crimaldi, M.; Cirillo, V.; Sarghini, F.; Maggio, A. Drone and sensor technology for sustainable weed management: A review. Chem. Biol. Technol. Agric. 2021, 8, 18. [Google Scholar] [CrossRef]
  87. Fernández Quintanilla, C.; Peña, J.M.; Andújar, D.; Dorado, J.; Ribeiro, A.; López Granados, F. Is the current state of the art of weed monitoring suitable for site-specific weed management in arable crops? Weed Res. 2018, 58, 259–272. [Google Scholar] [CrossRef]
  88. Singh, V.; Rana, A.; Bishop, M.; Filippi, A.M.; Cope, D.; Rajan, N.; Bagavathiannan, M. Chapter Three—Unmanned aircraft systems for precision weed detection and management: Prospects and challenges. Adv. Agron. 2020, 159, 93–134. [Google Scholar] [CrossRef]
  89. Bolch, E.A. Comparing Mapping Capabilities of Small Unmanned Aircraft and Manned Aircraft for Monitoring Invasive Plants in a Wetland Environment. Master’s Thesis, University of California, Merced, CA, USA, 2020. [Google Scholar]
  90. Mohidem, N.A.; Che’Ya, N.N.; Juraimi, A.S.; Fazlil Ilahi, W.F.; Mohd Roslim, M.H.; Sulaiman, N.; Saberioon, M.; Mohd Noor, N. How Can Unmanned aerial vehicles be used for detecting weeds in agricultural fields? Agriculture 2021, 10, 1004. [Google Scholar] [CrossRef]
  91. Huang, Y.; Reddy, K.N.; Fletcher, R.S.; Pennington, D. UAV low-altitude remote sensing for precision weed management. Weed Technol. 2018, 32, 2–6. [Google Scholar] [CrossRef]
  92. Su, J.; Yi, D.; Coombes, M.; Liu, C.; Zhai, X.; McDonald-Maier, K.; Chen, W. Spectral analysis and mapping of blackgrass weed by leveraging machine learning and UAV multispectral imagery. Comput. Electron. Agric. 2022, 192, 106621. [Google Scholar] [CrossRef]
  93. de Camargo, T.; Schirrmann, M.; Landwehr, N.; Dammer, K.; Pflanz, M. Optimized deep learning model as a basis for fast uav mapping of weed species in winter wheat crops. Remote Sens. 2021, 13, 1704. [Google Scholar] [CrossRef]
  94. Sonobe, R.; Yamaya, Y.; Tani, H.; Wang, X.; Kobayashi, N.; Mochizuki, K. Assessing the suitability of data from Sentinel-1A and 2A for crop classification. GISci. Remote Sens. 2017, 54, 918–938. [Google Scholar] [CrossRef]
  95. Anderegg, J.; Tschurr, F.; Kirchgessner, N.; Treier, S.; Schmucki, M.; Streit, B.; Walter, A. On-farm evaluation of UAV-based aerial imagery for season-long weed monitoring under contrasting management and pedoclimatic conditions in wheat. Comput. Electron. Agric. 2023, 204, 107558. [Google Scholar] [CrossRef]
  96. Fraccaro, P.; Butt, J.; Edwards, B.; Freckleton, R.P.; Childs, D.Z.; Reusch, K.; Comont, D. A deep learning application to map weed spatial extent from unmanned aerial vehicles imagery. Remote Sens. 2022, 14, 4197. [Google Scholar] [CrossRef]
  97. Lambert, J.P.T.; Hicks, H.L.; Childs, D.Z.; Freckleton, R.P. Evaluating the potential of Unmanned Aerial Systems for mapping weeds at field scales: A case study with Alopecurus myosuroides. Weed Res. 2018, 58, 35–45. [Google Scholar] [CrossRef]
  98. Lambert, J.P.T.; Childs, D.Z.; Freckleton, R.P. Testing the ability of unmanned aerial systems and machine learning to map weeds at subfield scales: A test with the weed Alopecurus myosuroides (Huds). Pest Manag. Sci. 2019, 75, 2283–2294. [Google Scholar] [CrossRef] [PubMed]
  99. Castaldi, F.; Pelosi, F.; Pascucci, S.; Casa, R. Assessing the potential of images from unmanned aerial vehicles (UAV) to support herbicide patch spraying in maize. Precis. Agric. 2017, 18, 76–94. [Google Scholar] [CrossRef]
  100. Gao, J.; Liao, W.; Nuyttens, D.; Lootens, P.; Vangeyte, J.; Pižurica, A.; He, Y.; Pieters, J.G. Fusion of pixel and object-based features for weed mapping using unmanned aerial vehicle imagery. Int. J. Appl. Earth Obs. 2018, 67, 43–53. [Google Scholar] [CrossRef]
  101. Garcia-Ruiz, F.J.; Wulfsohn, D.; Rasmussen, J. Sugar beet (Beta vulgaris L.) and thistle (Cirsium arvensis L.) discrimination based on field spectral data. Biosyst. Eng. 2015, 139, 1–15. [Google Scholar] [CrossRef]
  102. Sa, I.; Popovic, M.; Khanna, R.; Chen, Z.; Lottes, P.; Liebisch, F.; Nieto, J.; Stachniss, C.; Walter, A.; Siegwart, R. WeedMap A large-scale semantic weed mapping framework using aerial multispectral imaging and deep neural network for precision farming. Remote Sens. 2018, 10, 1423. [Google Scholar] [CrossRef]
  103. Zou, K.; Chen, X.; Zhang, F.; Zhou, H.; Zhang, C. A field weed density evaluation method based on UAV imaging and modified. Remote Sens. 2021, 13, 310. [Google Scholar] [CrossRef]
  104. Khan, S.; Tufail, M.; Khan, M.T.; Khan, Z.A.; Iqbal, J.; Alam, M. A novel semi-supervised framework for UAV based crop/weed classification. PLoS ONE 2021, 16, e0251008. [Google Scholar] [CrossRef] [PubMed]
  105. Gašparović, M.; Zrinjski, M.; Barković, Đ.; Radočaj, D. An automatic method for weed mapping in oat fields based on UAV imagery. Comput. Electron. Agric. 2020, 173, 105385. [Google Scholar] [CrossRef]
  106. de Castro, A.; Torres-Sánchez, J.; Peña, J.; Jiménez-Brenes, F.; Csillik, O.; López-Granados, F. An automatic random forest-OBIA Algorithm for early weed mapping between and within crop rows using UAV imagery. Remote Sens. 2018, 10, 285. [Google Scholar] [CrossRef]
  107. López-Granados, F.; Torres-Sánchez, J.; Serrano-Pérez, A.; de Castro, A.I.; Mesas-Carrascosa, F.J.; Peña, J. Early season weed mapping in sunflower using UAV technology: Variability of herbicide treatment maps against weed thresholds. Precis. Agric. 2016, 17, 183–199. [Google Scholar] [CrossRef]
  108. Pérez-Ortiz, M.; Peña, J.M.; Gutiérrez, P.A.; Torres-Sánchez, J.; Hervás-Martínez, C.; López-Granados, F. Selecting patterns and features for between and within crop-row weed mapping using UAV-imagery. Expert. Syst. Appl. 2016, 47, 85–94. [Google Scholar] [CrossRef]
  109. Rozenberg, G.; Kent, R.; Blank, L. Consumer-grade UAV utilized for detecting and analyzing late-season weed spatial distribution patterns in commercial onion fields. Precis. Agric. 2021, 22, 1317–1332. [Google Scholar] [CrossRef]
  110. Genze, N.; Wirth, M.; Schreiner, C.; Ajekwe, R.; Grieb, M.; Grimm, D.G. Improved weed segmentation in UAV imagery of sorghum fields with a combined deblurring segmentation model. Plant Methods 2023, 19, 87. [Google Scholar] [CrossRef]
  111. Zhang, Y.; Slaughter, D.C.; Staab, E.S. Robust hyperspectral vision-based classification for multi-season weed mapping. ISPRS J. Photogramm. Remote Sens. 2012, 69, 65–73. [Google Scholar] [CrossRef]
  112. Huang, H.; Deng, J.; Lan, Y.; Yang, A.; Deng, X.; Zhang, L. A fully convolutional network for weed mapping of unmanned aerial vehicle (UAV) imagery. PLoS ONE 2018, 13, e0196302. [Google Scholar] [CrossRef] [PubMed]
  113. Huang, H.; Deng, J.; Lan, Y.; Yang, A.; Deng, X.; Wen, S.; Zhang, H.; Zhang, Y. Accurate weed mapping and prescription map generation based on fully convolutional networks using UAV imagery. Sensors 2018, 18, 3299. [Google Scholar] [CrossRef]
  114. de Castro, A.I.; Jurado-Expósito, M.; Peña-Barragán, J.M.; López-Granados, F. Airborne multi-spectral imagery for mapping cruciferous weeds in cereal and legume crops. Precis. Agric. 2012, 13, 302–321. [Google Scholar] [CrossRef]
  115. de Castro, A.I.; López-Granados, F.; Jurado-Expósito, M. Broad-scale cruciferous weed patch classification in winter wheat using QuickBird imagery for in-season site-specific control. Precis. Agric. 2013, 14, 392–413. [Google Scholar] [CrossRef]
  116. Castillejo-González, I.L.; Peña-Barragán, J.M.; Jurado-Expósito, M.; Mesas-Carrascosa, F.J.; López-Granados, F. Evaluation of pixel- and object-based approaches for mapping wild oat (Avena sterilis) weed patches in wheat fields using QuickBird imagery for site-specific management. Eur. J. Agron. 2014, 59, 57–66. [Google Scholar] [CrossRef]
  117. Revathy, R.; Setia, R.; Jain, S.; Das, S.; Gupta, S.; Pateriya, B. Classification of potato in Indian Punjab using time-series sentinel-2 images. In Artificial Intelligence and Machine Learning in Satellite Data Processing and Services; Lecture Notes in Electrical Engineering; Springer Nature: Singapore, 2023; Volume 970, pp. 193–201. [Google Scholar]
  118. Mudereri, B.T.; Dube, T.; Niassy, S.; Kimathi, E.; Landmann, T.; Khan, Z.; Abdel-Rahman, E.M. Is it possible to discern Striga weed (Striga hermonthica) infestation levels in maize agro-ecological systems using in-situ spectroscopy? Int. J. Appl. Earth Obs. 2020, 85, 102008. [Google Scholar] [CrossRef]
  119. Mudereri, B.T.; Abdel-Rahman, E.M.; Dube, T.; Niassy, S.; Khan, Z.; Tonnang, H.E.Z.; Landmann, T. A two-step approach for detecting Striga in a complex agroecological system using Sentinel-2 data. Sci. Total Environ. 2021, 762, 143151. [Google Scholar] [CrossRef] [PubMed]
  120. Mkhize, Y.; Madonsela, S.; Cho, M.; Nondlazi, B.; Main, R.; Ramoelo, A. Mapping weed infestation in maize fields using Sentinel-2 data. Phys. Chem. Earth Parts A/B/C 2024, 134, 103571. [Google Scholar] [CrossRef]
  121. Mudereri, B.T.; Dube, T.; Adel-Rahman, E.M.; Niassy, S.; Kimathi, E.; Khan, Z.; Landmann, T. A comparative analysis of planetscope and sentinel sentinel-2 space-borne sensors in mapping striga weed using guided regularised random forest classification ensemble. Int. Arch. Photogramm. Remote Sens. Spat. Inf. Sci. 2019, XLII-2/W13, 701–708. [Google Scholar] [CrossRef]
  122. He, J.; Zang, Y.; Luo, X.; Zhao, R.; He, J.; Jiao, J. Visual detection of rice rows based on Bayesian decision theory and robust regression least squares method. Int. J. Agric. Biol. Eng. 2021, 14, 199–206. [Google Scholar] [CrossRef]
  123. Wang, S.; Su, D.; Jiang, Y.; Tan, Y.; Qiao, Y.; Yang, S.; Feng, Y.; Hu, N. Fusing vegetation index and ridge segmentation for robust vision based autonomous navigation of agricultural robots in vegetable farms. Comput. Electron. Agric. 2023, 213, 108235. [Google Scholar] [CrossRef]
  124. Suh, H.K.; Hofstee, J.W.; IJsselmuiden, J.; van Henten, E.J. Sugar beet and volunteer potato classification using Bag-of-Visual-Words model, Scale-Invariant Feature Transform, or Speeded Up Robust Feature descriptors and crop row information. Biosyst. Eng. 2018, 166, 210–226. [Google Scholar] [CrossRef]
  125. Bah, M.; Hafiane, A.; Canals, R. Deep learning with unsupervised data labeling for weed detection in line crops in UAV images. Remote Sens. 2018, 10, 1690. [Google Scholar] [CrossRef]
  126. Shi, J.; Bai, Y.; Diao, Z.; Zhou, J.; Yao, X.; Zhang, B. Row detection BASED navigation and guidance for agricultural robots and autonomous vehicles in row-crop fields: Methods and applications. Agronomy 2023, 13, 1780. [Google Scholar] [CrossRef]
  127. Ronchetti, G.; Mayer, A.; Facchi, A.; Ortuani, B.; Sona, G. Crop row detection through UAV surveys to optimize on-farm irrigation management. Remote Sens. 2020, 12, 1967. [Google Scholar] [CrossRef]
  128. Bah, M.D.; Hafiane, A.; Canals, R. CRowNet: Deep network for crop row detection in UAV images. IEEE Access 2020, 8, 5189–5200. [Google Scholar] [CrossRef]
  129. de Silva, R.; Cielniak, G.; Gao, J. Vision based crop row navigation under varying field conditions in arable fields. Comput. Electron. Agric. 2024, 217, 108581. [Google Scholar] [CrossRef]
  130. Han, C.J.; Zheng, K.; Zhao, X.G.; Zheng, S.Y.; Fu, H.; Zhai, C.Y. Design and experiment of row identification and row-oriented spray control system for field cabbage crops. Trans. Chin. Soc. Agric. Mach. 2022, 53, 89–101. [Google Scholar]
  131. de Silva, R.; Cielniak, G.; Wang, G.; Gao, J. Deep learning-based crop row detection for infield navigation of agri-robots. J. Field Robot. 2023, 40, 1–23. [Google Scholar] [CrossRef]
  132. Winterhalter, W.; Fleckenstein, F.V.; Dornhege, C.; Burgard, W. Crop row detection on tiny plants with the pattern hough transform. IEEE Robot. Autom. Let. 2018, 3, 3394–3401. [Google Scholar] [CrossRef]
  133. Winterhalter, W.; Fleckenstein, F.; Dornhege, C.; Burgard, W. Localization for precision navigation in agricultural fields—Beyond crop row following. J. Field Robot. 2021, 38, 429–451. [Google Scholar] [CrossRef]
  134. Chen, J.; Qiang, H.; Wu, J.; Xu, G.; Wang, Z. Navigation path extraction for greenhouse cucumber-picking robots using the prediction-point Hough transform. Comput. Electron. Agric. 2021, 180, 105911. [Google Scholar] [CrossRef]
  135. Cruz Ulloa, C.; Krus, A.; Barrientos, A.; Cerro, J.D.; Valero, C. Robotic fertilization in strip cropping using a cnn vegetables detection-characterization method. Comput. Electron. Agric. 2022, 193, 106684. [Google Scholar] [CrossRef]
  136. Wendel, A.; Underwood, J. Self-supervised weed detection in vegetable crops using ground based hyperspectral imaging. In Proceedings of the 2016 IEEE International Conference on Robotics and Automation (ICRA), Stockholm, Sweden, 16–21 May 2016; IEEE: Piscataway, NJ, USA, 2016; pp. 5128–5135. [Google Scholar]
  137. Ospina, R.; Noguchi, N. Simultaneous mapping and crop row detection by fusing data from wide angle and telephoto images. Comput. Electron. Agric. 2019, 162, 602–612. [Google Scholar] [CrossRef]
  138. Tian, Z.; Junfang, X.; Gang, W.; Jianbo, Z. Automatic navigation path detection method for tillage machines working on high crop stubble fields based on machine vision. Int. J. Agric. Biol. Eng. 2014, 7, 29. [Google Scholar]
  139. Shi, J.; Bai, Y.; Zhou, J.; Zhang, B. Multi-crop navigation line extraction based on improved YOLO-v8 and threshold-DBSCAN under complex agricultural environments. Agriculture 2024, 14, 45. [Google Scholar] [CrossRef]
  140. Yang, R.; Zhai, Y.; Zhang, J.; Zhang, H.; Tian, G.; Huang, P.; Li, L. Potato Visual navigation line detection based on deep learning and feature midpoint adaptation. Agriculture 2022, 12, 1363. [Google Scholar] [CrossRef]
  141. Louargant, M.; Jones, G.; Faroux, R.; Paoli, J.; Maillot, T.; Gée, C.; Villette, S. Unsupervised classification algorithm for early weed detection in row-crops by combining spatial and spectral information. Remote Sens. 2018, 10, 718–761. [Google Scholar] [CrossRef]
  142. Bah, M.D.; Hafiane, A.; Canals, R. Hierarchical graph representation for unsupervised crop row detection in images. Expert. Syst. Appl. 2023, 216, 119478. [Google Scholar] [CrossRef]
  143. Zhao, R.; Yuan, X.; Yang, Z.; Zhang, L. Image-based crop row detection utilizing the Hough transform and DBSCAN clustering analysis. IET Image Process 2024, 18, 1161–1177. [Google Scholar] [CrossRef]
  144. Rabab, S.; Badenhorst, P.; Chen, Y.P.; Daetwyler, H.D. A template-free machine vision-based crop row detection algorithm. Precis. Agric. 2021, 22, 124–153. [Google Scholar] [CrossRef]
  145. Vidović, I.; Cupec, R.; Hocenski, Ž. Crop row detection by global energy minimization. Pattern Recogn. 2016, 55, 68–86. [Google Scholar] [CrossRef]
  146. Wang, A.C.; Zhang, M.; Liu, Q.S.; Wang, L.L.; Wei, X.H. Seedling crop row extraction method based on regional growth and mean shift clustering. Trans. Chin. Soc. Agric. Eng. (Trans. CSAE) 2021, 37, 202–210. [Google Scholar]
  147. Chen, Z.W.; Li, W.; Zhang, W.Q.; Li, Y.W.; Li, M.S.; Li, H. Vegetable crop row extraction method based on accumulation threshold of Hough Transformation. Trans. Chin. Soc. Agric. Eng. (Trans. CSAE) 2019, 35, 314–322. [Google Scholar]
  148. Hamuda, E.; Mc Ginley, B.; Glavin, M.; Jones, E. Improved image processing-based crop detection using Kalman filtering and the Hungarian algorithm. Comput. Electron. Agric. 2018, 148, 37–44. [Google Scholar] [CrossRef]
  149. Singh, K.; Agrawal, K.N.; Bora, G.C. Advanced techniques for Weed and crop identification for site specific Weed management. Biosyst. Eng. 2011, 109, 52–64. [Google Scholar] [CrossRef]
  150. Saleem, M.H.; Potgieter, J.; Arif, K.M. Automation in agriculture by machine and deep learning techniques: A review of recent developments. Precis. Agric. 2021, 22, 2053–2091. [Google Scholar] [CrossRef]
  151. Wu, Z.; Chen, Y.; Zhao, B.; Kang, X.; Ding, Y. Review of weed detection methods based on computer vision. Sensors 2021, 21, 3647. [Google Scholar] [CrossRef] [PubMed]
  152. Al-Badri, A.H.; Ismail, N.A.; Al-Dulaimi, K.; Salman, G.A.; Khan, A.R.; Al-Sabaawi, A.; Salam, M.S.H. Classification of weed using machine learning techniques: A review—Challenges, current and future potential techniques. J. Plant Dis. Protect 2022, 129, 745–768. [Google Scholar] [CrossRef]
  153. Wang, A.; Zhang, W.; Wei, X. A review on weed detection using ground-based machine vision and image processing techniques. Comput. Electron. Agric. 2019, 158, 226–240. [Google Scholar] [CrossRef]
  154. Hu, W.; Wane, S.O.; Zhu, J.; Li, D.; Zhang, Q.; Bie, X.; Lan, Y. Review of deep learning-based weed identification in crop fields. Int. J. Agric. Biol. Eng. 2023, 16, 1–10. [Google Scholar] [CrossRef]
  155. Hu, K.; Wang, Z.; Coleman, G.; Bender, A.; Yao, T.; Zeng, S.; Song, D.; Schumann, A.; Walsh, M. Deep Learning Techniques for in-Crop Weed Identification: A Review; Cornell University Library: Ithaca, NY, USA, 2024. [Google Scholar]
  156. Qu, H.; Su, W. Deep learning-based weed–crop recognition for smart agricultural equipment: A review. Agronomy 2024, 14, 363. [Google Scholar] [CrossRef]
  157. Rico-Fernández, M.P.; Rios-Cabrera, R.; Castelán, M.; Guerrero-Reyes, H.I.; Juarez-Maldonado, A. A contextualized approach for segmentation of foliage in different crop species. Comput. Electron. Agric. 2019, 156, 378–386. [Google Scholar] [CrossRef]
  158. Kamath, R.; Balachandra, M.; Prabhu, S. Crop and weed discrimination using Laws’ texture masks. Int. J. Agric. Biol. Eng. 2020, 13, 191–197. [Google Scholar] [CrossRef]
  159. Liu, B.; Li, R.; Li, H.; You, G.; Yan, S.; Tong, Q. Crop/weed discrimination using a field imaging spectrometer system. Sensors 2019, 19, 5154. [Google Scholar] [CrossRef] [PubMed]
  160. Kazmi, W.; Garcia-Ruiz, F.; Nielsen, J.; Rasmussen, J.; Andersen, H.J. Exploiting affine invariant regions and leaf edge shapes for weed detection. Comput. Electron. Agric. 2015, 118, 290–299. [Google Scholar] [CrossRef]
  161. Lottes, P.; Hörferlin, M.; Sander, S.; Stachniss, C. Effective vision-based classification for separating sugar beets and weeds for precision farming. J. Field Robot. 2017, 34, 1160–1178. [Google Scholar] [CrossRef]
  162. Pulido-Rojas, C.A.; Molina-Villa, M.A.; Solaque-Guzmán, L.E. Machine vision system for weed detection using image filtering in vegetables crops. Rev. Fac. Ing. Univ. Antioq. 2016, 80, 124–130. [Google Scholar] [CrossRef]
  163. Zhang, L.; Zhang, Z.; Wu, C.; Sun, L. Segmentation algorithm for overlap recognition of seedling lettuce and weeds based on SVM and image blocking. Comput. Electron. Agric. 2022, 201, 107284. [Google Scholar] [CrossRef]
  164. Espejo-Garcia, B.; Mylonas, N.; Athanasakos, L.; Fountas, S.; Vasilakoglou, I. Towards weeds identification assistance through transfer learning. Comput. Electron. Agric. 2020, 171, 105306. [Google Scholar] [CrossRef]
  165. Gai, J.; Tang, L.; Steward, B.L. Automated crop plant detection based on the fusion of colour and depth images for robotic weed control. J. Field Robot. 2020, 37, 35–52. [Google Scholar] [CrossRef]
  166. Pallottino, F.; Menesatti, P.; Figorilli, S.; Antonucci, F.; Tomasone, R.; Colantoni, A.; Costa, C. Machine vision retrofit system for mechanical weed control in precision agriculture applications. Sustainability 2018, 10, 2209. [Google Scholar] [CrossRef]
  167. Li, N.; Zhang, C.; Chen, Z.; Ma, Z.; Sun, Z.; Yuan, T.; Li, W.; Zhang, J. Crop positioning for robotic intra-row weeding based on machine vision. Int. J. Agric. Biol. Eng. 2015, 8, 20. [Google Scholar] [CrossRef]
  168. Jin, X.; Sun, Y.; Che, J.; Bagavathiannan, M.; Yu, J.; Chen, Y. A novel deeplearning-based method for detection of weeds in vegetables. Pest Manag. Sci. 2022, 78, 1861–1869. [Google Scholar] [CrossRef]
  169. Ma, Z.; Wang, G.; Yao, J.; Huang, D.; Tan, H.; Jia, H.; Zou, Z. An improved U-Net model based on multi-scale input and attention mechanism: Application for recognition of chinese cabbage and weed. Sustainability 2023, 15, 5764. [Google Scholar] [CrossRef]
  170. Hussain, N.; Farooque, A.A.; Schumann, A.W.; Abbas, F.; Acharya, B.; McKenzie-Gopsill, A.; Barrett, R.; Afzaal, H.; Zaman, Q.U.; Cheema, M.J.M. Application of deep learning to detect Lamb’s quarters (Chenopodium album L.) in potato fields of Atlantic Canada. Comput. Electron. Agric. 2021, 182, 106040. [Google Scholar] [CrossRef]
  171. Sabzi, S.; Abbaspour-Gilandeh, Y.; Arribas, J.I. An automatic visible-range video weed detection, segmentation and classification prototype in potato field. Heliyon 2020, 6, e03685. [Google Scholar] [CrossRef]
  172. Zhao, J.; Tian, G.; Qiu, C.; Gu, B.; Zheng, K.; Liu, Q. Weed detection in potato fields based on improved YOLOv4: Optimal speed and accuracy of weed detection in potato fields. Electronics 2022, 11, 3709. [Google Scholar] [CrossRef]
  173. Abouzahir, S.; Sadik, M.; Sabir, E. Bag-of-visual-words-augmented Histogram of Oriented Gradients for efficient weed detection. Biosyst. Eng. 2021, 202, 179–194. [Google Scholar] [CrossRef]
  174. Nnadozie, E.C.; Iloanusi, O.; Ani, O.; Yu, K. Cassava detection from UAV images using YOLOv5 object detection model: Towards weed control in a cassava farm. BioRxiv 2022, 2011–2022. [Google Scholar] [CrossRef]
  175. Khan, S.; Tufail, M.; Khan, M.T.; Khan, Z.A.; Anwar, S. Deep learning-based identification system of weeds and crops in strawberry and pea fields for a precision agriculture sprayer. Precis. Agric. 2021, 22, 1711–1727. [Google Scholar] [CrossRef]
  176. Sun, H.; Liu, T.; Wang, J.; Zhai, D.; Yu, J. Evaluation of two deep learning-based approaches for detecting weeds growing in cabbage. Pest Manag. Sci. 2024, 80, 2817–2826. [Google Scholar] [CrossRef]
  177. Suh, H.K.; IJsselmuiden, J.; Hofstee, J.W.; van Henten, E.J. Transfer learning for the classification of sugar beet and volunteer potato under field conditions. Biosyst. Eng. 2018, 174, 50–65. [Google Scholar] [CrossRef]
  178. Ruigrok, T.; van Henten, E.; Booij, J.; van Boheemen, K.; Kootstra, G. Application-specific evaluation of a weed-detection algorithm for plant-specific spraying. Sensors 2020, 20, 7262. [Google Scholar] [CrossRef]
  179. Moazzam, S.I.; Khan, U.S.; Qureshi, W.S.; Tiwana, M.I.; Rashid, N.; Alasmary, W.S.; Iqbal, J.; Hamza, A. A patch-image based classification approach for detection of weeds in sugar beet crop. IEEE Access 2021, 9, 121698–121715. [Google Scholar] [CrossRef]
  180. Bakhshipour, A.; Jafari, A. Evaluation of support vector machine and artificial neural networks in weed detection using shape features. Comput. Electron. Agric. 2018, 145, 153–160. [Google Scholar] [CrossRef]
  181. Zhang, C.; Liu, J.; Li, H.; Chen, H.; Xu, Z.; Ou, Z. Weed detection method based on lightweight and contextual information fusion. Appl. Sci. 2023, 13, 13074. [Google Scholar] [CrossRef]
  182. Guo, Z.; Goh, H.H.; Li, X.; Zhang, M.; Li, Y. WeedNet-R: A sugar beet field weed detection algorithm based on enhanced RetinaNet and context semantic fusion. Front. Plant Sci. 2023, 14, 1226329. [Google Scholar] [CrossRef] [PubMed]
  183. Wang, A.; Peng, T.; Cao, H.; Xu, Y.; Wei, X.; Cui, B. TIA-YOLOv5: An improved YOLOv5 network for real-time detection of crop and weed in the field. Front. Plant Sci. 2022, 13, 1091655. [Google Scholar] [CrossRef] [PubMed]
  184. Jin, X.; Che, J.; Chen, Y. Weed identification using deep learning and image processing in vegetable plantation. IEEE Access 2021, 9, 10940–10950. [Google Scholar] [CrossRef]
  185. López-Correa, J.M.; Moreno, H.; Ribeiro, A.; Andújar, D. Intelligent weed management based on object detection neural networks in tomato crops. Agronomy 2022, 12, 2953. [Google Scholar] [CrossRef]
  186. Bender, A.; Whelan, B.; Sukkarieh, S. A high-resolution, multimodal data set for agricultural robotics: ALadybird ‘s-eye view of Brassica. J. Field Robot. 2020, 37, 73–96. [Google Scholar] [CrossRef]
  187. Moreno, H.; Gómez, A.; Altares-López, S.; Ribeiro, A.; Andújar, D. Analysis of Stable Diffusion-derived fake weeds performance for training Convolutional Neural Networks. Comput. Electron. Agric. 2023, 214, 108324. [Google Scholar] [CrossRef]
  188. Patel, J.; Ruparelia, A.; Tanwar, S.; Alqahtani, F.; Tolba, A.; Sharma, R.; Raboaca, M.S.; Neagu, B.C. Deep learning-based model for detection of brinjal weed in the era of precision agriculture. Comput. Mater. Contin. 2023, 77, 1281–1301. [Google Scholar] [CrossRef]
  189. Gallo, I.; Rehman, A.U.; Dehkordi, R.H.; Landro, N.; La Grassa, R.; Boschetti, M. Deep object detection of crop weeds: Performance of YOLOv7 on a real case dataset from UAV images. Remote Sens. 2023, 15, 539. [Google Scholar] [CrossRef]
  190. Fatima, H.S.; Ul Hassan, I.; Hasan, S.; Khurram, M.; Stricker, D.; Afzal, M.Z. Formation of a lightweight, deep learning-based weed detection system for a commercial autonomous laser weeding robot. Appl. Sci. 2023, 13, 3997. [Google Scholar] [CrossRef]
  191. Sharpe, S.M.; Schumann, A.W.; Boyd, N.S. Goosegrass detection in strawberry and tomato using a convolutional neural network. Sci. Rep. 2020, 10, 9548. [Google Scholar] [CrossRef] [PubMed]
  192. Albraikan, A.A.; Aljebreen, M.; Alzahrani, J.S.; Othman, M.; Mohammed, G.P.; Ibrahim Alsaid, M. Modified barnacles mating optimization with deep learning based weed detection model for smart agriculture. Appl. Sci. 2022, 12, 12828. [Google Scholar] [CrossRef]
  193. Janneh, L.L.; Zhang, Y.; Cui, Z.; Yang, Y. Multi-level feature re-weighted fusion for the semantic segmentation of crops and weeds. J. King Saud Univ.—Comput. Inf. Sci. 2023, 35, 101545. [Google Scholar] [CrossRef]
  194. Madanan, M.; Muthukumaran, N.; Tiwari, S.; Vijay, A.; Saha, I. RSA based improved YOLOv3 network for segmentation and detection of weed species. Multimed. Tools Appl. 2024, 83, 34913–34942. [Google Scholar] [CrossRef]
  195. Reedha, R.; Dericquebourg, E.; Canals, R.; Hafiane, A. Transformer neural network for weed and crop classification of high resolution UAV images. Remote Sens. 2022, 14, 592. [Google Scholar] [CrossRef]
  196. Ying, B.; Xu, Y.; Zhang, S.; Shi, Y.; Liu, L. Weed detection in images of carrot fields based on improved YOLO v4. Trait. Signal 2021, 38, 341–348. [Google Scholar] [CrossRef]
  197. Hamuda, E.; Mc Ginley, B.; Glavin, M.; Jones, E. Automatic crop detection under field conditions using the HSV colour space and morphological operations. Comput. Electron. Agric. 2017, 133, 97–107. [Google Scholar] [CrossRef]
  198. Rai, N.; Zhang, Y.; Ram, B.G.; Schumacher, L.; Yellavajjala, R.K.; Bajwa, S.; Sun, X. Applications of deep learning in precision weed management: A review. Comput. Electron. Agric. 2023, 206, 107698. [Google Scholar] [CrossRef]
  199. Su, W. Advanced machine learning in point spectroscopy, RGB and hyperspectral-imaging for automatic discriminations of crops and weeds: A review. Smart Cities 2020, 3, 767–792. [Google Scholar] [CrossRef]
  200. Deng, B.; Lu, Y.; Xu, J. Weed database development: An updated survey of public weed datasets and cross-season weed detection adaptation. Ecol. Inform. 2024, 81, 102546. [Google Scholar] [CrossRef]
  201. Available online: https://github.com/vicdxxx/Weed-Datasets-Survey-2023 (accessed on 25 June 2024).
  202. Lu, Y. 2seasonweeddet8: A Two-season, 8-class Dataset for Cross-season Weed Detection Generalization Evaluation. Zenodo, 2024. [CrossRef]
  203. Hamuda, E.; Glavin, M.; Jones, E. A survey of image processing techniques for plant extraction and segmentation in the field. Comput. Electron. Agric. 2016, 125, 184–199. [Google Scholar] [CrossRef]
  204. Hasan, A.S.M.M.; Sohel, F.; Diepeveen, D.; Laga, H.; Jones, M.G.K. A survey of deep learning techniques for weed detection from images. Comput. Electron. Agric. 2021, 184, 106067. [Google Scholar] [CrossRef]
  205. Pallottino, F.; Antonucci, F.; Costa, C.; Bisaglia, C.; Figorilli, S.; Menesatti, P. Optoelectronic proximal sensing vehicle-mounted technologies in precision agriculture: A review. Comput. Electron. Agric. 2019, 162, 859–873. [Google Scholar] [CrossRef]
  206. Mohammadi, V.; Gouton, P.; Rossé, M.; Katakpe, K.K. Design and development of large-band Dual-MSFA sensor camera for precision agriculture. Sensors 2023, 24, 64. [Google Scholar] [CrossRef]
  207. Tao, T.; Wu, S.; Li, L.; Li, J.; Bao, S.; Wei, X. Design and experiments of weeding teleoperated robot spectral sensor for winter rape and weed identification. Adv. Mech. Eng. 2018, 10, 2072046762. [Google Scholar] [CrossRef]
  208. Duncan, L.; Miller, B.; Shaw, C.; Graebner, R.; Moretti, M.L.; Walter, C.; Selker, J.; Udell, C. Weed Warden: A low-cost weed detection device implemented with spectral triad sensor for agricultural applications. Hardwarex 2022, 11, e00303. [Google Scholar] [CrossRef]
  209. Che Ya, N.N.; Dunwoody, E.; Gupta, M. Assessment of weed classification using hyperspectral reflectance and optimal multispectral UAV imagery. Agronomy 2021, 11, 1435. [Google Scholar] [CrossRef]
  210. Martín, M.P.; Ponce, B.; Echavarría, P.; Dorado, J.; Fernández-Quintanilla, C. Early-season mapping of Johnsongrass (Sorghum halepense), Common Cocklebur (Xanthium strumarium) and Velvetleaf (Abutilon theophrasti) in corn fields using airborne hyperspectral imagery. Agronomy 2023, 13, 528. [Google Scholar] [CrossRef]
  211. Elstone, L.; How, K.Y.; Brodie, S.; Ghazali, M.Z.; Heath, W.P.; Grieve, B. High speed crop and weed identification in lettuce fields for precision weeding. Sensors 2020, 20, 455. [Google Scholar] [CrossRef] [PubMed]
  212. Eddy, P.R.; Smith, A.M.; Hill, B.D.; Peddle, D.R.; Coburn, C.A.; Blackshaw, R.E. Weed and crop discrimination using hyperspectral image data and reduced bandsets. Can. J. Remote Sens. 2014, 39, 481–490. [Google Scholar] [CrossRef]
  213. Longchamps, L.; Panneton, B.; Samson, G.; Leroux, G.D.; Thériault, R. Discrimination of corn, grasses and dicot weeds by their UV-induced fluorescence spectral signature. Precis. Agric. 2010, 11, 181–197. [Google Scholar] [CrossRef]
  214. Panneton, B.; Guillaume, S.; Roger, J.M.; Samson, G. Improved discrimination between monocotyledonous and dicotyledonous plants for weed control based on the blue-green region of ultraviolet-induced fluorescence spectra. Appl. Spectrosc. 2010, 64, 30–36. [Google Scholar] [CrossRef] [PubMed]
  215. Wang, A.; Li, W.; Men, X.; Gao, B.; Xu, Y.; Wei, X. Vegetation detection based on spectral information and development of a low-cost vegetation sensor for selective spraying. Pest Manag. Sci. 2022, 78, 2467–2476. [Google Scholar] [CrossRef]
  216. Wang, A.C.; Gao, B.J.; Zhao, C.J.; Xu, Y.F.; Wang, M.L.; Yan, S.G.; Li, L.; Wei, X.H. Detecting green plants based on fluorescence spectroscopy. Spectrosc. Spectr. Anal. 2022, 42, 788–794. [Google Scholar] [CrossRef]
  217. Wang, P.; Peteinatos, G.; Li, H.; Gerhards, R. Rapid in-season detection of herbicide resistant Alopecurus myosuroides using a mobile fluorescence imaging sensor. Crop Prot. 2016, 89, 170–177. [Google Scholar] [CrossRef]
  218. Lednev, V.N.; Grishin, M.Y.; Sdvizhenskii, P.A.; Kurbanov, R.K.; Litvinov, M.A.; Gudkov, S.V.; Pershin, S.M. Fluorescence mapping of agricultural fields utilizing drone-based LIDAR. Photonics 2022, 9, 963. [Google Scholar] [CrossRef]
  219. Zhao, X.; Zhai, C.; Wang, S.; Dou, H.; Yang, S.; Wang, X.; Chen, L. Sprayer boom height measurement in wheat field using ultrasonic sensor: An exploratory study. Front. Plant Sci. 2022, 13, 1008122. [Google Scholar] [CrossRef] [PubMed]
  220. Wei, Z.; Xue, X.; Salcedo, R.; Zhang, Z.; Gil, E.; Sun, Y.; Li, Q.; Shen, J.; He, Q.; Dou, Q.; et al. Key Technologies for an orchard variable-rate sprayer: Current status and future prospects. Agronomy 2023, 13, 59. [Google Scholar] [CrossRef]
  221. Lin, Y. LiDAR: An important tool for next-generation phenotyping technology of high potential for plant phenomics? Comput. Electron. Agric. 2015, 119, 61–73. [Google Scholar] [CrossRef]
  222. Rivera, G.; Porras, R.; Florencia, R.; Sánchez-Solís, J.P. LiDAR applications in precision agriculture for cultivating crops: A review of recent advances. Comput. Electron. Agric. 2023, 207, 107737. [Google Scholar] [CrossRef]
  223. Krus, A.; van Apeldoorn, D.; Valero, C.; Ramirez, J.J. Acquiring plant features with optical sensing devices in an organic strip-cropping system. Agronomy 2020, 10, 197. [Google Scholar] [CrossRef]
  224. Shahbazi, N.; Ashworth, M.B.; Callow, J.N.; Mian, A.; Beckie, H.J.; Speidel, S.; Nicholls, E.; Flower, K.C. Assessing the capability and potential of LiDAR for weed detection. Sensors 2021, 21, 2328. [Google Scholar] [CrossRef] [PubMed]
  225. Cai, S.; Gou, W.; Wen, W.; Lu, X.; Fan, J.; Guo, X. Design and development of a low-cost UGV 3D phenotyping platform with integrated LiDAR and electric slide rail. Plants 2023, 12, 483. [Google Scholar] [CrossRef]
  226. Reiser, D.; Vázquez-Arellano, M.; Paraforos, D.S.; Garrido-Izard, M.; Griepentrog, H.W. Iterative individual plant clustering in maize with assembled 2D LiDAR data. Comput. Ind. 2018, 99, 42–52. [Google Scholar] [CrossRef]
  227. Forero, M.G.; Murcia, H.F.; Méndez, D.; Betancourt-Lozano, J. LiDAR platform for acquisition of 3D plant phenotyping database. Plants 2022, 11, 2199. [Google Scholar] [CrossRef]
  228. Jayakumari, R.; Nidamanuri, R.R.; Ramiya, A.M. Object-level classification of vegetable crops in 3D LiDAR point cloud using deep learning convolutional neural networks. Precis. Agric. 2021, 22, 1617–1633. [Google Scholar] [CrossRef]
  229. Martínez-Guanter, J.; Garrido-Izard, M.; Valero, C.; Slaughter, D.C.; Pérez-Ruiz, M. Optical sensing to determine tomato plant spacing for precise agrochemical application: Two scenarios. Sensors 2017, 17, 1096. [Google Scholar] [CrossRef]
  230. Guo, Q.; Wu, F.; Pang, S.; Zhao, X.; Chen, L.; Liu, J.; Xue, B.; Xu, G.; Li, L.; Jing, H.; et al. Crop 3D—A LiDAR based platform for 3D high-throughput crop phenotyping. Sci. China Life Sci. 2018, 61, 328–339. [Google Scholar] [CrossRef] [PubMed]
  231. Zhang, F.; Hassanzadeh, A.; Kikkert, J.; Pethybridge, S.J.; van Aardt, J. Evaluation of leaf area index (LAI) of broadacre crops using UAS-based LiDAR point clouds and multispectral imagery. IEEE J. Sel. Top. Appl. Earth Obs. Remote Sens. 2022, 15, 4027–4044. [Google Scholar] [CrossRef]
  232. Weis, M.; Andújar, D.; Peteinatos, G.G.; Gerhards, R. Improving the determination of plant characteristics by fusion of four different sensors. In Precision Agriculture’13; Wageningen Academic Publishers: Wageningen, The Netherlands, 2013; pp. 63–69. [Google Scholar]
  233. Maldaner, L.F.; Molin, J.P.; Canata, T.F.; Martello, M. A system for plant detection using sensor fusion approach based on machine learning model. Comput. Electron. Agric. 2021, 189, 106382. [Google Scholar] [CrossRef]
  234. Wang, G.; Huang, D.; Zhou, D.; Liu, H.; Qu, M.; Ma, Z. Maize (Zea mays L.) seedling detection based on the fusion of a modified deep learning model and a novel Lidar points projecting strategy. Int. J. Agric. Biol. Eng. 2022, 15, 172–180. [Google Scholar] [CrossRef]
  235. Liu, X.; Bo, Y. Object-based crop species classification based on the combination of airborne hyperspectral images and LiDAR data. Remote Sens. 2015, 7, 922–950. [Google Scholar] [CrossRef]
  236. Chen, B.; Shi, S.; Gong, W.; Xu, Q.; Tang, X.; Bi, S.; Chen, B. Wavelength selection of dual-mechanism LiDAR with reflection and fluorescence spectra for plant detection. Opt. Express 2023, 31, 3660. [Google Scholar] [CrossRef] [PubMed]
  237. Su, W. Crop plant signaling for real-time plant identification in smart farm: A systematic review and new concept in artificial intelligence for automated weed control. Artif. Intell. Agric. 2020, 4, 262–271. [Google Scholar] [CrossRef]
  238. Jiang, B.; Zhang, H.Y.; Su, W.H. Automatic localization of soybean seedlings based on crop signaling and multi-view imaging. Sensors 2024, 24, 3066. [Google Scholar] [CrossRef]
  239. Raja, R.; Slaughter, D.C.; Fennimore, S.A.; Nguyen, T.T.; Vuong, V.L.; Sinha, N.; Tourte, L.; Smith, R.F.; Siemens, M.C. Crop signalling: A novel crop recognition technique for robotic weed control. Biosyst. Eng. 2019, 187, 278–291. [Google Scholar] [CrossRef]
  240. Raja, R.; Nguyen, T.T.; Slaughter, D.C.; Fennimore, S.A. Real-time robotic weed knife control system for tomato and lettuce based on geometric appearance of plant labels. Biosyst. Eng. 2020, 194, 152–164. [Google Scholar] [CrossRef]
  241. Raja, R.; Nguyen, T.T.; Vuong, V.L.; Slaughter, D.C.; Fennimore, S.A. RTD-SEPs: Real-time detection of stem emerging points and classification of crop-weed for robotic weed control in producing tomato. Biosyst. Eng. 2020, 195, 152–171. [Google Scholar] [CrossRef]
  242. Su, W.; Sheng, J.; Huang, Q. Development of a three-dimensional plant localization technique for automatic differentiation of soybean from intra-row weeds. Agriculture 2022, 12, 195. [Google Scholar] [CrossRef]
  243. Li, J.; Su, W.; Zhang, H.; Peng, Y. A real-time smart sensing system for automatic localization and recognition of vegetable plants for weed control. Front. Plant Sci. 2023, 14, 1133969. [Google Scholar] [CrossRef]
  244. Su, W.; Fennimore, S.A.; Slaughter, D.C. Computer vision technology for identification of snap bean crops using Systemic Rhodamine B. In Proceedings of the 2019 ASABE Annual International Meeting 2019, Boston, MA, USA, 7–10 July 2019; An ASABE Meeting Presentation 2019. p. 1900075. [Google Scholar] [CrossRef]
  245. Su, W.; Fennimore, S.A.; Slaughter, D.C. Development of a systemic crop signalling system for automated real-time plant care in vegetable crops. Biosyst. Eng. 2020, 193, 62–74. [Google Scholar] [CrossRef]
  246. Su, W.; Fennimore, S.A.; Slaughter, D.C. Fluorescence imaging for rapid monitoring of translocation behaviour of systemic markers in snap beans for automated crop/weed discrimination. Biosyst. Eng. 2019, 186, 156–167. [Google Scholar] [CrossRef]
  247. Su, W.; Slaughter, D.C.; Fennimore, S.A. Non-destructive evaluation of photostability of crop signaling compounds and dose effects on celery vigor for precision plant identification using computer vision. Comput. Electron. Agric. 2020, 168, 105155. [Google Scholar] [CrossRef]
  248. Fennimore, S.A.; Siemens, M.C. Mechanized Weed Management in Vegetable Crops. In Encyclopedia of Digital Agricultural Technologies; Zhang, Q., Ed.; Springer International Publishing: Cham, Switzerland, 2023; pp. 807–817. [Google Scholar]
  249. Fennimore, S.A.; Slaughter, D.C.; Siemens, M.C.; Leon, R.G.; Saber, M.N. Technology for automation of weed control in specialty crops. Weed Technol. 2016, 30, 823–837. [Google Scholar] [CrossRef]
  250. Fennimore, S.A.; Cutulle, M. Robotic weeders can improve weed control options for specialty crops. Pest Manag. Sci. 2019, 75, 1767–1774. [Google Scholar] [CrossRef] [PubMed]
  251. Gerhards, R.; Risser, P.; Spaeth, M.; Saile, M.; Peteinatos, G. A comparison of seven innovative robotic weeding systems and reference herbicide strategies in sugar beet (Beta vulgaris subsp. vulgaris L.) and rapeseed (Brassica napus L.). Weed Res. 2024, 64, 42–53. [Google Scholar] [CrossRef]
  252. Allmendinger, A.; Spaeth, M.; Saile, M.; Peteinatos, G.G.; Gerhards, R. Precision chemical weed management strategies: A review and a design of a new CNN-based modular spot sprayer. Agronomy 2022, 12, 1620. [Google Scholar] [CrossRef]
  253. Özlüoymak, Ö.B. Development and assessment of a novel camera-integrated spraying needle nozzle design for targeted micro-dose spraying in precision weed control. Comput. Electron. Agric. 2022, 199, 107134. [Google Scholar] [CrossRef]
  254. Özlüoymak, Ö.B. Design and development of a servo-controlled target-oriented robotic micro-dose spraying system in precision weed control. Semin. Ciências Agrárias 2021, 42, 635–656. [Google Scholar] [CrossRef]
  255. Hussain, N.; Farooque, A.; Schumann, A.; McKenzie-Gopsill, A.; Esau, T.; Abbas, F.; Acharya, B.; Zaman, Q. Design and development of a smart variable rate sprayer using deep learning. Remote Sens. 2020, 12, 4091. [Google Scholar] [CrossRef]
  256. Zhang, X.; Cao, C.; Luo, K.; Wu, Z.; Qin, K.; An, M.; Ding, W.; Xiang, W. Design and operation of a Peucedani Radix weeding device based on YOLOV5 and a parallel manipulator. Front. Plant Sci. 2023, 14, 1171737. [Google Scholar] [CrossRef] [PubMed]
  257. Raja, R.; Slaughter, D.C.; Fennimore, S.A.; Siemens, M.C. Real-time control of high-resolution micro-jet sprayer integrated with machine vision for precision weed control. Biosyst. Eng. 2023, 228, 31–48. [Google Scholar] [CrossRef]
  258. Dammer, K. Real-time variable-rate herbicide application for weed control in carrots. Weed Res. 2016, 56, 237–246. [Google Scholar] [CrossRef]
  259. Utstumo, T.; Urdal, F.; Brevik, A.; Dørum, J.; Netland, J.; Overskeid, Ø.; Berge, T.W.; Gravdahl, J.T. Robotic in-row weed control in vegetables. Comput. Electron. Agric. 2018, 154, 36–45. [Google Scholar] [CrossRef]
  260. Spaeth, M.; Sökefeld, M.; Schwaderer, P.; Gauer, M.E.; Sturm, D.J.; Delatrée, C.C.; Gerhards, R. Smart sprayer a technology for site-specific herbicide application. Crop Prot. 2024, 177, 106564. [Google Scholar] [CrossRef]
  261. Parasca, S.C.; Spaeth, M.; Rusu, T.; Bogdan, I. Mechanical weed control: Sensor-based inter-row hoeing in sugar beet (Beta vulgaris L.) in the transylvanian depression. Agronomy 2024, 14, 176. [Google Scholar] [CrossRef]
  262. Ye, S.; Xue, X.; Si, S.; Xu, Y.; Le, F.; Cui, L.; Jin, Y. Design and testing of an elastic comb reciprocating a soybean plant-to-plant seedling avoidance and weeding device. Agriculture 2023, 13, 2157. [Google Scholar] [CrossRef]
  263. Chang, C.; Xie, B.; Chung, S. Mechanical control with a deep learning method for precise weeding on a farm. Agriculture 2021, 11, 1049. [Google Scholar] [CrossRef]
  264. Quan, L.; Jiang, W.; Li, H.; Li, H.; Wang, Q.; Chen, L. Intelligent intra-row robotic weeding system combining deep learning technology with a targeted weeding mode. Biosyst. Eng. 2022, 216, 13–31. [Google Scholar] [CrossRef]
  265. Fennimore, S.A.; Smith, R.F.; Tourte, L.; LeStrange, M.; Rachuy, J.S. Evaluation and economics of a rotating cultivator in bok choy, celery, lettuce, and radicchio. Weed Technol. 2014, 28, 176–188. [Google Scholar] [CrossRef]
  266. Tillett, N.D.; Hague, T.; Grundy, A.C.; Dedousis, A.P. Mechanical within-row weed control for transplanted crops using computer vision. Biosyst. Eng. 2008, 99, 171–178. [Google Scholar] [CrossRef]
  267. Van Der Weide, R.Y.; Bleeker, P.O.; Achten, V.T.J.M.; Lotz, L.A.P.; Fogelberg, F.; Melander, B. Innovation in mechanical weed control in crop rows. Weed Res. 2008, 48, 215–224. [Google Scholar] [CrossRef]
  268. Lati, R.N.; Rosenfeld, L.; David, I.B.; Bechar, A. Power on! Low-energy electrophysical treatment is an effective new weed control approach. Pest Manag. Sci. 2021, 77, 4138–4147. [Google Scholar] [CrossRef]
  269. Xiong, Y.; Ge, Y.; Liang, Y.; Blackmore, S. Development of a prototype robot and fast path-planning algorithm for static laser weeding. Comput. Electron. Agric. 2017, 142, 494–503. [Google Scholar] [CrossRef]
  270. Young, S.L. Beyond precision weed control: A model for true integration. Weed Technol. 2018, 32, 7–10. [Google Scholar] [CrossRef]
  271. Bawden, O.; Kulk, J.; Russell, R.; McCool, C.; English, A.; Dayoub, F.; Lehnert, C.; Perez, T. Robot for weed species plant-specific management. J. Field Robot. 2017, 34, 1179–1199. [Google Scholar] [CrossRef]
  272. Wu, X.; Aravecchia, S.; Lottes, P.; Stachniss, C.; Pradalier, C. Robotic weed control using automated weed and crop classification. J. Field Robot. 2020, 37, 322–340. [Google Scholar] [CrossRef]
  273. Merfield, C.N. Robotic weeding’s false dawn? Ten requirements for fully autonomous mechanical weed management. Weed Res. 2016, 56, 340–344. [Google Scholar] [CrossRef]
Figure 1. Integrated weed management robots: (a) IWM robot; (b) Agbot II robot; (c) Modular weed control system.
Figure 1. Integrated weed management robots: (a) IWM robot; (b) Agbot II robot; (c) Modular weed control system.
Agriculture 14 01378 g001
Table 1. Types of vegetables.
Table 1. Types of vegetables.
CategoriesDescriptionExamples
Root vegetablesVegetables with tuberous roots and tubers as edible portionsCarrots, turnips, root beets, white radishes, potatoes, taro, yams, sweet potatoes, winter bamboo shoots, etc.
Leafy vegetablesVegetables with fresh and tender green leaves, petioles, and tender stems as edible portionsLettuce, spinach, coriander, water spinach, sunflower, celery, chrysanthemum, amaranth, celery, rapeseed, water spinach, etc.
CabbagesVegetables with leaf bulbs, tender stems, flower bulbs, and tender leaf clusters as edible portionsChinese cabbage, head cabbage, bulbous cabbage, cauliflower, baby cabbage, broccoli, pickled cabbage, head mustard, etc.
Fruits and melonsVegetables with fruit as edible portionsPumpkin, golden pumpkin, winter melon, bitter gourd, breast gourd, cucumber, luffa, bergamot, watermelon, Hu gua, gourd, watermelon, tomato, eggplant, etc.
Scallions and garlicsVegetables with bulbs, pseudostems, tubular leaves, or strip-shaped leaves as edible portionsOnions, scallions, scallions, garlic, chives, etc.
Sprouted vegetablesCultivate edible “sprouts” from seeds of various grains, beans, and treesPea sprouts, Chinese toon sprouts, radish sprouts, buckwheat sprouts, peanut sprouts, ginger sprouts, soybean sprouts, mung bean sprouts, bean sprouts, etc.
BeansVegetables with bean kernels or tender pods as edible portionsKidney beans, cowpeas, peas, string beans, knife beans, lentils, green beans, mung beans, broad beans, kidney beans, eyebrow beans, four winged beans, etc.
Aquatic vegetablesVegetables that grow in water and are edibleLotus root, water chestnut, water celery, water chestnuts, cigu, water bamboo, etc.
FungiFungi that are non-toxic to the human body and can be safely consumedBlack fungus, white fungus, ground fungus, stone fungus, shiitake mushroom, shiitake mushroom, shiitake mushroom, monkey head mushroom, etc.
Table 2. Research methods and results of global weed detection for some crops.
Table 2. Research methods and results of global weed detection for some crops.
PlatformsCropsWeedsSensorsResolution Ratio/mm·pixel−1Recognition AlgorithmsPrecisionReferences
UAVWheatBroadleaf herba, GramineaeRGB2.7Random forest (RF)72%Anderegg et al. [95]
Gahnia tristisRGB/Multispectral10/30UNET-ResNet>90%Fraccaro et al. [96]
Multispectral11.6RF>93%Su et al. [92]
Alopecurus myosuroidesRGB, Multispectral32RF87% (RGB); 61% (Multispectral)Lambert et al. [97]
Multispectral82.7Convolutional Neural Networks (CNN)82.5%Lambert et al. [98]
Matricaria chamomilla L., Papaver rhoeas L., Veronica hederifolia L., and Viola arvensis ssp. ArvensisRGB-Deep residual convolutional neural network (ResNet-18)94%Camargo et al. [93]
MaizeCyperus rotundus L., Cynodon dactylon L., Malva sylvestris L., Artemisia vulgaris L., Polygonum aviculare L.Multispectral50/80/90Support vector machine (SVM)61%Castaldi et al. [99]
Convolvulus canariensis, Chenopodium album, Digitaria sanguinalisRGB1.78Hough transform (HT) and object based image analysis (OBIA)94.5%Gao et al. [100]
Sugar beetCirsium arvensis L.Multispectral5.2PLS-DA analysis model95% (weeds); 89% (sugar beet)Garcia-Ruiz et al. [101]
Galinsoga spec., Amaranthus retroflexus, Atriplex spec., Polygonum spec., Gramineae, Convolvulusarvensis, Stellaria media, Taraxacum spec.Multispectral82Improved Deep Nueral Network (DNN)>78%Sa et al. [102]
MarigoldSetaria viridis, Asclepias syriaca, Cistothorus stellarisRGB5U-net>93%Zou et al. [103]
Pea, StrawberryEleusine indicaRGB3Semi-Supervised Generative Adversarial Network (SGAN)90%Khan et al. [104]
OatChamaemelum nobile, Cirsium arvenseRGB35RF, K-means>87%Gašparović et al. [105]
Cotton, SunflowerConyza crispa, Fallopia japonica, Chenopodium retusum, Phalaris canariensis, Convolvulus arvensisRGB12–24OBIA81% (Cotton); 84% (Sunflower)de Castro et al. [106]
SunflowerChenopodium retusum, Brassica juncea var. megarrhiza, Convolvulus arvensis, Chenopodium album Linn.RGB, Multispectral-OBIA>85%López-Granados et al. [107]
Sunflower, maizeChenopodium retusum, Brassica juncea var. megarrhiza, Convolvulus arvensis, Batis maritimeRGB14OBIA95% (Sunflower); 79% (maize)Pérez-Ortiz et al. [108]
OnionCyperus rotundus, Sorghum halepense, Chenopodium album, Xanthium strumarium, Amaranth, Conyza bonariensis L., Solanum L.RGB5Maximum Likelihood and SVM>85%Rozenberg et al. [109]
SorghumChenopodium album L., Cirsium arvense L.RGB1Deblurring and segmentation models 83.7%Genze et al. [110]
TomatoSolanum nigrum, Amaranthus retroflexusHyperspectral-Multiclassifier architecture95.8%Zhang et al. [111]
RiceLeptochloa chinensis Nees, Cyperus L.RGB3/5Fully convolutional network (FCN)>90%Huang et al. [112,113]
Aviation aircraftPea, broad bean and winter wheatDiplotaxis spp., Sinapis spp.RGB, NIR250Vegetation indices, Maximum Likelihood>85%de Castro et al. [114]
QuickBird satelliteWheatBrassica napusMultispectral2.4 × 103Vegetation indices, Maximum Likelihood>89%de Castro et al. [115]
Avena sterilisMultispectral2.4 × 103Maximum Likelihood, SVM>91%Castillejo -González et al. [116]
Sentinel-1A and Sentinel-2ABeans, beetroot, maize, potato, and wheat-Multispectral, MSI10/20/60 × 103Kernel-based extreme learning machine (KELM)96.8%Sonobe et al. [94]
Sentinel-2 satellitePotatoTriticum aestivumMultispectral, NIR, SWIR-Unsupervised classification86%Revathy et al. [117]
MaizeDavilla strigosaMultispectral10 × 103Guided regularized random forest (GRRF)/Machine learning discriminant >85%Mudereri et al. [118,119]
RF/Subpixel multiple endmember spectral mixture analysis (MESMA)88%
Richardia brasiliensis, Chenopodium album, Cyperus esculentus, Megathyrsus maximus10/20/60 × 103RF95%Mkhize et al. [120]
PlanetScope and Sentinel-2Davilla strigosa3/10 × 103GRRF92%(PS); 88%(S2)Mudereri et al. [121]
Table 3. Research methods and results of crop-rows detection for some vegetables.
Table 3. Research methods and results of crop-rows detection for some vegetables.
PlatformsCropsSensorsResolution RatioDetection AlgorithmsPrecisionReferences
UAVGrapevine, pear, and tomatoRGB0.1 mThresholding algorithms,
classification algorithms, and Bayesian segmentation
>90%Ronchetti et al. [127]
Sugar beet, maizeMultispectral2.5 cmCRowNet—a model formed with
SegNet (S-SegNet) and a CNN based Hough transform (HoughCNet)
93.58%Bah et al. [128]
Spinach, legumesRGB0.35 cmHough transform; simple linear iterative clustering (SLIC); ResNet-18 (CNN)>90%Bah et al. [125]
RobotSugar beet, maizeRGB-Deep learning (DL)90.25%Silva et al. [129]
CabbageRGB1920 × 1080 pxA method of cabbage crop row localization and multi-row adaptive ROI extraction based on the limited threshold vertical projection95.75%Han et al. [130]
Sugar beetRGB-D1280 × 720 pxA semantic segmentation approach for crop row detection based on U-Net85%Silva et al. [131]
Sugar beet, canola, and leek3D laser and vision
camera
-Pattern Hough transform-Winterhalter et al. [132]
Kohlrabi, Chinese cabbage, and pointed cabbageGNSS and colour camera5-megapixelLocalization—beyond crop row following83%Winterhalter et al. [133]
CucumberRGB5-megapixelPrediction point Hough transform algorithm-Chen et al. [134]
Cabbage and red cabbageRGB/Multispectral1280 × 960 pxCNN Vegetables Detection—characterization method90.5%Cruz Ulloa et al. [135]
MaizeHyperspectral2 nmSelf-supervised method>80%Wendel et al. [136]
TractorSoybeanRGB1280 × 480 pxImage analysis method-Ospinad et al. [137]
Rice, rape, and wheatRGB480 × 640 pxShear-binary-image algorithm; least square method96.7%Zhang et al. [138]
Handheld camera/UAVCabbage, kohlrabi, and riceRGB4032 × 3024 pxImproved YOLO-v8 and threshold-DBSCAN98.9%/97.9%/100%Shi et al. [139]
Mobile phone/Moving vehiclePotatoRGB1280 × 720 pxDeep learning (VGG16) and feature midpoint adaptation>90%Yang et al. [140]
Handheld cameraSugar beet, maizeMultispectral6 mmUnsupervised classification algorithm based on spatial information (Hough transform) and spectral information 100%Louargant et al. [141]
Public dataSugar beet, legumesMultispectral-CNN (autoencoder) and simple linear iterative clustering (SLIC)88%Bah et al. [142]
Corn, celery, potato, onion, sunflower, and soybeanRGB-Hough transform and DBSCAN clustering analysis63.3%Zhao et al. [143]
Maize, celery, potato, onion, sunflower, and soybeanRGB2560 × 1920 pxNew crop row detection algorithm84%Rabab et al. [144]
New method74.7%Vidović et al. [145]
Garlic, corn, oilseed rape, rice, and wheatRGB-Seedling crop row extraction method based on regional growth and mean
shift clustering
98.18%Wang et al. [146]
Lettuce and cabbageRGB768 × 1024 pxAutomatic accumulation threshold of Hough Transformation; K-means clustering method97.1%Chen et al. [147]
CauliflowerRGB2704 × 1520 pxKalman filtering and Hungarian algorithm99.34%Hamuda et al. [148]
Table 4. Research methods and results of traditional machine learning for some vegetables.
Table 4. Research methods and results of traditional machine learning for some vegetables.
CropsWeedsPlatformsSensorsImage Pre-Processing MethodsImage Segmentation MethodsFeature Extraction MethodsClassification
Algorithms
AccuracyPrecisionReferences
Carrot, maize, and tomato-RobotRGBColour space transformation from RGB to CIE LuvBased on learning methodNeighbourhood pixel informationSVM-91%/88%/89% Rico-Fernández et al. [157]
Carrot-Public dataRGBTechnique of contrast limited adaptive histogram equalisation (CLAHE); dimensionality reduction-Laws’ texture masksRF94%-Kamath et al. [158]
Impatiens balsamina Linn., Humifuse Euphorbia Herb, Eleusine indica-HyperspectralData normalization, soil background removalNormalized Vegetation Index (NDVI)Band selection, wavelet transformFisher linear discriminant analysis (LDA) and SVM->85%Liu et al. [159]
Sugar beetThistleGround vehicleRGB-Modified excess
green (ExG) index
Local features based on affine invariant regions and scale invariant keypoints, colour vegetation indicesBag-of-Visual-Word scheme with SVM99.07%-Kazmi et al. [160]
-RobotRGB + NIRNormalization processingNear-infrared informationStatistical and shape featuresRF->90%Lottes et al. [161]
Spinach-RobotRGB-ExGImage filtering to extract colour and area featuresA classification based on area->80%Pulido-Rojas et al. [162]
LettuceChenopodium serotinum, Polygonum lapathifoliumRobotRGB-Based on colour featuresHistogram of oriented gradient (HOG), local binary pattern (LBP), and grey-level co-occurrence matrix (GLCM)SVM and image blocking-94.73%Zhang et al. [163]
Tomato, cottonSolanum nigrum, Abutilon theophrasti-RGBRandom rotation and zooming, Gaussian, Poisson, and salt–pepper noise addition, and contrast and brightness shiftsNormalized difference vegetation index, OTSU threshold-Fine-tuned densenet and SVM-99.29% (F1)Espejo-Garcia et al. [164]
Broccoli, lettuce Bromus, Chenopodium retusum, Chenopodium album Linn., Dysophyllastellata Benth., Echinochloa crusgalli L., Convolvulus arvensis, Impatiens balsamina Linn., Trifolium repensRemote-controlled ground vehicleRGB-DRemoved invalid pixels and noise pixels in point cloudsColour and depth informationCanopy and leaf features ML; Colour-depth fusion-segmentation algorithm-96.6%/92.4%Gai et al. [165]
-TractorRGBIllumination correction for an increased uniformity; transformation of the colour coordinates from RGB to
Hue saturation value; extraction of the H and
S channels; filtering for noise reduction
-Image area
and shape factors
Shape analysis and colourimetric
k-nearest neighbor (k-NN) clustering
-65–79%/69–96%Pallottino et al. [166]
Lettuce, cauliflower, and maize-TractorRGB-ExG-A novel method calculating lateral offset>95%-Li et al. [167]
Table 5. Research methods and results of deeping learning for some vegetables.
Table 5. Research methods and results of deeping learning for some vegetables.
CropsWeedsPlatformsSensorsImage Pre-Processing MethodsImage Segmentation MethodsFeature Extraction/Data Annotations MethosClassification AlgorithmsAccuracyPrecisionReferences
Chinese white cabbage--RGBData augmentation, brightness, rotation, flips, colour variations, and image definition ExGBounding box
annotation
YOLO-v3-97.1%Jin et al. [168]
Chinese cabbage-Mobile trolleyRGBGaussian noise, random rotation, random cropping, and random flip-Image annotationImproved U-Net98.24%94.56%Ma et al. [169]
PotatoChenopodium albumSmart sprayersRGBAdjusting image size-Image annotationDCNNs (GoogLeNet, VGG-16, and EfficientNet), PyTorch/TensorFlow >90%>90%Hussain et al. [170]
Malva neglecta, Portulaca oleracea, Chenopodium album L, Secale cereale L., Xanthium strumariumMoving platformRGB-Based on thresholdBased on the gray level cooccurrence matrix (GLCM), colour features, spectral descriptors of texture, moment invariants and shape featuresHybrid artificial neural network-ant colony (ANN-ACO), hybrid artificial neural network-simulated annealing (ANN-SA), and hybrid artificial neural network-genetic algorithm (ANN-GA)98%-Sabzi et al. [171]
--RGBData enhancement Image annotationImproved YOLOv4 model (MC-YOLOv498.52% Zhao et al. [172]
Sugar-beet, carrot, soybeanHeuchera americana, Chenopodium album Linn., Chenopodium retusum, Rapistrum rugosum; Mentha requienii, Daucus carota; Poaceae Barnhart, Broadleaf herbaPublic dataRGB-ExGBag-of-visual-words-augmented histogram of oriented gradientsBackpropagation neural network>90%>92%Abouzahir et al. [173]
Cassava-UAVRGBImage-space and colour-space transformations and the novel mosaic data augmentation-Bounding box
annotation
YOLOv5-96.7%Nnadozie et al. [174]
Strawberry, peaEleusine indicaUAVRGB--Bounding box
annotation
Faster R-CNN94.73%-Khan et al. [175]
CabbageAcalypha australis Linn., Chenopodium album, Humulus scandens, Ivy glorybind, Capsella bursa-pastoris-RGBCroppedExGBounding box
annotation
YOLOv5/YOLOv8-98.6%/
97.3%
Sun et al. [176]
Sugar beetSolanum tuberosumRobotRGBAdjusting image size; data enhancement--AlexNet, VGG-19, GoogLeNet, ResNet-50, ResNet-101 and Inception-v3>90%-Suh et al. [177]
RobotRGBTranslation, rotation, shear, scale, reflection, saturation, brightness, Aadjusting image size-Image annotationYOLOv3-84%Ruigrok et al. [178]
Fumaria indicaUAVMultispectralSmall patches classification-Image annotationVGG-BEET network-93.4%Moazzam et al. [179]
Chenopodium retusum, Chenopodium album Linn., Eruca vesicaria, Rapistrum rugosum-RGBRemoval of duplicate images and low-quality images, data annotation using LableIMgBased on thresholdFourier descriptors and moment invariant featuresArtificial neural network-ANN/SVM->92%Bakhshipour et al. [180]
-Public dataRGB--Bounding box
annotation
CCCS-YOLO79.5%81.3%Zhang et al. [181]
-Public dataRGB--Bounding box
annotation
WeedNet-R->85%Guo et al. [182]
-Public dataRGBData augmentation and background segmentation--TIA-YOLOv5 network90%80%Wang et al. [183]
Bok choy--RGBColour, brightness, rotation, and image definition, data enhancementBased on colour Bounding box
annotation
CenterNet-95.6%Jin et al. [184]
TomatoMonocotyledonous, DicotyledonousPublic dataRGBManually labelled, adjusting image size-Bounding box
annotation
RetinaNet, YOLOv7 and Faster-RCNN>90%-López-Correa et al. [185]
Cauliflower and broccoli-RobotRGBThe images were corrected for vignetting and flash patterns and down sampled to a resolution of 1080 × 720 pixels-Bounding box
annotation
Faster R-CNN architecture
with a Resnet-101 feature extractor
-94.79%/
94.3%
Bender et al. [186]
Maize, tomatoSolanum nigrum L., Portulaca oleracea L., Setaria Verticillata L.-RGBApplying data augmentation to the images, consisting of scaling, horizontal flips, and random adjustments to the exposure and saturation of the images.ExGRBounding box
annotation
Yolov 8l and RetinaNet algorithms-93%Moreno et al. [187]
Brinjal-Public dataRGBFlipping the image vertically and horizontally and adding some noise; adjusting image size; normalization processing-Bounding box
annotation
ResNet-18, YOLOv3, CenterNet, and Faster RCNN88%85%Patel et al. [188]
Chicory, sugar beetMercury HerbUAV/Public dataRGBMarking, cropping-Bounding box annotationYOLOv7 61.3%Gallo et al. [189]
Okra, bitter gourd, sponge gourdConyza canadensis, Paris quadrifolia, GramineaeRobotRGBScaling-Bounding box
annotation
YOLOv588%83%Fatima et al. [190]
Strawberry, tomatoEleusine indica-RGBAnnotation of the entire plant and annotation of partial sections of the leaf blade-Bounding box
annotation
YOLOv3-tiny87%/77%74%/38%Sharpe et al. [191]
Crops--RGBGabor filtering and short-term Fourier transformation for image enhancement--Elman Neural Network Model (ENN) and Modified Barnacles Mating Optimization (MBMO)98.99%96.13%Albraikan et al. [192]
Carrot, sugar beet, rice-Public dataRGBData augmentation of random horizontal flip, normalisation, and random Gaussian blur--Improved deep convolutional neural network (DCNN) algorithms->91%Janneh et al. [193]
Vegetables-Public dataRGBData augmentation and filtering--Improved YOLOv396%96%Madanan et al. [194]
Beet, parsley and spinach-UAVRGBData augmentation and normalisation processing-Bounding box
annotation
Visual transformers (ViT-16)->97%Reedha et al. [195]
CarrotCrabgrass, Musa spp., Pale persicaria, Cephalanoplos-RGBData augmentation-Bounding box
annotation
YOLO v4-weeds88.46%-Ying et al. [196]
CauliflowerChenopodium album, Capsella bursa-pastoris, Lemna minor-RGBReduce image size; Gaussian fuzzy filtering; HSV colour space conversionBased on colourBounding box
annotation
PASCAL visual object classes method 99.04%Hamuda et al. [197]
Table 7. Research methods and results of fluorescence detection for some vegetables and weeds.
Table 7. Research methods and results of fluorescence detection for some vegetables and weeds.
CropsWeedsInduced Light SourceCollecting Spectral BandsOptimal Spectral BandsAnalysis MethodsAccuracyReferences
CornMonocotyledonous weeds and dicotyledonous weedsUltraviolet400–760 nm-Principal component analysis and linear discriminant analysis91.8%Longchamps et al. [213]
400–425 nm and 425–490 nmA partial least squares discriminant analysis (PLS-DA)>91.7%Panneton et al. [214]
CropWeeds, fake leaves, soil, and plant residuesBlue, red, and white light650–800, 675–825 and 500–950 nm685 nm(blue) and 740 nm(red)Mean normalisation and standard normal variate transformation (SNV); principal component analysis combined with linear discriminant analysis (PCA–LDA) and SVM100%Wang et al. [215]
Green plants, simulated green plants, and soil650–850 nm731.1 nm(white),
730.76 nm(blue),
731.1 nm(red)
Soft independent modelling of class analogy (SIMCA) and linear discriminant analysis (LDA)>92%Wang et al. [216]
Winter wheatHerbicide resistant weedsBlue light>680 nm--95%Wang et al. [217]
Corn-Laser light350–820 nm680 and 740 nm--Lednev et al. [218]
Table 8. Research methods and results of LiDAR detection for some vegetables and weeds.
Table 8. Research methods and results of LiDAR detection for some vegetables and weeds.
PlatformsCropsWeedsSensorsAngle ResolutionFrequencyAnalysis AlgorithmsPrecisionReferences
TractorCabbage, leek, potato, and wheatLeymus chinensis TzvelevLMS111 LiDAR0.5°50 HzBased on Euclidean distance>85%Krus et al. [223]
Fixed bracketWheatAvena fatua, Sonchus oleraceus L.MRS6000 LiDAR0.13°10 HzEuclidean clustering100%Shahbaz et al. [224]
Unmanned ground vehiclesCrisphead lettuce, wild lettuce, romaine, stem lettuce, butterhead lettuce, and loose-leaf lettuce-VLP-16 LiDAR0.1°–0.4°-Random sampling consistency (RANSAC), Euclidean clustering, and K-means clustering algorithm-Cai et al. [225]
RobotCorn-LMS111 2D-LiDAR--Plant detection with Euclidian clustering (PDEC)/Iterative plant clustering Method (IPCM)73.7%/100%Reiser et al. [226]
Moving platform-SICK LMS4121R-13000 LiDAR 0.1°600 HzRF89.4%Forero et al. [227]
A movable tripodCabbage, tomato, and
eggplant
-Terrestrial laser scanner--Deep convolution neural network
(CNN) model-CropPointNet
90% (cabbage, eggplant); 70% (Tomato)Jayakumari et al. [228]
TractorTomato-LMS 111 LiDAR laser scanner0.5°50 HzK-means clustering96%Martinez- Guanter et al. [229]
Table 9. Research methods and results of combination sensors for some vegetables.
Table 9. Research methods and results of combination sensors for some vegetables.
PlatformsCropsSensorsDetection AlgorithmsPrecisionReferences
TractorSugarcaneBA2M-DDT Photoelectric sensor; HC-SR04 Ultrasonic sensorDecision Tree (DT)>90%Maldaner et al. [233]
Moving platformMaizeRGB and LiDAR sensorsDeep learning-SSD>90%Wang et al. [234]
Y-12 aircraftShelter forest, cereal crops (maize, wheat), and vegetables (leek, lettuce, cauliflower, potato, watermelon, and pepper)Compact Airborne Spectrographic Imager (CASI) and Leica ALS70 LiDAROBIA90.33%Liu et al. [235]
Fixed platformDifferent types of bladesReflection and fluorescence spectra and LiDARRF90.69%Chen et al. [236]
Remote-controlled ground vehicleBroccoli and
lettuce
RGB and depth sensorsColour-depth
fusion algorithm, ML
96.6% (broccoli), 92.4% (lettuce)Gai et al. [165]
Table 10. Research methods and results of plant modification technology for some crops.
Table 10. Research methods and results of plant modification technology for some crops.
CropsModification MethodsMarking Materials/
Substances
Marking PositionDetection MethodsAnalysis AlgorithmsPrecisionReferences
TomatoPhysicalAn environmentally friendly straw and an environmentally friendly paintMain stemColour mark sensorMethod based on colour and threshold95.19%Li et al. [243]
SoybeanChemicalRhodamine B (Rh-B)SeedsGreen light-induced fluorescence methodImproved Multi-View Positioning Algorithm96.7%Jiang et al. [238]
Snap beanChemicalRh-BSeedsWhite/green/UV light-induced fluorescence methodMethod based on colour and threshold-Su et al. [244,245,246]
Tomato,
lettuce
ChemicalRh-BRootsSunlight/UV-induced fluorescence methodMachine vision (MV)-Raja et al.
[28,239,240,241]
BiologicalGreen fluorescent proteinLeavesBlue light-induced fluorescence methodMethod based on colour and threshold-
BiologicalLc maize anthocyanin
regulatory gene
SeedsSunlight-induced fluorescence methodMethod based on colour-
PhysicalMaize-based plastic straws (water-based latex fluorescent green/orange striping paint)Independent labelsSunlight/UV-induced fluorescence methodMV97.8%
ChemicalWater-based latex fluorescent paintStemsUV-induced fluorescence methodMV98–100%
ChemicalWater-based synthetic polymer-based paintsFoliageWhite light/UV-induced fluorescence methodMV99.7%
PhysicalBiodegradable beverage straws (green or orange fluorescent water-based paint)Independent labelsSunlight/UV-induced fluorescence methodMV-Kennedy et al. [74]
ChemicalGreen or orange fluorescent water-based paintLettuce foliage, tomato seedlings
SoybeanChemicalRh-BSeedsGreen light-induced fluorescence methodMethod based on colour and threshold97%Su et al. [242]
CeleryChemicalRh-BSeedlings-Su et al. [247]
Table 11. Intelligent chemical weeding robots.
Table 11. Intelligent chemical weeding robots.
RobotsPhotosPlatformsCropsActuatorsDetection MethodsSpraying MethodsWork
Efficiency
Save Pesticide DosageDevelopers/Companies
Mobile robot-Self-developed robot-Lechler standard flat fan nozzleMVMicro dose directional spraying-95%Özlüoymak et al. [254]
Smart variable rate sprayer-PotatoSpraying nozzleDLVariable spraying->40%Hussain et al. [255]
Robot-Peucedani radix, tomato, and eggplantParallel robotic arms and circular nozzlesDLPrecise spraying--Zhang et al. [256]
Sprayer-LettucePrecision spray assemblyFluorescence
detection
Precise spraying--Raja et al. [257]
--TractorCarrotInjector nozzleMVReal-time, variable-rate
herbicide application
->30%Dammer et al. [258]
AsterixAgriculture 14 01378 i001RobotDrip irrigation nozzleMVDrip irrigation-90%Utstumo et al. [259]
Smart sprayerAgriculture 14 01378 i002TractorSugar beet, sunflower, and maizeSpot sprayerMVTargeted spraying-10–55%Spaeth et al. [260]
ARA smart sprayerAgriculture 14 01378 i003TractorPastures, field vegetables, large crops, and lawnsA strip with 156 nozzles, spaced apart every 4 cmMulti-camera vision systemUltra-precise spraying4 hm2/h95%Ecorobotix
(ecorobotix.com, URL (accessed on 29 June 2024))
AVOAgriculture 14 01378 i004Solar powered platform---Smart and ultra-ecological spraying10 hm2/d95%
Aviro D15 ProAgriculture 14 01378 i005UAV-1 to 12 nozzles options with cone/jet optionsAI prescription mapPoint to point spraying4.08 hm2/h-Avirtech
(avirtech.co, URL (accessed on 29 June 2024))
XAG XP 2020Agriculture 14 01378 i006-Rotary atomisation nozzleVariable rate spraying10 hm2/h-
WEED-ITAgriculture 14 01378 i007Tractor--WEED-IT detection sensorSpot spraying/Dual spraying/Variable rate spraying90 hm2/h-WEED-IT (weed-it.com, URL (accessed on 29 June 2024))
WeedSeeker 2Agriculture 14 01378 i008Tractor-Fixed nozzleInfrared sensor and high-resolution blue LED-spectrometerSpot spraying-90%Trimble
Agriculture
(ptxtrimble.com, URL (accessed on 29 June 2024))
AX-1Agriculture 14 01378 i009RobotCarrots, parsley root, spinach, radish, rocket, baby leaves, and celeriacA new type of nozzleRTKSelective spraying--Kilter (kiltersystems.com, URL (accessed on 29 June 2024))
Table 12. Intelligent mechanical weeding robots.
Table 12. Intelligent mechanical weeding robots.
RobotsPhotosPlatformsCropsActuatorsDetection MethodsInter-Row/Intra-Row WeedingWork
Efficiency
Weeding RateDevelopers/Companies
Intelligent intra-row robotic weeding systemAgriculture 14 01378 i010Mobile robot platformMaizeVertical disc weeding knifeDLIntra-row-85.91%Quan et al. [264]
FD20Agriculture 14 01378 i011Solar powered platformSugar beets, beetroots, onions, spinach, rapeseed, and different herbsHoeing wires inter-row, cutting knifes in-rowRTK-GPS records exact position
of each crop seed
Inter-row and intra-row--FarmDroid (farmdroid.co.uk, URL (accessed on 30 June 2024))
ROBOTTI 150DAgriculture 14 01378 i012Automatic navigation platformSugar beets, basil, lettuce, maize, onions, faba beans, parsnips, white cabbage, etc.Hoeing inter-row and finger or horizontal hoeing in-rowDeep learning and bigdata-0.5–2 hm2/h-Agrointelli
(agrointelli.com, URL (accessed on 30 June 2024))
BonirobAgriculture 14 01378 i013RobotCarrotRotating shovel or toothML--90%Deeofield Robotics
(deepfield-robotics.com, URL (accessed on 30 June 2024))
Farming Revolution W4Agriculture 14 01378 i014RobotBok choy, broccoli, cabbage, carrot, cauliflower, lettuce, corn, garlic, onion, soy bean, pumpkin, sugar beet, etc.Side-cut knives inter-row, rotary hoes in-rowMultispectral and CNNInter-row and intra-row70 hm2/week100%Farming Revolution
(farming-revolution.com, URL (accessed on 30 June 2024))
Fobro Mobil B38Agriculture 14 01378 i015Self-developed tractor-Hoeing wires inter-row and finger weeding in-row; FOBRO row hoeing brush-Inter-row and intra-row--Fobro Mobil (fobro-mobil.com, URL (accessed on 30 June 2024))
-Agriculture 14 01378 i016TractorSugar beet, lettuce, broccoli, cabbage, carrot, corn, etcHorizontal hoeingMVInter-row and intra-row1.2 hm2/hour-FarmWise
(farmwise.io, URL (accessed on 29 June 2024))
Robocrop InRow WeederAgriculture 14 01378 i017TractorLettuce, cabbage, celery, etc.Hoeing inter-row; rotating notched disc in-rowRobocrop video image analysis techniquesInter-row and intra-row--Garford
(garford.com, URL (accessed on 29 June 2024))
WEAIAgriculture 14 01378 i018Battery powered RobotOnions, beetroots, carrots, etc.-Crop and weed detection based on AIIntra-row10 hm2/time-Ekobot
(ekobot.se, URL (accessed on 29 June 2024))
Mechanical RobovatorAgriculture 14 01378 i019TractorLettuce, onion, etc.Automatic hoeing in the rowMVIntra-row--Visionweeding
(visionweeding.com, URL (accessed on 29 June 2024))
Table 13. Mechanical weeding tools.
Table 13. Mechanical weeding tools.
Weeding ToolsPhotosInter-Row/Intra-Row WeedingActuated MethodsDevelopers/
Companies
V-shaped hoeing and fingerAgriculture 14 01378 i020Inter-row and intra-rowSoil resistanceKULT Kress (kult-kress.com, URL (accessed on 29 June 2024))
HoeingAgriculture 14 01378 i021Intra-rowHydraulic or electric power
Rotating notched discAgriculture 14 01378 i022Intra-rowElectric powerFennimore et al. [265]
Tillett et al. [266]
Torsion barAgriculture 14 01378 i023Intra-rowSoil resistanceVandereide et al. [267]
Hoeing brushAgriculture 14 01378 i024Inter-row and intra-rowPTO 540 rpm or hydraulicFobro Mobil
(fobro-mobil.com, URL (accessed on 29 June 2024))
Horizontal hoeingAgriculture 14 01378 i025Intra-rowHydraulicFarmWise
(farmwise.io, URL (accessed on 29 June 2024))
Vertical disc weeding knifeAgriculture 14 01378 i026Intra-rowServo motorQuan et al. [264]
Weeding knifeAgriculture 14 01378 i027Intra-rowPneumatic cylindersRaja et al. [246]
Table 14. Intelligent physical weeding robots.
Table 14. Intelligent physical weeding robots.
RobotsPhotosPlatformsCropsWeeding MethodsActuatorsDevelopers/
Companies
AI-powered laser weeding robotAgriculture 14 01378 i028Tractor100+ cropsLaser weeding30X 150W CO2 10.6 μm lasers with tracking camerasCarbon Robotics
(carbonrobotics.com, URL (accessed on 29 June 2024))
CLAWS robotAgriculture 14 01378 i029100% battery and solar-powered robotBrassicas and lettuceConcentrated light autonomous weedingShort pulse of concentrated lightEarth Rover
(earthrover.farm, URL (accessed on 29 June 2024))
TensorbotAgriculture 14 01378 i030V2 robot-Hot oil weedingCanola oil 160 °CTensorfield Agriculture (tensorfield.ag, URL (accessed on 29 June 2024))
Thermal robovator--Sugar beetsFlame weedingPropane flamesVisionweeding
(visionweeding.com, URL (accessed on 29 June 2024))
eWeedingAgriculture 14 01378 i031Tractor-Electric weedingHigh frequency alternating currentRootWave
(rootwave.com, URL (accessed on 29 June 2024))
Annihilator 12R30/16R30, Terminator T1/T2/T3Agriculture 14 01378 i032Alfalfa, oats, rye peas, cloverElectric weedingA copper bar that has voltage on itThe Weed Zapper
(theweedzapper.oldschoolmanufacturing.com, URL (accessed on 29 June 2024))
XPSAgriculture 14 01378 i033-Electric weedingFixed and movable electrodesZasso
(zasso.com, URL (accessed on 29 June 2024))
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Jiao, J.; Zang, Y.; Chen, C. Key Technologies of Intelligent Weeding for Vegetables: A Review. Agriculture 2024, 14, 1378. https://doi.org/10.3390/agriculture14081378

AMA Style

Jiao J, Zang Y, Chen C. Key Technologies of Intelligent Weeding for Vegetables: A Review. Agriculture. 2024; 14(8):1378. https://doi.org/10.3390/agriculture14081378

Chicago/Turabian Style

Jiao, Jinkang, Ying Zang, and Chaowen Chen. 2024. "Key Technologies of Intelligent Weeding for Vegetables: A Review" Agriculture 14, no. 8: 1378. https://doi.org/10.3390/agriculture14081378

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop