3.1.1. Spraying Robots
The locomotion and navigation methods proposed in the literature depend on the mission (environment and target crops). Mainly, the robots follow a straight path within a crop row. For example, a low-cost spraying robot for crops with narrow rows is proposed in [
18]. The robot is able to use vision to detect and apply herbicide to weeds, as well as to detect the crop row in order to perform navigation in-row. The proposed robot is able to operate continuously in flaxseed fields with an autonomous recharging system. The low-cost robot described in [
19] is designed to move on a rail in pre-defined paths in sugar beet fields. In this study, the camera detects weeds and applies an amount of herbicide, which depends on the size of the detected weeds. The authors observed the decreasing amount of herbicide required while adjusting the speed of the robot and the height of the spraying nozzle.
In terms of sensing, robots predominantly use vision for weed and crop recognition and localisation. The purpose is to recognise the location of the weeds in order to apply the herbicide in a targeted manner, thus limiting the amount dispersed in the environment. An early work regarding the vision for precision spraying is presented in [
20]. In this work, the cameras mounted on a modular robot are used to control the eight spraying nozzles that are attached to its weeding implement. The prototype described in [
21] is designed to work with an image processing algorithm which allows the detection of weeds around finger millet and subsequent selective application of herbicide. The authors reported successful weed identification in almost all cases and that the duration of identification and spraying is approximately 3 s. In a similar study, a prototype robot for detecting weeds in onion fields is presented in [
22]. Their proposed vision-based weed identification system successfully identified weeds in the onion field with approximately 97% accuracy. Another machine learning-based weed detection method is described in [
23]. In this work, a four-wheeled robotic platform moves in a straight line in cotton field rows and uses its multiple spraying nozzles to deliver the herbicide at the recognised weed locations. It is reported that their method has a similar precision (over 97%) in identifying weeds. The Asterix robot’s targeted spraying has been tested in carrot fields [
24]. The robot uses a forward-facing camera to detect the seed lines and apply a controlled quantity of herbicides when weeds are detected.
Taking into account the weed location provided by the machine vision methods described above, the spraying devices employed by these robots are designed to apply the chemical substance selectively. This results in both reducing the amount of chemical used and minimising its effect on the actual crop. For example, the two-wheeled intra-row robot described in [
25] uses a novel targeted tool which involves the targeted application of herbicide after first damaging the weed tissue. With this configuration, the robot is able to direct herbicide delivery to working zones detected by the vision model. Experiments in a maise field revealed that the robot achieved a weed removal rate of up to 90%, a crop damage rate of less than 1.95% at a good working speed in a maise field, and a 94.45% weed removal rate and a 0.82% crop damage rate in a Chinese cabbage field. A combination of spraying and mechanical weeding can also be found in [
26], where vision guides the spraying and stamping tool. Another selective spraying device is described in [
27], where the simulated BonnBot-I robot is equipped with spot-spray nozzles to treat the detected weeds individually.
Table 1 summarises the characteristics of spraying weeding robots. In all cases, vision was used for weed and crop detection.
Figure 2 shows examples of spraying weeding robots and an example of an implement for targeted spraying.
3.1.2. Non-Chemical Weeding Robots
As in the previous section, weeding robots in this category display various navigation methods, mainly dictated by the available sensing equipment and the target crop. The Korean K-Weedbot is able to extract morphological features (seedling characteristic points) from its vision module in order to determine the rows of paddy fields that will determine its navigation [
28,
29]. The authors of this study reported high precision guidance with a less than 1-degree error in the estimated guidance line. A similar navigational strategy is demonstrated for paddy fields by [
30]. In recent work, a machine vision-based navigation method for a weeding robot is presented in [
31]. In this case, vision is used for rice seedling detection, which leads to seedling line extraction with the least squares method and provides visual feedback for navigation corrections. Field experiments using the proposed method yielded an average weed control rate of 82.4% and a seedling injury rate of 2.8%. On the other hand, Cowbot, an autonomous weed mowing robot for maintaining cow pastures, uses algorithms for online coverage planning, which takes into account continuously retrieved information regarding weed detection in order to optimise path length and ensure coverage. The robot utilises sensor fusion from Real-Time Kinematic Positioning (RTK), an Inertial Measurement Unit (IMU) and two cameras to assist navigation [
32]. Another sensor fusion method for the navigation of a weeding robot is presented in [
33]. The proposed method attempts to integrate satellite-based location information, compass and machine vision to accurately guide the robot along a pre-defined route to cover the entire paddy field. Their algorithm was able to identify the paddy field rows and guide the robot with good accuracy (less than 2.5° in orientation error). In the case of [
34], however, the authors selected to assign the navigation part to a human operator controlling the vehicle remotely, while the weeding operation is carried out autonomously with a camera locating the weeds and guiding a gripper for weed removal. In [
35], the target environment is a cucumber greenhouse, so the weeding robot moves on an installed monorail. In [
36], the proposed robot possesses a novel screw-type wheel design and wheel angle adjustment that provide better in-row navigation while weeding slurry paddy fields.
Table 2 summarises these works.
In terms of sensing, it can be seen that weed detection using vision is predominant in the literature. A complete weeding robotic system is presented in [
37]. In this work, a robot is able to navigate a cotton field by following waypoints, performing vision-based weed recognition and applying targeted treatment using both spraying and mechanical tools. The robot demonstrates an accuracy of more than 92% in recognising a range of weed species. In [
38], the AgBotII robot is used for weed scouting. The objective of the proposed method is to enable the robot to be deployed in a field without prior knowledge of the target crop. Using collected data from field trials, the method uses machine vision approaches to cluster plants into groups using clustering algorithms. A planning algorithm for treating weeds using laser beams is demonstrated in [
39]. While moving, the robot recognises the weeds within an area using vision, and the laser beam is directed to consecutive locations of detected weeds. The system displayed good accuracy with a weeding hit rate of up to 97%. In contrast with contact weed removal methods, for contactless laser weeding, an arm with only two degrees of freedom is sufficient. However, the accuracy and speed of the gimbals are important to ensure the accurate application of the beam. In this study, two Class 1 laser pointers were used to simulate the more powerful Class 4 lasers that would be used in weeding. Nevertheless, the proposed system achieved a moving speed of 30 mm per second while applying the beam for 0.64 s per weed. In [
40], a vision-based system using the BoniRob platform is presented. The system uses two cameras: one camera in front of the manipulator to detect the weed and a second camera attached to the end effector for visual servoing to drive the weeding tool. The system was able to remove 1.75 weeds per second. The robotic system described in [
41] exhibited good performance in recognising corn seedlings (up to 93% recognition rate) and weeds (up to 89% recognition rate) in corn fields, and this resulted in high weed prevention effectiveness and low seedling injury rate, given a laser beam as a weeding tool. In this case, the blue laser device is the end effector of a 5-degree-of-freedom arm, moving according to the robot’s moving platform and adjusting according to the detected weeds. The authors calculated the appropriate laser emission doses to apply in order to inhibit weed growth. The recognition accuracy was dependent upon the speed of the robot. The proposed robotic platform AGRIBOT is designed to perform autonomous navigation in a field for real-time weed detection [
42]. In the relevant simulation work, the authors showed that their trained model successfully performed weed recognition with good performance in terms of accuracy (approximately 99%) and latency (2.5 frames per second), suitable for real-time weed detection. In a similar manner, the weed-detecting three-legged robot presented in [
43] was able to identify weeds using a trained vision model with an identification rate above 99.5% in order to guide a delta arm for weed removal. Preliminary investigations on robots with machine vision for weed and crop discrimination can also be found in [
44,
45]. For weed detection, these studies use a Convolution Neural Network (CNN) and a fuzzy real-time classifier, respectively.
Table 3 summarises the research described above.
Other studies have focused on the weeding tools employed by the weeding robots. For example, a weed suppression mechanism is presented in [
46]. The robot possesses an arm with a brush at its end-effector, which applies force to weed seedlings, thus suppressing their growth. The authors have also considered controlling the posture of the robot when operating on uneven ground. Another early work describes the modular BoniRob platform within the context of the RemoteFarming project, which can accept various tools depending on the target crops [
47]. The study presented in [
48] compares three mechanical weeding implements mounted on an AgBotII robot, namely the arrow hoe, a tine, and a cutting tool, for treating cotton and grasses. The study found that, of the three tools examined, when treating cotton is considered, the most effective tools were the tine and the cutting mechanism, but the cutting mechanism was ineffective for grasses. Also, the experiments suggested that early treatment (four weeks after planting) is the most effective strategy. An approach using a floating robot frame in a paddy field to perform weeding is employed in [
49]. In this approach, the teleoperated robot uses propellers for steering and maintaining its heading, while it uses chains to stir the soil under the water and thus dislocate the weeds. Using a similar principle, the paddy field weeding robot presented in [
50] possesses specially designed wheels, which are the actual implements that are used to stir the soil and remove the weeds. In this case, however, the robot is moving autonomously using coverage planning based on the rice seedlings detected by capacitive sensors. This is in contrast to the vision-based approaches followed by the vast majority of the other approaches discussed in this paper. A robot moving on a conveyor belt using its rotating weeding tool (two vertically rotating discs with weeding knives) when weeds are detected is described in [
51]. Using this tool, the authors report a weed removal rate of up to 84.4% in field trials, while they achieved a crop injury rate of around 1%. A prototype robot with a gesture-controlled weeding arm is proposed in [
52]. Here, the weeding arm is taught to perform the weeding action by movements made by a hand glove.
Table 4 summarises these aforementioned results.
The reviewed literature also includes works focusing on subjects that cannot be grouped with the above, such as performance comparisons. For example, a comparison between a commercial robotic lawnmower and other non-automated weed removal methods in terms of cost and effectiveness for pear orchards is presented in [
53]. Specific measures have been employed to assess both the effectiveness and efficiency of the machines, including weed-cutting efficiency (ratio of cut weeds to the total weeds present) and total annual costs (including ownership, maintenance, energy, etc.). The authors found that for smaller fields, the robotic device is more cost-effective than other conventional lawnmowers, and it has displayed good performance in all field sizes. In a recent study, [
54] the performance of seven robotic systems for weeding in sugar beet fields and winter oil-seed rape, compared to traditional herbicide treatment, was examined. Field experiments measured weed and crop density and working rate. The study found that robots reduced weed density by at least equal to the standard herbicide treatment. Also, in some cases, robotic treatment resulted in significant herbicide savings. Furthermore, it was found that in some experiments, robotic weeding achieved a weeding control efficacy of 93%, compared to the average of 83% for herbicide applications. The study concluded that despite the high cost of weeding robots, this approach is reliable and effective.
There have also been investigations on the environmental impacts of weeding robots. In 2015, a study investigated the fuel consumption of robotic weeding tractors for various weeding approaches [
55]. A model to measure fuel consumption was developed, and methods related to adjustments in gear and throttle positions to reduce consumption were proposed. In a more recent study, energy use, crop yields and emissions were studied by comparing several non-chemical weeding methods with electric and diesel fuel robotic tractors. It was found that intelligent robotic weed control methods are more efficient compared to conventional ones [
56]. More specifically, the lowest total energy consumption was achieved with vision-based mechanical inter-row loosening. This weeding method resulted in lower emissions when a diesel-powered robot was used. Also, the use of weeding robots resulted in higher sugar beet root yield compared to applying inter-row mulching. In a subsequent study, an evaluation of a conventional weeding system and FD20 robot was performed in terms of performance and effect on soil [
57]. The results showed that the average weed control effectiveness inter-row was higher for the robotic system and that the robotic system had a minimal effect on soil penetration resistance, while the conventional weed control systems increased soil penetration resistance by up to 20%. In [
58], a life cycle assessment (LCA) of the WeLASER weeding robot is presented in order to identify its strengths and weaknesses in environmental terms. It was found that even though autonomous laser-based weeding robots show potential for environmental efficiency, their energy issues are still the most challenging. Energy-related environmental impacts are found to be related mainly to thermal energy generation by the diesel engine machine, but the laser-weeding method has only a moderate environmental impact compared to mechanical and chemical weed control methods, as long as the method is used in a targeted manner. However, the study does not examine additional expected benefits of laser-based weeding, such as higher food quality and lower soil compaction.
Table 5 summarises the approaches presented above.
Figure 3 shows examples of weeding robots with mechanical weeding tools.
3.1.3. Cooperative Approaches
Cooperative approaches, mainly multi-robot, have also been proposed in the literature, aimed at improving the efficiency and performance of the weeding systems. For example, in [
59], the authors propose the use of the BoniRob robotic platform [
60] as the Unmanned Ground Vehicle (UGV) with an Unmanned Aerial Vehicle (UAV). In this configuration, the UAV is used to identify weed pressure in the crops from the air and then communicate this information to the ground robot, which, in turn, moves towards the desired areas for targeted weed removal. To achieve this, a map that can be shared between the UAV and the UGV is constructed. To achieve this, the authors use a pipeline that registers heterogeneous georeferenced point clouds generated by heterogeneous vehicles. In practical terms, this approach aims at efficient weed removal interventions, where the UGV only visits the locations of interest, taking advantage of the more efficient UAV monitoring capabilities. In [
61], a fleet of heterogeneous ground and aerial robots is employed for pest control. In this study, various weeding approaches are evaluated for different crops, including chemical spraying, mechanical weeding as well as thermal weeding. The fleet is coordinated by a mission manager, who includes planners for the aerial and ground teams. The mission manager can orchestrate both inspections and treatment missions according to the user’s mission parameters. It is responsible for generating robot trajectories, obtaining data from the perception systems and supervising the behaviour of the robots during the mission. Another multi-robot approach is proposed in the simulation work described in [
62,
63], where the focus is to develop, through a novel simulation software, an appropriate coordination planner. This work involves a centralised planner assigning tasks to identical agents who are recognising and killing weeds in the crop rows while navigating. The planner optimises performance by utilising information about the environment shared by the weeding agents so that the agents are directed to specific rows in the field based on a reward model. The simulation environment developed for this work was used to perform various computational experiments in a simulated field in order to investigate how parameters such as initial weed densities, weed growth duration, robot team size and time of deployment affect the effectiveness of the coordinated weeding. These parameters help design more effective multi-robot weeding interventions. Finally, a co-robot weeding approach is proposed in [
64]. In this system, there is a human operator on the machine, but the operation of a pair of intra-row hoes is controlled automatically, according to the known plant positions. With accurate positioning, it is shown that it is possible to perform weeding while at the same time protecting the crop, and the system was shown to significantly reduce manual labour. The time required by the proposed system to hoe a specific intra-row region was measured to be 10.2 h per hectare, compared to the 24.1 h per hectare required by manual hoeing, indicating a significant time gain.
Table 6 below summarises the reviewed cooperative approaches.