Author Contributions
Conceptualization, R.G. and F.M.; methodology, R.G. and G.F.O.; software, G.F.O.; validation, R.G. and F.M.; formal analysis, G.F.O.; investigation, G.F.O.; resources, R.G. and F.M.; data curation, G.F.O.; writing original draft preparation, G.F.O. and F.M.; writing review and editing, R.G.; visualization, R.G.; supervision, R.G. and F.M.; project administration, R.G. and F.M.; funding acquisition, R.G. and F.M. All authors have read and agreed to the published version of the manuscript.
Figure 1.
Overview of the approach: the LiDAR sensor provides points, and, from those points, lines (models) are extracted and from those lines the motor speeds are deduced.
Figure 1.
Overview of the approach: the LiDAR sensor provides points, and, from those points, lines (models) are extracted and from those lines the motor speeds are deduced.
Figure 2.
The line finding problem: The objective is to identify the crop rows from LiDAR data. The left part of the figure depicts a top view of the scene (a robot moving between two crop rows), the middle part depicts the data from the sensors (i.e., points corresponding to plants) and the right part shows an ideal result from the line finding algorithm (that is, the left and right crop rows have been identified from the LiDARs data set).
Figure 2.
The line finding problem: The objective is to identify the crop rows from LiDAR data. The left part of the figure depicts a top view of the scene (a robot moving between two crop rows), the middle part depicts the data from the sensors (i.e., points corresponding to plants) and the right part shows an ideal result from the line finding algorithm (that is, the left and right crop rows have been identified from the LiDARs data set).
Figure 3.
Overview of the Ruby method.
Figure 3.
Overview of the Ruby method.
Figure 4.
The idea behind considering and . Without doing so, there is a chance that the model research considers improbable models regarding the orientation of the robot (middle part of the illustration), reducing the chance of finding the best models. On the other hand, while dividing the model research into left and right points, as the rows should be on the left or on the right, the chances of finding the correct models are increased (the right section of the illustration).
Figure 4.
The idea behind considering and . Without doing so, there is a chance that the model research considers improbable models regarding the orientation of the robot (middle part of the illustration), reducing the chance of finding the best models. On the other hand, while dividing the model research into left and right points, as the rows should be on the left or on the right, the chances of finding the correct models are increased (the right section of the illustration).
Figure 5.
Details of the control approach.
Figure 5.
Details of the control approach.
Figure 6.
Computing the distance between two parallel models and . Note that for the models to be parallel.
Figure 6.
Computing the distance between two parallel models and . Note that for the models to be parallel.
Figure 7.
Assuming that from the top configuration, the line finding algorithm returns the models , , and (bottom part of the figure). The expected behavior of the filter is to keep the models and (removing the model ) and knowing the distance d between the rows, to compute the missing models (dotted lines). Note that for the bottom part of the figure, the colors of the dots correspond to the models the points are associated to.
Figure 7.
Assuming that from the top configuration, the line finding algorithm returns the models , , and (bottom part of the figure). The expected behavior of the filter is to keep the models and (removing the model ) and knowing the distance d between the rows, to compute the missing models (dotted lines). Note that for the bottom part of the figure, the colors of the dots correspond to the models the points are associated to.
Figure 8.
The filtered models
, with
, resulting in the configuration depicted in
Figure 7. The models
and
are computed according to
,
, and
d.
Figure 8.
The filtered models
, with
, resulting in the configuration depicted in
Figure 7. The models
and
are computed according to
,
, and
d.
Figure 9.
Input membership functions.
Figure 9.
Input membership functions.
Figure 10.
Output membership functions. Note that the left wheel speed and the right wheel speed have the same membership function, and that “max” means the maximal possible speed value.
Figure 10.
Output membership functions. Note that the left wheel speed and the right wheel speed have the same membership function, and that “max” means the maximal possible speed value.
Figure 11.
Rule chart. The detailed rules for the left and right wheel speeds according to the position and orientation of the robot.
Figure 11.
Rule chart. The detailed rules for the left and right wheel speeds according to the position and orientation of the robot.
Figure 12.
The simulator overview.
Figure 12.
The simulator overview.
Figure 13.
The simulated robot: the white wheels are the differential wheels, the black balls are the caster wheels and the blue/red boxes are the LiDARs. Note that the blue rays depict the LiDAR measurements. (a) General view; (b) Side view; (c) Top view.
Figure 13.
The simulated robot: the white wheels are the differential wheels, the black balls are the caster wheels and the blue/red boxes are the LiDARs. Note that the blue rays depict the LiDAR measurements. (a) General view; (b) Side view; (c) Top view.
Figure 14.
Simulated test crops. The green dots represent crops while the red dots represent weeds. (a) Crop 1; (b) Crop 2; (c) Crop 3; (d) Crop 4.
Figure 14.
Simulated test crops. The green dots represent crops while the red dots represent weeds. (a) Crop 1; (b) Crop 2; (c) Crop 3; (d) Crop 4.
Figure 15.
Example of trajectory: the robot starts at the position (0, 0), parallel to the crops, as depicted in the figure. Then it has to reach the end of the fifth row without running over the plants. This figure depicts an example of a successful trajectory.
Figure 15.
Example of trajectory: the robot starts at the position (0, 0), parallel to the crops, as depicted in the figure. Then it has to reach the end of the fifth row without running over the plants. This figure depicts an example of a successful trajectory.
Figure 16.
An example of a failed trajectory: the robot reaches the end of the fifth row but runs over the crop plants.
Figure 16.
An example of a failed trajectory: the robot reaches the end of the fifth row but runs over the crop plants.
Table 1.
A more classical representation of the controller rules. L: left, R: right, V: very, C: center, 1: slow, 2: medium, 3: maximum, F: forward, B: backward.
Table 1.
A more classical representation of the controller rules. L: left, R: right, V: very, C: center, 1: slow, 2: medium, 3: maximum, F: forward, B: backward.
Left Wheel Speed |
| Position |
| VL | L | C | R | VR |
Orientation | VL | 2 F | 2 F | 1 F | 3 F | 3 F |
L | 1 F | 1 F | 2 F | 1 F | 2 F |
C | 3 F | 3 F | 3 F | 2 F | 1 F |
R | 2 F | 1 F | 1 F | 1 B | 1 B |
VR | 2 F | 1 F | 1 B | 2 B | 2 B |
Right Wheel Speed |
| Position |
| VL | L | C | R | VR |
Orientation | VL | 2 B | 2 B | 1 B | 1 F | 2 F |
L | 1 B | 1 B | 1 F | 1 F | 2 F |
C | 1 F | 2 F | 3 F | 3 F | 3 F |
R | 2 F | 1 F | 2 F | 1 F | 1 F |
VR | 3 F | 3 F | 1 F | 2 F | 2 F |
Table 2.
The ratio of successful runs regarding the total number of tries (five, in this case). A result of 1 means that the algorithm never failed, a result of 0 means that the robot never reached the end of the field without crushing crop plants.
Table 2.
The ratio of successful runs regarding the total number of tries (five, in this case). A result of 1 means that the algorithm never failed, a result of 0 means that the robot never reached the end of the field without crushing crop plants.
Successful Runs (%) |
---|
| Crop 1 | Crop 2 | Crop 3 | Crop 4 | Mean |
Ruby | 0.8 | 0.8 | 1 | 0.2 | 0.75 |
RG | 0.8 | 0.8 | 0.4 | 0 | 0.5 |
RGOP | 1 | 1 | 1 | 1 | 1 |
RGOPPN | 1 | 1 | 1 | 0.8 | 0.95 |
RGOPPNI | 1 | 1 | 1 | 0 | 0.75 |
Table 3.
The positioning error that the robot made when moving between two crop rows. In a perfect situation, the robot should be exactly in the middle of the two crop rows, and thus should have an error of 0.
Table 3.
The positioning error that the robot made when moving between two crop rows. In a perfect situation, the robot should be exactly in the middle of the two crop rows, and thus should have an error of 0.
Mean Squared Error (mm2) |
---|
| Crop 1 | Crop 2 | Crop 3 | Crop 4 | Mean |
Ruby | 2900 | 360 | 120 | 5500 | 2200 |
RG | 8700 | 3100 | 89 | 9640 | 5300 |
RGOP | 65 | 430 | 67 | 750 | 320 |
RGOPPN | 300 | 700 | 140 | 2240 | 840 |
RGOPPNI | 67 | 310 | 260 | 60,000 | 15,000 |
Table 4.
The average execution time for the line finding algorithms (LiDAR data to models) during the 500 first iterations of each run.
Table 4.
The average execution time for the line finding algorithms (LiDAR data to models) during the 500 first iterations of each run.
Average Execution Time (ms) |
---|
| Crop 1 | Crop 2 | Crop 3 | Crop 4 | Mean |
Ruby | 4.14 | 21.15 | 9.39 | 9.61 | 11.07 |
RG | 5.25 | 8.12 | 6.56 | 6.11 | 6.51 |
RGOP | 2.35 | 3.44 | 2.9 | 2.7 | 2.84 |
RGOPPN | 1.58 | 3.57 | 2.8 | 2.37 | 2.58 |
RGOPPNI | 3.3 | 6.37 | 4.42 | 4.05 | 4.5 |
Table 5.
The average number of points considered by the line finding algorithms during the 500 first iterations of each run.
Table 5.
The average number of points considered by the line finding algorithms during the 500 first iterations of each run.
Average Number of Points |
---|
| Crop 1 | Crop 2 | Crop 3 | Crop 4 | Mean |
Ruby | 91 | 179.3 | 122.54 | 120.36 | 128.3 |
RG | 91.16 | 178.1 | 124.53 | 120 | 128.4 |
RGOP | 40.1 | 73.12 | 54.18 | 47.16 | 53.64 |
RGOPPN | 40 | 73.15 | 54.19 | 47.12 | 53.61 |
RGOPPNI | 40.08 | 73.96 | 54.19 | 47.17 | 53.85 |
Table 6.
The average execution time per 100 points for the line finding algorithms (LiDAR data to models) during the 500 first iterations of each run.
Table 6.
The average execution time per 100 points for the line finding algorithms (LiDAR data to models) during the 500 first iterations of each run.
Execution Time /100 Points (ms) |
---|
| Crop 1 | Crop 2 | Crop 3 | Crop 4 | Mean |
Ruby | 4.54 | 11.79 | 7.66 | 7.98 | 7.99 |
RG | 5.75 | 4.55 | 5.26 | 5.09 | 5.11 |
RGOP | 5.86 | 4.7 | 5.35 | 5.7 | 5.4 |
RGOPPN | 3.95 | 4.88 | 5.16 | 5.02 | 4.75 |
RGOPPNI | 8.2 | 8.61 | 8.15 | 8.58 | 8.38 |