Next Article in Journal
Impact of Regional Location and Territorial Characteristics on Profitability in the Spanish Pig Farming Industry
Previous Article in Journal
CanKiwi: A Mechanistic Competition Model of Kiwifruit Bacterial Canker Disease Dynamics
Previous Article in Special Issue
Development and Experiment of an Air-Assisted Sprayer for Vineyard Pesticide Application
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Development of EV Crawler-Type Weeding Robot for Organic Onion

1
Laboratory of Bio-Mechatronics, Faculty of Engineering, Kitami Institute of Technology, Koentyo 165, Kitami Shi 090-8507, Japan
2
College of Biosystems Engineering and Food Science, Zhejiang University, Hangzhou 310058, China
*
Author to whom correspondence should be addressed.
These authors contributed equally to this work.
Agriculture 2025, 15(1), 2; https://doi.org/10.3390/agriculture15010002
Submission received: 9 November 2024 / Revised: 8 December 2024 / Accepted: 12 December 2024 / Published: 24 December 2024
(This article belongs to the Special Issue Design and Development of Smart Crop Protection Equipment)

Abstract

:
The decline in the number of essential farmers has become a significant issue in Japanese agriculture. In response, there is increasing interest in the electrification and automation of agricultural machinery, particularly in relation to the United Nations Sustainable Development Goals (SDGs). This study focuses on the development of an electric vehicle (EV) crawler-type robot designed for weed cultivation operations, with the aim of reducing herbicide use in organic onion farming. Weed cultivation requires precise, delicate operations over extended periods, making it a physically and mentally demanding task. To alleviate the labor burden associated with weeding, we employed a color camera to capture crop images and used artificial intelligence (AI) to identify crop rows. An automated system was developed in which the EV crawler followed the identified crop rows. The recognition data were transmitted to a control PC, which directed the crawler’s movements via motor drivers equipped with Controller Area Network (CAN) communication. Based on the crop row recognition results, the system adjusted motor speed differentials, enabling the EV crawler to follow the crop rows with a high precision. Field experiments demonstrated the effectiveness of the system, with automated operations maintaining a lateral deviation of ±2.3 cm, compared to a maximum error of ±10 cm in manual operation. These results indicate that the automation system provides a greater accuracy and is suitable for weed cultivation tasks in organic farming.

1. Introduction

In recent years, Japanese agriculture has faced increasing challenges, particularly due to the declining number of core agricultural workers and the growing proportion of elderly workers. These demographic trends, influenced by the nation’s declining birthrate and aging population, have significantly affected the agricultural workforce [1]. Concurrently, heightened environmental awareness has brought organic farming into the spotlight. Organic farming not only mitigates the risk of environmental pollution, but also enhances the value of agricultural products [2,3,4,5]. However, a major challenge in organic farming is the increased labor intensity required for weeding, which is more demanding compared to conventional herbicide usage [6]. Robotic automation has been identified as an effective solution to this issue [7]. Among the various methods for automating weed control in organic farming, weeding cultivators are particularly notable [8]. These devices are used in the cultivation of several crops, including legumes, corn, potatoes, onions, and sugar beets. The operation of weeding cultivators requires delicate precision, often at the centimeter level, over extended periods, which is physically and mentally demanding, further emphasizing the need for automation.
Automating weeding cultivator operations requires systems capable of both following crop rows and recognizing these rows accurately. In actual field conditions, the transplanting of crops—whether mechanically or manually—can result in varying degrees of row misalignment. Consequently, weeding machines without feedback mechanisms cannot automatically adjust to the curves of the crop rows, which may result in crop damage. Therefore, it is essential for automated systems to avoid damaging seedlings. Weeding cultivators are commonly employed in organic onion cultivation [9]. The row spacing in onion farming typically ranges from 12 to 15 cm, which is narrower than that of crops such as corn (20–70 cm) and soybeans (60–70 cm). As such, a high-precision tracking performance relative to the crop rows is required [10,11,12]. This precision is crucial for successful automation. Research on automated steering systems for agricultural machinery has explored methods utilizing both Global Navigation Satellite Systems (GNSSs) and camera-based systems [13,14]. In the case of automatic steering for tractors, hydraulic steering systems have been developed that can track target lines with a steering angle deviation of less than 2.192° and a maximum lateral path tracking error of 4.39 cm [15,16]. Using GNSSs and Inertial Navigation Systems (INSs), agricultural tractors have been able to achieve a lateral path tracking error as low as 2.94 cm under straight path conditions [17]. However, GNSS-based navigation systems are vulnerable to environmental factors such as weather conditions, satellite positioning, and surrounding obstacles. To overcome these limitations, vision-based road detection systems independent of GNSSs are being developed. These systems work by segmenting road surfaces and edges from RGB images captured by cameras. The resulting lateral error from the machine vision system can then be transmitted to the tractor’s automatic navigation system, enabling automatic driving with lateral errors of less than 0.2 m on unpaved roads and less than 0.4 m on paved roads [18]. This highlights that automatic weeding cultivator operations are feasible if crop rows can be effectively recognized.
Crop row recognition and estimation have been extensively studied using image processing techniques and artificial intelligence (AI). Image processing techniques for crop row estimation generally involve identifying weed density, providing guidance, and extracting overlapping data for region-specific processing, which allows for the estimation and recognition of crop rows [19]. Curve recognition is typically performed by extracting feature points from images and progressively estimating paths [20]. However, image-based crop row estimation is often susceptible to misrecognition due to crop growth, requiring multiple pattern settings. Crop row detection, based on machine vision, generally faces challenges such as a low detection accuracy and a suboptimal real-time performance. Furthermore, complex field conditions—such as a high weed density, poor lighting, and the presence of shadows cast by vegetation—pose significant challenges to crop row detection [21,22]. Recent advancements in AI-driven image recognition techniques provide solutions to these challenges [23]. For example, semantic segmentation using deep neural network architectures, such as Fully Convolutional Networks (FCNs) and U-Net, has been successfully applied to tea-picking operations, enabling the extraction of tea row contours for accurate crop row estimation [24]. Additionally, deep learning models such as R-CNN and SSD have demonstrated efficacy in crop row estimation in rice fields [25]. In strawberry fields, convolutional neural networks (CNNs) have been utilized to segment RGB images into crop and non-crop areas, effectively handling uneven contours [26]. For potato crops, U-Net with a VGG16 backbone has been employed to adaptively adjust the visual navigation line position according to crop growth stages, providing accurate navigation line detection [27]. Lettuce crop row estimation has utilized vegetation indices derived from captured images, combined with the Progressive Sample Consensus (PROSAC) algorithm and distance filtering, to reliably extract crop row centerlines and achieve real-time recognition at 10 frames per second (FPS). The YOLOv8-seg model has been proposed for its balanced performance in both real-time detection and accuracy, out-performing other segmentation models such as Mask R-CNN, YOLOv5-seg, and YOLOv7-seg, with improvements in mean Average Precision (mAP50) of 10.8%, 13.4%, and 1.8%, respectively. Furthermore, it achieved a detection speed of 24.8 FPS on a Jetson Orin Nano standalone device [28]. In initial tests on onion fields, the YOLOv8-seg model demonstrated its ability to reliably estimate crop rows under challenging conditions, such as a high weed density and variable lighting. Given these findings, the YOLOv8-seg model is considered to be particularly suitable for real-time performance and a high detection accuracy, especially for onion crop row estimation using AI image recognition.
To reduce environmental impacts and improve driving precision, the adoption of electric crawler-type machines has been considered. It is estimated that the electrification of agricultural machinery will reduce agricultural carbon dioxide emissions by 44–70%, significantly contributing to environmental sustainability [29,30]. Combustion-engine-driven machines, such as tractors, have slower acceleration and deceleration compared to electric vehicles, making high-precision autonomous driving more challenging. Electrification enables more precise automatic driving [31]. Among agricultural machinery, both wheeled and crawler types are commonly used. Crawler machines exert a lower ground pressure and offer a greater traction efficiency than wheeled machines, which supports the development of electric crawlers [32]. Research on the automatic driving of crawler-based systems has shown high levels of driving accuracy, with RTK-GPS and IMU systems achieving lateral and directional accuracies of approximately 1 cm and 0.2°, respectively [33]. For these reasons, this paper adopts the electric crawler type for its operational benefits.
Therefore, this study aims to automate weeding cultivation tasks by developing an electric vehicle (EV) crawler that replaces traditional tractor machines in agricultural operations. A system is developed to perform autonomous driving by recognizing crop rows through AI-based image recognition and the estimation of crop lines. A comparative analysis is conducted between manual and automatic driving systems, using both tractors and EV crawlers, to evaluate the tracking performance of the developed system. The practical implementation of crawler mechanisms is highly promising; however, challenges such as cost, scalability, and data quality and accessibility arise during operation. Overcoming these challenges will require not only technological advancements, but also the optimization of system design and resource management. In the implementation of AI systems, it is crucial to develop strategies for efficiently utilizing crawlers and aim for sustainable operation. This study is believed to contribute to addressing these issues.

2. Materials and Methods

2.1. EV Crawler-Type Weed Control Robot

As illustrated in Figure 1, the EV crawler-type weed cultivation robot consists of an electric vehicle (EV) crawler and an AI-based image recognition system. The EV crawler captures images of crop rows using an integrated color camera, which are then transmitted in real time to a control PC for AI-based image recognition. The processed recognition results are subsequently sent back to the control PC, which generates and transmits driving commands to the EV crawler, controlling it to follow the crop rows and avoid stepping on them during automatic operation.
In the AI-based image recognition process, the images captured during operation are annotated based on the collected data. These labeled data are then used for training the AI model on a dedicated training PC. Once trained, the model is deployed to the image recognition PC, enabling the real-time recognition of crop rows during autonomous operation. Table 1 outlines the components utilized in the EV crawler-type robot, while Figure 2 illustrates the actual configuration of the EV crawler-type robot.

2.2. EV Crawler Basic Design

For the development of the EV crawler-type weed cultivation robot, the initial design phase was conducted. Since the width (inter-row spacing) of onion crop rows can vary across different fields, it was essential to adjust the width of the vehicle to prevent damage to the crop rows. The decision to adopt a crawler-type design was based on several factors, including the need to reduce the number of operational components through differential drive, simplify the system construction, enable tight turning in place, minimize slippage, and improve its traversing ability across diverse terrains. Figure 3a,b illustrate the basic design of the EV crawler, while Table 2 provides the detailed specifications of the EV crawler. The reason for adopting a crawler-type design was that it uses tracks (caterpillars) to move across the ground, which distributes its weight over a wide area, increasing the contact surface with the ground. This results in greater friction, making it less likely to slip. Additionally, the crawler system allows for stable movement on uneven terrain or sandy surfaces, and when crossing stones or obstacles, the flexibility of the tracks enables them to adapt to the ground, making it easier to overcome these challenges.

2.3. Motor Control of EV Crawler

A driving program for motor control was developed for the EV crawler-type robot. Table 3 presents the specifications of the motor, while Table 4 outlines the specifications of the motor driver (MD). The motor drivers employed in this system are compatible with Controller Area Network (CAN) communication. Figure 4 depicts a flowchart of the driving commands used during the EV crawler’s operation.
The control PC calculates the required values using Expressions (1) and (2) based on the parameters of m_rpm (set speed: from −3000 to 3000 rpm) and turn_rate (turning rate: from −50 to 50). These values are used to adjust the speed differential between the left and right motors to enable turning. The system utilizes differential two-wheel control, allowing for smooth turning through speed differentials. Communication between the control PC and motor drivers is facilitated through PDO (Process Data Object) communication within the CANopen protocol. Read/write operations are performed at predetermined intervals, enabling the transmission of speed commands and the monitoring of motor states. The monitored parameters include rotational speed, current, torque, error status, and motor configuration status, as specified by the MD specifications. By leveraging PDO communication, the system ensures the periodic, real-time acquisition and transmission of data, providing a high responsiveness during operation.
R _ s p e e d = m _ _ r p m × 100 + min 2 × t u r n r a t e , 0 100
L _ s p e e d = m _ _ r p m × 100 max 2 × t u r n r a t e , 0 100

2.4. Crop Row Recognition Using AI

2.4.1. Collecting and Labeling Training Images

In the actual field, a camera was mounted on a tractor to capture images of the crop rows. Images were collected from onion fields on varying dates and under varying weather conditions. However, due to the camera being attached to the tractor, the captured images only represented the crop rows in a straight line, and it was not possible to capture images of diagonal crop rows. In such straight-line images, diagonal rows cannot be recognized, which may make it difficult to recover or correct errors during vehicle operation when following the rows. Therefore, in this study, a system was developed using MATLAB2023b to artificially generate diagonal crop rows. This program was developed in MATLAB to resize and adjust the angles of these images, generating synthetic images of diagonal crop rows. Randomly selected images were labeled using the polygon format within the Image Labeler app in MATLAB, as shown in Figure 5, for crop row labeling. In total, 270 images of straight rows and 130 images of diagonal rows were labeled, resulting in a dataset of 400 labeled images, which was subsequently used for training.

2.4.2. AI Learning

Using the labeled training data, AI model training was conducted on the devices specified in Table 5. The training process utilized a specific proportion of images for each labeled category, as detailed in Table 6. The dataset was divided into the following three subsets: the training dataset was used to train the network, the validation dataset was employed to prevent overfitting, and the inference dataset was used post-training to evaluate the recognition performance of the AI model. The reason for selecting YOLOv8s in this study is that it enables AI image processing using only the CPU, which allows for the miniaturization of the image recognition PC.

2.4.3. AI Learning Evaluation

The performance of the learning model was evaluated using a set of evaluation metrics. For this evaluation, 400 images, which were segmented as inference images during the training, were used. The primary evaluation metric employed was Intersection over Union (IoU).
I o U [ % ] = T P T P + F P + F N × 100
Dice coefficient (Dice),
D i c e [ % ] = 2 × T P ( 2 × T P + F P + F N ) × 100
Precision,
P r e c i s i o n [ % ] = T P T P + F P × 100
Recall,
R e c a l l [ % ] = T P T P + F N × 100
Boundary F measure (BF),
B F [ % ] = T P T P + F P + F N × 100
Table 7 presents the confusion matrix, where TP (True Positive) refers to a correct prediction of the crop row class, FP (False Positive) indicates a false identification as the crop row class, and FN (False Negative) represents a failure to identify the crop row class. For example, when evaluating the crop row class, a pixel labeled as a crop row class is considered a TP if predicted as a crop row, and an FN if predicted as a different class. Conversely, a pixel labeled as a non-crop row class is classified as an FP if predicted as a crop row.
Figure 6 and Figure 7 present the recognition results for the onion crop rows using the trained AI model, which was trained with 340 images. The crop rows identified by the model are highlighted in red. From these results, it is evident that the AI model is capable of recognizing crop rows, regardless of whether they are straight or diagonal. Figure 8 and Figure 9 show the performance metrics, including bounding box loss (train/box_loss), segmentation loss (train/seg_loss), classification loss (train/cls_loss), and objectness loss (train/obj_loss). These metrics consistently decrease over the training epochs, indicating that the model is improving in terms of object localization and class identification. This reflects the model’s enhanced generalization ability for object localization on previously unseen data. The precision for Class B (metrics/precision(B)), recall for Class B (metrics/recall(B)), and mean average precision at IoU = 0.50 for Class B (metrics/mAP50(B)) all show increasing precision curves. This suggests that the model effectively reduces False Positives, improves its ability to identify all True Positives over time, and achieves a good precision and recall for Class B at this IoU threshold. Additionally, the mean Average Precision at IoU = 0.50:0.95 for Class B (metrics/mAP50-95(B)) consistently improves, reaching high final values. This demonstrates that the model performs well across varying levels of localization strictness.
Table 8 presents the performance metrics (Precision (P), Recall (R), and Average Precision (AP)) after training YOLOv8s. The results demonstrate that both GPU and CPU implementations achieved high precision and recall values.
Table 9 compares the processing speeds for crop row estimation using GPU processing and CPU processing. While CPU processing took approximately twice as long as GPU processing, it was found to be sufficiently fast for the current application.

2.5. Target Line Setting and Turn Rate Calculation for Automatic Driving

Figure 10 illustrates the method used to detect crop row lines. The crop row lines are extracted using the least squares method applied to the crop row regions identified through image recognition. These extracted and estimated lines are considered as the crop rows and are designated as target lines. From these obtained target lines, Figure 11a demonstrates how the lateral error is calculated, while Figure 11b shows the calculation of the azimuth error. Since the vertical axis of the image is oriented downwards as positive, a coordinate transformation is performed to reorient the y-axis so that the positive direction is upwards. The blue line in the figure represents the detected crop row line. The red line in the in the figure represents the vehicle’s travel path,
x d e t e c t ( y ) = a d e t e c t y + b d e t e c t
The blue straight line represents the position and orientation of the EV crawler.
x E V ( y ) = a E V y + b E V
The horizontal error for the image height y’pixel is calculated as follows,
x E V y = x d e t e c t y x E V ( y )
In the automatic steering system of this study, if the height used to determine the lateral error is set to 0 pixels, and the angle between the y-axis and the detected crop row line is then the slope of Equation (3), expressed as follows:
tan(α) = α
α [ d e g ] = ( arctan ( a d e t e c t ) × 180 π )
The angle β between the y-axis and the straight line of the tractor is shown below.
β [ d e g ] = ( arctan ( a E V ) × 180 π )
Therefore, the orientation error θ is as follows:
θ [deg] = αβ
We obtain the lateral error ϵ and azimuth error θ. To determine the rotation rate ψ of the EV, we will apply the PID feedback control method to calculate ψ, with the objective of minimizing both the lateral error ϵ and azimuth error θ to zero.
ψ ε = k p ε ·   ε + k d ε d ε d t
ψ θ = k p θ ·   θ + k d θ d θ d t
ψ =   ψ ε + ψ θ   · s i n θ
The calculated ψ will be output as the turn_rate of the EV crawler, as shown in Figure 4, to enable the robot to follow the crop rows.
The automatic driving system of the EV crawler is structured according to the flow outlined in Figure 12. The driving procedure involves determining the steering angle of the EV crawler based on the detected crop row lines. This is achieved by placing reference lines on the image to indicate the position and orientation of the EV crawler. Next, the target speed (m_rpm) is set on the screen, and the driving process is initiated by pressing the start button. Equation (18) describes the formula for PID control. PID control is applied to ensure that the target value of the calculated steering angle (turn_rate) becomes zero. The tuning of the PID control parameters is performed manually by operating the EV crawler in practice. Currently, the system is designed to follow only straight and diagonal lines, and it cannot handle situations where crop rows are no longer recognized or when there are changes in row spacing.
u t = K p   e t + K i 0 t e τ d τ + K d e ˙ ( t )

3. Experiment Results

The experiment was conducted in the fields of Yahagi Agriculture Co., Ltd., located in Tsubetsu Town, Abashiri District, Hokkaido in the weeding season of 2024. The following two types of tests were performed: one involved simulating crop rows using green hoses, and the other involved actual weeding with a cultivator in onion fields. A green hose was used as a substitute for crop rows. The rationale for this choice lay in the hose’s ability to easily form straight or curved lines, facilitating the evaluation of tracking accuracy toward the target. Additionally, the hose could be prepared regardless of the season, offering a high versatility in simulation environments. This characteristic enables efficient and consistent experiments that mimic real field conditions. Both manual and automatic operations of the tractor and EV crawler were recorded. The systems were driven at approximately 5 km/h with a weeding cultivator attached, and their tracking performance was compared.
An additional 50 images were incorporated into the training dataset to enable the system to accurately detect the green hoses. The accuracy of crop row tracking was evaluated by recording the lateral errors based on the AI image recognition results. Figure 13 illustrates the experimental setup using the hoses.
To evaluate the tracking accuracy of the developed autonomous driving system, actual weeding operations were conducted in onion fields using a cultivator. The manual operation of the tractor by the farmer, as well as both the manual and autonomous operation of the EV crawler, were recorded. The accuracy of crop row tracking was assessed by recording the lateral errors from the AI image recognition results, similar to the experiment with the green hoses. Figure 14 illustrates the experimental environment in which the system followed the actual onion crop rows.
Figure 15 shows the AI recognition results for the green hose during operation. Figure 16 demonstrates the tracking accuracy of the autonomous driving system. As illustrated in Figure 16, the system maintained a maximum error within ±2 cm while following the path. These results suggest that autonomous driving using AI image recognition on the EV crawler is feasible.
Figure 17 presents the AI recognition results for the onion crop rows during operation. Figure 18 illustrates the tracking accuracy of the manually driven tractor, while Figure 19 shows the tracking accuracy of the manually operated EV crawler. Figure 20 demonstrates the tracking accuracy of the EV crawler during autonomous operation. Table 10 lists the standard deviation values for each scenario. Based on the data in Figure 20 and Table 10, it is evident that the tracking accuracy of the EV crawler was highest during autonomous operation.
During the experiments, the system operated for a maximum of approximately three hours per day, consuming about 40% of the battery.

4. Discussion

This paper focused on the automation of weeding cultivation and aimed to develop a control system for an EV crawler-type weeding robot. The research encompassed the development of the EV crawler, crop row recognition using AI, the design of an automatic crop row tracking system, and simulated driving experiments on field paths. In terms of crop row recognition with AI, significant progress was made in miniaturizing the image recognition PC. Additionally, the recognition of crop rows in diagonal patterns was successfully achieved.
The robot developed in this study achieved a high level of accuracy (1.4 cm) by combining AI-based image recognition technology with an electric system. The AI technology accurately recognized crop rows within the field and operated efficiently without being affected by environmental conditions (lighting, soil changes, or plant growth patterns). This improvement in accuracy was due to the AI’s ability to dynamically adjust the path by making real-time decisions based on the complex conditions within the field. Additionally, the adoption of the electric system provided a higher responsiveness and precision compared to traditional engine-driven systems. This enables stable operation and will significantly enhance the accuracy of agricultural automation tasks.
In contrast, other agricultural robot systems, such as John Deere’s autonomous tractor or Naio Technologies’ robots (e.g., Oz), rely on GPS technology and basic image recognition, usually operating with an accuracy from around 2 to 5 cm. These systems, being based on GPS and cameras, are particularly sensitive to weather and environmental conditions. For instance, fog, strong sunlight, or soil moisture can impact the accuracy of the sensors, potentially leading to misidentifications. In contrast, AI-based systems can handle these variations more flexibly, achieving highly accurate crop recognition and movement. Furthermore, the electric system experiences less mechanical wear and allows for precise control, enabling the robot to operate steadily and accurately even on complex or uneven terrain—an advantage over other systems.

5. Conclusions

In this study, we developed an EV crawler-type weeding cultivator robot used in an organic onion field. The experiment results demonstrated a superior navigational precision of about 1.4 cm. The automatic navigation precision outperformed human-operated controls. This high accuracy can be attributed to both hardware and system components. The crawler-type design enhanced mobility and stability in rough agricultural fields, while the shift to electric power improved responsiveness and precision. The integration of AI-based image recognition technology further enhanced accuracy, enabling the robot to reliably identify crop rows, even in complex field conditions with varying lighting, soil, and growth patterns.
The robot’s performance highlights the potential of combining AI with electric systems for agricultural automation. The system’s ability to recognize both straight and diagonal crop rows, while minimizing misrecognition errors, allowed it to adjust its path dynamically in real time. This capability ensures efficient and accurate task execution across diverse environments, offering a significant improvement over traditional systems.
Despite these advancements, there are potential challenges regarding the autonomous driving system that need further attention. One potential issue is the AI model’s ability to accurately differentiate between crop rows and other elements in the field, such as green straight lines, which could be mistakenly recognized as crop rows. While we briefly mentioned the possibility of this situation, it remains a concern that could affect the robot’s performance, especially in fields with varying plant growth patterns, shadows, or similar visual cues. Future research should focus on developing learning models that are more robust to such confounding elements, potentially incorporating additional data sources like plant behavior, growth stages, and environmental changes over time.
Additionally, while the current system performed well in the experimental environment, its behavior may vary in different weather conditions and environments. For example, extreme weather such as heavy rain or high winds could impact the robot’s navigational precision or its ability to detect crop rows reliably. To address this, future studies should compare the results of autonomous driving in various weather conditions and field environments. This would help to identify any environmental factors that influence the robot’s performance and could lead to improvements in the model’s adaptability.
Another limitation of the current research is the robot’s handling of large-scale, dense crop fields, where the inter-row movement systems may face challenges. Enhancing the robot’s ability to operate efficiently in such environments, while maintaining its high level of precision, will be a key area of future development. Furthermore, while the integration of AI has improved task execution, the model’s ability to adapt to highly diverse crop behaviors and irregular field layouts needs further refinement.
In terms of future research directions, one approach could be to expand the AI learning models to focus more on crop behavior patterns over time, rather than relying solely on visual recognition. By incorporating a broader set of environmental and biological data, the model could be trained to make more accurate predictions in real time. Additionally, exploring multi-sensor fusion, combining visual, thermal, and possibly even auditory data, could improve the robot’s ability to recognize crops in challenging conditions.
In conclusion, while the robot’s performance demonstrates significant promise, there are several areas for further research and improvement. These include refining the AI model to minimize misrecognition, enhancing its adaptability to diverse environments, and developing more efficient systems for large-scale agricultural tasks. By addressing these limitations, future models can improve their applicability and reliability in various agricultural settings, supporting sustainable farming practices and reducing reliance on manual labor.

Author Contributions

Data curation, S.K. and C.T.; formal analysis, S.K. and L.Y.; funding acquisition, L.Y., Y.H. and Y.L.; investigation, Y.H., Y.L. and C.T.; methodology, S.K., L.Y., Y.H., Y.L. and C.T.; resources, L.Y. and Y.H.; software, S.K. and L.Y.; supervision, L.Y.; validation, L.Y.; writing—original draft, S.K., L.Y., Y.H., Y.L. and C.T.; writing—review and editing, S.K. and L.Y. All authors have read and agreed to the published version of the manuscript.

Funding

This research was supported by the National Agriculture and Food Research Organization under the project “Development and Demonstration of Smart Agriculture Technology” [grant No.: 21448282], National Innovation Park for Forestry and Grass Equipments [grant numbers 2023YG08], Zhejiang province agricultural machinery research, manufacturing and application integration project [grant numbers 2023-YT-06], Sichuan Province Engineering Technology Research Center of Modern Agriculture Equipment [grant numbers XDNY2023-003], Zhejiang University-Yongkang Intelligent Agricultural Machinery Equipment Joint Research Center [grant numbers Zdyk2303Y].

Institutional Review Board Statement

Not applicable.

Data Availability Statement

Dataset will be made available on request to the authors.

Acknowledgments

This research is being conducted in collaboration with Q-hoe Co., Ltd.

Conflicts of Interest

The authors declare no conflicts of interest. The funders had no role in the design of the study; in the collection, analyses, or interpretation of data; in the writing of the manuscript; or in the decision to publish the results.

References

  1. Statistics on Agricultural Labor Force in Japan (2023); Ministry of Agriculture, Forestry and Fisheries of Japan: Tokyo, Japan, 2023.
  2. Boone, L.; Roldán-Ruiz, I.; Van Linden, V.; Muylle, H.; Dewulf, J. Environmental Sustainability of Conventional and Organic Farming: Accounting for Ecosystem Services in Life Cycle Assessment. Sci. Total Environ. 2019, 695, 133841. [Google Scholar] [CrossRef] [PubMed]
  3. Lankoski, J.; Thiem, A. Linkages between Agricultural Policies, Productivity and Environmental Sustainability. Ecol. Econ. 2020, 178, 106809. [Google Scholar] [CrossRef]
  4. Franco, S. Assessing the Environmental Sustainability of Local Agricultural Systems: How and Why. Curr. Res. Environ. Sustain. 2021, 3, 100028. [Google Scholar] [CrossRef]
  5. Cachero-Martínez, S. Consumer Behaviour towards Organic Products: The Moderating Role of Environmental Concern. J. Risk Financ. Manag. 2020, 13, 330. [Google Scholar] [CrossRef]
  6. Gamage, A.; Gangahagedara, R.; Gamage, J.; Jayasinghe, N.; Kodikara, N.; Suraweera, P.; Merah, O. Role of Organic Farming for Achieving Sustainability in Agriculture. Farming Syst. 2023, 1, 100005. [Google Scholar] [CrossRef]
  7. Vahdanjoo, M.; Gislum, R.; Sørensen, C.A.G. Operational, Economic, and Environmental Assessment of an Agricultural Robot in Seeding and Weeding Operations. AgriEngineering 2023, 5, 299–324. [Google Scholar] [CrossRef]
  8. Melander, B.; Lattanzi, B.; Pannacci, E. Intelligent versus Non-Intelligent Mechanical Intra-Row Weed Control in Transplanted Onion and Cabbage. Crop Prot. 2015, 72, 1–8. [Google Scholar] [CrossRef]
  9. Oonisi, T.; Tanaka, S. Onion Work Handbook; Rural Culture Association: Kyoto, Japan, 2012. [Google Scholar]
  10. Akazaki, S. Onion Encyclopedia; Rural Culture Association: Kyoto, Japan, 2019. [Google Scholar]
  11. Maddonni, G.A.; Otegui, M.E.; Cirilo, A.G. Plant Population Density, Row Spacing and Hybrid Effects on Maize Canopy Architecture and Light Attenuation. Field Crops Res. 2001, 71, 183–193. [Google Scholar] [CrossRef]
  12. Nakano, H.; Kawamoto, K.; Isida, K. Relationship between distance between rows and growth, yield, and downfall in soybeans. Rep. Chugoku Branch Crop Sci. Soc. Jpn. 1998, 35, P35–P36. [Google Scholar]
  13. Roshanianfard, A.; Noguchi, N.; Okamoto, H.; Ishii, K. A Review of Autonomous Agricultural Vehicles (The Experience of Hokkaido University). J. Terramechanics 2020, 91, 155–183. [Google Scholar] [CrossRef]
  14. Yao, Z.; Zhao, C.; Zhang, T. Agricultural Machinery Automatic Navigation Technology. iScience 2024, 27, 108714. [Google Scholar] [CrossRef] [PubMed]
  15. Bo, H.; Liang, W.; Yuefeng, D.; Zhenghe, S.; Enrong, M.; Zhongxiang, Z. Design and Experiment on Integrated Proportional Control Valve of Automatic Steering System. IFAC-PapersOnLine 2018, 51, 389–396. [Google Scholar] [CrossRef]
  16. An, G.; Zhong, Z.; Yang, S.; Yang, L.; Jin, C.; Du, J.; Yin, X. EASS: An Automatic Steering System for Agricultural Wheeled Vehicles Using Fuzzy Control. Comput. Electron. Agric. 2024, 217, 108544. [Google Scholar] [CrossRef]
  17. Jing, Y.; Li, Q.; Ye, W.; Liu, G. Development of a GNSS/INS-Based Automatic Navigation Land Levelling System. Comput. Electron. Agric. 2023, 213, 108187. [Google Scholar] [CrossRef]
  18. Saha, S.; Morita, T.; Ospina, R.; Noguchi, N. A Vision-Based Navigation System for an Agricultural Autonomous Tractor. IFAC-PapersOnLine 2022, 55, 48–53. [Google Scholar] [CrossRef]
  19. Guerrero, J.M.; Ruz, J.J.; Pajares, G. Crop Rows and Weeds Detection in Maize Fields Applying a Computer Vision System Based on Geometry. Comput. Electron. Agric. 2017, 142, 461–472. [Google Scholar] [CrossRef]
  20. Bentley, L.; Macinnes, J.; Bhadani, R.; Bose, T. A Pseudo-Derivative Method for Sliding Window Path Mapping in Robotics-Based Image Processing; CAT Vehicle Research Experience for Undergraduates: Tucson, AZ, USA, 2019. [Google Scholar] [CrossRef]
  21. Yang, Y.; Zhou, Y.; Yue, X.; Zhang, G.; Wen, X.; Ma, B.; Xu, L.; Chen, L. Real-Time Detection of Crop Rows in Maize Fields Based on Autonomous Extraction of ROI. Expert Syst. Appl. 2023, 213, 118826. [Google Scholar] [CrossRef]
  22. Zhang, S.; Liu, Y.; Xiong, K.; Tian, Y.; Du, Y.; Zhu, Z.; Du, M.; Zhai, Z. A Review of Vision-Based Crop Row Detection Method: Focusing on Field Ground Autonomous Navigation Operations. Comput. Electron. Agric. 2024, 222, 109086. [Google Scholar] [CrossRef]
  23. Ruan, Z.; Chang, P.; Cui, S.; Luo, J.; Gao, R.; Su, Z. A Precise Crop Row Detection Algorithm in Complex Farmland for Unmanned Agricultural Machines. Biosyst. Eng. 2023, 232, 1–12. [Google Scholar] [CrossRef]
  24. Lin, Y.-K.; Chen, S.-F. Development of Navigation System for Tea Field Machine Using Semantic Segmentation. IFAC-PapersOnLine 2019, 52, 108–113. [Google Scholar] [CrossRef]
  25. Liu, F.; Yang, Y.; Zeng, Y.; Liu, Z. Bending Diagnosis of Rice Seedling Lines and Guidance Line Extraction of Automatic Weeding Equipment in Paddy Field. Mech. Syst. Signal Process. 2020, 142, 106791. [Google Scholar] [CrossRef]
  26. Ponnambalam, V.R.; Bakken, M.; Moore, R.J.D.; Glenn Omholt Gjevestad, J.; Johan From, P. Autonomous Crop Row Guidance Using Adaptive Multi-ROI in Strawberry Fields. Sensors 2020, 20, 5249. [Google Scholar] [CrossRef]
  27. Yang, R.; Zhai, Y.; Zhang, J.; Zhang, H.; Tian, G.; Zhang, J.; Huang, P.; Li, L. Potato Visual Navigation Line Detection Based on Deep Learning and Feature Midpoint Adaptation. Agriculture 2022, 12, 1363. [Google Scholar] [CrossRef]
  28. Lyu, Z.; Lu, A.; Ma, Y. Improved YOLOv8-Seg Based on Multiscale Feature Fusion and Deformable Convolution for Weed Precision Segmentation. Appl. Sci. 2024, 14, 5002. [Google Scholar] [CrossRef]
  29. Farokhi Soofi, A.; Manshadi, S.D.; Saucedo, A. Farm Electrification: A Road-Map to Decarbonize the Agriculture Sector. Electr. J. 2022, 35, 107076. [Google Scholar] [CrossRef]
  30. Ghobadpour, A.; Monsalve, G.; Cardenas, A.; Mousazadeh, H. Off-Road Electric Vehicles and Autonomous Robots in Agricultural Sector: Trends, Challenges, and Opportunities. Vehicles 2022, 4, 843–864. [Google Scholar] [CrossRef]
  31. Yamazaki, M. Electrification, Automated Driving Trends and Future Automotive Industry. J. Jpn. Soc. Colour Mater. 2018, 91, 351–354. [Google Scholar] [CrossRef]
  32. Yamada, H.; Chida, Y.; Tanemura, M. Improvement of Linear Tracking Response of Two-Degree-of-Freedom Control of Discrete-Valued Driven Crawler. IFAC-PapersOnLine 2023, 56, 313–318. [Google Scholar] [CrossRef]
  33. Takai, R.; Barawid, O.; Ishii, K.; Noguchi, N. Development of Crawler-Type Robot Tractor Based on GPS and IMU. IFAC Proc. Vol. 2010, 43, 151–156. [Google Scholar] [CrossRef]
Figure 1. EV crawler type weeding robot system diagram.
Figure 1. EV crawler type weeding robot system diagram.
Agriculture 15 00002 g001
Figure 2. EV crawler-type robot.
Figure 2. EV crawler-type robot.
Agriculture 15 00002 g002
Figure 3. (a) side and (b) front.
Figure 3. (a) side and (b) front.
Agriculture 15 00002 g003
Figure 4. Flowchart of the motor control.
Figure 4. Flowchart of the motor control.
Agriculture 15 00002 g004
Figure 5. Crop rows labeling with red lines.
Figure 5. Crop rows labeling with red lines.
Agriculture 15 00002 g005
Figure 6. AI recognition result (GPU) The recognized crop rows are filled in red.
Figure 6. AI recognition result (GPU) The recognized crop rows are filled in red.
Agriculture 15 00002 g006
Figure 7. AI recognition result (CPU) The recognized crop rows are filled in red.
Figure 7. AI recognition result (CPU) The recognized crop rows are filled in red.
Agriculture 15 00002 g007
Figure 8. Evolution of performance of training and validation of AI model (GPU).
Figure 8. Evolution of performance of training and validation of AI model (GPU).
Agriculture 15 00002 g008
Figure 9. Evolution of performance of training and validation of AI model (CPU).
Figure 9. Evolution of performance of training and validation of AI model (CPU).
Agriculture 15 00002 g009
Figure 10. Crop row straight line calculation procedure.
Figure 10. Crop row straight line calculation procedure.
Agriculture 15 00002 g010
Figure 11. Definition of the lateral and heading errors.
Figure 11. Definition of the lateral and heading errors.
Agriculture 15 00002 g011
Figure 12. Automatic driving control system.
Figure 12. Automatic driving control system.
Agriculture 15 00002 g012
Figure 13. Experimental environment (hose).
Figure 13. Experimental environment (hose).
Agriculture 15 00002 g013
Figure 14. Experimental environment (onion).
Figure 14. Experimental environment (onion).
Agriculture 15 00002 g014
Figure 15. AI image recognition during experiments (hose).
Figure 15. AI image recognition during experiments (hose).
Agriculture 15 00002 g015
Figure 16. Driving tracking graph (EV crawler (auto) (hose)).
Figure 16. Driving tracking graph (EV crawler (auto) (hose)).
Agriculture 15 00002 g016
Figure 17. AI image recognition during experiments (onion).
Figure 17. AI image recognition during experiments (onion).
Agriculture 15 00002 g017
Figure 18. Driving tracking graph (tractor (manual)).
Figure 18. Driving tracking graph (tractor (manual)).
Agriculture 15 00002 g018
Figure 19. Driving tracking graph (EV crawler (manual)).
Figure 19. Driving tracking graph (EV crawler (manual)).
Agriculture 15 00002 g019
Figure 20. Driving tracking graph (EV crawler (auto) (onion)).
Figure 20. Driving tracking graph (EV crawler (auto) (onion)).
Agriculture 15 00002 g020
Table 1. Parts used in EV crawler robots.
Table 1. Parts used in EV crawler robots.
ItemSpecification
MotorAG120D4-2A000 HW2.2 FW 2.2.6
Motor driversCHFM-5107P.SVB30
Control processing PCPanasonic FZ-G1 TOUGHPAD
Image recognition PCIntel NUC13 Pro
CameraSee3CAM_24CUG
CAN to USB dongleCANUSB
BatteryHC100-12 (lead–acid battery)
Table 2. EV crawler specifications.
Table 2. EV crawler specifications.
ItemSpecification
Number of motors2
Motor output1 kw × 2
Communication methodCAN communication
Driving typeDifferential two wheels
Drive voltage48 V
Width1200–1600 mm
Vertical width1560 mm
Vehicle height560–680 mm
Table 3. Specifications of the motor.
Table 3. Specifications of the motor.
ItemSpecification
ModelCHFM-5107P-SV-B-30
Ratio30
Volts25 V
Rating3.15 N·m
r/min3000
Output1 kw
Table 4. Specifications of the motor driver.
Table 4. Specifications of the motor driver.
ItemSpecification
ModelAG120D4-2A000 HW2.2 FW 2.3.1
Input voltage48 Vdc–Max 60 Vdc
Output current46 Arms–Max 140 Arms (2 s)
Communication methodCAN communication (CANopen)
Table 5. Equipment used during AI learning.
Table 5. Equipment used during AI learning.
ItemSpecification
Model numberMouse DAIV DGX760H2-M2S5
OSWindows 11
CPUIntel Core i9-9900X CPU @3.50 GHz
GPUNVIDIA GeForce RTX 2080 Ti
RAM64 GB
Programming languagePython-3.8.12
AI modelYOLOv8s
Table 6. Image usage settings during AI learning.
Table 6. Image usage settings during AI learning.
Image Setting
Training75%
Validation20%
Inference5%
Table 7. Confusion matrix.
Table 7. Confusion matrix.
ActualPredicted
TrueFalse
True(True Positive) TP(False Negative) FN
False(False Positive) FP(True Negative) TN
Table 8. The performance metrics (Precision (P), Recall (R), and Average Precision (AP)) after training YOLOv8s.
Table 8. The performance metrics (Precision (P), Recall (R), and Average Precision (AP)) after training YOLOv8s.
ProcessingPrecision (P)(%)Recall (R)(%)Average Precision (AP)(%)
GPU97.496.153.2
CPU96.595.549.5
Table 9. Comparison of average processing time for crop row estimation.
Table 9. Comparison of average processing time for crop row estimation.
Mouse DAIV DGX760H2-M2S5Intel NUC13 Pro
GPUCPU
0.0329 [s]0.0725 [s]
Table 10. Standard deviation.
Table 10. Standard deviation.
StateStandard Deviation
Tractor (manual)1.924 [cm]
EV crawler (manual)4.838 [cm]
EV crawler (auto (onion))1.428 [cm]
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Yang, L.; Kamata, S.; Hoshino, Y.; Liu, Y.; Tomioka, C. Development of EV Crawler-Type Weeding Robot for Organic Onion. Agriculture 2025, 15, 2. https://doi.org/10.3390/agriculture15010002

AMA Style

Yang L, Kamata S, Hoshino Y, Liu Y, Tomioka C. Development of EV Crawler-Type Weeding Robot for Organic Onion. Agriculture. 2025; 15(1):2. https://doi.org/10.3390/agriculture15010002

Chicago/Turabian Style

Yang, Liangliang, Sota Kamata, Yohei Hoshino, Yufei Liu, and Chiaki Tomioka. 2025. "Development of EV Crawler-Type Weeding Robot for Organic Onion" Agriculture 15, no. 1: 2. https://doi.org/10.3390/agriculture15010002

APA Style

Yang, L., Kamata, S., Hoshino, Y., Liu, Y., & Tomioka, C. (2025). Development of EV Crawler-Type Weeding Robot for Organic Onion. Agriculture, 15(1), 2. https://doi.org/10.3390/agriculture15010002

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop