Next Article in Journal
Shining a Light on Symbiosis: N-Fixing Bacteria Boost Legume Growth under Varied Light Conditions
Previous Article in Journal
Evaluation of Biochar Addition to Digestate, Slurry, and Manure for Mitigating Carbon Emissions
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Review

A Comprehensive Overview of Control Algorithms, Sensors, Actuators, and Communication Tools of Autonomous All-Terrain Vehicles in Agriculture

by
Hamed Etezadi
1 and
Sulaymon Eshkabilov
2,*
1
Department of Bioresource Engineering, McGill University, Montréal, QC H9X 3V9, Canada
2
Department of Agricultural and Biosystems Engineering, North Dakota State University, Fargo, ND 58102, USA
*
Author to whom correspondence should be addressed.
Agriculture 2024, 14(2), 163; https://doi.org/10.3390/agriculture14020163
Submission received: 27 December 2023 / Revised: 15 January 2024 / Accepted: 17 January 2024 / Published: 23 January 2024
(This article belongs to the Special Issue Application of Mechatronics in Agriculture)

Abstract

:
This review paper discusses the development trends of agricultural autonomous all-terrain vehicles (AATVs) from four cornerstones, such as (1) control strategy and algorithms, (2) sensors, (3) data communication tools and systems, and (4) controllers and actuators, based on 221 papers published in peer-reviewed journals for 1960–2023. The paper highlights a comparative analysis of commonly employed control methods and algorithms by highlighting their advantages and disadvantages. It gives comparative analyses of sensors, data communication tools, actuators, and hardware-embedded controllers. In recent years, many novel developments in AATVs have been made due to advancements in wireless and remote communication, high-speed data processors, sensors, computer vision, and broader applications of AI tools. Technical advancements in fully autonomous control of AATVs remain limited, requiring research into accurate estimation of terrain mechanics, identifying uncertainties, and making fast and accurate decisions, as well as utilizing wireless communication and edge cloud computing. Furthermore, most of the developments are at the research level and have many practical limitations due to terrain and weather conditions.

1. Introduction

Autonomous all-terrain vehicle (AATV)-based farming promises to produce more crops with less effort and reduce environmental impact. Drones, self-driving all-terrain vehicles, and seed-planting robots could play a crucial role in future food supplies. According to the Global Market Insights report, the autonomous farm equipment market is projected to grow at over 6% CAGR between 2021 and 2027. By 2027, industry shipments are expected to exceed 210,000 units. Growth in the market is likely to be augmented by the growing demand for autonomous farming equipment, especially in regions with a low farmer population. Autonomous farm equipment market growth has been negatively impacted by the COVID-19 pandemic. Several governments have imposed travel restrictions and lockdown measures in response to the outbreak. As a result, several companies’ logistics and manufacturing capabilities have been hindered, resulting in lower sales figures. Additionally, the introduction of travel regulations increased labor shortage challenges for farm owners and restricted their ability to hire suitable workers to conduct fieldwork [1]. Food chains are being affected by global farming shortages. With reference to the agriculture and agri-food labor task force [2], there may be 114,000 labor shortages in Canada’s farming sector by 2025. The situation is similar in the US, where immigration changes have contributed to a shortage of farm labor. The use of AATVs can improve management by facilitating in-field tasks in a time-effective manner. A rise in farm labor costs and a fall in self-driving technology prices will also accelerate the transition. A growing body of evidence suggests that precision agriculture is already saving growers money and increasing yields. A 10% increase in farmer revenue could be achieved, as well as a reduction in labor costs [3]. In addition to their lighter weight, the AATVs would reduce soil compaction because of their compact size, which can reduce crop yields and has been a problem with heavy tractor machinery in the past.
AATVs, which are mostly four-wheel vehicles and are now used for a variety of purposes [4], are primarily used in agricultural fields for moving, carrying tools, applying chemical fertilizers, plowing, mowing grass, spreading seeds, transporting livestock, weed detection [5], and transporting firewood.
In agricultural operations, farmers are faced with different fields, each of which has its own characteristics depending on the type of product or desired operation, and these machines must perform as correctly as possible in these fields. In this regard, for future intelligent manned and/or unmanned agricultural operations, a review of the management of agricultural machinery was presented. Strategic, tactical, operational, and evaluation aspects of mechanization [6]. Among this machinery, ATVs play an essential role in agricultural operations. The special features of ATVs, such as low-pressure tires, short axle distances, narrow track widths, and high centers of gravity, make them maneuverable [7].
Considering the numerous uses and versatility of ATVs, they form a dynamic field of study. It is crystal clear that terrain properties influence the design, performance, and mobility of vehicles. Better traction and minimal sinkage will increase maneuverability, especially in wet terrain. In other words, a significant objective is to reduce the amount of roll, reduce steering effort, and thus enhance stability. As a result of increased maneuverability, vehicles have to be more stable on the road, especially in rough terrain. It is possible to reduce 85% of fatal events in agriculture by improving the balance control of these tools [8]. Any system failure in an ATV may lead to irreversible consequences [9]. Detecting such faults accurately and timely is crucial to preventing critical hazards and halting operations. For this reason, supervisory intelligent control algorithms and controllers should be continuously monitored in a smart setting to maximize ATV uptime and prevent potential hazards [10]. Therefore, the importance of control systems in ATVs is crucial.
There are a few crucial issues to be considered while designing and building autonomous ATVs that need to be investigated and improved, such as safety, especially against overturning, being autonomous, using the latest artificial intelligence technologies, and high-precision navigation. Improvement in each of these cases will increase the output and farmers’ work efficiency. Experimental studies have been conducted on maintaining the balance of AATVs [11]. Turning at low speed on surfaces with uneven slopes and sharp turns on flat ground were included in these tests, and the car’s condition was assessed statically and dynamically, but these tests were not repeatable because of remote control by radio control [11], and considering human errors, the need for automatic speed control and steering control systems in these kinds of experiments was deemed to be imperative [11]. An AATV was modeled, simulated, and tested on a test track [12]. The purpose of this study was to examine the degree to which simplified ADAMS modeling can accurately simulate the response of an AATV on uneven ground. The researchers concluded that chassis flexibility affects the vibration response of the vehicle body in a significant way [12].
Wireless communication. In another experiment, using wireless control, Aras modified the ATV to have semi-autonomous control. Yaw motion was used to determine the ATV’s stability. The ATV model was inferred using the MATLAB system identification toolbox based on the results of the experiment or yaw motion [13]. A vehicle’s motion is greatly influenced by rolling resistance, especially when it is operating off-road. As a result of reduced rolling resistance, smaller motors can be implemented, which will reduce manufacturing costs, reduce vehicle weight, reduce energy loss, and reduce rolling resistance further. Petterson and Gooch presented new data for the rolling resistance of seven different ATV tires in an agricultural environment [14]. They found rolling resistance is heavily influenced by the diameter of ATV tires. Despite large tire diameters, significant tire width and a wide, deep tread were found to adversely affect rolling resistance. The rolling resistance increased with speed at low speeds, and inflation pressure affected rolling resistance significantly.
Rolling resistance. Rolling resistance is strongly influenced by the firmness of the ground. In contrast to hard surfaces (such as concrete), soft soils (such as sand) produce a much higher resistance force. Rolling resistance is also influenced by the ground surface. Compared with tires driven on smooth surfaces, tires driven on rough macro- or micro-textures will experience greater deformation and suffer larger energy losses [15].
Terramechanics. From the terrain mechanics aspect, there are three general categories of modeling for ATVs: (1) empirical models, the simplest models but difficult to apply beyond testing conditions; (2) physics-based models, exhibiting the greatest degree of fidelity but at the expense of a high computational expenditure; and (3) semi-empirical models, which are better suited for real-time estimation and control because they strike a balance between computational efficiency and fidelity [16]. One of the most widely used semi-empirical methods is the Bekker-based model [17,18]. For Bekker-based models to produce an accurate representation of the complex stress distribution generated at the contact patch, several parameters are considered, such as cohesion and internal friction angle. In ATV operation, however, it is difficult to determine these parameters because vehicles may operate on terrain that is unknown or whose properties vary. In addition, SCM was used, but since stress is discretized and integrated, this method may not be suitable for real-time applications due to its computational cost [19]. Other methods, such as the Bekker-based SCM surrogate model, were developed to get better results. However, model-dependent navigation algorithms, such as MPC, were difficult to use due to the lack of twice-continuous differentiability [20]. Using a neural network terramechanics model for terrain estimation, the tire’s lateral forces can be predicted with sufficient accuracy [20].
Operational environment. Agricultural AATVs, unlike other conventional autonomous vehicles, cannot be controlled using traffic signs, road lanes, or other road guidance tools for navigation on the farm. Therefore, the navigation system of AATVs is the most essential part that differs from that of on-road autonomous vehicles. Plans [21], perceptions [22], and control are major parts of an autonomous navigation system, as they make up the majority of its components. ATV researchers have also conducted research in the field of navigation, analyzing the effect of algorithms such as artificial neural networks (ANNs), genetic algorithms (GAs), and Kalman filters (KFs) to enhance the accuracy of vehicle control concerning a selected route (Figure 1) [23]. As Mousazadeh [23] concluded, an intelligent autonomous system would be enhanced by combining all categorized algorithms. Kalman filters for precise navigation, machine vision for detecting crops and rows, neural networks for weed and plant classification, fuzzy systems for detecting obstacles, and other techniques would also be required. Moreover, the review presented by Zhou and others [24] addresses path-planning problems involving multiple constraints. According to their classification, there are three stages: stage 1—structural, kinematic, and dynamic; stage 2—planning the route; and stage 3—planning the trajectory and motion, as well as a review of various methods used for USVs [24].
Studies have also been conducted to determine the correct route for USVs in the fields of route planning and motion planning, kinematics, and dynamics to design an optimal control of such vehicles [24]. Another study used LoRa-based lidar sensors in electric ATVs to predict movement paths by using data collected by the automatic driving algorithm. Using the high-resolution LiDAR scanning system developed by Rus, an automated driving algorithm was developed to map an enclosure to be used for autonomous driving purposes. To predict the location of the autonomous vehicle, a LoRa communication system was correlated with the LiDAR-type scanning system [25].
Automated vehicles currently range from (completely manual) to (fully autonomous) [26].
A vehicle’s operation is entirely under the control of the driver. However, there is still the possibility of alerting the driver in case of a danger (fully manual).
Self-driving vehicles can be classified into six levels. The levels, which are defined by the Society of Automotive Engineers (SAE), and the process of driving a vehicle vary depending on the degree of human interaction (Figure 2).
L0: Fully human-operated and no assistance while driving.
L1: Driver assistance systems assist drivers while driving. As well as controlling the speed of the vehicle, the steering wheel can also be maneuvered by the system (driving assistance).
L2: Driving a vehicle is a shared responsibility between the driver and the automatic system. In addition to controlling the vehicle’s speed and turning the steering wheel, the car’s speed can also be monitored by the driver (partial automation).
L3: In an autonomous vehicle, the environment can be monitored, and the vehicle can drive itself as well. Driving is permissible, but drivers have to remain in charge if necessary (conditioned automation).
L4: This system does not require human intervention; it can drive itself, it can monitor its surroundings, and it can monitor its environment as well (high automation).
L5: In any road or environment, it can drive autonomously, without human intervention (fully autonomous).
Level 4 and 5 ATVs are the subject of this review paper. ATVs used in agriculture have been the subject of valuable studies [28,29] (Figure 3). A variety of navigation sensors (machine vision, GPS, laser sensors, reckoning sensors, IMUs, and GDS) were used to develop positioning and orientation control algorithms and controllers. They have also included various computational methods as well as navigation planning algorithms for agricultural vehicles (AVs). There have been numerous research publications on the designs, control algorithms, and methods employed by sensors. However, there are limited published papers on the comparative analysis of settings and performances of the employed sensors, algorithms, and control methods, communication tools and methods, control systems, and uncertainties in the study environments of AATVs. Moreover, the published papers do not cover sufficiently in-depth technical aspects of employed platforms, operational functions, sensors, internal or external communication methods, control units, and test environments of AATVs in conjunction. Another important point is that many guidance systems and their components are commercialized [29]. Therefore, many essential technical details of the guidance systems are not available.
One of the potential advantages of automated ATVs is that they are capable of reducing human-operator errors while providing stable and acceptable results and more accurate control in the long run [11,30]. The same is true for autonomous ATVs.
ATVs with self-driving control systems have several advantages, including reducing mortality, reducing overturning, especially in rough conditions, and eliminating human error in control [11,30]. Moreover, autonomous ATVs increase the accuracy of calculations, reduce labor costs, compensate for labor shortages, and enable cameras and robotic tools with artificial intelligence to be used in precision agriculture while reducing production and operating costs, particularly in products such as sugar beets and carrots that are sensitive to compaction [31]. A study was conducted on the development of unmanned autonomous ATVs using the remote-control system method [32]. Road routing, speed control, and wheel and brake systems were tested along with three models of the autonomous command control system. Platforms equipped with sensors and actuators can be specially developed or used as hardware. As part of the robotic system, many components can be implemented on the platform, including a steering wheel that can be controlled, a computer connected to a programmable electronic control unit, as well as ports (input and output) for sensors and actuators to connect and communicate.
In agriculture, the autonomous concept will lead to the development of new equipment based on small, intelligent machines capable of performing various tasks more efficiently and environmentally friendly. Using smart tractors [33], for instance, the farmland and people can be kept safe by generating operational routes and avoiding field obstacles intelligently.
AATVs have become prominent in precision agriculture (PAG) thanks to the efficient use of information and communication technology tools to monitor crop yields and employ variable-rate technologies on farms [34]. Due to the growing agricultural knowledge system, large amounts of data are available. Based on a recent review, it was found that 37% of robotic systems are four-wheel drive, and 22.06% are used for weeding. Furthermore, 50% of the navigation systems are equipped with cameras, 20% with RTK/GNSS/INS, and 16% with LiDAR [35]. According to these statistical data, a large amount of data are used by AATVs to have an optimal result in real-time operations.
The PAG research trends could also benefit from the fifth- and sixth-generations of communication (5G and 6G). The next generation of wireless communication technologies, 5G and 6G, feature high-frequency electromagnetic waves and low latency [36]. Compared to previous wireless communication technologies, these networks offer faster data transmission speeds and greater throughput, providing device communications, user-side artificial intelligence (AI) algorithms, distributed fault diagnosis methods, and complex security strategies that may be useful in agriculture [37]. The main challenge of this technology is the lack of network infrastructure in all regions, as well as spectrum availability and implementation difficulties. There are currently security concerns, and it is an expensive technology.
From an environmental aspect, due to the complexity and dynamic nature of the work environment, challenges may arise from agricultural environments with varying conditions, canopy structures, and physio-chemical properties [38]. Each environment may have its own features and limitations. Depending on the weather conditions, such as rain, fog, or dust, the sensor function may be adversely affected. When working in an open field, lighting conditions, wind, and muddy soil may present different challenges to the sensor. Navigating in orchards can be challenging due to the surrounding trees. A major problem in paddy fields is muddy soil and the difficulty of maneuvering. Designing an ATV should take the farming environment into account [39]. Agricultural farm fields face several complicated factors. For example, working areas generally remain the same; it is easy to place landmarks around the corners of a field and consider them stationary. Generally, the plants are the same, and they can be identified easily [29].
In PAG operations, AATVs are in high demand for maneuvering and accuracy in navigation. Developing a fully autonomous ATV capable of navigating and overcoming obstacles reliably is a challenging task. To plan a vehicle’s drive path and maneuver the vehicle, it is essential to be able to correctly estimate the vehicle’s dynamic behaviors, such as stability, controllability, and maneuverability, which are directly linked with terrain conditions, such as soil properties and obstacles, in real-time. The terrain can be classified based on topography and soil mechanics parameters by integrating ground and environmental data systems. To minimize energy consumption, an energy cost-to-go map of the unstructured environments can be coupled with a model predictive controller that uses higher-fidelity models to capture important aspects of off-road energy consumption, including terra-mechanics, altitude variations, and the dynamics of the vehicle [40]. In addition, an index of wheel mobility performance can be extended to a vehicle mobility performance index, and a combination of the parameters could be used to describe the vehicle’s technical productivity and efficiency, enabling the estimation and control of vehicle mobility performance as well as facilitating the design of autonomous control systems [41]. In recent review articles, some opportunities and challenges for ATTVs have been discussed, and their functional subsystems have been analyzed. Research has been summarized by application type and performance measures for evaluating AATVs, and a large number of examples of AATV applications were presented. In comparison to previous reviews, this article aims to: (a) highlight the distinctive issues, requirements, and challenges in a more structural format, (b) describe existing approaches to implementing these functions in AATVs as well as their relationship with other approaches, (c) provide more detailed comparisons of different methods and approaches, and (d) discuss potential future strategies for overcoming the limitations of these approaches.

2. Objective

In this paper, we present a structured and comprehensive review and analysis of the studies dedicated to AATVs and their design and performance characteristics based on published papers in peer-reviewed journals from four cornerstones. (1) control strategy and control algorithm; (2) sensors; (3) data communication; (4) controllers and actuators (hardware implementation). The following flowchart, shown in Figure 4, shows the overall logical structure of the paper.
The key objectives of this article are to (1) illustrate fundamental aspects of how AATVs operate in agricultural environments and (2) provide a comparative analysis of the commonly employed control algorithms, sensors, controllers, actuators, and communication tools in AATVs by highlighting their key advantages and drawbacks (3) elaborate common challenges, technical and technological constraints in developing and building AATVs, and (4) underline some future development trends for AATV designs. Figure 5 shows a schematic of AATV and its features.

3. Models and Control Algorithms

In an uncertain or contested environment, autonomous control systems (ACS) enable the self-governance of vehicle control functions with little to no human intervention. They are developed using model-based engineering, artificial intelligence (AI), machine learning (ML), and data acquisition. An AATV can be controlled by three elements: (1) actuators and sensors serve as a reliable base for the ATV through which digital signals are used to control the handlebars of the ATV in a forward direction; (2) speed feedback controls; and (3) brake controls. Regarding the steering control system, an AI model with an image processing system using real-time images can be used. Moreover, with the help of a trained machine learning model, it is possible to pass the current position of the ATV and the expected trajectory to the CAN bus, which is used to steer the ATV with the help of the handlebar angle expected to be built on the predicted trajectory [43]. Using different types of control strategies is another aspect of ATVs’ control systems. There are different types of control systems: PID, adaptive, open-loop controls, and closed-loop controls. PID controllers, encoders, and DC motors control the steering system of autonomous vehicles. Encoders and PID controllers are used to control systems based on the required conditions. As the steering wheel turns, the encoder generates pulses, which are sent to the DC motor attached to the front axle that turns the vehicle. Using feedback from the error value, it is possible to correct the required navigation parameters by using a PID controller. A PID controller has three built-in functions, which are proportional-integral derivative, to control the error. PID controllers are popular due to their easy-to-control and easy-to-implement capabilities.
Open-loop controls and closed-loop controls each have different situations when they are needed. In open-loop control systems, there is no feedback or error handling required. Despite its simplicity and economy, it cannot be optimized. Open-loop control systems are easier to maintain. Control systems with closed loops handle feedback and errors. In closed-loop control, feedback is the key difference. The advantages of closed-loop controls include automatic correction of disturbances, the ability to maintain a set point, and the ability to stabilize unstable processes. A process that is erratic and rarely changes output merits open-loop control. On the other hand, vehicle guidance systems have recently been a topic of renewed interest due to rapid advances in electronics, computers, and computing technologies. Many different types of guidance technologies have been investigated, such as mechanical guidance, radio navigation, optical guidance, and ultrasonic guidance, among others [44,45]. The advent of autonomous navigation systems for agricultural vehicles represents a significant advance in PAG and a promising alternative to a rapidly declining farm labor force, as well as meeting the need for increased production efficiency and safety [44,46].

3.1. Vehicle Motion Models

3.1.1. Kinematic Model

Geometric relationships within the system are described by a kinematic model. Based on state-space representation, it describes the relationship between input (control) parameters and system behavior. Using differential first-order equations, a kinematic model describes system velocities. It is usually sufficient to use kinematic models in wheeled mobile robotic systems to design locomotion strategies. However, dynamic models are necessary for other systems [47]. Kinematic models can be classified into several types: internal kinematics, external kinematics, direct kinematics, and inverse kinematics. In internal kinematics, variables within a system are explained in relation to each other, such as wheel rotation and robot motion. According to some reference coordinate frames, external kinematics describes the position and orientation of a robot. Robot states are defined by their inputs in direct kinematics, such as wheel speed and wheel steering. Motion planning can be designed using inverse kinematics, which means that inputs for a desired robot state sequence can be calculated [47]. ATV kinematics is concerned with the modeling of the horizontal motion of the vehicle. Figure 6 shows a schematic diagram of an autonomous tractor [48]. The linear velocities x ˙ R and y ˙ R and angular velocity ψ ˙ R at the rear axle of the tractor (point R) are found from the kinematic Equations (1)–(3):
x ˙ R = v x cos ψ
y ˙ R = v x sin ψ
ψ ˙ ˙ R = v x tan δ L
The variables v x   ψ ,   δ ,   and L represent the longitudinal velocity, the yaw angle, the steering angle, and the distance between the front axle and the rear axle of the tractor, respectively. When the center of gravity (CG) is considered, linear velocities x ˙ and y ˙ are projected onto the CG as given in Equations (4) and (5).
x ˙ = v x cos ψ v y sin ψ
y ˙ = v x sin ψ + v y cos ψ
In this case, v y is the tractor’s lateral velocity at the center of gravity.
Despite the simplicity of kinematic models, researchers have used them to quantify lateral errors without considering vehicle dynamics [49,50,51]. Due to sliding, deformed tires, or changes in wheel-ground contact conditions, pure rolling constraints are almost impossible to meet when performing agricultural tasks. In order to provide accurate guidance, improved kinematic models have been developed that can be adapted to consider tire slippage aspects [49,50,51,52,53,54].

3.1.2. Dynamic Model

A dynamic model describes how a system moves when forces are applied to it. In these models, forces, energies, mass, inertia, and velocity parameters are taken into account. In dynamic models, differential equations of second order describe their behavior. Fully autonomous vehicles that are capable of reliably navigating and overcoming obstacles are a major challenge. It is crucial to predict vehicle dynamics and behavior in real-time based on soil properties and obstacles. In addition, ground and environmental data systems are integrated into the vehicle dynamics model, which can classify terrain based on soil mechanics and topography. Predicting an optimal driving path requires a high-fidelity vehicle dynamics model. For analyzing the dynamics of vehicles, Newton’s second law of motion is commonly used [55,56]. In order to perform well in various operational and environmental conditions, all-terrain vehicles need to be more flexible and adaptable. As a result, it is necessary to better understand the stability and dynamic response of the ATV under high nonlinearity and complex loading conditions. In order to achieve high autonomy, an ATV must be able to operate in any natural environment. ATVs are advantageous in poor terrain situations due to their good traversability capability and their capability to operate in unsafe conditions. ATVs, however, are characterized by low stability margins, roll-over risks, and excessive side slippage due to dynamic constraints [57].
A mode is a natural vibration characteristic of a mechanical system, and each mode has a particular natural frequency, damping ratio, and mode shape. In ATV design, modulation analysis is capable of identifying weak links in the components of the mechanism during the movement process, enhancing the stiffness of the structure, and providing a reference for improving the design. Figure 7 shows a dynamic model of an autonomous tractor.
A modal analysis and vertical motion of the system can be modeled as a spring-mass-damper system [59]. Because of the tractor’s limited driving speed, we can assume the lateral forces on the left and right wheels are equivalent and can be summed. As a result, the tractor is modeled in 2D as a bicycle system. Figure 8 schematically illustrates the velocities, sideslip angles, and forces acting on the rigid body of an autonomous tractor.
Dynamic motion models have been successfully studied in recent years. On the basis of the nonlinear dynamic model established by Alipour, he developed a robust sliding-mode trajectory tracking controller based on the lateral and longitudinal slips of the wheel [60]. In another study, several critical factors, such as chassis kinematics, chassis dynamics, the interaction between the wheel and the ground, and the wheel dynamics, were included within Liao’s integrated dynamic model. Model-based coordinated adaptive robust controllers with three-level designs for robot dynamics were developed by them [61].
For agricultural vehicle navigation, dynamic models are relatively complex since all vehicle characteristics (inertia, sliding, and springing) must be described. It is difficult to determine most of these parameter values (mass, contact conditions between wheel and ground, tire and wheel deformation) even with experimental identification. Researchers are interested in studying how agricultural vehicles handle dynamic tasks [62,63].

3.1.3. Mathematical Modeling

As ATVs have many nonlinear subsystems, they are always nonlinear. The design of appropriate controllers can be aided by mathematical modeling. The dynamic motion of an ATV is usually described through mathematical modeling. The throttle of the engine, the braking forces left and right, and the turning rate are just a few of the variables controlled by AATV. The last three decades have seen a variety of mathematical models developed for AATV, but none have considered all types of non-linearity. A lot of progress has been made in the development of both human-controlled (manned or teleoperated) vehicles and automatic guided vehicles; however, there is still a fundamental difference: compared to autonomous controllers’ relative rigidity, humans can diagnose and adapt to changing or unexpected operating conditions [64]. An UGATV/AATV recognizes its environment and performs missions autonomously without human intervention [65]. UGATV’s/AATV’s nonlinear mathematical model is crucial for the development of control systems and to ensure good driving performance. In order to facilitate controller design and provide computationally efficient simulations, nonlinear mathematical models should be as simple as possible [66]. To design a controller, it is important to know how UGATVs/AATV’s interact with terrain, and to design a controller requires nonlinear mathematical modeling [67]. Nonlinear aspects of AATV operations were modeled, and four different transient conditions, namely increasing mass, change of slope of the road, and sudden right and left breaks, were tested [68]. The controller results were compared, and the appropriate controller was selected based on the comparison [68].

3.2. Logic and Control Systems

There are a number of logical and control systems that can be used in agricultural fields in order to control lateral and longitudinal errors. This is due to the nonlinear behavior of autonomous navigation. For example, neural networks, fuzzy logic, PID controllers, FPIDs, or MPCs could be used. An orchard speed sprayer can be operated autonomously using a fuzzy logic controller [69]. To maintain the drive path accuracy, predictive model control can be employed to consider slippage and pseudo-slippage of agricultural vehicles on slippery terrains [53].

3.2.1. PID

PID controllers are used to correct deviations between the reference and measurement feedback within an acceptable period of time. In order to do so, it increases or decreases the output of the process by using three main parameters known as gains (proportional gain, integral gain, and derivative gain), which can accelerate, delay, and stabilize this correction (Figure 9).
To guide a tractor through crop rows, Kodagoda combined a feed-forward controller with a fuzzy proportional derivative proportional integral controller [70]. They showed that their controller was insensitive to parameter uncertainty, load, and parameter fluctuations, and it could be implemented in real-time. In order to develop a closed-loop controller, Cortner used a PID implementation in the microcontroller, which was developed for the steering system of the AATV. Cortner found that the PID gains for the controller were P = (3/4), I = (1/2048), and D = 4 [71]. Eski and Kus used a PID controller in order to control a UAV. The results showed that there is an overshoot of 1 cm in 0.00001 s. After this point, the unit step signal was tracked with no overshoot [72]. Due to the fact that the autonomous vehicle modifies its speed according to the estimated size and type of the reference trajectory curves, Hossain used PID controllers in his study to maintain an optimal speed while following the given trajectory [73].

3.2.2. Fuzzy Logic

Fuzzification, fuzzy inference, and defuzzification are the three steps of fuzzy logic control. Fuzzification transforms crisp input values into fuzzy ones. By applying fuzzy IF-THEN rules and logical operations, fuzzy values are mapped to different fuzzy values. Mamdani and Sugeno [74] are two types of fuzzy inference systems. Systems of all types are classified as Mamdani types, while dynamic nonlinear systems are classified as Sugeno types. In the defuzzification process, the linguistic output values from the previous step are aggregated. After the defuzzification process is completed, a single crisp value is produced as an outcome (Figure 10).
Using many-valued fuzzy logic (FL), Sumarsono built an ATV control system using GPS data [75]. To obtain useful data for hardware implementation, a fuzzy logic controller (FLC) simulation model was developed. To determine the radius and speed of the vehicle, the FLC used two inputs: the vehicle heading and the offset from the planned path. Expert experience and knowledge were FLC’s control knowledge base, and the center of gravity for singletons (CoGS) was used for defuzzification [75]. In addition, Sumarsono showed that the accuracy of FLC is dependent on the determination of membership functions [75]. In the membership functions of heading, offset, steering angle, radius, and speed, the appropriate numbers were set to 5, 5, 7, 7, and 3. Based on the constructed kinematic bicycle model of a tractor and implement, a fuzzy logic-based controller was proposed to automatically steer an implement using hydraulic cylinder actuators to cover crop fields [76]. To navigate in agricultural environments, Bonadies and others [77] applied PID and fuzzy controllers to an unmanned ground vehicle. Agricultural produce and ground are differentiated from an image obtained from the camera. Within the image, the left and right boundaries of the crops and the center of the row were identified (Figure 11). The average percent overshoot for the fuzzy controller was 45.75%. The average velocity was 35.26 rpm, with a velocity standard deviation of 0.32 rpm and an absolute average error of 28.37. In his study, he found that fuzzy logic and PID controllers performed similarly [77]. Using an improved fuzzy logic control method, Yao and others [78] generated optimal steering angles for a wheeled autonomous vehicle to track a sequence of waypoints. In developing the path-tracking algorithm, accuracy, stability, and convergence speed were considered. The proposed IFM provides higher accuracy, stability, and convergence speed when compared with conventional fuzzy logic control [78].

3.2.3. Genetic (Evolutionary) Algorithms

A genetic algorithm (GA) is a global, parallel search and optimization method that uses a population of potential solutions to solve a problem. Within the population, each individual represents a particular solution to the problem, which is usually encoded in genetic code. Over generations, the population evolves to produce better solutions to the problem. GA are efficient and appropriate optimization methods for control system design [79] for both feedback controllers and feedforward controllers. Figure 12 illustrates the algorithm schematically.
GA’s operational principle is based on natural selection to select better-fit solution candidates and to generate the most-fit solution by using crossover operations and very limited artificial mutation operations. A genetic algorithm is a method of solving optimization and search problems using true or approximate solutions. Inheritance, mutation, selection, and crossover are techniques that are inspired by biological evolution. Gas was used by Ryerson and Zhang to plan a guided vehicle’s optimal path [81]. They found that the total coverage achieved was greater than 90% on four of the eight runs. There was an average coverage of 89%. Several combinations of lateral and heading deviations were tested by Ashraf using genetic algorithms [82]. According to their finding for vehicles moving along sloped land, the mean and standard deviation of lateral deviation along contour directions were 0.047 m and 0.039 m, respectively, which are highly insignificant. Shiltagh and Jalal used a modified genetic algorithm (MGA) developed for global path planning, and the application of MGA to the problem of navigation was investigated under the assumption that a model of the environment had been developed [83]. The simulation results demonstrated that this algorithm had a great potential to solve path planning with satisfactory results in terms of minimizing distance and execution time, and it was also determined that 374.47 s were required to find the shortest generated path (distance) with a length of 30.86 m [83]. In another similar study to Shiltagh research, the genetic algorithm used the grid method to mark the geographical environment information and map the environmental information to the grid [84]. According to the algorithm in this paper, mutation probability had the greatest impact on the effective path ratio. According to the study, when the mutation probability is 0.09, the effective path ratio is 80.65%. Furthermore, the effective path ratio remained approximately 80%, regardless of how the population size, evolutionary algebra, and crossover probability were selected.

3.2.4. Artificial Neural Network

Typically, neural networks consist of three layers: input, hidden, and output (Figure 13). One or more layers could make up the hidden layer, depending on the requirements. Adaptations were made to the neural networks to compensate for inclinations and magnetism errors. Prior to performing the main maneuvers, it was necessary to train the neural network and the AATV. Due to the hidden layers in neural networks, one of the major disadvantages was the use of unidentified boxes (so-called black boxes) in programming. In order to explain the input-output relationship of vehicle motion on sloping land, Torisu developed a neural network (NN) vehicle model rather than a dynamic or kinematic model [85]. In a comparison of experiments and simulations of vehicle motion in slope-land environments, Zhu found that the NN model was suitable to represent the input-output relationship of vehicle motion. Also, the NN vehicle model was deemed suitable for representing the tractor’s motion on sloping terrain for autonomous navigation [76]. Eski and Kus showed that the unmanned agricultural vehicles with a model-based neural network PID control system followed the given reference trajectory with minimum errors in formidable road conditions and that they reduced overshoot to a considerable extent [72]. A neural network’s backpropagation algorithm is probably its most fundamental component. A chain rule algorithm is used to train neural networks effectively. The backpropagation method performs a backward pass through a network after each forward pass while adjusting the model’s parameters (weights and biases). By using the back propagation algorithm, Ashraf developed a NN vehicle model for sloped terrain conditions. In order to generalize the optimal steering for different land slopes, Ashraf developed a model that utilizes an NN-based steering controller. With a prototype test tractor, he conducted autonomous travel tests on sloping lands and found that the tractor could follow predetermined rectangular paths precisely [82]. As a result of his study, the average lateral deviations were only 0.058 m and 0.063 m for the four rectilinear motions, whereas the average heading angles were 2.950 and 1.935, both of which are insignificant for tractor motion even on flat terrain.

3.2.5. MPC

MPC consists of predicting the future behavior of the controlled system over a finite time horizon and computing an optimal control input that maximizes a priori-defined functional cost while satisfying the system constraints. For a more precise calculation of the control input, an optimal open-loop control problem of finite horizon is solved at each sampling instant. After applying the first part of the optimal input trajectory to the system, the horizon is shifted, and the process is repeated. As a result of its ability to explicitly incorporate a performance criterion and hard state constraints into its design, MPC is particularly successful (Figure 14).
Multivariate, complex, and unpredictable are the characteristics of agricultural systems [86]. Traditionally, control technologies such as on/off, P, and PID are easy to implement, but they cannot control time-delayed processes [87,88]. In addition, adjustment of the controller takes considerable time and effort [89]. In the past few decades, MPC has been extensively investigated as a promising control strategy [90,91,92,93] and has also been applied in the industrial sector [94]. There are a few alternative versions of the MPC algorithm used in autonomous vehicle control, e.g., a prescribed performance control algorithm proposed to establish accurate tracking control for a tractor trailer [95]. A combine harvester was controlled using constraint MPC and an alternative ASM as part of the cruise control process [96]. A navigation task was completed by Backman using NMPC. It is necessary to have sufficient accuracy for lateral errors of up to 10 cm at 12 km/h [97]. It is also possible to use the MPC for inland navigation in addition to the navigation of agricultural machines [98]. It is extremely difficult to control path-tracking in agriculture because there are so many complex bodies involved. Using NMHE and rapid distributed nonlinear MPC, Kayacan developed an estimation scheme for the state and parameters of the system [99]. As part of the path-tracking error control of unmanned ground vehicles, linear MPC was designed, resulting in an average Euclidian distance error of 23.49 cm and 21.21 cm on straight lines, respectively, while the average Euclidian distance error was 39.82 cm and 36.21 cm on curved lines. LMPC calculates in approximately 1.1 ms, which is faster than NMPC [48]. Based on an mp-MIQP technique, Yang presented explicit MPC for the reduction of trailer tracking errors and smoothing tractor steering angle behavior [100]. MPC outperformed LQC in tracking multivariate systems under constraints, according to Yakub [101]. A high-precision closed-loop tracking method based on LTV-MPC has been proposed by Plessen. Precision tracking is possible within millimeters [102]. Additionally, MPC could be used to track the path of hydraulic forestry cranes in forests [103,104]. MPC uses a model to predict and optimize the steering angle and velocity of an AV, as well as other features of the process. To overcome the shortcomings of PID, ANN, and other similar controllers, different types of MPC were developed, such as hybrid MPC, robust MPC, adaptive MPC, nonlinear MPC, tube-based MPC, distributed MPC, stochastic MPC, and explicit MPC [105]. The MPC controller can be used for a number of reasons, including: (1) MPCs are capable of handling MIMO systems in the same way as ANNs. It is useful when inputs and outputs are predicted to have some interaction. A prediction model, a rolling optimization, and feedback adjustments are all involved in the MPC. In MPC, the outputs are controlled simultaneously by a multivariable controller, which takes into account all variables within a system. (2) As AVs violate constraints, they will have unwanted consequences, which can be handled by MPC. There should be a safe distance between autonomous vehicles and obstacles, and multi-robot systems should follow speed limits. The acceleration limits of AVs are also a limitation. An MPC algorithm should track a desired trajectory if all these constraints are met. In the MPC, previewing is like feedback control. In the event that the controller is unaware that a corner is approaching, the AV can only apply its brake during cornering when it is traveling on a curve or turning at the headland. AVs with safety sensors, such as cameras that provide trajectory information, will provide information to the controller ahead of the upcoming corner. As a result, it is able to brake sufficiently so that it can stay on the road safely. Control performance can be improved by incorporating future reference information into the MPC.

3.2.6. Kalman Filter

Digital control engineering is one area in which the Kalman filter (KF) is frequently used. Control engineers use the filter to eliminate measurement noise that may interfere with the performance of a system under control. Additionally, it provides an estimate of the current state of the process or system. The KF is used (1) to estimate future position and velocity based on the current position of objects or people. (2) To estimate the state, position, and velocity of a vehicle, navigation systems use the output of an inertial measurement unit (IMU) and a global navigation satellite system (GNSS) receiver. (3) To feature tracking or cluster tracking applications based on computer vision. Figure 15 shows the intuitive explanation of the KF. The KF process has two steps: 1. Prediction step: Based on previous measurements, the KF predicts the system’s next state. 2. Update step: KF calculates the system’s current state based on measurements taken at that time.
Multi-sensor data fusion is based on the KF, which offers a solid theoretical framework. The approach relies on tracking the vehicle’s location at all times or the system’s state. INS and GPS need to be integrated using Kalman filters in a highly dynamic system that is likely to experience significant acceleration. When the GPS signal is lost, these integrated systems can provide short-term positioning information that is reliable. A wide range of literature exists on integrating INSs with GPSs and/or other sensors [107,108,109,110]. KF was applied to draw DGPS measurements and effectively reduce the RMS positioning error by removing DGPS noise [111]. Using a bandpass filter and extended KF, Hague and Tillett combined image processing with bandpass filtering [112]. A new sigma-point filter has been proposed in order to improve the KF performance [113]. Based on simulations, Zhang found that the sigma-point Kalman filter had better numerical robustness and computational efficiency [114]. By considering ATVs’ multiple positioning systems and comparing them, Pratama uses the Extended Kalman Filter (EKF) to detect abnormal deviations. Based on residue values, their model can detect one fault condition at a time [115]. An innovative, robust CKF with a scaling factor was presented by Gao in his study. The Mahalanobis distance criterion is used to identify abnormal observations. An increased observation noise covariance was achieved by adding a robust factor (scaling factor) to the standard CKF by using the Mahalanobis distance criterion. This results in a decrease in filtering gain when abnormal observations are present. In addition to improving the robustness of CKF, the proposed solution does not depend on abnormal observations to influence navigation solutions [116]. GPS-based navigation was developed that allowed an autonomous ATV to follow a virtual path in the field [32]. Their ATV was equipped with an EPS, PID, low-level servo controller, and high-level controller based on the estimation algorithm, EKF. The position data was obtained by an RTK global navigation satellite system and used to precisely turn the EPS motor on the ATV’s steering shaft [32].

3.2.7. Machine Learning: DRL

Reinforcement learning (RL). Machine learning techniques such as reinforcement learning reward and/or punish desired behaviors. There are two types of RL methods: model-free and model-based. An approach based on value-based model-free learning takes a state and action as inputs and outputs the value. By selecting the action that maximizes the value function, the policy is extracted. An approach based on models learns a predictive function from the current state and a sequence of actions to predict the future state. Utilizing predicted future states, policies are extracted by selecting actions that maximize future rewards. A model-free algorithm is capable of learning complex tasks [117]; however, they are typically sample-inefficient, as opposed to model-based algorithms, which are sample-efficient but difficult to scale to complex, high-dimensional tasks. Reinforcement learning agents are capable of interpreting their environment, taking actions, and learning through trial and error. RL focuses on the problem of goal-directed agents interacting with uncertain environments, which is crucial for navigating rough terrain in unknown conditions.
Deep-reinforcement learning (DRL) has been incredibly successful since its introduction [118]. Particularly, DRL has found a niche in robotic manipulation based on vision. It has been shown that robots controlled by neural networks with DRL-trained representations can solve complex tasks even in unstructured environments without requiring imitation learning. DRL-based local navigation of unknown rough terrain was presented by Josef and Degani. In comparison with traditional local planning methods, their method reduced planning time and improved planning success rates. In a goal-directed task, they used reward shaping to provide a dense reward signal [119]. With Kahn’s algorithm, robot navigation policies can be learned in a self-supervised manner that requires minimal human interaction, using samples in an efficient, stable, and high-performing manner [120]. Deep RL is capable of learning control for rough terrain vehicles that have continuous, high-dimensional observations and actions in their environment, according to Wiberg [121].
A comparison analysis of control algorithms for AATVs is shown in Table 1, which shows the advantages and disadvantages of commonly used control algorithms.

4. Sensors

Sensors are one of the main components of robots. Data from sensors are transmitted to the controlling unit. There are three main types of sensors deployed in AATVs: position sensors that measure the robot’s location, attitude sensors that determine its orientation, and safety sensors that alert in case of an emergency. There are six categories of sensors based on their applications: ground-based beacons, active ranging sensors, heading sensors, tactile sensors, motion/speed sensors, and vision-based sensors [122,123].
Ground-based beacons are used to locate and position Avs [124]. An AV’s orientation and attitude are determined by heading sensors. Physical contact is detected by the tactile sensors. Geometric triangulation, reflectivity, and time-of-flight, which is an approach to measuring the distance between a sensor and an object, are all measured by active-ranging sensors. In motion-speed sensors, Avs can be measured based on their speed and acceleration relative to fixed or moving objects. Using vision-based sensors, we can range objects, analyze images, segment them, and recognize obstacles [125,126,127].

4.1. Attitude

In autonomous navigation, the attitude sensor indicates the vehicle’s orientation. When tunnels, trees, or buildings prevent the GPS signal from being received, sensor fusion methods can improve positioning accuracy. Agricultural robots previously used GDSs to measure heading angle (so-called yaw). There are two major problems with these sensors: Magnetic fields affect their accuracy, and the inclination of the vehicle can cause errors. It was then that FOG sensors became popular; however, their cost makes them unsuitable for agricultural robots. Vehicle orientation can be indicated most accurately with IMUs. Sensors like these are capable of measuring quaternions, headings (pitch, roll, and yaw velocity, as well as indirect pitch, roll, and yaw angles), linear accelerations, and gravity. IMUs have inertia sensors, gyroscopes, and accelerometers, and some also have magnetometers. Sensor fusion combined with an IMU and RTK-GPS can provide high-accuracy vehicle guidance. Automated driving relies heavily on slip angle and attitude. An IMU-based method is used to estimate a vehicle’s slip angle and attitude based on vehicle dynamics and GNSS [128].

4.2. Navigation Planners

Agricultural vehicles use navigation planners to drive autonomously, which convert position deviations (headings and positions) into steering angles. Navigation planning must incorporate tracking methods as well as sensor information and vehicle motion to guide the vehicle in the desired direction. Two parts are involved in local navigation on unknown, rough terrain. In the first step, the UGV’s/ATV’s kinematic constraints and surface geometry are used to plan a possible path toward the target position. Second, the UGV/ATV must track a selected path while considering the terrain’s interactions with it [129].

4.3. Tracking Methods

In navigation planning, there are four methods to consider: tracking the position, tracking the line, tracking the map, and avoiding obstacles. A guideline line or trajectory is usually used when operating a guidance system. Usually, this method uses crop rows, swath edges, and tilled or untilled boundaries. A weakening or disappearance of the tracking signal, however, results in a failure of the operation. GPS systems often use map tracking, which is time- and labor-intensive [29].

4.4. Sensor Types

4.4.1. Vision

The machine vision sensor on a vehicle can sense its surroundings and determine the relative position and heading of the vehicle. Two static image sensors as subsystems can be used to control a single main system mounted on a robotic platform, utilizing the triangulation principle as a control method [130]. An angle from the base line and a distance between two subsystems determine the robot’s location based on a predetermined distance between the two subsystems. A common machine vision application is the detection of guidance directions in rows of crops, edges of harvested crops, and soil tillage. Benson developed a guidance combine harvester that was based on the position of the lateral edge of the crop cut [131]. The Hough transform was used by Marchant and Brivot for real-time row tracking (sampling rate of 10 Hz), and the method was tolerant of outliers (such as weeds) when the number of outliers was low [132]. As a result of Søgaard and Olsen’s machine vision guidance method, plant segmentation was not required. The position and orientation of rows were determined using weighted linear regression instead of calculating the center of gravity for each row segment in the image [133]. Crop rows were segmented using Han’s k-means clustering algorithm. Tractors were then guided by this information. In order to guide a weeding cultivator, Okamoto developed an automatic guidance device. In order to calculate the offset between the machine and target row, the images from the color CCD camera were processed by a computer [134]. The vision guidance system is relatively insensitive to weeds’ visual “noise”. In order to identify outliers as a means of row guidance, the researchers performed linear regression on three crop row segments and calculated a cost function that was comparable to the moment of the best-fit line [135]. Researchers have combined two monocular field images taken simultaneously from a binocular camera into a three-dimensional (3D) field image using stereovision systems. To reduce ambient light influence, 3D images are reconstructed from monocular images of different disparities. Using stereovision, a crop-row tracking navigation system was developed for agricultural machinery. Up to 3.0 m/s and following both straight and curved rows, the RMS error was 3–5 cm [136]. It is worth mentioning that the speed of up to 3 m/s is very slow and practically makes such systems meaningless to employ in agricultural processes such as seeding, tillage, plowing, or harvesting.

Computational Methods

In an autonomous agricultural vehicle guidance system, the main purpose of a computational method is to detect image features from the processing of images or to provide basic information by combining sensor data. Hough transform and Kalman filter are two main approaches to computational methods.

Hough Transform

An image can be analyzed using the Hough transform technique to isolate specific features. Initially, the transform identified lines in an image, but later it was extended to identify arbitrary shapes, such as circles and ellipses. Hough transforms have the advantage of being quite robust, so they can still find a straight line if a group of points varies. Several publications have discussed how the Hough transform can be used to deduce guidance signals from plant structures [132,137,138,139,140,141]. Agricultural machinery guidance systems were used to develop an automated tractor guidance system based on stereovision. In order to detect crop rows, the algorithm is composed of stereo-image processing, elevation map generation, and navigation point detection [136].

Kalman Filter

For multi-sensor data fusion, the Kalman filter [142] provides a solid theoretical framework. Tracking the vehicle’s location or the system’s status is the key to the approach. GPS receivers often use Kalman filter models to estimate position based on raw GPS signals. There is abundant literature related to the integration of inertial navigation systems (INS) and/or other sensors with GPS systems [109,110,143,144].

4.4.2. Position

GPS

As a global guidance sensor, GPS receivers have been widely used since the 1990s [145,146,147,148]. Four of the most important positioning systems are the GNSS, GPS, DGPS, and RTK-GPS. RTK GPS, GNSS, and DGPS have several differences. All of them follow the same general principles. With the GPS, attitude and longitude are determined by three satellites, and altitude is determined by one satellite, ensuring 3 m accuracy [39]. GPS systems are not adversely affected by weed densities, shadows, missing plants, or any other condition that reduces their performance in comparison with machine vision guidance systems. The GPS guidance system can also be programmed to follow curved rows, which is another advantage [149]. The satellites orbiting Earth transmit their locations to ground control stations, which calculate ground position through trilateration, forming a GNSS. The term GNSS encompasses a wide range of satellite-based PNT systems. PNT plays a critical role in telecommunications, land surveying, precision agriculture, scientific research, and so on. An antenna (or cellphone) is carried by the mobile unit (rover) to receive GNSS signals. An RTK-GPS can provide a minimum accuracy of 5 cm. Using GPS for vehicle guidance has three limitations. A major limitation of GPS is that it cannot provide consistent positioning accuracy within centimeters in all terrains (for example, steep hills or trees and interruptions in satellite signals) [29]. Due to the inherent time delay (known as data latency) that is required for location determination due to signal processing, high field speeds can present control system challenges. Finally, for agricultural applications, the cost is high [29]. An RTK-GPS is a system for correcting the position of mobile units based on positional coordination from a base station. An RTK GPS guidance system was utilized in order to control an autonomous tractor along a curved path [150,151]. An RMS error of 6 cm with a maximum error of 13 cm was recorded while following a sinusoidal path at 6.5 km/h with an amplitude of 2.5 m and a wavelength of 30 m [150,151]. Using RTK GPS position information as a reference, a number of low-cost GPS receivers were evaluated for their dynamic accuracy, and on average, these receivers were found to have a cross-track error of around 1 m [152]. For an automated six-row rice transplanter, an RTK GPS was used for positioning and FOG sensors for maintaining inclination [109]. In DGPS, atmospheric errors are similar between two receivers that are relatively close to each other. A GPS receiver must be installed at a precisely known location in order to use DGPS. The base station, or reference station, is this GPS receiver. Satellite signals calculate the position of the base station receiver, which is then compared with the known location. The second GPS receiver, called the roving receiver, records GPS data by applying the difference. Real-time correction of field data can be applied by radio signals or by postprocessing with specialized software after data capture using the roving receiver. In DGPS, a known positional coordinate is used to enhance the position. It is possible to achieve an accuracy of 10 cm using DGPS [39].
LNAV is an off-the-wire navigation system that uses electromagnetic induction along with power cables installed around the field. There is a strong correlation between the amount of magnetic field generated and the distance from each cable. Manually teaching the robot the boundaries of the field was required to use this positioning system. In large fields, the cost of construction can be dramatic, but this system can perform well in different weather conditions.
DGPS, a TMS, and an IMU were used in the SNAV developed by JAEI [153]. In order to improve the interval of the positioning system, the IMU and TMS are utilized. However, the cost, information service, and reference station of SNAV limited its use. Sanyo Electric Co. manufactured XNAV on behalf of BRAIN-IAM. Optical measurement systems were used for this positioning sensor. In the reference station, it was possible to observe the target, which was installed on the vehicle, and measure both the diagonal distance and the horizontal angle from the reference station [39].

Dead-Reckoning Sensors

An inexpensive, reliable dead-reckoning sensor is used for short-distance mobile robots. Using a simple mathematical formula, it determines a vehicle’s current location by advancing its previous position along a predetermined path over a specified time period. Odometry is a simple form of dead reckoning. Error accumulation without limit is inevitable in odometry because it integrates incremental motion information over time. The use of odometry in robot navigation systems remains important despite these limitations [154,155]. Additionally, dead reckoning is widely used by autonomous vehicles in conjunction with other sensors [156].

4.4.3. Laser Sensors

Compared to other sensors, laser sensors offer a wider range and higher resolution. In order for guidance systems to function properly, there should be at least three reflectors (landmarks) around the work area.
Based on the time when the laser beam is detected, the guidance system uses triangulation to determine the vehicle’s location. It is important to note that, despite the fact that laser-based sensors are insensitive to environmental conditions, they have two major disadvantages. If artificial landmarks are moved, they do not work well [29]. To register natural landmarks in the map-building process, it is necessary to update the map if they are used for navigation. Second, road unevenness leads to noisy laser measurements. Laser-based sensors are widely used in orchards because tree canopy frequently blocks satellite microwaves. A system for navigating between rows of trees was developed, and as a result, there was an 11 cm lateral error and a 1.5° heading error [137]. A machine vision and laser radar-based autonomous guidance system was developed for citrus grove navigation [157]. A gyroscope, laser rangefinder, and RTK GPS were incorporated into an autonomous tractor system [158].

4.4.4. Ultrasonic Sensor

The ultrasonic sensor transmits and receives information using sound waves. In the next step, the duration is converted to a distance measurement by using the speed of sound (340 m/s). An ultrasonic sensor is used to detect objects in the real world. In an experiment, according to Chandan and Akhil [159], ultrasonic sensor accuracy varies between 92.20% and 92.88%, whereas LiDAR sensor accuracy varies between 92.55% and 93.33%. In conclusion, the LiDAR sensor is slightly more accurate than the ultrasonic sensor in the proposed research study [159]. By using a frequency-modulated ultrasonic sensor, Harper and Mckerrow detected and categorized plants and returned a signal containing geometric information about the plants in order to improve their navigation [160].

4.4.5. Light Sensor (LiDAR)

LiDAR measures the range of a target object using the light emitted by sensors. To estimate the range of a target object. Since light travels at a constant speed, the sensor emits a light pulse and then measures the time it takes for the reflected pulse to be received. Sophisticated software processes all this data, creates a route, and sends instructions to the actuators that control acceleration, braking, and steering. Up to 2.2 million points can be produced per second by modern LiDAR sensors with multiple lasers or channels, 8–128 [161]. With some LiDAR units, the field of view is 360 degrees horizontally, producing a dense point cloud representing the surrounding area. Light pulses can be used to gather multiple returns from LiDAR sensors. This is because several objects, such as the leaves and branches of a canopy of trees, may reflect the light pulses as they travel from the sensor. By recording this information, LiDAR sensors can provide detailed information about the tree canopy as well as the underlying structure. A LiDAR has several advantages, such as quick and precise data gathering; integration with other sensors such as sonar, IMU, and GPS; and 24-h operation thanks to an active lighting sensor. As long as a LiDAR is configured properly, it can run quite independently and can collect a lot of data on large areas of land. Among the disadvantages of LiDAR are: LiDAR can be extremely expensive due to the project specifications; when it is raining, thick clouds, fog, smoke, or when transparent obstacles are present, it is ineffective; it may take some time and resources to analyze the large amounts of data collected; LiDAR laser blades can damage eyes; and penetration of extremely dense materials can be challenging. Autonomous vehicles are capable of detecting obstacles quickly and interpreting the environment using LiDAR systems combined with video cameras and radar sensors [162]. An online LiDAR localization approach based on range images was presented by Chen. A triangular mesh-based map representation, combined with range images generated from LiDAR scans, allowed this method to successfully localize autonomous systems [163]. A number of studies have been conducted over the last few years on simultaneous mapping and localization for autonomous vehicles. By keeping track of the location and constructing maps of unknown environments, SLAM accomplishes its purpose. A framework for solving the dynamics of SLAM problems is proposed in the article “Semantics Aware Dynamic SLAM Based on 3D MODT”. The dynamic regions of the scene were handled with a Visual-LiDAR using MODT. A dataset developed for LiDAR-based autonomous driving was used to evaluate and compare the framework with state-of-the-art SLAM algorithms. With budgeted computational resources, it is possible to perform real-time SLAM using the proposed dynamic framework [164].

4.4.6. Radar Sensor

Radar, or radio detection and ranging, was developed before World War II. The principle involved emitting electromagnetic waves within an area of interest and receiving dispersed waves or reflections from targets in order to process the signals and determine the range information of the targets. Based on the Doppler property of electromagnetic waves, it determines the relative speed and position of identified obstacles [165]. The Doppler effect, also known as the Doppler shift, describes the variation in wave frequency caused by relative motion between a wave source and its target. The frequency of the detected signal increases when the target travels in the direction of the radar system. Radar sensors monitor blind spots for distance control and braking assistance purposes by using short-range (24 GHz) and long-range (77 GHz) radio waves.

4.4.7. Inertia Sensor

There are numerous applications for inertial sensors in vehicles [109,141,166]. An inertial sensor measures how a vehicle is performing internally. Due to their packaging and seals, inertial sensors could withstand harsh environmental conditions. Accelerometers and gyroscopes are two common forms of inertial sensors. Inertial reference frames are used to determine the acceleration of an accelerometer. Gravitational, rotational, and linear accelerations are included in this definition. A gyroscope measures rotation speed independent of a coordinate frame. These sensors are also capable of detecting wheel slippage. The problem with this type of sensor is that it is prone to drifting in its position [167]. GPS, or machine vision, is used most often with inertial sensors. Using a vision sensor, fiber optical gyroscope (FOG), and RTK GPS, Zhang and Reid developed an on-field navigation system. GPS, or machine vision, is used most often with inertial sensors [168]. A GPS and an inertial measurement unit were developed by Noguchi for agricultural navigation [169].

4.4.8. Geo-Magnetic Sensor

The GDS detects the earth’s magnetic field. In a similar manner to an electronic compass, it can be used as a heading sensor [44]. Electromagnetic interference is one of the limitations of GDS sensors. Other sensors are generally supplemented by the GDS. GPS and GDS were used to guide vehicles in straight lines or directional lines [49]. They managed to track a straight line with an average error of less than 1 cm by controlling these error sources and combining GDS with a medium-accuracy GPS (20 cm). Agricultural guidance systems have been explored using GDS in conjunction with sensor applications.

4.4.9. Safety Sensor

The safety sensors on AATVs include laser scanners (2D/3D), cameras (2D, 3D, and OSV), and switches (tap switches, proximity switches, bumper switches, and emergency switches). This type of sensor can either cover a specific safety zone (such as a circle with a diameter of 10 m) or be operated physically. A dynamic stability index can be calculated using angle sensors and accelerometers, which can be used to control protective measures such as automatically deployable ROPS [170]. Field mapping and hazardous location identification have been conducted using static stability indices (based on static stability angles) [171]. Using low-cost sensors and the MEMSIC accelerometer, Nichol detected potential tractor instability and alerted the operator [172]. An accelerometer was incorporated into the sensing device to sense static and dynamic accelerations and calculate the tractor’s angle of rotation. In order to identify an upset vehicle and contact emergency personnel for rescue, a smartphone-based stability sensor was developed [173]. To improve the accuracy of the roll and pitch angles and the roll and pitch rates, a complementary filter was used to combine data from the accelerometer, gyroscope (MPU-6050), and GPS (Skylab SKM53) sensors.

4.4.10. Power Status Sensors

A wide variety of sensors are classified in the literature for autonomous vehicles on the basis of parameters, energy, and application [174,175].
First and foremost, sensors are classified according to the type of parameters they contain, which may be internal (so-called proprioceptive) or external (so-called exteroceptive) [123,176]. Proprioceptive sensors measure the internal parameters of AVs, such as motor speed and joint angle, or force through load cells. With exteroceptive sensors, AVs are able to detect obstacles and measure distances from their surrounding environments, such as LiDAR or laser scanners [177]. A sensor with an external sense is usually used for observing the environment, mapping it, determining autonomous navigation, estimating action, or describing its variables. It is possible to use them as safety sensors, positioning sensors, attitude sensors, or any combination of these sensors [39].

Energy

Secondly, the direction of energy is classified based on whether it originates from the sensors (so-called active sensors) or is directed toward them (so-called passive sensors) [123]. Active sensors such as encoders, laser rangefinders, and ultrasonic sensors transmit energy into the environment to determine how it reacts. Passive sensors such as thermal sensors and probes for measuring temperature are used in order to measure the energy entering the sensor from its surroundings [127].

4.4.11. Sensor Fusion

Sensor fusion is used to combine information from multiple sensing sources in order to automate navigation operations in various vehicles and environments. Sensors will function in accordance with their field status during operation. Still, under certain field conditions, multiple sensors can provide superior results in terms of data integration over those using an individual sensor. An agent for agricultural navigation was developed, and several positioning sensors (DGPS, digital compass, and dead-reckoning system) and safety sensors (laser rangefinder, bumper, inclinometer, emergency stop) were integrated into the farming vehicle, along with onboard processors, wireless communication systems, and electrohydraulic actuators. Sensor-fusion algorithms were developed to provide continuous and precise positioning without GPS signals [178]. By using sensor fusion technology, Kulkarni used both sensors (IMU and GPS) and removed the drawbacks of both sensors, such as slow update rates, accumulating errors, and drifts over time [179]. Table 2 shows a comparison analysis of sensors.

5. Data Communication

5.1. Types of Communications

5.1.1. Internal Communication

In a vehicle, internal communication always takes place between the electronic control unit (ECU) and the controlling personal computer (PC). Early agricultural vehicles did not have an ECU, so communication with a PC was impossible. Normally, such situations require direct control of the actuators, where the PC controls the actuators directly. There is no communication system in the direct control method, and the motor is controlled by an IO port, an AD, a DA, and a PMC. Later, autonomous vehicles with ECUs were controlled directly by an ECU. The ECU controlled the actuators based on the signals from the PC. There were several standards being used at that time, such as the RS232c standard that was used for slow and low-speed communication, as well as the RS422 and RS485 standards that were used for multidrop serial buses, Ethernet, and USB. In contrast to the RS232c standard, which is limited to communicating between two devices at the same time, 10 to 32 devices can simultaneously communicate using the RS422 and RS485 standards [39].

CAN

Designing an autonomous ATV requires robust electrical signaling protocols for all major subsystems. In 1990, ISOBUS was published as an international communication protocol. The agricultural vehicle communication system is defined by ISO 11783, which is based on SAE standard J1939. The development of controller-area networks was a result of the adoption of this standard [180]. Using a bus connection, many ECUs, PCs, and IOT devices, as well as microcontrollers, can communicate with each other. As a result of vehicle robotics laboratories developing ‘direct control’ communication systems, RS232 communications were further developed, and eventually CAN-bus networks were employed. CAN is an asynchronous TDM protocol. In vehicles, the CAN bus reduces the number of cables running between sensors and controls. There are two wires in a CAN bus, each capable of transmitting data at a rate of up to 1 megabit per second. Data or information are transmitted between the nodes through a half-duplex CAN bus, which has two wires, CAN L and CAN H (Figure 16). A dominant level refers to the logic that is being transmitted at 0 V when TTL = 0 V, and a recessive level refers to the logic that is being transmitted at 5 V when TTL = 5. In bus arbitration, dominant levels always override recessive levels [181]. It is important to understand that CAN buses are carrier-sense broadcast buses, which allow efficient distributed control with high security and can be connected by a number of different processors in real time [182,183]. The CAN model contains three processes. The processor queues a message into the queueing window. The processor then checks the status of the CAN bus. If there is a message using the bus at the same time as the queued message, the queued message will wait until the state is idle. Transmission of the queued message occurs at the end of the process [184]. Henderson has successfully installed a CAN bus network on the ATV. For steering control, a new node was developed. The throttle and brakes were controlled by another node. The steering control node and remote-control functionality were reintegrated into the system [185].

ROS

Robot software can be written using ROS, a flexible framework. In fact, ROS is not a communication protocol but rather a middleware framework for developing robotic software. It is used to communicate between different components of a robotic system. The robot behavior framework consists of tools, libraries, and conventions for building robust and complex robotic behaviors across a wide range of platforms [186]. In essence, it is a meta-operating system for robots that is open source. Operating systems provide many services, including hardware abstraction, low-level device control, implementing commonly used functionality, and passing messages between processes [186]. A ROS implementation consists of nodes, messages, topics, and services, where nodes are computation processes (Figure 17). There are typically many nodes in a ROS system; its modularity is fine-grained. Messages are passed between these nodes to communicate. Multiple nodes can publish and/or subscribe to a single topic concurrently, and a single node can publish and/or subscribe to multiple topics at once (Figure 8). Using ROS, Rhoades enabled an existing RC-controlled ATV. Adding more sensors was necessary after the base robot had been tested. It is possible with ROS to deploy advanced localization methods and mapping techniques in a short period of time and easily. The ROS commands sent by ROS must be enabled by the user since they operate at a high level [187]. Using an open-source framework, Zhu presented an UAV system. There are four layers in the system: firstly, the robot body; second, the data service; third, the cloud service; and lastly, the user interaction layer [188].

5.1.2. External Communication

An external communication system can connect an ATV with a computer, a switch, or another vehicle far from its control center. The AVs can also be controlled from control and monitoring rooms, cloud servers, and user interfaces, as well as from separate vehicles like tractors, combine harvesters, and drones. External communication is required for this kind of communication. Communication can be one-way, two-way, or between master and slave. Based on the system application and research target, Wi-Fi routers, cell phones, Bluetooth devices, or PDAs may be used for external communication.
There are three types of connectivity: wired, wireless, and hybrid wired-wireless. Wireless technology is the predominant communication technology involved, which is mostly used in external communication [189]. There are different kinds of external communication technologies, such as Wi-Fi, ZigBee, LORA, RFID, mobile communications, Bluetooth, and so on. A number of different devices, including computers, smartphones, IoT devices, and others, can be connected through them. An overview of existing wireless communication technologies can be found in Table 3.
A low-power wide-area network (LP WAN) enables long-range data transmission up to a few tens of kilometers at a low data rate, with a range of up to a few hundred meters. LoRa is the most widely used LP WAN radio modulation technology, which is combined with the LoRaWAN standard that defines how LoRa devices communicate with gateways [189]. Despite its development for data transmission, recent research has shown that LoRa can also be used to estimate the location of transmitting devices. The location of the signal source can be determined by estimating the distance between the LoRa node and multiple gateways, which are spatially separated [194]. Locations can be determined with good accuracy up to tens of meters, but they are often evaluated under very optimal conditions, such as line-of-sight communications. LoRa localization is not effective in real-life conditions because the estimated distance between a node and a gateway varies greatly depending on the location of the node and the radio channel attenuation [194]. Rus developed a vehicle location and positioning system using a LoRa system and LiDAR and laid the foundation for autonomous vehicles with safety and reliability [25].
There are other technologies, such as Wi-MAX, Sig-Fox, and Narrowband Internet of Things (NB-IoT), that can be used for communication. Both LoRA and NB-IoT are promising technologies, although they differ in a number of ways [195]. LoRaWAN will be described in this paper based on its key characteristics.

Cloud

Cloud-based technologies, communications, and networking innovations have the potential to revolutionize the driving experience by transforming the ATV into a fully connected device. In ATVs, communication will be a key technology. Using the cloud, they can benefit from the experiences of other vehicles, download their data into freely accessible maps in real-time, and transmit danger warnings to their surroundings (Figure 18). There is a strong need for a stable, reliable network system that could handle real-time communication between the devices and the cloud infrastructure in order to realize a cloud-controlled robotic system [196]. In addition to providing ease of maintenance, moving the control logic to the cloud improves the system’s resilience to software and hardware failures. Gerla discussed in 2012 the design principles, issues, and potential applications of vehicular cloud computing (VCC) [197]. A cloud-based Octree design was proposed in the same year by Kumar for autonomous vehicles to help plan their trajectory [198]. Figure 18 illustrates the high-level system architecture.
It is necessary for autonomous vehicles to access vast amounts of data, such as sensor network data, maps, images, videos, weather forecasts, programs, algorithms, etc. Cloud infrastructures provide unlimited elastic storage capacities on-demand over cloud servers that are capable of storing large amounts of big data as well as facilitating intensive computations [200].

6. Control Units and Actuators

6.1. Control Units

Control units are always used to control the autonomous navigation of an AATV. Control is carried out by a computer with multiple connections. To write the controlling program, various languages are used, including C# and C. A direct control system was used for robot navigation when vehicles did not have ECUs. PCs connected to actuators and sensors via AD, DA, or PMC boards were usually used for this type of controlling system. Microcontrollers, such as Arduino, Raspberry Pi, and Renesas, became increasingly common in control units over time. The use of such a control unit allows a computer program to control an actuator based on signals received from sensors. The steering, throttle, and braking are controlled by the control unit (microcontroller). Vehicles without ECUs may benefit from this type of control system. Microcontrollers can be used as ECUs for vehicles. An ECU-equipped vehicle can be controlled by a computer connected to the ECU. As opposed to a direct control system, which uses multiple control boards, the ECU is connected only to the PC. PCs only send commands to ECUs; ECUs control movements. Manufacturers of ECUs provide an API library for developing control algorithms using predetermined standard codes. Vehicle sensors and actuators are connected to the ECU, and external sensors and communication systems are connected to the PC. A CAN bus has been developed to replace RS-232c as a mechanism for communication between the PC and the ECU [39]. A CAN-bus communications system allows the addition of different ECUs for different units. Implements and external functions can be connected to the external ECU.

6.2. Actuators

6.2.1. Steering

A steering wheel converts the driver’s turning movements into a change in the steering angle of the wheels. There are different types of steering—rack and pinion steering, hydraulic power steering (HPS), parameterizable hydraulic power steering (PHPS), electrohydraulic power steering (EHPS), and electromechanical power steering (EMPS)—that can be used in ATVs, Table 4. EPS offers the advantages of being lightweight, high efficiency, reducing energy consumption, and gradually replacing hydraulic power steering (HPS) [201]. Direct-current brush motors and surface permanent magnet synchronous motors are the two most common power-assisted motors used by EPS [202]. In general, brushed DC motors have a short service life due to their structural design. A small size, a high power density, and a large torque-to-inertia ratio define SPMSM motors [203,204]. The SPMSM can provide smoother electric power torque [205]. It is crucial that steering controllers have the ability to control steering action based on several parameters, such as equipment operation state, vehicle speed, tire cornering stiffness, surface conditions, and a number of other parameters that are important to steering dynamics. Guidance systems have been developed and implemented using PID, feed-forward PID, and fuzzy logic steering controllers [168,206]. Different types of terrain often present challenges to agricultural vehicles, including even and uneven terrain or changing and unpredictable surfaces such as asphalt or spongy top soil. Depending on the equipment operation state, travel speed, tire cornering stiffness, ground conditions, and many other parameters influencing steering dynamics, steering controllers should be capable of providing appropriate steering action in automatic or autonomous navigation. Therefore, agricultural vehicle steering controller design is complex. A hydraulic steering system is used in most modern agricultural vehicles, and recent advances in autonomous steering controllers incorporate advanced modifications to existing hydraulic systems to account for vehicle dynamics, such as terrain conditions and vehicle speed (and acceleration). AVs have a control algorithm (steering controllers) that makes decisions [207]. By using this algorithm, a kinematic and dynamic model is developed that controls the lateral and longitudinal errors of the vehicle and guides it along the desired path. As a result, an optimal steering controller with curved-path guidance was achieved with good results [151]. Based on the kinematic model, the hydraulic actuator and front wheels are gain-related through steering linkage geometry [51]. The autonomous steering system drives the steering wheels to follow the desired steering angle calculated by the path-tracking system [208]. It is widely used to control steering wheels using PID controllers and sliding-mode controllers [209]. As discontinuous functions, state feedback and control signals in SMC are insensitive to parametric uncertainties and external disturbances [210]. Due to this, SMC performs well in nonlinear systems.

6.2.2. Speed Control

In an internal combustion engine-based vehicle, the throttle angle is the primary input to the speed controller. An intake manifold’s throttle angle refers to the position of the throttle plate. The throttle plate restricts nearly all airflow into the engine when the throttle opening is close to 0”. With an increasing throttle angle, the throttle plate becomes less restrictive and allows more air to pass through, allowing the engine to produce more torque. Pneumatic actuators are most commonly used to adjust the throttle angle. Generally, stepping motors mounted directly on the throttle plate are used when a faster and more accurate throttle response is required. A fuzzy speed controller was developed for throttle-regulated internal combustion engines on ATVs. Results showed smooth throttle movement, robustness in varying terrain, and commanded speeds ranging from 2 to 30 mph [211]. Alvarado and García used fuzzy-neural-network-based algorithms in order to update velocity according to the terrain roughness in such a way that, as fast as possible, the vehicle safely navigates. Based on fuzzy-neural-network algorithms, Alvarado and Garcia updated the vehicle’s velocity according to the terrain roughness in order to ensure that the vehicle navigated as safely as possible [212]. AGV longitudinal speed tracking control was proposed to use a proportional and internal model controller [213]. MPC with longitudinal velocity tracking is robust to model uncertainty and external disturbances, yielding a faster response and less overshoot than PI [214]. A SMC controller with an input–output linearization method was presented by Cao [215]. A new velocity control strategy for collision avoidance, collision prediction, and a velocity generator are used to adjust the velocity of autonomous agricultural vehicles depending on the movement state, the degree of danger of obstacles, and the distance between the obstacles and the vehicles [216]. An integrated cloud model was used to implement the velocity generator. This algorithm met the real-time requirement with an average processing time of 0.2 s [217].

6.2.3. Brake Control

As a result of the brakes’ design, the vehicle can be stopped in the shortest amount of time and distance possible. ATV safety depends heavily on the brakes. In order to maintain steering control under heavy braking and use all of the deceleration capacity of all four tires, the system must deliver optimum braking force balance between the front and rear wheels. Figure 19 shows the types of brakes. During braking, kinetic energy is converted into heat energy, slowing the speed of the vehicle. In addition to friction between the rotor and pads, caliper pads apply a clamping force to brake rotors. Therefore, large amounts of heat are produced and dissipated. Figure 19 shows different types of brakes in ATVs, and Table 5 presents their main parts, advantages, and disadvantages.

6.3. Operation Control Consistency

There are two types of control methods in the references. A path controller is developed in the first group to track position and trajectory commands based on time [217,218]. Even though several experiments have been conducted [217,219], validation of the performance of these types of control methods is not trivial since, in practice, it is difficult or meaningless to command the AGV/ATV appropriately in all situations based on time-dependent position trajectory commands. In a second approach, the path controller is designed to follow the velocity and heading angle commands [220,221]. In contrast to time-dependent commands, velocity and heading angle commands can be generated simply and intuitively with this approach. It is particularly advantageous to develop velocity and heading angle controllers for steering-type ground vehicles separately because the dynamic model can be divided into longitudinal (velocity) and lateral (yaw motion) subsystems. In PID controllers, heading reference commands were followed [222]. To control the lateral motion, a robust control method with continuous control input was developed [223], and time-varying nonlinear lateral vehicle models were controlled using robust gain scheduling [224]. In addition, the lateral motion controller was developed to deal with steering systems that exhibit backlash-type hysteresis [224]. Since Ohnishi proposed DOB in 1983, it has been one of the most widely used and robust control tools. The robustness of systems is simply achieved by feedbacking the estimations of disturbances by DOB-based robust control, which is based on identified dynamics and measurable states of plants. As rough terrain has lots of uncertainties, the robustness of the path control system is essential, and Shin proposed a DOB-based control method to improve its robustness [21].

7. Conclusions

There have been significant developments in AATVs used in agriculture in recent years, thanks to advancements in wireless and remote communication, fast data processors, electronic sensors, and computer vision, as well as broader AI applications. New techniques and methods have been proposed and verified for sensor data analysis. Some ideas are in the early research stages, while others are mature for industrial application. However, fully autonomous control capabilities in AATVs are limited and require further study in terra-mechanics, uncertainty identification, faster decision-making, and wireless communication.
Most developments are on a research level, with practical limitations on farmlands due to terrain and weather conditions. A major drawback is their low speed, approximately 3 m/s, making them impractical for farming operations. Despite this, there is a growing demand for AATVs, driven by broader AI applications and electromechanical systems.
Future studies will focus on energy-efficient powertrains, steering, and robust control for AATV designs capable of reaching speeds of around 7 m/s on soft soil terrains. Improving tracking robustness and studying dynamic models, especially terra-mechanics, combined with AI, will be crucial. Deep RL is expected to play a key role in vehicle control, particularly in high-dimensional state spaces. Additionally, future work will involve repeatable navigation and rollover simulations during field tests with accurate steering control systems. Testing AATVs in real environments using 5G connectivity and Edge Cloud environments will also be explored.

Author Contributions

Conceptualization, S.E.; methodology, S.E.; investigation, H.E. and S.E.; writing—original draft preparation, H.E.; writing—review and editing, S.E.; visualization, H.E.; supervision, S.E. All authors have read and agreed to the published version of the manuscript.

Funding

This research received no external funding.

Informed Consent Statement

Not applicable.

Data Availability Statement

Not applicable.

Conflicts of Interest

The authors declare that they have no known competing financial interests or personal relationships that could have appeared to influence the work reported in this paper.

References

  1. GMI. Global Market Insights. 2021. Available online: https://www.gminsights.com/industry-analysis/autonomous-farm-equipment-market (accessed on 13 March 2023).
  2. Singer, C.R. Agricultural Worker Shortage Could Rise to 114,000. 2021. Available online: https://www.immigration.ca/agricultural-worker-shortage-rise-114000/?nowprocket=1 (accessed on 13 March 2023).
  3. Future of Farming: Driverless Tractors, ag Robots. 2016. Available online: https://www.cnbc.com/2016/09/16/future-of-farming-driverless-tractors-ag-robots.html (accessed on 13 March 2023).
  4. McBain-Rigg, K.E.; Franklin, R.C.; McDonald, G.C.; Knight, S.M. Why Quad Bike Safety is a Wicked Problem: An Exploratory Study of Attitudes, Perceptions, and Occupational Use of Quad Bikes in Northern Queensland, Australia. J. Agric. Saf. Health 2014, 20, 33–50. [Google Scholar] [CrossRef] [PubMed]
  5. Patel, D.; Gandhi, M.; Shankaranarayanan, H.; Darji, A.D. Design of an Autonomous Agriculture Robot for Real-Time Weed Detection Using CNN. In Advances in VLSI and Embedded Systems; Lecture Notes in Electrical Engineering; Darji, A.D., Joshi, D., Joshi, A., Sheriff, R., Eds.; Springer: Singapore, 2023; Volume 962. [Google Scholar] [CrossRef]
  6. Bochtis, D.D.; Sørensen, C.G.; Busato, P. Advances in agricultural machinery management: A review. Biosyst. Eng. 2014, 126, 69–81. [Google Scholar] [CrossRef]
  7. Ayers, P.; Conger, J.B.; Comer, R.; Troutt, P. Stability Analysis of Agricultural Off-Road Vehicles. J. Agric. Saf. Health 2018, 24, 167–182. [Google Scholar] [CrossRef] [PubMed]
  8. Mcintosh, A.S.; Patton, D.A.; Rechnitzer, G.; Grzebieta, R. Injury mechanisms in fatal Australian quad bike incidents. Traffic Inj. Prev. 2016, 17, 386–390. [Google Scholar] [CrossRef] [PubMed]
  9. Abid, A.; Khan, M.T.; Iqbal, J. A review on fault detection and diagnosis techniques: Basics and beyond. Artif. Intell. Rev. 2021, 54, 3639–3664. [Google Scholar] [CrossRef]
  10. Gültekin, Ö.; Cinar, E.; Özkan, K.; Yazıcı, A. Multisensory data fusion-based deep learning approach for fault diagnosis of an industrial autonomous transfer vehicle. Expert Syst. Appl. 2022, 200, 117055. [Google Scholar] [CrossRef]
  11. Zellner, J.; Kebschull, S. Full-Scale Dynamic Overturn Tests of an ATV with and without a “Quadbar” CPD Using an Injury-Monitoring Dummy; Dynamic Research Inc.: Torrance, CA, USA, 2015. [Google Scholar]
  12. Kanchwala, H.; Chatterjee, A. ADAMS model validation for an all-terrain vehicle using test track data. Adv. Mech. Eng. 2019, 11. [Google Scholar] [CrossRef]
  13. Aras, M.; Zambri, M.; Azis, F.; Rashid, M.; Kamarudin, M. System identification modelling based on modification of all terrain vehicle (ATV) using wireless control system. J. Mech. Eng. Sci. 2015, 9, 1640–1654. [Google Scholar] [CrossRef]
  14. Petterson, T.C.; Gooch, S.D. Rolling Resistance of Atv Tyres In Agriculture. In Proceedings of the Design Society: DESIGN Conference, Cavtat, Croatia, 26–29 October 2020; Cambridge University Press: Cambridge, UK, 2020. [Google Scholar] [CrossRef]
  15. Board, T.R. Tires and Passenger Vehicle Fuel Economy: Informing Consumers, Improving Performance; Special Report 286; The National Academies Press: Washington, DC, USA, 2006; p. 174. [Google Scholar]
  16. Taheri, S.; Sandu, C.; Pinto, E.; Gorsich, D. A technical survey on Terramechanics models for tire–terrain interaction used in modeling and simulation of wheeled vehicles. J. Terramech. 2015, 57, 1–22. [Google Scholar] [CrossRef]
  17. Gallina, A.; Krenn, R.; Scharringhausen, M.; Uhl, T.; Schäfer, B. Parameter Identification of a Planetary Rover Wheel-Soil Contact Model via a Bayesian Approach. J. Field Robot. 2014, 31, 161–175. [Google Scholar] [CrossRef]
  18. Guo, T. Power Consumption Models for Tracked and Wheeled Small Unmanned Ground Vehicles on Deformable Terrains. Ph.D. Thesis, University of Michigan, Detroid, MI, USA, 2016. Available online: https://deepblue.lib.umich.edu/bitstream/handle/2027.42/133484/tianyou_1.pdf?sequence=1 (accessed on 13 March 2023).
  19. Dallas, J.; Jain, K.; Dong, Z.; Sapronov, L.; Cole, M.P.; Jayakumar, P.; Ersal, T. Online terrain estimation for autonomous vehicles on deformable terrains. J. Terramech. 2020, 91, 11–22. [Google Scholar] [CrossRef]
  20. Dallas, J.; Cole, P.M.; Jayakumar, P.; Ersal, T. Neural network based terramechanics modeling and estimation for deformable terrains. arXiv 2020, arXiv:2003.02635. [Google Scholar] [CrossRef]
  21. Shin, J.; Kwak, D.; Lee, T. Robust path control for an autonomous ground vehicle in rough terrain. Control Eng. Pract. 2020, 98, 104384. [Google Scholar] [CrossRef]
  22. Sock, J.; Kim, J.; Min, J.; Kwak, K. Probabilistic traversability map generation using 3D-LIDAR and camera. In Proceedings of the IEEE International Conference on Robotics and Automation (ICRA), Stockholm, Sweden, 16–21 May 2016; pp. 5631–5637. [Google Scholar]
  23. Mousazadeh, H. A technical review on navigation systems of agricultural autonomous off-road vehicles. J. Terramech. 2013, 50, 211–232. [Google Scholar] [CrossRef]
  24. Zhou, C.; Gu, S.; Wen, Y.; Du, Z.; Xiao, C.; Huang, L.; Zhu, M. The review unmanned surface vehicle path planning: Based on multi-modality constraint. Ocean Eng. 2020, 200, 107043. [Google Scholar] [CrossRef]
  25. Rus, C.; Leba, M.; Negru, N.; Marcuş, R.; Costandoiu, A. Autonomous Control System for an Electric ATV. MATEC Web Conf. 2021, 343, 6003. [Google Scholar] [CrossRef]
  26. Ma, X.; Hu, X.; Schweig, S.; Pragalathan, J.; Schramm, D. A Vehicle Guidance Model with a Close-to-Reality Driver Model and Different Levels of Vehicle Automation. Appl. Sci. 2021, 11, 380. [Google Scholar] [CrossRef]
  27. SAE J3016. 2019. Available online: https://www.sae.org/news/2019/01/sae-updates-j3016-automated-driving-graphic (accessed on 1 March 2023).
  28. Bak, T.; Jakobsen, H. Agricultural Robotic Platform with Four Wheel Steering for Weed Detection. Biosyst. Eng. 2004, 87, 125–136. [Google Scholar] [CrossRef]
  29. Li, M.; Imou, K.; Wakabayashi, K.; Yokoyama, S. Review of research on agricultural vehicle autonomous guidance. Int. J. Agric. Biol. Eng. 2009, 2, 1–16. [Google Scholar] [CrossRef]
  30. Heydinger, G.; Bixel, R.; Yapp, J.; Zagorski, S.; Sidhu, A.; Nowjack, J.; Jebode, H. Vehicle Characteristics Measurements of All-Terrain Vehicles; For Consumer Products Safety Commission Contract HHSP I; SEA Vehicle Dynamics Division: Columbus, OH, USA, 2016. [Google Scholar]
  31. Finch, H.J.S.; Samuel, A.M.; Lane, G.P.F. Introduction, in Lockhart & Wiseman’s Crop Husbandry Including Grassland, 9th ed.; Finch, H.J.S., Samuel, A.M., Lane, G.P.F., Eds.; Woodhead Publishing: Sawston, UK, 2014; pp. 17–42. [Google Scholar]
  32. Chou, H.-Y.; Khorsandi, F.; Vougioukas, S.G. Developing and Testing a GPS-Based Steering Control System for an Autonomous All-Terrain Vehicle. In Proceedings of the ASABE Annual International, Virtual, 13–15 July 2020; p. 1. [Google Scholar]
  33. Cheein, F.A.A.; Carelli, R. Agricultural Robotics: Unmanned Robotic Service Units in Agricultural Tasks. IEEE Ind. Electron. Mag. 2013, 7, 48–58. [Google Scholar] [CrossRef]
  34. Mavridou, E.; Vrochidou, E.; Papakostas, G.A.; Pachidis, T.; Kaburlasos, V.G. Machine Vision Systems in Precision Agriculture for Crop Farming. J. Imaging 2019, 5, 89. [Google Scholar] [CrossRef] [PubMed]
  35. Oliveira, L.F.P.; Moreira, A.P.; Silva, M.F. Advances in Agriculture Robotics: A State-of-the-Art Review and Challenges Ahead. Robotics 2021, 10, 52. [Google Scholar] [CrossRef]
  36. Chowdhury, M.Z.; Shahjalal, M.; Hasan, M.K.; Jang, Y.M. The Role of Optical Wireless Communication Technologies in 5G/6G and IoT Solutions: Prospects, Directions, and Challenges. Appl. Sci. 2019, 9, 4367. [Google Scholar] [CrossRef]
  37. Tomaszewski, L.; Kołakowski, R.; Zagórda, M. Application of mobile networks (5G and beyond) in precision agriculture. In Proceedings of the IFIP International Conference on Artificial Intelligence Applications and Innovations, Hersonissos, Greece, 17–20 June 2022; Springer: Berlin/Heidelberg, Germany, 2022. [Google Scholar] [CrossRef]
  38. Vougioukas, S.G. Agricultural robotics. Annual review of control, robotics, and autonomous systems. Ann. Rev. 2019, 2, 365–392. [Google Scholar]
  39. Roshanianfard, A.; Noguchi, N.; Okamoto, H.; Ishii, K. A review of autonomous agricultural vehicles (The experience of Hokkaido University). J. Terramech. 2020, 91, 155–183. [Google Scholar] [CrossRef]
  40. Goulet, N.; Ayalew, B. Energy-Optimal Ground Vehicle Trajectory Planning on Deformable Terrains. IFAC-PapersOnLine 2022, 55, 196–201. [Google Scholar] [CrossRef]
  41. Vantsevich, V.V.; Gorsich, D.J.; Paldan, J.R.; Ghasemi, M.; Moradi, L. Terrain mobility performance optimization: Fundamentals for autonomous vehicle applications. Part I. New mobility indices: Optimization and analysis. J. Terramech. 2022, 104, 31–47. [Google Scholar] [CrossRef]
  42. Jonsson, F. CAKE-Kibb. 2022. Available online: https://www.umu.se/en/umea-institute-of-design/education/student-work/masters-programme-in-transportation-design/2022/fanny-jonsson/ (accessed on 3 March 2023).
  43. Erian, K.H. Autonomous Control of an All-Terrain Vehicle Using Embedded Systems and Artificial Intelligence Techniques. Ph.D. Thesis, The University of North Carolina at Charlotte, Charlotte, NC, USA, 2022. ACM Digital Library. Available online: https://dl.acm.org/doi/book/10.5555/AAI29164332 (accessed on 13 March 2023).
  44. Reid, J.F.; Zhang, Q.; Noguchi, N.; Dickson, M. Agricultural automatic guidance research in North America. Comput. Electron. Agric. 2000, 25, 155–167. [Google Scholar] [CrossRef]
  45. Tillett, N. Automatic guidance sensors for agricultural field machines: A review. J. Agric. Eng. Res. 1991, 50, 167–187. [Google Scholar] [CrossRef]
  46. Murakami, N.; Ito, A.; Will, J.D.; Steffen, M.; Inoue, K.; Kita, K.; Miyaura, S. Environment identification technique using hyper omni-vision and image map. In Proceedings of the 3rd IFAC International Workshop Bio-Robotics, Sapporo, Japan, 6–8 September 2006. [Google Scholar]
  47. Klančar, G.; Zdešar, A.; Blažič, S.; Škrjanc, I. Chapter 2—Motion Modeling for Mobile Robots. In Wheeled Mobile Robotics; Klančar, G., Zdešar, A., Blažič, S., Škrjanc, I., Eds.; Butterworth-Heinemann: Oxford, UK, 2017; pp. 13–59. [Google Scholar]
  48. Kayacan, E.; Kayacan, E.; Ramon, H.; Saeys, W. Towards agrobots: Identification of the yaw dynamics and trajectory tracking of an autonomous tractor. Comput. Electron. Agric. 2015, 115, 78–87. [Google Scholar] [CrossRef]
  49. Benson, E.; Stombaugh, T.S.; Noguchi, N.; Will, J.D.; Reid, J.F. An evaluation of a geomagnetic direction sensor for vehicle guidance in precision agriculture applications. In Proceedings of the ASAE Annual International Meeting, Orlando, FL, USA, 12–15 July 1998. [Google Scholar]
  50. O’Connor, M.; Bell, T.; Elkaim, G.; Parkinson, B. Automatic steering of farm vehicles using GPS. In Proceedings of the 3rd International Conference on Precision Agriculture, Minneapolis, MN, USA, 23–26 June 1996; Wiley and Sons: Hoboken, NJ, USA, 1996. [Google Scholar] [CrossRef]
  51. Zhang, Q.; Wu, D.; Reid, J.; Benson, E. Model recognition and validation for an off-road vehicle electrohydraulic steering controller. Mechatronics 2002, 12, 845–858. [Google Scholar] [CrossRef]
  52. Fang, H.; Fan, R.; Thuilot, B.; Martinet, P. Trajectory tracking control of farm vehicles in presence of sliding. Robot. Auton. Syst. 2006, 54, 828–839. [Google Scholar] [CrossRef]
  53. Lenain, R.; Thuilot, B.; Cariou, C.; Martinet, P. Model Predictive Control for Vehicle Guidance in Presence of Sliding: Application to Farm Vehicles Path Tracking. In Proceedings of the IEEE International Conference on Robotics and Automation, Barcelona, Spain, 18–22 April 2005; pp. 885–890. [Google Scholar]
  54. Lenain, R.; Thuilot, B.; Cariou, C.; Martinet, P. High accuracy path tracking for vehicles in presence of sliding: Application to farm vehicle automatic guidance for agricultural tasks. Auton. Robot. 2006, 21, 79–97. [Google Scholar] [CrossRef]
  55. Franceschetti, B.; Lenain, R.; Rondelli, V. Comparison between a rollover tractor dynamic model and actual lateral tests. Biosyst. Eng. 2014, 127, 79–91. [Google Scholar] [CrossRef]
  56. Kayacan, E.; Kayacan, E.; Ramon, H.; Saeys, W. Learning in Centralized Nonlinear Model Predictive Control: Application to an Autonomous Tractor-Trailer System. IEEE Trans. Control Syst. Technol. 2014, 23, 197–205. [Google Scholar] [CrossRef]
  57. Bouton, N.; Lenain, R.; Thuilot, B.; Martinet, P. Backstepping observer dedicated to tire cornering stiffness estimation: Application to an all terrain vehicle and a farm tractor. In Proceedings of the IEEE/RSJ International Conference on Intelligent Robots and Systems, San Diego, CA, USA, 29 October–2 November 2007; pp. 1763–1768. [Google Scholar]
  58. Biral, F.; Pelanda, R.; Cis, A. Anti-dive front suspension for agricultural tractors: Dynamic model and validation. In Advances in Italian Mechanism Science: Proceedings of the First International Conference of IFToMM Italy; Springer: Berlin/Heidelberg, Germany, 2017. [Google Scholar] [CrossRef]
  59. Pan, K.; Zheng, W.; Shen, X. Optimization Design and Analysis of All-terrain Vehicle Based on Modal Analysis. J. Phys. Conf. Ser. 2021, 1885, 52055. [Google Scholar] [CrossRef]
  60. Alipour, K.; Robat, A.B.; Tarvirdizadeh, B. Dynamics modeling and sliding mode control of tractor-trailer wheeled mobile robots subject to wheels slip. Mech. Mach. Theory 2019, 138, 16–37. [Google Scholar] [CrossRef]
  61. Liao, J.; Chen, Z.; Yao, B. Model-Based Coordinated Control of Four-Wheel Independently Driven Skid Steer Mobile Robot with Wheel–Ground Interaction and Wheel Dynamics. IEEE Trans. Ind. Inform. 2018, 15, 1742–1752. [Google Scholar] [CrossRef]
  62. Fnadi, M.; Plumet, F.; Amar, F.B. Nonlinear Tire Cornering Stiffness Observer for a Double Steering Off-Road Mobile Robot. In Proceedings of the 2019 International Conference on Robotics and Automation (ICRA), Montreal, QC, Canada, 24 May 2019; pp. 7529–7534. [Google Scholar] [CrossRef]
  63. Feng, L.; He, Y.; Bao, Y.; Fang, H. Development of Trajectory Model for a Tractor-Implement System for Automated Navigation Applications. In Proceedings of the IEEE Instrumentation and Measurement Technology, Ottawa, ON, Canada, 16–19 May 2005; pp. 1330–1334. [Google Scholar]
  64. Soe, T.T.; Tun, H.M. Implementation of Double Closed-Loop Control System for Unmanned Ground Vehicles. Int. J. Sci. Technol. Res. 2014, 3, 257–262. [Google Scholar]
  65. Kim, J.-H.; Kim, C.-K.; Jo, G.-H.; Kim, B.-W. The research of parking mission planning algorithm for unmanned ground vehicle. In Proceedings of the 2010 International Conference on Control, Automation and Systems (ICCAS 2010), Gyeonggi-do, Republic of Korea, 27–30 October 2010; pp. 1093–1096. [Google Scholar] [CrossRef]
  66. Deur, J.; Petric, J.; Asgari, J.; Hrovat, D. Recent Advances in Control-Oriented Modeling of Automotive Power Train Dynamics. In Proceedings of the IEEE International Symposium on Industrial Electronics, Dubrovnik, Croatia, 20–23 June 2005; pp. 269–278. [Google Scholar] [CrossRef]
  67. Tran, T.H.; Kwok, N.M.; Scheding, S.; Ha, Q.P. Dynamic Modelling of Wheel-Terrain Interaction of a UGV. In Proceedings of the 2007 IEEE International Conference on Automation Science and Engineering, Scottsdale, AZ, USA, 22–25 September 2007; pp. 369–374. [Google Scholar] [CrossRef]
  68. Dave, P.N.; Patil, J.B. Modeling and control of nonlinear unmanned ground all terrain vehicle. In Proceedings of the 2015 International Conference on Trends in Automation, Communications and Computing Technology (I-TACT-15), Bangalore, India, 21–22 December 2015; pp. 1–7. [Google Scholar] [CrossRef]
  69. Cho, S.I.; Lee, J.H. Autonomous Speedsprayer using Differential Global Positioning System, Genetic Algorithm and Fuzzy Control. J. Agric. Eng. Res. 2000, 76, 111–119. [Google Scholar] [CrossRef]
  70. Kodagoda, K.R.S.; Wijesoma, W.S.; Teoh, E.K. Fuzzy speed and steering control of an AGV. IEEE Trans. Control Syst. Technol. 2002, 10, 112–120. [Google Scholar] [CrossRef]
  71. Cortner, A.; Conrad, J.M.; BouSaba, N.A. Autonomous all-terrain vehicle steering. In Proceedings of the IEEE Southeastcon, Orlando, FL, USA, 15–18 March 2012; pp. 1–5. [Google Scholar] [CrossRef]
  72. Eski, I.; Kuş, Z.A. Control of unmanned agricultural vehicles using neural network-based control system. Neural Comput. Appl. 2019, 31, 583–595. [Google Scholar] [CrossRef]
  73. Hossain, T.; Habibullah, H.; Islam, R. Steering and Speed Control System Design for Autonomous Vehicles by Developing an Optimal Hybrid Controller to Track Reference Trajectory. Machines 2022, 10, 420. [Google Scholar] [CrossRef]
  74. Behrooz, F.; Mariun, N.; Marhaban, M.H.; Radzi, M.A.M.; Ramli, A.R. Review of Control Techniques for HVAC Systems—Nonlinearity Approaches Based on Fuzzy Cognitive Maps. Energies 2018, 11, 495. [Google Scholar] [CrossRef]
  75. Sumarsono, S. Control System for an All Terrain Vehicle Using DGPS and Fuzzy Logic, in Civil and Environmental Engineering. Ph.D. Thesis, University of Melbourne, Melbourne, Australia, 1999. [Google Scholar]
  76. Delavarpour, N.; Eshkabilov, S.; Bon, T.; Nowatzki, J.; Bajwa, S. The Tractor-Cart System Controller with Fuzzy Logic Rules. Appl. Sci. 2020, 10, 5223. [Google Scholar] [CrossRef]
  77. Bonadies, S.; Smith, N.; Niewoehner, N.; Lee, A.S.; Lefcourt, A.M.; Gadsden, S.A. Development of Proportional–Integral–Derivative and Fuzzy Control Strategies for Navigation in Agricultural Environments. J. Dyn. Syst. Meas. Control 2018, 140, 4038504. [Google Scholar] [CrossRef]
  78. Yao, L.; Pitla, S.K.; Zhao, C.; Liew, C.; Hu, D.; Yang, Z. An Improved Fuzzy Logic Control Method for Path Tracking of an Autonomous Vehicle. Trans. ASABE 2020, 63, 1895–1904. [Google Scholar] [CrossRef]
  79. Fleming, P.; Purshouse, R. Evolutionary algorithms in control systems engineering: A survey. Control Eng. Pract. 2002, 10, 1223–1241. [Google Scholar] [CrossRef]
  80. Mathworks. Available online: https://www.mathworks.com/help/gads/what-is-the-genetic-algorithm.html (accessed on 21 March 2023).
  81. Ryerson, A.F.; Zhang, Q. Vehicle path planning for complete field coverage using genetic algorithms. Agric. Eng. Int. CIGR J. 2007, IX. [Google Scholar]
  82. Ashraf, M.A.; Takeda, J.-I.; Torisu, R. Neural Network Based Steering Controller for Vehicle Navigation on Sloping Land. Eng. Agric. Environ. Food 2010, 3, 100–104. [Google Scholar] [CrossRef]
  83. Shiltagh, N.A.; Jalal, L.D. Path planning of intelligent mobile robot using modified genetic algorithm. Int. J. Soft Comput. Eng. 2013, 3, 31–36. [Google Scholar]
  84. Chen, Z.; Xiao, J.; Wang, G. An Effective Path Planning of Intelligent Mobile Robot Using Improved Genetic Algorithm. Wirel. Commun. Mob. Comput. 2022, 2022, 9590367. [Google Scholar] [CrossRef]
  85. Torisu, R.; Hai, S.; Takeda, J.-I.; Ashraf, M.A. Automatic Tractor Guidance on Sloped Terrain (Part 1) Formulation of NN Vehicle Model and Design of control Law for contour Line Travel. J. Jpn. Soc. Agric. Mach. 2002, 64, 88–95. [Google Scholar] [CrossRef]
  86. Kamilaris, A.; Prenafeta-Boldú, F.X. Deep learning in agriculture: A survey. Comput. Electron. Agric. 2018, 147, 70–90. [Google Scholar] [CrossRef]
  87. Afram, A.; Janabi-Sharifi, F. Theory and Applications of HVAC Control systems–A Review of Model Predictive Control (MPC). Build. Environ. 2014, 72, 343–355. [Google Scholar] [CrossRef]
  88. Christofides, P.D.; Scattolini, R.; de la Peña, D.M.; Liu, J. Distributed model predictive control: A tutorial review and future research directions. Comput. Chem. Eng. 2013, 51, 21–41. [Google Scholar] [CrossRef]
  89. Wang, Y.-G.; Shi, Z.-G.; Cai, W.-J. PID autotuner and its application in HVAC systems. In Proceedings of the American Control Conference, Arlington, VA, USA, 25–27 June 2001; Volume 3, pp. 2192–2196. [Google Scholar] [CrossRef]
  90. Froisy, J.B. Model predictive control: Past, present and future. ISA Trans. 1994, 33, 235–243. [Google Scholar] [CrossRef]
  91. Morari, M.; Lee, J.H. Model predictive control: Past, present and future. Comput. Chem. Eng. 1999, 23, 667–682. [Google Scholar] [CrossRef]
  92. Qin, S.; Badgwell, T.A. A survey of industrial model predictive control technology. Control Eng. Pract. 2003, 11, 733–764. [Google Scholar] [CrossRef]
  93. Rawlings, J.B.; Mayne, D.Q. Model Predictive Control: Theory and Design, 2nd ed.; Nob Hill Pub LLC: San Francisco, CA, USA, 2009. [Google Scholar]
  94. Han, H.; Qiao, J. Nonlinear Model-Predictive Control for Industrial Processes: An Application to Wastewater Treatment Process. IEEE Trans. Ind. Electron. 2013, 61, 1970–1982. [Google Scholar] [CrossRef]
  95. Lu, E.; Xue, J.; Chen, T.; Jiang, S. Robust Trajectory Tracking Control of an Autonomous Tractor-Trailer Considering Model Parameter Uncertainties and Disturbances. Agriculture 2023, 13, 869. [Google Scholar] [CrossRef]
  96. Coen, T.; Anthonis, J.; De Baerdemaeker, J. Cruise control using model predictive control with constraints. Comput. Electron. Agric. 2008, 63, 227–236. [Google Scholar] [CrossRef]
  97. Backman, J.; Oksanen, T.; Visala, A. Navigation system for agricultural machines: Nonlinear Model Predictive path tracking. Comput. Electron. Agric. 2012, 82, 32–43. [Google Scholar] [CrossRef]
  98. Horvath, K.; Petreczky, M.; Rajaoarisoa, L.; Duviella, E.; Chuquet, K. MPC control of water level in a navigation canal—The Cuinchy-Fontinettes case study. In Proceedings of the 2014 European Control Conference (ECC), Strasbourg, France, 24–27 June 2014; pp. 1337–1342. [Google Scholar] [CrossRef]
  99. Kayacan, E.; Kayacan, E.; Ramon, H.; Saeys, W. Distributed nonlinear model predictive control of an autonomous tractor–trailer system. Mechatronics 2014, 24, 926–933. [Google Scholar] [CrossRef]
  100. Bin, Y.; Shim, T. Constrained model predictive control for backing-up tractor-trailer system. In Proceedings of the 10th World Congress on Intelligent Control and Automation (WCICA 2012), Beijing, China, 6–8 July 2012; pp. 2165–2170. [Google Scholar] [CrossRef]
  101. Yakub, F.; Mori, Y. Comparative study of autonomous path-following vehicle control via model predictive control and linear quadratic control. Proc. Inst. Mech. Eng. Part D J. Automob. Eng. 2015, 229, 1695–1714. [Google Scholar] [CrossRef]
  102. Plessen, M.M.G.; Bemporad, A. Reference trajectory planning under constraints and path tracking using linear time-varying model predictive control for agricultural machines. Biosyst. Eng. 2017, 153, 28–41. [Google Scholar] [CrossRef]
  103. Kalmari, J.; Backman, J.; Visala, A. Nonlinear model predictive control of hydraulic forestry crane with automatic sway damping. Comput. Electron. Agric. 2014, 109, 36–45. [Google Scholar] [CrossRef]
  104. Kalmari, J.; Backman, J.; Visala, A. Coordinated motion of a hydraulic forestry crane and a vehicle using nonlinear model predictive control. Comput. Electron. Agric. 2017, 133, 119–127. [Google Scholar] [CrossRef]
  105. Ding, Y.; Wang, L.; Li, Y.; Li, D. Model predictive control and its application in agriculture: A review. Comput. Electron. Agric. 2018, 151, 104–117. [Google Scholar] [CrossRef]
  106. Kalmanfilter. Available online: https://www.kalmanfilter.net/kalman1d.html (accessed on 1 April 2023).
  107. Gan-Mor, S.; Upchurch, B.; Clark, R.; Hardage, D. Implement Guidance Error as Affected by Field Conditions Using Automatic DGPS Tractor Guidance; American Society of Agricultural and Biological Engineers: St. Joseph, MI, USA, 2002. [Google Scholar]
  108. Shen, J.; Huang, X. GNSS Application Case Agricultural Auto-Steering and Guidance Systems. In Proceedings of the 16thMeeting of the International Committee on Global Navigation Satellite Systems (ICG-16), Abu Dhabi, United Arab Emirates, 10–14 October 2022; Beijing UniStrong Science and Technology Co., Ltd.: Beijing, China, 2022. Available online: https://www.unoosa.org (accessed on 1 April 2023).
  109. Nagasaka, Y.; Umeda, N.; Kanetai, Y.; Taniwaki, K.; Sasaki, Y. Autonomous guidance for rice transplanting using global positioning and gyroscopes. Comput. Electron. Agric. 2004, 43, 223–234. [Google Scholar] [CrossRef]
  110. Nørremark, M.; Griepentrog, H.; Nielsen, J.; Søgaard, H. The development and assessment of the accuracy of an autonomous GPS-based system for intra-row mechanical weed control in row crops. Biosyst. Eng. 2008, 101, 396–410. [Google Scholar] [CrossRef]
  111. Han, S.; Zhang, Q.; Ni, B.; Reid, J. A guidance directrix approach to vision-based vehicle guidance systems. Comput. Electron. Agric. 2004, 43, 179–195. [Google Scholar] [CrossRef]
  112. Hague, T.; Tillett, N. A bandpass filter-based approach to crop row location and tracking. Mechatronics 2001, 11, 1–12. [Google Scholar] [CrossRef]
  113. Crassidis, J.L. Sigma-point Kalman filtering for integrated GPS and inertial navigation. IEEE Trans. Aerosp. Electron. Syst. 2006, 42, 750–756. [Google Scholar] [CrossRef]
  114. Zhang, Y.; Gao, F.; Tian, L. INS/GPS integrated navigation for wheeled agricultural robot based on sigma-point Kalman Filter. In Proceedings of the 2008 Asia Simulation Conference–7th International Conference on System Simulation and Scientific Computing (ICSC), Beijing, China, 10–12 October 2008; pp. 1425–1431. [Google Scholar] [CrossRef]
  115. Pratama, P.S.; Gulakari, A.V.; Setiawan, Y.D.; Kim, D.H.; Kim, H.K.; Kim, S.B. Trajectory tracking and fault detection algorithm for automatic guided vehicle based on multiple positioning modules. Int. J. Control Autom. Syst. 2016, 14, 400–410. [Google Scholar] [CrossRef]
  116. Gao, B.; Hu, G.; Zhu, X.; Zhong, Y. A Robust Cubature Kalman Filter with Abnormal Observations Identification Using the Mahalanobis Distance Criterion for Vehicular INS/GNSS Integration. Sensors 2019, 19, 5149. [Google Scholar] [CrossRef] [PubMed]
  117. Lillicrap, T.P. Continuous control with deep reinforcement learning. arXiv 2015, arXiv:1509.02971. Available online: http://arxiv.org/abs/1509.02971 (accessed on 21 April 2023).
  118. Mnih, V.; Kavukcuoglu, K.; Silver, D.; Graves, A.; Antonoglou, I.; Wierstra, D.; Riedmiller, M. Playing atari with deep reinforcement learning. arXiv 2013, arXiv:1312.5602. [Google Scholar] [CrossRef]
  119. Josef, S.; Degani, A. Deep Reinforcement Learning for Safe Local Planning of a Ground Vehicle in Unknown Rough Terrain. IEEE Robot. Autom. Lett. 2020, 5, 6748–6755. [Google Scholar] [CrossRef]
  120. Kahn, G.; Villaflor, A.; Ding, B.; Abbeel, P.; Levine, S. Self-Supervised Deep Reinforcement Learning with Generalized Computation Graphs for Robot Navigation. In Proceedings of the 2018 IEEE International Conference on Robotics and Automation (ICRA), Brisbane, Australia, 21–25 May 2018; pp. 1–8. [Google Scholar] [CrossRef]
  121. Wiberg, V.; Wallin, E.; Nordfjell, T.; Servin, M. Control of Rough Terrain Vehicles Using Deep Reinforcement Learning. IEEE Robot. Autom. Lett. 2021, 7, 390–397. [Google Scholar] [CrossRef]
  122. Ampatzidis, Y.G.; Vougioukas, S.G.; Bochtis, D.D.; Tsatsarelis, C.A. A yield mapping system for hand-harvested fruits based on RFID and GPS location technologies: Field testing. Precis. Agric. 2009, 10, 63–72. [Google Scholar] [CrossRef]
  123. Narvaez, F.Y.; Reina, G.; Torres-Torriti, M.; Kantor, G.; Cheein, F.A. A Survey of Ranging and Imaging Techniques for Precision Agriculture Phenotyping. IEEE/ASME Trans. Mechatron. 2017, 22, 2428–2439. [Google Scholar] [CrossRef]
  124. Prado, J.; Michałek, M.M.; Cheein, F.A. Machine-learning based approaches for self-tuning trajectory tracking controllers under terrain changes in repetitive tasks. Eng. Appl. Artif. Intell. 2018, 67, 63–80. [Google Scholar] [CrossRef]
  125. Moshou, D.; Bravo, C.; Oberti, R.; West, J.S.; Ramon, H.; Vougioukas, S.; Bochtis, D. Intelligent multi-sensor system for the detection and treatment of fungal diseases in arable crops. Biosyst. Eng. 2011, 108, 311–321. [Google Scholar] [CrossRef]
  126. Rahnemoonfar, M.; Sheppard, C. Deep Count: Fruit Counting Based on Deep Simulated Learning. Sensors 2017, 17, 905. [Google Scholar] [CrossRef] [PubMed]
  127. Vasconez, J.P.; Salvo, J.; Auat, F. Toward Semantic Action Recognition for Avocado Harvesting Process based on Single Shot MultiBox Detector. In Proceedings of the 2018 IEEE International Conference on Automation/23rd Congress of the Chilean Association of Automatic Control (ICA-ACCA), Concepcion, Chile, 17–19 October 2018; pp. 1–6. [Google Scholar] [CrossRef]
  128. Xiong, L.; Xia, X.; Lu, Y.; Liu, W.; Gao, L.; Song, S.; Han, Y.; Yu, Z. IMU-Based Automated Vehicle Slip Angle and Attitude Estimation Aided by Vehicle Dynamics. Sensors 2019, 19, 1930. [Google Scholar] [CrossRef] [PubMed]
  129. Yang, Y.; Fu, M.; Zhu, H.; Xiong, G.; Sun, C. Control methods of mobile robot rough-terrain trajectory tracking. In Proceedings of the 8th IEEE International Conference on Control and Automation (ICCA 2010), Xiamen, China, 9–11 June 2010; pp. 731–738. [Google Scholar] [CrossRef]
  130. Ishii, K.; Terao, H.; Noguchi, N. Studies on Self-learning Autonomous Vehicles (Part 3) Positioning System for Autonomous Vehicle. J. Jpn. Soc. Agric. Mach. 1998, 60, 51–58. [Google Scholar]
  131. Benson, E.; Reid, J.; Zhang, Q. Machine Vision-based Guidance System for Agricultural Grain Harvesters using Cut-edge Detection. Biosyst. Eng. 2003, 86, 389–398. [Google Scholar] [CrossRef]
  132. Marchant, J.A.; Brivot, R. Real-Time Tracking of Plant Rows Using a Hough Transform. Real-Time Imaging 1995, 1, 363–371. [Google Scholar] [CrossRef]
  133. Søgaard, H.; Olsen, H. Determination of crop rows by image analysis without segmentation. Comput. Electron. Agric. 2003, 38, 141–158. [Google Scholar] [CrossRef]
  134. Okamoto, H.; Hamada, K.; Kataoka, T.; Hata, M.T.A.S.; Terawaki, M.; Hata, S. Automatic Guidance System with Crop Row Sensor. In Proceedings of the Automation Technology for Off-Road Equipment, Chicago, IL, USA, 26–27 July 2002; p. 307. [Google Scholar] [CrossRef]
  135. Billingsley, J.; Schoenfisch, M. The successful development of a vision guidance system for agriculture. Comput. Electron. Agric. 1997, 16, 147–163. [Google Scholar] [CrossRef]
  136. Kise, M.; Zhang, Q.; Más, F.R. A Stereovision-based Crop Row Detection Method for Tractor-automated Guidance. Biosyst. Eng. 2005, 90, 357–367. [Google Scholar] [CrossRef]
  137. Barawid, O.C.; Mizushima, A.; Ishii, K.; Noguchi, N. Development of an Autonomous Navigation System using a Two-dimensional Laser Scanner in an Orchard Application. Biosyst. Eng. 2007, 96, 139–149. [Google Scholar] [CrossRef]
  138. Lee, J.-W.; Choi, S.-U.; Lee, Y.-J.; Lee, K. A study on recognition of road lane and movement of vehicles using vision system. In Proceedings of the 40th SICE Annual Conference (SICE 2001), Nagoya, Japan, 27 July 2001; pp. 38–41. [Google Scholar] [CrossRef]
  139. Leemans, V.; Destain, M.-F. Line cluster detection using a variant of the Hough transform for culture row localisation. Image Vis. Comput. 2006, 24, 541–550. [Google Scholar] [CrossRef]
  140. Marchant, J. Tracking of row structure in three crops using image analysis. Comput. Electron. Agric. 1996, 15, 161–179. [Google Scholar] [CrossRef]
  141. Yu, B.; Jain, A. Lane boundary detection using a multiresolution Hough transform. In Proceedings of the International Conference on Image Processing, Santa Barbara, CA, USA, 26–29 October 1997; Volume 2, pp. 748–751. [Google Scholar] [CrossRef]
  142. Kalman, R.E. A New Approach to Linear Filtering and Prediction Problems. J. Basic Eng. 1960, 82, 35–45. [Google Scholar] [CrossRef]
  143. Heidman, B.; Abidine, A.; Upadhyaya, S.; Hills, D. Application of RTK GPS based auto-guidance system in agricultural production. In Proceedings of the 6th International Conference on Precision Agriculture and Other Precision Resources Management, Minneapolis, MN, USA, 14–17 July 2002; American Society of Agronomy: Minneapolis, MN, USA, 2003. [Google Scholar]
  144. Kumagai, H.; Kubo, Y.; Kihara, M.; Sugimoto, S. DGPS/INS/VMS Integration for High Accuracy Land-Vehicle Positioning. In Proceedings of the 12th International Technical Meeting of the Satellite Division of The Institute of Navigation (ION GPS 1999), Nashville, TN, USA, 14–17 September 1999. [Google Scholar] [CrossRef]
  145. Bell, T. Automatic tractor guidance using carrier-phase differential GPS. Comput. Electron. Agric. 2000, 25, 53–66. [Google Scholar] [CrossRef]
  146. Gan-Mor, S.; Ronen, B.; Josef, S.; Bilanki, Y. Guidance of Automatic Vehicle for Greenhouse Transportation. Acta Hortic. 1996, 1, 99–104. [Google Scholar] [CrossRef]
  147. Larsen, W.; Nielsen, G.; Tyler, D. Precision navigation with GPS. Comput. Electron. Agric. 1994, 11, 85–95. [Google Scholar] [CrossRef]
  148. Yamamoto, S.; Yukumoto, O.; Matsuo, Y. Robotization of Agricultural Vehicles—Various Operation with Tilling Robot. IFAC Proc. Vol. 2000, 34, 203–208. [Google Scholar] [CrossRef]
  149. Slaughter, D.; Giles, D.; Downey, D. Autonomous robotic weed control systems: A review. Comput. Electron. Agric. 2008, 61, 63–78. [Google Scholar] [CrossRef]
  150. Kise, M.; Noguchi, N.; Ishii, K.; Terao, H. Development of the Agricultural Autonomous Tractor with an RTK-GPS and a Fog. IFAC Proc. Vol. 2001, 34, 99–104. [Google Scholar] [CrossRef]
  151. Kise, M.; Noguchi, N.; Ishii, K.; Terao, H. The Development of the Autonomous Tractor with Steering Controller Applied by Optimal Control. In Proceedings of the Automation Technology for Off-Road Equipment, Chicago, IL, USA, 26–27 July 2002; IFAC Proceedings Volumes. Volume 34, pp. 99–104. [Google Scholar] [CrossRef]
  152. Ehsani, M.R.; Sullivan, M.D.; Zimmerman, T.L.; Stombaugh, T. Evaluating the Dynamic Accuracy of Low-Cost GPS Receivers. In Proceedings of the 2003 ASAE Annual Meeting, Las Vegas, NV, USA, 27–30 July 2003; p. 1. [Google Scholar]
  153. Kohno, Y.; Kondo, N.; Iida, M.; Kurita, M.; Shiigi, T.; Ogawa, Y.; Kaichi, T.; Okamoto, S. Development of a Mobile Grading Machine for Citrus Fruit. Eng. Agric. Environ. Food 2011, 4, 7–11. [Google Scholar] [CrossRef]
  154. Borenstein, J. Experimental results from internal odometry error correction with the Omni Mate mobile robot. IEEE Trans. Robot. Autom. 1998, 14, 963–969. [Google Scholar] [CrossRef]
  155. Chenavier, F.; Crowley, J. Position estimation for a mobile robot using vision and odometry. In Proceedings of the 1992 IEEE International Conference on Robotics and Automation, Nice, France, 12–14 May 1992. pp. 2588–2593. [CrossRef]
  156. Morimoto, E.; Suguri, M.; Umeda, M. Vision-based Navigation System for Autonomous Transportation Vehicle. Precis. Agric. 2005, 6, 239–254. [Google Scholar] [CrossRef]
  157. Subramanian, V.; Burks, T.F.; Arroyo, A. Development of machine vision and laser radar based autonomous vehicle guidance systems for citrus grove navigation. Comput. Electron. Agric. 2006, 53, 130–143. [Google Scholar] [CrossRef]
  158. Ahamed, T.; Takigawa, T.; Koike, M.; Honma, T.; Hasegawa, H.; Zhang, Q. Navigation using a laser range finder for autonomous tractor (part 1) positioning of implement. J. Jpn. Soc. Agric. Mach. 2006, 68, 68–77. [Google Scholar] [CrossRef]
  159. Chandan, K.J.; Akhil, V.V. Investigation on Accuracy of Ultrasonic and LiDAR for Complex Structure Area Measurement. In Proceedings of the 6th International Conference on Trends in Electronics and Informatics (ICOEI), Tirunelveli, India, 28–30 April 2022; pp. 134–139. [Google Scholar] [CrossRef]
  160. Harper, N.; McKerrow, P. Recognising plants with ultrasonic sensing for mobile robot navigation. Robot. Auton. Syst. 2001, 34, 71–82. [Google Scholar] [CrossRef]
  161. Geo-Matching. Available online: https://geo-matching.com/articles/vectornav-gnss-ins-systems-for-lidar-mapping#:~:text=Modern%20LiDAR%20sensors%20have%20multiple,that%20represents%20the%20surrounding%20area (accessed on 1 April 2023).
  162. Wang, G.; Wu, J.; Xu, T.; Tian, B. 3D Vehicle Detection with RSU LiDAR for Autonomous Mine. IEEE Trans. Veh. Technol. 2021, 70, 344–355. [Google Scholar] [CrossRef]
  163. Chen, X.; Vizzo, I.; Labe, T.; Behley, J.; Stachniss, C. Range Image-based LiDAR Localization for Autonomous Vehicles. In Proceedings of the 2021 IEEE International Conference on Robotics and Automation (ICRA), Xi’an, China, 30 May–5 June 2021; pp. 5802–5808. [Google Scholar] [CrossRef]
  164. Sualeh, M.; Kim, G.-W. Semantics Aware Dynamic SLAM Based on 3D MODT. Sensors 2021, 21, 6355. [Google Scholar] [CrossRef]
  165. Jahromi, B.S.; Tulabandhula, T.; Cetin, S. Real-Time Hybrid Multi-Sensor Fusion Framework for Perception in Autonomous Vehicles. Sensors 2019, 19, 4357. [Google Scholar] [CrossRef]
  166. Schönberg, T.; Ojala, M.; Suomela, J.; Torpo, A.; Halme, A. Positioning an autonomous off-road vehicle by using fused DGPS and inertial navigation. IFAC Proc. Vol. 1995, 28, 211–216. [Google Scholar] [CrossRef]
  167. Barshan, B.; Durrant-Whyte, H. Inertial navigation systems for mobile robots. IEEE Trans. Robot. Autom. 1995, 11, 328–342. [Google Scholar] [CrossRef]
  168. Zhang, Q.; Reid, J.F.; Noguchi, N. Agricultural vehicle navigation using multiple guidance sensors. In Proceedings of the International Conference on Field and Service Robotics, Pittsburgh, PA, USA, 29–31 August 1999; University of Illinois at Urbana: Urbana, IL, USA, 1999. [Google Scholar]
  169. Noguchi, N.; Terao, H. Path planning of an agricultural mobile robot by neural network and genetic algorithm. Comput. Electron. Agric. 1997, 18, 187–204. [Google Scholar] [CrossRef]
  170. Liu, J.; Ayers, P.D. Application of a Tractor Stability Index for Protective Structure Deployment. J. Agric. Saf. Health 1998, 4, 171–181. [Google Scholar] [CrossRef]
  171. Liu, J.; Ayers, P.D. Off-road Vehicle Rollover and Field Testing of Stability Index. J. Agric. Saf. Health 1999, 5, 59–72. [Google Scholar] [CrossRef]
  172. Nichol, C.I.; Iii, H.J.S.; Murphy, D.J. Simplified Overturn Stability Monitoring of Agricultural Tractors. J. Agric. Saf. Health 2005, 11, 99–108. [Google Scholar] [CrossRef]
  173. Liu, B.; Koc, A.B. Field Tests of a Tractor Rollover Detection and Emergency Notification System. J. Agric. Saf. Health 2015, 21, 113–127. [Google Scholar] [CrossRef]
  174. Siegwart, R.; Nourbakhsh, I.R.; Scaramuzza, D. Introduction to Autonomous Mobile Robots; MIT Press: Cambridge, MA, USA, 2011. [Google Scholar]
  175. Vasconez, J.P.; Kantor, G.A.; Cheein, F.A.A. Human–robot interaction in agriculture: A survey and current challenges. Biosyst. Eng. 2019, 179, 35–48. [Google Scholar] [CrossRef]
  176. Soter, G.; Conn, A.; Hauser, H.; Rossiter, J. Bodily Aware Soft Robots: Integration of Proprioceptive and Exteroceptive Sensors. In Proceedings of the 2018 IEEE International Conference on Robotics and Automation (ICRA), Brisbane, Australia, 21–25 May 2018; pp. 2448–2453. [Google Scholar] [CrossRef]
  177. Vasconez, J.P.; Guevara, L.; Cheein, F.A. Social robot navigation based on HRI non-verbal communication: A case study on avocado harvesting. In Proceedings of the SAC ‘19: The 34th ACM/SIGAPP Symposium on Applied Computing, Limassol, Cyprus, 8–12 April 2019; pp. 957–960. [Google Scholar] [CrossRef]
  178. García-Pérez, L.; García-Alegre, M.; Ribeiro, A.; Guinea, D. An agent of behaviour architecture for unmanned control of a farming vehicle. Comput. Electron. Agric. 2008, 60, 39–48. [Google Scholar] [CrossRef]
  179. Kulkarni, A.D.; Narkhede, G.G.; Motade, S.N. SENSOR FUSION: An Advance Inertial Navigation System using GPS and IMU. In Proceedings of the 6th International Conference on Computing, Communication, Control And Automation (ICCUBEA), Pune, India, 26–27 August 2022; pp. 1–5. [Google Scholar] [CrossRef]
  180. AEF. Agricultural Industry Electronics Foundation. 2019. Available online: https://www.aef-online.org/about-us/isobus.html#/About (accessed on 3 April 2023).
  181. Gurram, S.K.; Conrad, J.M. Implementation of CAN bus in an autonomous all-terrain vehicle. In Proceedings of the 2011 IEEE Southeastcon, Nashville, TN, USA, 17–20 March 2012; pp. 250–254. [Google Scholar] [CrossRef]
  182. Corrigan, S. Introduction to the Controller Area Network (CAN); Texas Instruments: Dallas, TX, USA, 2002. [Google Scholar]
  183. Tindell, K.; Burns, A.; Wellings, A. Calculating controller area network (can) message response times. Control Eng. Pract. 1995, 3, 1163–1169. [Google Scholar] [CrossRef]
  184. Baek, W.; Jang, S.; Song, H.; Kim, S.; Song, B.; Chwa, D. A CAN-based Distributed Control System for Autonomous All-Terrain Vehicle (ATV). IFAC Proc. Vol. 2008, 41, 9505–9510. [Google Scholar] [CrossRef]
  185. Henderson, J.R.; Conrad, J.M.; Pavlich, C. Using a CAN bus for control of an All-terrain Vehicle. In Proceedings of the IEEE SoutheastCon 2014, Lexington, KY, USA, 13–16 March 2014; pp. 1–5. [Google Scholar] [CrossRef]
  186. Open-Source Robotics Foundation. Robot Operating System (ROS). Available online: https://www.ros.org/ (accessed on 3 April 2023).
  187. Rhoades, B.B.; Srivastava, D.; Conrad, J.M. Design and Development of a ROS Enabled CAN Based All-Terrain Vehicle Platform. In Proceedings of the Southeastcon 2018, St. Petersburg, FL, USA, 19–22 April 2018; pp. 1–6. [Google Scholar] [CrossRef]
  188. Zhu, M.; Wang, H.; Li, P.; Liu, J. An Open Source Framework Based Unmanned All-Terrain Vehicle(U-ATV) for Wild Patrol and Surveillance. In Proceedings of the IEEE 8th Annual International Conference on CYBER Technology in Automation, Control, and Intelligent Systems (CYBER), Tianjin, China, 19–23 July 2018; pp. 1000–1005. [Google Scholar] [CrossRef]
  189. Alliance, L. LoRaWAN™ Specification” LoRa™ Alliance; Technical Report; LoRa: Fremont, CA, USA, 2015. [Google Scholar]
  190. Tao, W.; Zhao, L.; Wang, G.; Liang, R. Review of the internet of things communication technologies in smart agriculture and challenges. Comput. Electron. Agric. 2021, 189, 106352. [Google Scholar] [CrossRef]
  191. Available online: http://standards.ieee.org/ (accessed on 15 December 2023).
  192. Available online: http://www.iso.org/standard/ (accessed on 15 December 2023).
  193. Available online: http://www.etsi.org/ (accessed on 15 December 2023).
  194. Strzoda, A.; Marjasz, R.; Grochla, K. How Accurate is LoRa Positioning in Realistic Conditions? In Proceedings of the 12th ACM International Symposium on Design and Analysis of Intelligent Vehicular Networks and Applications, Montreal, QC, Canada, 24–28 October 2022; Association for Computing Machinery: New York, NY, USA, 2022. [Google Scholar]
  195. Boursianis, A.D.; Papadopoulou, M.S.; Diamantoulakis, P.; Liopa-Tsakalidi, A.; Barouchas, P.; Salahas, G.; Karagiannidis, G.; Wan, S.; Goudos, S.K. Internet of Things (IoT) and Agricultural Unmanned Aerial Vehicles (UAVs) in smart farming: A comprehensive review. Internet Things 2022, 18, 100187. [Google Scholar] [CrossRef]
  196. Balogh, M.; Vidacs, A.; Feher, G.; Maliosz, M.; Horvath, M.A.; Reider, N.; Racz, S. Cloud-Controlled Autonomous Mobile Robot Platform. In Proceedings of the IEEE 32nd Annual International Symposium on Personal, Indoor and Mobile Radio Communications (PIMRC), Helsinki, Finland, 13–16 September 2021; pp. 1–6. [Google Scholar] [CrossRef]
  197. Gerla, M. Vehicular Cloud Computing. In Proceedings of the 11th Annual Mediterranean Ad Hoc Networking Workshop (Med-Hoc-Net), Ayia Napa, Cyprus, 19–22 June 2012; pp. 152–155. [Google Scholar] [CrossRef]
  198. Kumar, S.; Gollakota, S.; Katabi, D. A cloud-assisted design for autonomous driving. In Proceedings of the 1st Edition of the MCC Workshop on Mobile Cloud Computing, Helsinki, Finland, 13–17 August 2012; pp. 41–46. [Google Scholar] [CrossRef]
  199. Shahzad, K. Cloud robotics and autonomous vehicles. In Autonomous Vehicle; Polish Naval Academy: Gdynia, Poland, 2016. [Google Scholar] [CrossRef]
  200. Warren, J.; Marz, N. Big Data: Principles and Best Practices of Scalable Realtime Data Systems; Simon and Schuster: New York, NY, USA, 2015. [Google Scholar]
  201. Jie, C.; Yanling, G. Research on Control Strategy of the Electric Power Steering System for All-Terrain Vehicles Based on Model Predictive Current Control. Math. Probl. Eng. 2021, 2021, 6642042. [Google Scholar] [CrossRef]
  202. Park, H.-J.; Lim, M.-S.; Lee, C.-S. Magnet Shape Design and Verification for SPMSM of EPS System Using Cycloid Curve. IEEE Access 2019, 7, 137207–137216. [Google Scholar] [CrossRef]
  203. Dutta, R.; Rahman, M.F. Design and Analysis of an Interior Permanent Magnet (IPM) Machine with Very Wide Constant Power Operation Range. IEEE Trans. Energy Convers. 2008, 23, 25–33. [Google Scholar] [CrossRef]
  204. Fodorean, D.; Idoumghar, L.; Brevilliers, M.; Minciunescu, P.; Irimia, C. Hybrid Differential Evolution Algorithm Employed for the Optimum Design of a High-Speed PMSM Used for EV Propulsion. IEEE Trans. Ind. Electron. 2017, 64, 9824–9833. [Google Scholar] [CrossRef]
  205. Kim, J.-M.; Yoon, M.-H.; Hong, J.-P.; Kim, S.-I. Analysis of cogging torque caused by manufacturing tolerances of surface-mounted permanent magnet synchronous motor for electric power steering. IET Electr. Power Appl. 2016, 10, 691–696. [Google Scholar] [CrossRef]
  206. Qiu, H.; Zhang, Q.; Reid, J.F.; Wu, D. Nonlinear Feedforward-Plus-PID Control for Electrohydraulic Steering Systems. In Proceedings of the ASME 1999 International Mechanical Engineering Congress and Exposition, Nashville, TN, USA, 14–19 November 1999; pp. 89–94. [Google Scholar] [CrossRef]
  207. Wu, D.; Zhang, Q.; Reid, J.F. Adaptive steering controller using a Kalman estimator for wheel-type agricultural tractors. Robotica 2001, 19, 527–533. [Google Scholar] [CrossRef]
  208. Xia, D.; Kong, L.; Hu, Y.; Ni, P. Silicon microgyroscope temperature prediction and control system based on BP neural network and Fuzzy-PID control method. Meas. Sci. Technol. 2015, 26, 25101. [Google Scholar] [CrossRef]
  209. Marino, R.; Scalzi, S.; Netto, M. Nested PID steering control for lane keeping in autonomous vehicles. Control Eng. Pract. 2011, 19, 1459–1467. [Google Scholar] [CrossRef]
  210. Amer, N.H.; Zamzuri, H.; Hudha, K.; Kadir, Z.A. Modelling and Control Strategies in Path Tracking Control for Autonomous Ground Vehicles: A Review of State of the Art and Challenges. J. Intell. Robot. Syst. 2016, 86, 225–254. [Google Scholar] [CrossRef]
  211. Trebi-Ollennu, A.; Dolan, J.M.; Khosla, P.K. Adaptive fuzzy throttle control for an all-terrain vehicle. Proc. Inst. Mech. Eng. Part I J. Syst. Control. Eng. 2001, 215, 189–198. [Google Scholar] [CrossRef]
  212. Alvarado, M.; García, F. Wheeled vehicles’ velocity updating by navigating on outdoor terrains. Neural Comput. Appl. 2011, 20, 1097–1109. [Google Scholar] [CrossRef]
  213. Wang, J.; Sun, Z.; Xu, X.; Liu, D.; Song, J.; Fang, Y. Adaptive speed tracking control for autonomous land vehicles in all-terrain navigation: An experimental study. J. Field Robot. 2013, 30, 102–128. [Google Scholar] [CrossRef]
  214. Zhu, M.; Chen, H.; Xiong, G. A model predictive speed tracking control approach for autonomous ground vehicles. Mech. Syst. Signal Process. 2017, 87, 138–152. [Google Scholar] [CrossRef]
  215. Cao, H.; Song, X.; Zhao, S.; Bao, S.; Huang, Z. An optimal model-based trajectory following architecture synthesising the lateral adaptive preview strategy and longitudinal velocity planning for highly automated vehicle. Veh. Syst. Dyn. 2017, 55, 1143–1188. [Google Scholar] [CrossRef]
  216. Xue, J.; Xia, C.; Zou, J. A velocity control strategy for collision avoidance of autonomous agricultural vehicles. Auton. Robot. 2020, 44, 1047–1063. [Google Scholar] [CrossRef]
  217. Kayacan, E.; Ramon, H.; Saeys, W. Robust Trajectory Tracking Error Model-Based Predictive Control for Unmanned Ground Vehicles. IEEE/ASME Trans. Mechatron. 2015, 21, 806–814. [Google Scholar] [CrossRef]
  218. Huang, J.; Wen, C.; Wang, W.; Jiang, Z.-P. Adaptive output feedback tracking control of a nonholonomic mobile robot. Automatica 2014, 50, 821–831. [Google Scholar] [CrossRef]
  219. Yi, J.; Song, D.; Zhang, J.; Goodwin, Z. Adaptive Trajectory Tracking Control of Skid-Steered Mobile Robots. In Proceedings of the 2007 IEEE International Conference on Robotics and Automation, Rome, Italy, 10–14 April 2007; pp. 2605–2610. [Google Scholar] [CrossRef]
  220. Huang, X.; Zhang, H.; Wang, J. Robust weighted gain-scheduling H∞ vehicle lateral dynamics control in the presence of steering system backlash-type hysteresis. In Proceedings of the 2013 American Control Conference (ACC), Washington, DC, USA, 17–19 June 2013; pp. 2827–2832. [Google Scholar] [CrossRef]
  221. Kang, J.; Kim, W.; Lee, J.; Yi, K. Skid Steering-Based Control of a Robotic Vehicle with Six in-Wheel Drives. Proc. Inst. Mech. Eng. Part D J. Automob. Eng. 2010, 224, 1369–1391. [Google Scholar] [CrossRef]
  222. Urmson, C.; Anhalt, J.; Bartz, D.; Clark, M.; Galatali, T.; Gutierrez, A.; Harbaugh, S.; Johnston, J.; Kato, H.; Koon, P.; et al. A robust approach to high-speed navigation for unrehearsed desert terrain. J. Field Robot. 2006, 23, 467–508. [Google Scholar] [CrossRef]
  223. Shin, J.; Huh, J.; Park, Y. Asymptotically stable path following for lateral motion of an unmanned ground vehicle. Control Eng. Pract. 2015, 40, 102–112. [Google Scholar] [CrossRef]
  224. Zhang, H.; Zhang, X.; Wang, J. Robust gain-scheduling energy-to-peak control of vehicle lateral dynamics stabilization. Veh. Syst. Dyn. 2014, 52, 309–340. [Google Scholar] [CrossRef]
Figure 1. Basic control diagram of autonomous vehicles [23] (written permission obtained).
Figure 1. Basic control diagram of autonomous vehicles [23] (written permission obtained).
Agriculture 14 00163 g001
Figure 2. Levels of automated driving graphics to reflect evolving standards [27].
Figure 2. Levels of automated driving graphics to reflect evolving standards [27].
Agriculture 14 00163 g002
Figure 3. Autonomous Weed detector (Reprinted/adapted with permission from [28]).
Figure 3. Autonomous Weed detector (Reprinted/adapted with permission from [28]).
Agriculture 14 00163 g003
Figure 4. A flowchart of the structure and development of this paper.
Figure 4. A flowchart of the structure and development of this paper.
Agriculture 14 00163 g004
Figure 5. Schematic of AATV systems based on the structure of the paper (Reprinted/adapted with permission from [42]).
Figure 5. Schematic of AATV systems based on the structure of the paper (Reprinted/adapted with permission from [42]).
Agriculture 14 00163 g005
Figure 6. Kinematic model for an autonomous tractor [48].
Figure 6. Kinematic model for an autonomous tractor [48].
Agriculture 14 00163 g006
Figure 7. Dynamic model of autonomous tractor [58].
Figure 7. Dynamic model of autonomous tractor [58].
Agriculture 14 00163 g007
Figure 8. Bicycle dynamics model for a tractor system [48], where γ , m, Ft,f, Fl,f, Fl,r represent the yaw rate and the mass of the tractor, the traction and lateral forces on the front wheel, and the lateral force on the rear wheel.
Figure 8. Bicycle dynamics model for a tractor system [48], where γ , m, Ft,f, Fl,f, Fl,r represent the yaw rate and the mass of the tractor, the traction and lateral forces on the front wheel, and the lateral force on the rear wheel.
Agriculture 14 00163 g008
Figure 9. PID Diagram.
Figure 9. PID Diagram.
Agriculture 14 00163 g009
Figure 10. Fuzzy Logic Control (FLC) Diagram.
Figure 10. Fuzzy Logic Control (FLC) Diagram.
Agriculture 14 00163 g010
Figure 11. Investigating the application of PID and fuzzy controllers (Reprinted/adapted with permission from [77]).
Figure 11. Investigating the application of PID and fuzzy controllers (Reprinted/adapted with permission from [77]).
Agriculture 14 00163 g011
Figure 12. Schematic of a genetic algorithm [80].
Figure 12. Schematic of a genetic algorithm [80].
Agriculture 14 00163 g012
Figure 13. Neural network schema.
Figure 13. Neural network schema.
Agriculture 14 00163 g013
Figure 14. MPC Basic Control Loop.
Figure 14. MPC Basic Control Loop.
Agriculture 14 00163 g014
Figure 15. Intuitive Explanation of the Kalman Filter [106].
Figure 15. Intuitive Explanation of the Kalman Filter [106].
Agriculture 14 00163 g015
Figure 16. CAN Structure.
Figure 16. CAN Structure.
Agriculture 14 00163 g016
Figure 17. ROS concept.
Figure 17. ROS concept.
Agriculture 14 00163 g017
Figure 18. A high-level architecture for cloud-assisted autonomous vehicle design [199].
Figure 18. A high-level architecture for cloud-assisted autonomous vehicle design [199].
Agriculture 14 00163 g018
Figure 19. Types of brakes.
Figure 19. Types of brakes.
Agriculture 14 00163 g019
Table 1. Comparison of commonly used algorithms of AATVs.
Table 1. Comparison of commonly used algorithms of AATVs.
Control MethodAdvantagesDisadvantagesReferences
Fuzzy
  • Comparable to human reasoning (based on membership rules and functions)
  • Using linguistic models.
  • Applying simple mathematics to nonlinear, integrated, and even complex systems.
  • High level of precision.
  • Fast operation.
  • More fuzzy grades result in exponentially increasing the rule.
  • Low speed and longer run time of the system.
  • Response time is not real-time.
  • A learning strategy cannot simply be implemented by receiving feedback.
  • A limited number of input variables can be used.
  • It is impossible to determine the membership function parameters and the optimal number of fuzzy rules in a straightforward manner.
[74,75,76,77,78,79]
GA
  • Optimization based on global non-derivatives.
  • No matter what mathematical theories are included, optimizing goals should be the focus.
  • Applicable to real-time applications only in some cases
  • Relatively low accuracy (~80%)
[79,80,81,82,83,84]
ANN
  • An excellent ability to predict models.
  • Identifies and controls nonlinearly.
  • Can be applied to non-mathematical models.
  • Ability to manage an abundance of input variables and data.
  • Assurances of trustworthiness
  • The neural network must be trained before it can operate.
  • Large neural networks take a long time to process.
  • Taking a long time to train off-line.
  • A lot of data are required to make quality predictions.
  • Computationally costly.
[72,76,82,85]
MPC
  • Enhanced energy savings at a low cost.
  • Robustness to disturbances and shifts in performance conditions.
  • Multi-variable control within bounds.
  • Improvements to steady-state responses and reductions in offset errors.
  • Disturbance prediction.
  • Predicting future control actions.
  • Improved transient response.
  • Controlling slow processes through time postponements.
  • Capacity to shift peak loads.
  • Better regulation and reduction of fluctuations from a set point.
  • Performance improvements and efficiency.
  • Reduced computation time.
  • The system needs to be modeled properly.
  • Installation/implementation can be costly.
  • The algorithm is considerably complex.
[86,87,88,89,90,91,92,93,94,95,96,97,98,99,100,101,102,103,104,105]
ML: DRL
  • Works with large datasets.
  • Due to its ability to learn different levels of abstraction from data, it can solve more complicated tasks with less prior knowledge.
  • Has accelerated progress in RL
  • Most useful for state space problems with high dimensions
  • Relatively easier to implement.
  • No mathematical modelling required.
  • For better decision-making, large datasets are required.
  • Increasing model complexity requires more data for reinforcement learning algorithms.
  • Reinforcement learning models are limited by the agent’s exploration of the environment.
  • In a constantly changing environment, making a good decision can be challenging.
  • Designing reward structures is a challenge.
[117,118,119,120,121]
PID
  • Among the most widely used controllers in the industry.
  • In the field of ATVs, researchers have studied these control techniques for a long time
  • Feedback controller
  • Combating sudden changes in system load with derivative terms
  • Creating undershoots and overshoots, cause sudden forces to the actuators and cause them to depreciate over time.
  • Mathematical modeling of the system required.
  • This method is not suitable for non-linear, complex, or uncertain information systems.
  • Unreliable in long testing time.
  • Considerable performance limitations.
  • It is difficult to adjust the controller parameters.
[70,71,72,73]
Kalman filter
  • Efficient in terms of computation
  • High-dimensionality can be handled with limited or no extra computational cost
  • Capable of handling short periods of sensor silence
  • Combining pattern recognition with parameter estimation
  • Various version
  • Assumptions are too restrictive
  • Only capable of representing Gaussian distributions
  • It has significant limitations in extremely uncertain conditions.
[106,107,108,109,110,111,112,113,114,115,116]
Table 2. Comparison analysis of sensors.
Table 2. Comparison analysis of sensors.
SensorAdvantagesDrawbacksAccuracyEnergy EfficiencyRobustness
Vision
  • Reduces the possibility of human error
  • Downtime is small.
  • Improves throughput
  • Measurement accuracy
  • Good object identification.
  • Dependent on the location of the camera
  • Low to zero accuracy in foggy or similar situations
  • Expensive
  • Requires heavy data analysis
Up to 250 m
Working distance
GPS
  • Easy to navigate.
  • GPS works well in different temperature conditions.
  • Affordable.
  • 100% coverage around the globe.
  • Integration with other technologies is easy.
  • Insufficient accuracy due to obstructions or atmospheric conditions.
  • High power consumption: GPS chips drain batteries in 8 to 12 h.
  • Do not penetrate solid walls.
  • Susceptibility to radio interferences.
2 m (CEP)
Approximately 99.88%
Power usage:
On average around 30 mA at 3.3 V
GPS signals typically have a −125 dBm power level.
Dead-Reckoning
  • Continuous positioning
  • Ability to work in closed environments
  • Help in the lack of technology or Internet based things
  • With the increase of distance from the known position, estimation errors increase
  • Require a lot of memory
  • For accurate position determination, both speed and direction must be known
  • Errors are cumulative
---------------------
LiDAR
  • A high level of accuracy is maintained while data are collected quickly.
  • It is capable of collecting elevation data even in dense forests.
  • Works both during the day and at night.
  • There are no distortions in the geometry.
  • Human supervision is minimal.
  • Extreme weather is not an issue.
  • Rain and low hanging clouds make it ineffective.
  • Reflections and high sun light angles affect it.
  • Large amount of data.
  • There is no international protocol.
  • Cannot penetrate thick vegetation.
  • Health hazard: human eyes can be affected negatively by powerful laser beams.
  • Special data analysis skills are required.
  • Operates at low altitudes.
  • Expensive.
Range of accuracy is 0.5 to 10 mm.
Up to 1 cm horizontal and 2 cm vertical mapping accuracy.
Accuracy range:
92.55% to 93.03%
8–30 W power consumption200 m working distance
Inertial
  • providing absolute position and attitude
  • External electromagnetic interference does not affect it
  • Can work all day long
  • information is continuous and low noise.
  • A high rate of data updates
  • Accuracy is poor over the long term
  • Prior to each use, long alignment times are required
  • Expensive
  • It is not possible to provide time information
GDS
  • Absolute for “heading”
  • Heading is distorted by a magnetic anomaly
Ultrasonic Sensor
  • Affected by neither color nor transparency of objects
  • Dark environments are not a problem
  • Cost-effective
  • Environments with dust, dirt, or high moisture are not a major issue
  • Temperature changes of 5–10 degrees or more affect sensing accuracy
  • Detection range is limited
  • Inapplicable to high-speed operations
Accuracy range:
92.20% to
92.88%
Average operating current: 5 mAUp to 20 m
Working distance
Table 3. Comparison of existing external communication technologies [190].
Table 3. Comparison of existing external communication technologies [190].
ParametersStandardFrequency BandData RateTransmission RateEnergy ConsumptionCost
Wi-FiIEEE 802.11a/c/b/d/g/n [191]5–60 GHz1 Mb/s–7 Gb/s20–100 mHighHigh
ZigBeeIEEE 802.15.4 [191]2.4 GHz20–250 kb/s10–20 mLowLow
LoRaLoRaWAN R1.0 [189]868/900 MHz0.3–50 kb/s<30 KmVery lowHigh
RFIDISO 18000-6C [192]860–960 MHz40 to 160 kb/s1–5 mLowLow
Mobile communication2G-GSM, CDMA 3G-UMTS, CDMA2000, 4G-LTE,5G-LTE,
GPRS [193]
865 MHz, 2.4 GHz2G: 50–100 kb/s
3G: 200 kb/s
4G: 0.1–1 Gb/s
Entire Cellular AreaLowLow
BluetoothIEEE 802.15.1 [191]24 GHz1–24 Mb/s8–10 mVery lowLow
Table 4. Steering types and Components, Advantages and Drawbacks.
Table 4. Steering types and Components, Advantages and Drawbacks.
Steering TypesMain PartsAdvantagesDrawbacks
Rack and pinion steering
  • Rack and pinion gear
  • Eliminate the need for center link and pitman arm
  • Two tie rod ends
  • Reduction of gears
  • It takes more energy to drive
  • High manufacturing cost
  • Complex structure
Hydraulic power steering
  • Fluid reservoir or tank
  • Rotary valve
  • Hydraulic pump
  • Hydraulic chamber
  • More powerful than electric and rack and pinion power steering
  • Good feedback.
  • Less expensive than EPS.
  • The mechanism is more reliable.
  • Consumption of more power.
  • Hydraulic fluid needs to be replaced from time to time.
  • HPS pumps do not work well with high revving motors.
  • Steering is a little complex.
  • In comparison to EPS, it is heavier.
Electric power steering
  • Steering angle sensor
  • Torque sensor
  • Reduction gearbox
  • ECU
  • Vehicle speed sensor
  • Electric motor
  • Less maintenance needs
  • Do not require power steering fluid
  • Fuel efficient
  • Smaller dimensions of the mechanism
  • Higher the cost of the mechanism and its elements
  • Less powerful
Electro Hydraulic power steering
  • Vane pump
  • Two electric motors
  • Steering gearbox
  • Electronics equipment
  • It does not need the engine running to drive the hydraulic pump.
  • Low energy consumption
Table 5. Brake types used in ATVs and their advantages and drawbacks.
Table 5. Brake types used in ATVs and their advantages and drawbacks.
Brake typesMain PartsAdvantagesDrawbacks
Electromagnetic braking system
  • Friction disc
  • Field coil
  • Armature hub
  • Plate Spring
  • Fast and cheap
  • Maintenance cost is low
  • Capacity of the system is more (like higher speeds, heavy loads)
  • a negligible amount of heat
  • They are very fast
  • Its operation is smooth
  • Their initial cost is high
  • Incompatible with high temperatures
  • Requirement of more electric power supply
Hydraulic braking system
  • Master Cylinder
  • Oil Reservoir
  • Brake Shoes
  • Pipeline
  • The force is higher compared to the mechanical braking system.
  • Brake failure is very less (safe)
  • Compared to mechanical brakes, heat is dissipated more thoroughly
  • Less wear and tear makes them durable
  • Compared to mechanical brakes, they are more effective
  • There is no difference in braking effort between tires
  • They are more expensive than mechanical brakes
  • It is important to use brake fluid that is compatible with the brake material
  • If braking fluid leaks, brakes could fail
  • The construction and maintenance are more complex
Mechanical braking system
  • Friction pads
  • Caliper
  • Disc
  • Fluid reservoir
  • Pipeline
  • Construction and maintenance are simple
  • Compared to hydraulic brakes, it is less expensive
  • Suitable for emergency and parking brakes
  • Enormous heat is produced
  • Brake failure is high
  • Not as effective than hydraulic brake
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Etezadi, H.; Eshkabilov, S. A Comprehensive Overview of Control Algorithms, Sensors, Actuators, and Communication Tools of Autonomous All-Terrain Vehicles in Agriculture. Agriculture 2024, 14, 163. https://doi.org/10.3390/agriculture14020163

AMA Style

Etezadi H, Eshkabilov S. A Comprehensive Overview of Control Algorithms, Sensors, Actuators, and Communication Tools of Autonomous All-Terrain Vehicles in Agriculture. Agriculture. 2024; 14(2):163. https://doi.org/10.3390/agriculture14020163

Chicago/Turabian Style

Etezadi, Hamed, and Sulaymon Eshkabilov. 2024. "A Comprehensive Overview of Control Algorithms, Sensors, Actuators, and Communication Tools of Autonomous All-Terrain Vehicles in Agriculture" Agriculture 14, no. 2: 163. https://doi.org/10.3390/agriculture14020163

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop