*3.3. Optitrack Validation*

The second experiment was conducted in an indoor environment equipped with the Optitrack camera system, which was used to evaluate the reference tracking with an external sensor. The reference tracking error in this experiment includes the error of the odometry-based mobile base control, the vibrations of the mobile base, the errors present in the robot arm encoder measurements and the Optitrack measurement noise. For practical reasons, the Optitrack markers were placed at the last joint of the robot arm (Figure 13), and the position of the spray frame was calculated using a single static transformation.

Graphs showing the *x* and *z* components of the spray frame position during the experiment, along with the tracking errors, are shown in Figures 14 and 15, respectively, and the corresponding error data are given in Table 3. In this experiment, the measured root mean square (RMS) error and the maximum error are equal to 9.76 mm and 52.81 mm, respectively. Errors are larger than the ones in the previous experiment, which is expected due to the already mentioned additional errors that the external sensors are able to capture, and a significant amount of measurement noise. External sensor data confirm that the odometry-based control does not result in a significant drift of the mobile base, as seen in Figure 14.

In Figure 16, the overall spray frame position calculated from the Optitrack data is compared to the position calculated using the joint encoder and vehicle odometry feedback.

**Figure 13.** For the second experiment, reference tracking is externally validated using Optitrack cameras to measure the position of the spray frame in the real world. Optitrack markers are attached to the end-effector of the robot arm.

**Figure 14.** Comparison between the *x* component of the spray frame position determined by the encoder measurements, and that determined externally via the Optitrack camera system, denoted *pS*,*<sup>x</sup>* and *p<sup>O</sup> <sup>S</sup>*,*x*, respectively. The bottom plot shows the corresponding error *<sup>p</sup>err <sup>S</sup>*,*x*.

**Table 3.** Spray frame position errors measured with the Optitrack camera system, during the indoor experiment.


**Figure 15.** Comparison between the *z* component of the spray frame position determined by the encoder measurements, and that determined externally via the Optitrack camera system, denoted *pS*,*<sup>z</sup>* and *p<sup>O</sup> <sup>S</sup>*,*z*, respectively. The bottom plot shows the corresponding error *<sup>p</sup>err <sup>S</sup>*,*z*.

**Figure 16.** Comparison between the position of the spray frame obtained by the encoder measurements and the position obtained externally via the Optitrack camera system, denoted as *<sup>p</sup><sup>S</sup>* and *<sup>p</sup><sup>O</sup> S* , respectively. *pRef <sup>S</sup>* represents the reference lawnmower trajectory, and *zR* and *zR* represent the upper and lower foliage boundaries, respectively.

As mentioned earlier, the task space controller selects joint velocities that follow the desired linear and roll spray frame velocities, while attempting to maintain the desired robot arm joint configuration. This results in the yaw and pitch angles of the spray frame shown in Figure 17. This type of control results in a pitch orientation (*θT*) graph similar to the *z* position, as shown in Figure 15. Similarly, the yaw orientation graph (*ψT*) follows the motion of the robot arm in the *x* direction.

**Figure 17.** Spray frame orientation during the indoor experiment. The pitch and yaw angles of the spray frame are denoted as *θ<sup>T</sup>* and *ψT*, respectively. These angles are not directly controlled, but are a result of the task space control criterion function.

#### **4. Conclusions and Future Work**

In this paper, a vineyard spraying algorithm for mobile manipulators is presented, based on task space model predictive control. The reference is generated based on grapevine canopy description, with the aim of minimizing unnecessary spraying waste and pollution.

There are certain limitations to the presented method. The time required to spray a row of grapevines is limited by the maximum velocity of the vehicle, as well as the maximum joint velocities of the robot arm. Task space control is used to calculate the joint velocity commands for the robot arm, which are not considered in the planning phase (MPC phase) of the algorithm. This could potentially lead to large spray frame velocities that cannot be tracked by the task space controller. Therefore, some experimentation is required to determine the maximum feasible velocity of the lawnmower pattern reference trajectory. Moreover, the task space control algorithm has no direct way of considering joint position constraints of the robot arm. This problem is dealt with indirectly, by allowing different angular velocities of the spraying frame, and constraining the optimization problem in such a way that the solutions moving the joints towards the desired configuration are preferred. No problems were encountered in the experiments regarding joint position constraints. The mobile base is controlled based on odometry feedback, which may lead to certain reference tracking problems since there is no external sensing. The second experiment shows that the open loop control performs well, mainly due to the fact that the vehicle moves in a straight line, which allows precise odometry. In future work, the plan is to close this control loop using a localization algorithm. Moreover, the tilt of the vehicle and other effects of uneven terrain are not taken into account in the current state of the algorithm, which could also be incorporated into future work. In the presented experiments, operator-selected grapevine row description was used. Manual selection of canopy areas proved to be error-prone, tedious and time-consuming. In the future work, a foliage detection algorithm is going to be incorporated for the purpose of generating a grapevine row description. Since the detection algorithm must be robust to changing lighting conditions, it is planned to be based on a combination of deep learning and depth information captured by an RGBD camera. The depth information acquired by the RGBD camera using infrared projection is sensitive to sunlight, so a camera based on pure stereo vision would be suitable for this task.

The presented method was evaluated in a vineyard spraying experiment, demonstrating its ability to adapt to a specific grapevine row structure. Mobile base velocity adapts to the row structure, which can be seen in the accompanying video and the graphs presented in Section 3.2. An additional experiment was performed evaluating the reference tracking with Optitrack cameras as external sensors. Error data show the 4.32 mm and 9.76 mm RMS errors in spray frame position, during the first and second experiment, respectively. Since the spray frame is located at a certain distance to the last link of the robot arm, its position is sensitive to small joint position errors. The presented error values are sufficiently small for the task of vineyard spraying, while a trade-off exists between reference tracking precision and the time required to execute the task.

The focus of this work was on the control algorithm that sprays a single row of vines. In the future, mission planning and navigation would allow the mobile manipulator to autonomously treat the entire vineyard by entering each row and executing the presented algorithm. Experiments evaluating the spray quality using a water-sensitive paper are planned in the future. Extensive experiments to determine the impact of the presented method on plant health and fruit production and compare it to manual spraying are to be conducted. The presented method will be tested for the task of fruit spraying rather than spraying the entire foliage, which is the focus of this article. Another challenge is the presence of dust in the vineyard, from the influence of which the equipment must be adequately protected. Moreover, while excessive robot arm heating was not noticed during the presented experiments, it could present a potential problem in the case of prolonged robot operation. In this case, some form of active cooling could be used to mitigate the problem. Currently, the spray tank has a volume of 30 L, which will be increased in the future.

**Author Contributions:** Conceptualization, Z.K., G.V. and I.V.; methodology, I.V.; software, I.V.; validation, I.V. and G.V.; investigation, I.V.; writing—original draft preparation, I.V.; writing—review and editing, G.V. and Z.K.; visualization, I.V.; supervision, Z.K. and G.V.; project administration, Z.K. and G.V.; funding acquisition, Z.K. and G.V. All authors have read and agreed to the published version of the manuscript.

**Funding:** The research work presented in this article has been supported by the project titled Heterogeneous Autonomous Robotic system in Viticulture and Mariculture (HEKTOR), financed by the European Union through the European Regional Development Fund—The Competitiveness and Cohesion Operational Programme (KK.01.1.1.04.0041).

**Institutional Review Board Statement:** Not applicable.

**Informed Consent Statement:** Not applicable.

**Conflicts of Interest:** The authors declare no conflict of interest.

#### **Abbreviations**

The following abbreviations are used in this manuscript:


#### **References**


**Jaroslav Vrchota 1, Martin Pech 1,\* and Ivona Švepešová <sup>2</sup>**


**Abstract:** Modern technologies are penetrating all fields of human activity, including agriculture, where they significantly affect the quantity and quality of agricultural production. Precision agriculture can be characterised as an effort to improve the results of practical farming, achieving higher profits by exploiting the existing spatial unevenness of soil properties. We aim to evaluate precision agriculture technologies' practical use in agricultural enterprises in the Czech Republic. The research was based on a questionnaire survey in which 131 farms participated. We validated the hypothesis through a Chi-squared test on the frequency of occurrence of end-use technology. The results showed that precision farming technologies are used more in crop than livestock production. In particular, 58.02% of enterprises use intelligent weather stations, 89.31% use uncrewed vehicles, and 61.83% use navigation and optimisation systems for optimising journeys. These technologies are the most used and closely related to autonomous driving and robotics in agriculture. The results indicate how willing are agricultural enterprises to adopt new technologies. For policy makers, these findings show which precision farming technologies are already implemented. This can make it easier to direct funding towards grants and projects.

**Keywords:** precision agriculture; Industry 4.0; technology; adoption; unmanned vehicles; smart production; drones; robots
