Central Manager

This central manager is an application that is divided into the following:


A human–machine interface (HMI) is a device or program enabling a user to communicate with another device, system, or machine. In this study, a HMI using portable devices (android tablets) is addressed to allow farmers to perform the following:


To achieve these characteristics, a graphic device was integrated with the portable/remote controller of the mobile platform. This controller provides manual and remote vehicle control and integrates an emergency button.

#### 2.1.6. Sequence of Actions

The relationships among these components and modules and the information flow are illustrated in Figures 2 and 3. The process is a repeated sequence of actions (A0 to A6), defined as follows:


#### *2.2. Integration Methods*

Integrating all of the components defined in the previous section to configure an autonomous robot depends on the nature of the applications the robot is devoted to and the connections and communication among the different components that must be precisely defined. Thus, this section first describes the computing architecture of the controller, which integrates the different subsystems and modules. Second, the interfaces between subsystems are precisely defined. Finally, the operation procedure is defined.

#### 2.2.1. Computing Architecture

A distributed architecture based on an open-source Robot Operating System (ROS) is proposed to integrate the system's main components onboard the mobile platform in this study. ROS is the operating system most widely accepted by software developers to create robotics applications. It consists of a set of software libraries and tools that include drivers and advanced algorithms to help developers build robot applications [15].

In this study, ROS, installed in the central controller, is used as a meta-operating system for the testing prototype. The necessary interfaces (bridges) are developed to establish communication with the autonomous vehicle, the perception system, and the laser-based weeding tool. Because of ROS versatility and its publisher/subscriber communication model, it is possible to adapt the messages to protocols commonly used in IoT, such as Message Queuing Telemetry Transport (MQTT).

ROS supports software developers in creating robotics functionalities to monitor and control robot components connected to a local network. However, this solution is not extendible to a wider network, such as the Internet. Fortunately, there exist some ROS modules that solve the problem. One is ROSLink, a protocol for extensions defining an asynchronous communication procedure between the users and the robots through the cloud [16]. ROSLink performance has been shown to be efficient and reliable, and it is widely accepted by the robotics software community [17]. Although ROSLink has been widely used to connect robotic systems with the cloud, it is oriented toward transmitting low-level messages. There is no convention to define standard data models that allow intelligent robotics systems to be scalable.

One alternative to a more internet-oriented communication framework is FIWARE, which offers interaction with the cloud using cloud services that provide well-known benefits, such as (a) cost and flexibility, (b) scalability, (c) mobility, and (d) disaster recovery [18]. FIWARE is an open software curated platform fostered by the European Commission and the European Information and Communication Technology (ICT) industry for the

development and worldwide deployment of Future Internet applications. It attempts to provide a completely open, public, and free architecture and a collection of specifications that allows organizations (designers, service providers, businesses, etc.) to develop open and innovative applications and services on the Internet that fulfill their needs [19].

In this study, a cloud-based communication architecture has been implemented using FIWARE as the core, which allows messages between the edge and the cloud to be transferred and stored. The selection was made because this is an open-source platform that provides free development modules and has many enablers already developing and integrating solutions for smart agriculture.

In addition to FIWARE, we use KAFKA, a robust distributed framework for streaming data (see Section 2.1.5) that allows producers to send data and for consumers to subscribe to and process such updates. KAFKA enables the processing of streams of events/messages in a scalable and fault-tolerant manner, and decouples producers and consumers (i.e., a consumer can process data even after a producer has gone offline). For historic data, HDFS allows the download of batches of data at any time and replicates each data in three copies to prevent data loss.

The visual dashboard will also be available on the HMI for the field operations. Through the dashboard, the operator will also interact with the robot. FIWARE smart data models do not suffice to represent our application domain or to integrate the agricultural and robotic domains; therefore, we have extended the existing models and updated some existing entities. Since smart data models from FIWARE are overlapping and sometimes inconsistent, we had to envision a unified model to integrate and reconcile the data. To connect the robotic system with the cloud, specific data models were developed to represent the different robotic elements, following the guidelines of FIWARE and its intelligent data models [12].

The IoT devices deployed in the field must be able to establish connections through WiFi and LoRa technologies. WiFi is a family of wireless network protocols. These protocols are generally used for Internet access and communication in local area networks, allowing nearby electronic devices to exchange data using radio waves. LoRa technology is a wireless protocol designed for long-range connectivity and low-power communications and is primarily targeted for the Internet of Things (IoT) and M2M networks. LoRa tolerates noise, multipath signals, and the Doppler effect. The cost of achieving this is a very low bandwidth compared to other wireless technologies. This study uses a 4G LTE-M modem to connect to the Internet.

At a lower level of communication, CANbus or ISOBUS is generally used to control and monitor the autonomous vehicle. This study uses CANbus and its communication protocol CANopen. Autonomous vehicles and agricultural tools typically contain their own safety controllers. The first behaves as a master and, in the case of a risky situation, it commands the tool to stop.

The human–machine interface (HMI) will include a synchronous remote procedure call-style communication over the services protocol and asynchronous communications to ensure the robot's safety. In addition to these ROS-based protocols, the HMI has a safety control connected to the low-level safety system (by radiofrequency) for emergency stops and manual control.

Figure 6 illustrates the overall architecture, indicating the following:


The main characteristics of this architecture are summarized in Table 1.

**Figure 6.** Experimental fields.

**Table 1.** Architecture components.


#### 2.2.2. Interfaces between System Components

This architecture considers four main interfaces between systems and modules, as follows:

#### Smart Navigation Manager (M4)/Perception System (M1) interface

To receive the raw information from the perception system (sensors, cameras, etc.), the central manager uses direct connections via the transmission control protocol/Internet protocol (TCP/IP) for sensors and the universal serial bus (USB) for RGB and ToF cameras. All IoT devices use the available wireless communication technologies (WiFi and LoRa) to access the Internet and the cloud.

To guide the robot, the obstacle detection system obtains data from the guiding vision system (RGB and ToF cameras) through the Ethernet that communicates the central manager with the perception system. This communication is stated using the ROS manager and the perception–ROS bridge (see Figure 3).

#### Smart Navigation Manager (M4)/Agricultural Tool (M3) interface

These systems can communicate through ROS messaging protocols, where the publisher/subscriber pattern is preferred. This interface exchanges simple test messages to verify the communication interface.

It is worth mentioning that the perception system and the agricultural tool are connected directly in some specific applications. This solution decreases the latency of data communication but demands moving a portion of the decision algorithms from the smart navigation manager to the tool controller; therefore, the tool must exhibit computational features. This scheme is used in the weeding system to test the proposed architecture.

#### Smart Navigation Manager (M4)/Autonomous Robot (M2) interface

Initially, these systems communicate via CANbus with the CANopen protocol. The central manager uses this protocol to receive information on the status of the autonomous vehicle and basic information from the onboard sensors (GNSS, IMU, safety system, etc.). A CANbus–ROS bridge is used to adapt the communication protocols.

#### Autonomous Robot (M2)/Agricultural Tool (M3) interface

Usually, it is not necessary for the vehicle to directly communicate with the tool because the smart navigation manager coordinates them. However, as autonomous vehicles and agricultural tools usually have safety controllers, there is wired communication between the two safety controllers. In such a case, the autonomous vehicle safety controller works as a master and commands the tool safety controller to stop the tool if a dangerous situation appears.

### Perception System (M1)/Agricultural Tool (M3)

This communication is required to inform the agricultural tools about the crop status. In weeding applications, the information is related to the positions of the weeds. In this specific application, the perception system (weed meristem detection module) sends the weed meristem positions to the laser scanner module of the agricultural tool. This communication is carried out using a conventional Ethernet connection. The metadata generated via the detection system are made available in the existing ROS network and sent to the smart navigation manager.

#### Smart Navigation Manager internal/cloud communications

The smart navigation manager is a distributed system that consists of three main modules:


The central manager and the smart operation manager communicate via NGSI v2, a FIWARE application programming interface, using a FIWARE–ROS bridge to adapt ROS protocols to NGSI v2 messages. In contrast, the HMI communicates with the central manager via WiFi and Internet, directly accessing the web services hosted in the cloud. The HMI exhibits a panic button connected via radiofrequency to the safety systems of the autonomous robot and the agricultural tool.
