*2.4. Framework*

The whole framework runs on board the Jetson card, and it was developed through the *ROS* (robot operating system) environment. The main advantage offered by ROS is to guarantee extreme flexibility and modularity: it is possible to interconnect multiple software packages, called nodes, through a publisher–subscriber scheme. The framework developed is shown in Figure 4, where the nodes are visible in blue, the topics in green, and the functions implemented within the nodes in black. The DJI OSDK node provides a communication interface to the Matrice 300 RTK. In particular, through the telemetry function, it is possible to obtain all the data of interest, while through the control function, it is possible to autonomously fly the drone sending a set-point velocity vector. In the current implementation, the telemetry data that are actually used in the framework are the *GNSS*

(global navigation satellite system) coordinates and the attitude vector of the drone. This data are then used to build the radar maps, which will be explained in Section 2.5.

**Figure 4.** Block diagram of the framework that implements the obstacle detection and avoidance algorithm. In particular, the framework is able to communicate with the drone and to receive data from the sensors to search for obstacles in the surrounding environment.

Regarding the radar node, it provides the topic containing the 2D coordinates of the obstacles detected in the radar frame. In fact, as already explained in Section 2.3, the AWR1843BOOST radar does not reliably provide the relative elevation coordinate. The ZED 2 node creates the interface toward the stereoscopic camera: this node provides the fixed-frame position estimation by processing the vision data, the camera attitude and the topic point cloud that contains the 3D image of the environment. As will be explained in Section 2.6, by suitably processing this topic, it is possible to create an obstacle-detection system. The *ODS* node (obstacle-detection system) implements various functions: in addition to the obstacle-detection task, it is responsible for the creation of the maps [12] explained in Section 2.5, exploiting the absolute paths generated from the GNSS data [12] via the DJI OSDK node and from the SLAM (simultaneous localization and mapping) estimated by the vision process [18–21].

Since through the point cloud topic, it is not possible to create vision maps directly, a specific node called "point cloud to voxel map" was developed. This node converts the point cloud topic into a usable data structure [22] and the processed data are used for the creation of the vision maps.

The last node implemented in the framework is the control node, through which it is possible to define the generating policy (planner) of the desired set points for the autonomous driving algorithm. In addition, the control node implements the avoidance strategy, which is enabled in the event that a potentially dangerous obstacle is detected. In this case, the function that realizes the avoidance strategy bypasses the planner function, which is normally enabled during the autonomous mission.
