*3.4. Electro-Optical*

The electro-optic sensing system transmits, detects, and examines radiations in the optical spectrum, including visible light, infrared, and ultraviolet radiation. It can handle long-range imaging and has reliable results under different illumination levels. The components associated include optics, laser, detectors, camera, processing unit, etc. Such systems have been used for UAV detection, direction finding, and localization continuously and in all weather conditions.

In [46], the authors proposed using machine learning techniques to automatically detect and track small moving objects in the airfield from their motion patterns, i.e., the ways an object moves. The system utilized remote digital towers with high-resolution cameras covering the 360-degree view of airports to construct a video dataset comprising aircraft in an airfield and drones. Harris detection and convolutional neural network followed by optical flow we applied to the dataset to locate and track very small moving objects in the wide-area scene. The results showed that the system can detect objects with 15 × 15 pixels in 1080p images with a low miss rate. Motion-based features are extracted from their trajectories, after which a K-nearest neighbor classifier is applied to classify objects into drones or aircraft, with an accuracy of 93%.

The proposed approach in [42] performs the detection by integrating a 3D LADAR sensing system. The study employed voxel-based background subtraction and variable radially bounded nearest neighbor (V-RBNN) techniques to detect small UAVs up to 2 km. During the development phase, this integration is supported with augmented data set to enhance the model's performance. The developed LADAR scanner can be rotated to cover a wide range of areas, e.g., 350 degrees for azimuth direction and 120 degrees for elevation direction. Furthermore, the used clustering algorithm, V-RBNN, has a good impact on the target UAV classification, which may increase the use of this proposed detection system in various applications.

In some electro-optic solutions, the detected data transferred to the analysis phase, including advanced processing, machine vision, or machine learning, are not accurate enough to track small UAVs effectively. Authors in [44] proposed an electro-optic system integrated with an all-sky camera system to get a wider view of the monitored area to improve the detection resolution. Multiple experiments were performed to evaluate the proposed solution and test its integration with other cues, i.e., acoustic. The combination of these three systems, electro-optic, all-sky camera, and acoustic cues, is also evaluated.

The study in [41] improved its outcomes' reliability by using the electro-optical method for small UAV detection and tracking. The actual video stream was used in real-time, and a differential method was employed for analyzing and investigating UAV detection and tracking. The differential method finds the differences in sequenced frames in a video stream. In the case of hovering UAVs or some axially moving and revealing objects near the frames, the contrast was selected only for the displaced part of the object to process the video streams using the DIS algorithm.

The electro-optical detection method needs to consider the following factors: size and movement in 3D, speed of detected airborne objects, the maximal distance of detected objects from the camera position, optical lens descriptions, and linear object image resolution. Since these factors directly relate to image processing methods, detecting distance and outputs of detections get affected negatively if one of the aforementioned factors contains faulty or inaccurate information. Detecting moving objects at a maximal distance from a camera in real-time is the main objective of the method.

A single dynamic vision sensing (DVS) camera, a base station, UAV, and a blinking marker are used in [45] to detect and locate mobile UAVs. During video streaming, the differences among the captured frames are computed and filtered to detect UAVs in the background image using a temporal-filtering algorithm. The triangulation algorithm was also used to help capture UAVs or drones by extracting spatial localization parameters and providing details about the physical size of the detected object.

Similarly, in [43], Seidaliyeva et al. developed an algorithm to detect drones from a video stream. The system overview for the process is illustrated in Figure 6. The input frames are passed into a moving object detector algorithm. The authors relied on background subtraction followed by threshold filtering and morphological operations for detecting moving objects. The background subtraction method describes a model background image that is subtracted from all frames to extract the foreground. This method heavily depends on the background remaining static throughout the operation. A CNNbased classifier is used on the detections to distinguish drones from other objects such as birds.

The limited detection range can pose a challenge while employing the electro-optic technique. The detection performance can be enhanced by incorporating other supporting algorithms. The summary of reviewed studies of this method is depicted in Table 5.

**Figure 6.** Proposed UAV detection algorithm used in electro-optical method [43].


**Table 5.** Summary of reviewed electro-optical sensor-based techniques for UAV characterization.

The performance of vision-based solutions becomes poor with no LOS (angle of camera), bad quality of lenses, in foggy, dark, and dusty environments (weather conditions), and background temperature [9,10,21,63]. The aforementioned limitations could be addressed to some extent by using an IR camera, i.e., detection based on drone component heat, but that increases the system cost significantly and limits the detection range and environment due to the sensibility of the sensors that measure the thermal difference between the drone and the background [14,63].

### *3.5. Hybrid Fusion Systems*

The hybrid fusion of multiple cues, such as radio frequency, radar, acoustic, and visual sensors, improves the performance of detection, classification, and localization of both the drone and its controller. Jovanoska et al. [47] suggested an array of sensors to collect the detected drone's data to be fed to a fusion engine for further analysis using the multiple hypothesis tracker (MHT) techniques, as illustrated in Figure 7. The RF signal is received and processed to compute the detected drone's DOA for drone localization. The captured signals from the integrated acoustic sensors are filtered to remove unwanted noise. After identifying and detecting the drone signatures, the coherent broadband beamforming technique is used to recognize the drone bearing angle and reduce its error by the two-step filter. Finally, the extracted DOA and bearing angle are referred to by the fusion engine for localization purposes. Finally, the GSM passive radar [41] is used for UAV detection and localization, and its output is fed to the fusion engine of the overall system. Combining all these technologies improved the system performance and enhanced the localization accuracy.

**Figure 7.** RF sensor direction finding reported in [47].

In [40], the proposed UAV detection and localization system relies on time delay and beamforming of the collected acoustic signal from a set of microphones. Acoustic signal characteristics with such signal processing are used to find the DOA of the UAV's detected recorded signal. Furthermore, Kalman filtering is used to improve the UAV's trajectory. The system is designed to identify and track the RF signal emitted by portable RF devices [48]. The system consists of two parts: (1) RF signal acquisitions achieved by an antenna array followed by a Nyquist ADC converter and (2) signal processing. The RF signal from the first part is passed into FFT to measure the DOA. The DOA is passed into the digital bandpass filter to measure the TDOA, which is used together with the DOA to estimate the location. AOA calculated from the DOA, the location, and past tracking information are used for tracking the drone's position [40].

#### *3.6. Comparison of Detection Technologies*

Earlier sections have discussed different techniques for detecting, identifying, and localizing UAVs. Each technique's performance varies according to equipment complexity and cost, coverage range and distance, operation efficiency, accuracy and precision measurements, etc. Table 6 summarizes the techniques cited in this study with their main features and affected factors. Combining the different techniques and integrating different sensors can increase the accuracy and reliability of the UAV detection systems, reduce the possibility of errors, and improve the system's ability to adapt.


**Table 6.** Summary of all reviewed UAV detection techniques.

#### **4. Drone Controller Detection and Localization**

Once UAVs are detected, the detection and localization of the drone controller are implemented to monitor their communication and limit illegal use.

UAV detection systems differ according to the technologies and functions performed, such as identification, classification, tracking, localization, interdiction, destruction, and damage. Technology and functions are selected and implemented based on the main requirements of the UAV detection system. In this section, some detection systems that support localization functionality are reviewed.

The process of locating and positioning a UAV is mainly based on collecting the direct measurements of the detected UAV and its emitted signals. These direction measurements and other extracted features are calculated and utilized in the UAV detection system to estimate the geolocation of the UAV. The computed geolocation parameters for the directionfinding methods include angle of arrival (AOA) [40], time of arrival (TOA) [64], direction of arrival (DOA) [18,47], frequency difference of arrival (FDOA) [51], time difference of arrival (TDOA) [11,35], and received signal strength (RSS) [65].

The proposed system in [12] utilizes a low-cost passive RF-based UAV detection and localization method. The system computes AOA for the RF-based signal to determine whether the transmitted signals' peaks correspond to the UAV or its controller. Then, it uses the triangulation technique to estimate the location of RF signal peak sources. The free-space path loss model and triangulation combination is reported in [50] to detect and localize a stationary drone controller. The proposed system contains two directionfinding systems for direction identification and localization for the drone and its controller. Each direction-finding system has an omnidirectional antenna for detecting drone signal occurrence and a mechanically agile directional antenna for directions identification and localization of RF signal peaks for UAV and/or its controller signals. The whole system is depicted in Figure 8.

**Figure 8.** Diagram of direction finding mechanism in the proposed system [12].

The study [12] discussed that distinguishing between RF signals from the UAV and its controller from other RF signals in the surrounding area poses a challenge in drone controller localization. The reported direction-finding station consists of two modules: drone signal analysis to classify drone and remote controller (RC) signals and the direction-finding module. Each direction-finding station extracts acute parameters from detected RF signals and uses a mechanical steering antenna for identification and localization. The precision of the direction-finding function is dependent and affected by the antenna's directivity and gain, drone velocity, scanning velocity, and beam width of the used directional antenna.

The reported detection system in [18] employs frequency hopping spread spectrum (FHSS) to detect and locate UAVs and RCs. The cyclostationarity analysis algorithm is used to identify the FHSS-type drone RC signals and differentiate them from other background signals operating in the same frequency band. After the successful classification of the drone RC signals, STFT and additional re-sampling processing are applied to enhance the detection accuracy of the reconstructed RC signal. Finally, the direction-finding phase is achieved by implementing the subspace algorithms to identify the AOA of the FHSS drone RC signal. The proposed system [49] utilizes a set of a uniform linear array of quasi-Yagi antennas in the experimental setup to enhance the precision of the directionfinding function.
