Advances in Detection and Tracking Applications for Drones and UAM Systems

A special issue of Drones (ISSN 2504-446X).

Deadline for manuscript submissions: closed (20 April 2024) | Viewed by 4718

Special Issue Editors


E-Mail Website
Guest Editor
Department of Industrial Engineering. University of Naples Federico II, 80125 Naples, Italy
Interests: UAS; payloads for UAV; electro-optical sensors; radar systems; ATM and UTM; attitude sensors
Special Issues, Collections and Topics in MDPI journals

E-Mail Website
Guest Editor
Department of Industrial Engineering. University of Naples Federico II, 80125 Naples, Italy
Interests: unmanned aerial systems; unmanned traffic management; trajectory prediction

Special Issue Information

Dear Colleagues,

The technological improvement of Unmanned Aerial Systems supports innovative tasks for civil and military solutions. Detection and tracking tasks can be achieved by integrated on-board systems that exploit advanced processing based on reliable and high-performance sensors, such as radars, optical sensors and lidars, acoustic sensors, and RF analyzers. Traffic management and surveillance aims need accurate and efficient systems that must be properly integrated for new applications of Unmanned Traffic Management and Urban Air Mobility.

The proposed Special Issue aims to investigate innovative detection and tracking solutions that can be used for navigation, traffic management, traffic integration, detect-and-avoid, and surveillance purposes. Artificial Intelligence techniques can be used for proper data processing in simulated or real scenarios. Advances in on-board data processing for target detection and tracking aim to improve aerial vehicle performance or advanced payload tasks. Surveillance can be supported by properly designed ground systems and services also considering the Urban Air Mobility scenario under developement.

We are pleased to invite original contributions and reviews. Topics can be related (but not limited) to the detection and tracking of targets and incoming traffic for navigation, traffic management, traffic integration, detection and tracking of Unmanned Aerial Systems for surveillance, detect-and-avoid, Urban Air Moblity applications and services, innovative image processing and sensor fusion, advanced solutions based on electro-optical, radar and/or lidar.

Prof. Dr. Giancarlo Rufino
Dr. Claudia Conte
Guest Editors

Manuscript Submission Information

Manuscripts should be submitted online at www.mdpi.com by registering and logging in to this website. Once you are registered, click here to go to the submission form. Manuscripts can be submitted until the deadline. All submissions that pass pre-check are peer-reviewed. Accepted papers will be published continuously in the journal (as soon as accepted) and will be listed together on the special issue website. Research articles, review articles as well as short communications are invited. For planned papers, a title and short abstract (about 100 words) can be sent to the Editorial Office for announcement on this website.

Submitted manuscripts should not have been published previously, nor be under consideration for publication elsewhere (except conference proceedings papers). All manuscripts are thoroughly refereed through a single-blind peer-review process. A guide for authors and other relevant information for submission of manuscripts is available on the Instructions for Authors page. Drones is an international peer-reviewed open access monthly journal published by MDPI.

Please visit the Instructions for Authors page before submitting a manuscript. The Article Processing Charge (APC) for publication in this open access journal is 2600 CHF (Swiss Francs). Submitted papers should be well formatted and use good English. Authors may use MDPI's English editing service prior to publication or during author revisions.

Keywords

  • unmanned aerial systems
  • urban air mobility
  • sense and avoid
  • artificial intelligence
  • sensor fusion
  • tracking

Published Papers (3 papers)

Order results
Result details
Select all
Export citation of selected articles as:

Research

19 pages, 746 KiB  
Article
Impact Analysis of Time Synchronization Error in Airborne Target Tracking Using a Heterogeneous Sensor Network
by Seokwon Lee, Zongjian Yuan, Ivan Petrunin and Hyosang Shin
Drones 2024, 8(5), 167; https://doi.org/10.3390/drones8050167 - 23 Apr 2024
Viewed by 433
Abstract
This paper investigates the influence of time synchronization on sensor fusion and target tracking. As a benchmark, we design a target tracking system based on track-to-track fusion architecture. Heterogeneous sensors detect targets and transmit measurements through a communication network, while local tracking and [...] Read more.
This paper investigates the influence of time synchronization on sensor fusion and target tracking. As a benchmark, we design a target tracking system based on track-to-track fusion architecture. Heterogeneous sensors detect targets and transmit measurements through a communication network, while local tracking and track fusion are performed in the fusion center to integrate measurements from these sensors into a fused track. The time synchronization error is mathematically modeled, and local time is biased from the reference clock during the holdover phase. The influence of the time synchronization error on target tracking system components such as local association, filtering, and track fusion is discussed. The results demonstrate that an increase in the time synchronization error leads to deteriorating association and filtering performance. In addition, the results of the simulation study validate the impact of the time synchronization error on the sensor network. Full article
Show Figures

Figure 1

20 pages, 3022 KiB  
Article
PVswin-YOLOv8s: UAV-Based Pedestrian and Vehicle Detection for Traffic Management in Smart Cities Using Improved YOLOv8
by Noor Ul Ain Tahir, Zhe Long, Zuping Zhang, Muhammad Asim and Mohammed ELAffendi
Drones 2024, 8(3), 84; https://doi.org/10.3390/drones8030084 - 28 Feb 2024
Cited by 1 | Viewed by 2102
Abstract
In smart cities, effective traffic congestion management hinges on adept pedestrian and vehicle detection. Unmanned Aerial Vehicles (UAVs) offer a solution with mobility, cost-effectiveness, and a wide field of view, and yet, optimizing recognition models is crucial to surmounting challenges posed by small [...] Read more.
In smart cities, effective traffic congestion management hinges on adept pedestrian and vehicle detection. Unmanned Aerial Vehicles (UAVs) offer a solution with mobility, cost-effectiveness, and a wide field of view, and yet, optimizing recognition models is crucial to surmounting challenges posed by small and occluded objects. To address these issues, we utilize the YOLOv8s model and a Swin Transformer block and introduce the PVswin-YOLOv8s model for pedestrian and vehicle detection based on UAVs. Firstly, the backbone network of YOLOv8s incorporates the Swin Transformer model for global feature extraction for small object detection. Secondly, to address the challenge of missed detections, we opt to integrate the CBAM into the neck of the YOLOv8. Both the channel and the spatial attention modules are used in this addition because of how well they extract feature information flow across the network. Finally, we employ Soft-NMS to improve the accuracy of pedestrian and vehicle detection in occlusion situations. Soft-NMS increases performance and manages overlapped boundary boxes well. The proposed network reduced the fraction of small objects overlooked and enhanced model detection performance. Performance comparisons with different YOLO versions ( for example YOLOv3 extremely small, YOLOv5, YOLOv6, and YOLOv7), YOLOv8 variants (YOLOv8n, YOLOv8s, YOLOv8m, and YOLOv8l), and classical object detectors (Faster-RCNN, Cascade R-CNN, RetinaNet, and CenterNet) were used to validate the superiority of the proposed PVswin-YOLOv8s model. The efficiency of the PVswin-YOLOv8s model was confirmed by the experimental findings, which showed a 4.8% increase in average detection accuracy (mAP) compared to YOLOv8s on the VisDrone2019 dataset. Full article
Show Figures

Figure 1

14 pages, 765 KiB  
Article
Towards Real-Time On-Drone Pedestrian Tracking in 4K Inputs
by Chanyoung Oh, Moonsoo Lee and Chaedeok Lim
Drones 2023, 7(10), 623; https://doi.org/10.3390/drones7100623 - 06 Oct 2023
Viewed by 1497
Abstract
Over the past several years, significant progress has been made in object tracking, but challenges persist in tracking objects in high-resolution images captured from drones. Such images usually contain very tiny objects, and the movement of the drone causes rapid changes in the [...] Read more.
Over the past several years, significant progress has been made in object tracking, but challenges persist in tracking objects in high-resolution images captured from drones. Such images usually contain very tiny objects, and the movement of the drone causes rapid changes in the scene. In addition, the computing power of mission computers on drones is often insufficient to achieve real-time processing of deep learning-based object tracking. This paper presents a real-time on-drone pedestrian tracker that takes as the input 4K aerial images. The proposed tracker effectively hides the long latency required for deep learning-based detection (e.g., YOLO) by exploiting both the CPU and GPU equipped in the mission computer. We also propose techniques to minimize detection loss in drone-captured images, including a tracker-assisted confidence boosting and an ensemble for identity association. In our experiments, using real-world inputs captured by drones at a height of 50 m, the proposed method with an NVIDIA Jetson TX2 proves its efficacy by achieving real-time detection and tracking in 4K video streams. Full article
Show Figures

Figure 1

Back to TopTop