Drone-Based Information Fusion to Improve Autonomous Navigation

A special issue of Drones (ISSN 2504-446X).

Deadline for manuscript submissions: 25 October 2024 | Viewed by 5255

Special Issue Editors


E-Mail Website
Guest Editor
COPELABS, Universidade Lusófona, Campo Grande 376, 1749-024 Lisbon, Portugal
Interests: computer vision; artificial intelligence UAV; autonomous systems; navigation; localization; remote sensing
Special Issues, Collections and Topics in MDPI journals

E-Mail Website
Guest Editor
PDMFC, 1300-609 Lisbon, Portugal
Interests: reliability; UAV; coverage; 5G; communication; artificial intelligence; localization

Special Issue Information

Dear Colleagues,

Due to their mobility, miniaturization, and flexible usage, UAVs have enabled a plethora of rapidly expanding applications in domains such as monitoring, search and rescue, telecommunications, agriculture, etc. UAV deployment in smart cities is promising for the provision of efficient delivery services, dynamically deployable mobile base stations for broadband hotspot connectivity, infrastructure inspection, and first-responder services, including in earthquakes, gas leakage, and explosions. However, the future prospects and ubiquity of drones in urban areas bring significant technical and societal challenges in privacy, cyber security, and public safety. Therefore, potential privacy, security, and safety concerns must be concurrently addressed with the development of full autonomy in drone operation via improvements in the performance, reliability, autonomy, and connectivity of UAV platforms.

This Special Issue welcomes high-quality papers detailing the latest research and application results in UAV development from experts in a wide array of fields, including navigation, autonomous control, secure localization, drone vision and sensing, nonlinear optimization models, and machine learning/artificial intelligence algorithms.

Potential topics for this Special Issue include, but are not limited to:

  • 5G and beyond wireless localization;
  • Machine learning and artificial intelligence for localization systems;
  • Information fusion;
  • Time-series analysis;
  • Autonomous navigation;
  • Unmanned aerial vehicle (UAV) communications relay;
  • Collision avoidance;
  • Collaborative localization and mapping;
  • Target tracking;
  • Computer vision;
  • Image processing;
  • Knowledge discovery in remote sensing imagery;
  • Novel applications of localization and tracking;
  • Security, data privacy, and trustworthiness of localization systems.

The proposed Special Issue considers target localization in wireless sensor networks, thus fitting well within the scope of Drones.

Dr. João Pedro Matos-Carvalho
Dr. Luís Miguel Campos
Guest Editors

Manuscript Submission Information

Manuscripts should be submitted online at www.mdpi.com by registering and logging in to this website. Once you are registered, click here to go to the submission form. Manuscripts can be submitted until the deadline. All submissions that pass pre-check are peer-reviewed. Accepted papers will be published continuously in the journal (as soon as accepted) and will be listed together on the special issue website. Research articles, review articles as well as short communications are invited. For planned papers, a title and short abstract (about 100 words) can be sent to the Editorial Office for announcement on this website.

Submitted manuscripts should not have been published previously, nor be under consideration for publication elsewhere (except conference proceedings papers). All manuscripts are thoroughly refereed through a single-blind peer-review process. A guide for authors and other relevant information for submission of manuscripts is available on the Instructions for Authors page. Drones is an international peer-reviewed open access monthly journal published by MDPI.

Please visit the Instructions for Authors page before submitting a manuscript. The Article Processing Charge (APC) for publication in this open access journal is 2600 CHF (Swiss Francs). Submitted papers should be well formatted and use good English. Authors may use MDPI's English editing service prior to publication or during author revisions.

Keywords

  • autonomous navigation
  • localization
  • computer vision
  • information fusion
  • artificial intelligence

Published Papers (2 papers)

Order results
Result details
Select all
Export citation of selected articles as:

Research

21 pages, 4974 KiB  
Article
Improving Visual SLAM by Combining SVO and ORB-SLAM2 with a Complementary Filter to Enhance Indoor Mini-Drone Localization under Varying Conditions
by Amin Basiri, Valerio Mariani and Luigi Glielmo
Drones 2023, 7(6), 404; https://doi.org/10.3390/drones7060404 - 19 Jun 2023
Cited by 2 | Viewed by 2743
Abstract
Mini-drones can be used for a variety of tasks, ranging from weather monitoring to package delivery, search and rescue, and also recreation. In outdoor scenarios, they leverage Global Positioning Systems (GPS) and/or similar systems for localization in order to preserve safety and performance. [...] Read more.
Mini-drones can be used for a variety of tasks, ranging from weather monitoring to package delivery, search and rescue, and also recreation. In outdoor scenarios, they leverage Global Positioning Systems (GPS) and/or similar systems for localization in order to preserve safety and performance. In indoor scenarios, technologies such as Visual Simultaneous Localization and Mapping (V-SLAM) are used instead. However, more advancements are still required for mini-drone navigation applications, especially in the case of stricter safety requirements. In this research, a novel method for enhancing indoor mini-drone localization performance is proposed. By merging Oriented Rotated Brief SLAM (ORB-SLAM2) and Semi-Direct Monocular Visual Odometry (SVO) via an Adaptive Complementary Filter (ACF), the proposed strategy achieves better position estimates under various conditions (low light in low-surface-texture environments and high flying speed), showing an average percentage error of 18.1% and 25.9% smaller than that of ORB-SLAM and SVO against the ground-truth. Full article
(This article belongs to the Special Issue Drone-Based Information Fusion to Improve Autonomous Navigation)
Show Figures

Figure 1

20 pages, 5814 KiB  
Article
Attitude Determination for Unmanned Cooperative Navigation Swarm Based on Multivectors in Covisibility Graph
by Yilin Liu, Ruochen Liu, Ruihang Yu, Zhiming Xiong, Yan Guo, Shaokun Cai and Pengfei Jiang
Drones 2023, 7(1), 40; https://doi.org/10.3390/drones7010040 - 6 Jan 2023
Cited by 3 | Viewed by 1786
Abstract
To reduce costs, an unmanned swarm usually consists of nodes with high-accuracy navigation sensors (HAN) and nodes with low-accuracy navigation sensors (LAN). Transmitting and fusing the navigation information obtained by HANs enables LANs to improve their positioning accuracy, which in general is called [...] Read more.
To reduce costs, an unmanned swarm usually consists of nodes with high-accuracy navigation sensors (HAN) and nodes with low-accuracy navigation sensors (LAN). Transmitting and fusing the navigation information obtained by HANs enables LANs to improve their positioning accuracy, which in general is called cooperative navigation (CN). In this method, the accuracy of relative observation between platforms in the swarm have dramatic effects on the positioning results. In the popular research, constructing constraints in three-dimensional (3D) frame could only optimize the position and velocity of LANs but neglected the attitude estimation so LANs cannot maintain a high attitude accuracy when utilizing navigation information obtained by sensors installed during maneuvers over long periods. Considering the performance of the inertial measurement unit (IMU) and other common sensors, this paper advances a new method to estimate the attitude of LANs in a swarm. Because the small unmanned nodes are strictly limited by relevant practical engineering problems such as size, weight and power, the method proposed could compensate for the attitude error caused by strapdown gyroscopic drift, which only use visual vectors built by the targets detected by cameras with the function of range finding. In our method, the coordinates of targets are mainly given by the You Only Look Once (YOLO) algorithm, then the visual vectors are built by connecting the targets in the covisibility graph of the nodes in the swarm. The attitude transformation matrices between each camera frame are calculated using the multivector attitude determination algorithm. Finally, we design an information filter (IF) to determine the attitude of LANs based on the observation of HANs. Considering the problem of positioning reference, the field test was conducted in the open air and we chose to use two-wheeled robots and one UAV to carry out the experiment. The results show that the relative attitude error between nodes is less than 4 degrees using the visual vector. After filtering, the attitude divergence of LANs’ installed low precision IMU can be effectively constrained, and the high-precision attitude estimation in an unmanned CN swarm can be realized. Full article
(This article belongs to the Special Issue Drone-Based Information Fusion to Improve Autonomous Navigation)
Show Figures

Figure 1

Back to TopTop