sensors-logo

Journal Browser

Journal Browser

UAV Imaging and Sensing

A special issue of Sensors (ISSN 1424-8220). This special issue belongs to the section "Sensing and Imaging".

Deadline for manuscript submissions: closed (30 November 2022) | Viewed by 16873

Special Issue Editors


E-Mail Website
Guest Editor
Department of Electronic Engineering, School of Engineering, Shibaura Institute of Technology, Tokyo 135-8548, Japan
Interests: UAV; Artificial Intelligence (AI); ITS; aerial/mobile robotics; audio/video processing
Special Issues, Collections and Topics in MDPI journals

E-Mail Website
Guest Editor
Faculty of Science and Engineering, Doshsiha University 1-3 Tatara Miyakodani, Kyotanabe-shi, 610-0321, Kyoto, Japan
Interests: wireless communications; UAV networks; sensor networks
Special Issues, Collections and Topics in MDPI journals

E-Mail Website
Guest Editor
Institute of Electronics and Telecommunications IETR UMR CNRS 6164, University of Rennes, 22305 Lannion, France
Interests: blind estimation of degradation characteristics (noise, PSF); blind restoration of multicomponent images; multimodal image correction; multicomponent image compression; multi-channel adaptive processing of signals and images; unsupervised machine learning and deep learning; multi-mode remote sensing data processing; remote sensing
Special Issues, Collections and Topics in MDPI journals

Special Issue Information

Dear Colleagues,

Aerial sensing/imaging has become a very hot research field with the latest development of UAV and related sensing devices (Optical, multispectral, hyperspectral, thermal, lidar, laser and optical SAR). Due to this UAV and related device development, a wide range of their applications have been utilized in agriculture, transportation, the construction industry, rescue tasks and so on. This Special Issue calls for papers regarding the latest research findings related to UAV-based imaging/sensing and their applications. 

Dr. Chinthaka Premachandra
Dr. Tomotaka Kimura
Dr. Benoit Vozel
Guest Editors

Manuscript Submission Information

Manuscripts should be submitted online at www.mdpi.com by registering and logging in to this website. Once you are registered, click here to go to the submission form. Manuscripts can be submitted until the deadline. All submissions that pass pre-check are peer-reviewed. Accepted papers will be published continuously in the journal (as soon as accepted) and will be listed together on the special issue website. Research articles, review articles as well as short communications are invited. For planned papers, a title and short abstract (about 100 words) can be sent to the Editorial Office for announcement on this website.

Submitted manuscripts should not have been published previously, nor be under consideration for publication elsewhere (except conference proceedings papers). All manuscripts are thoroughly refereed through a single-blind peer-review process. A guide for authors and other relevant information for submission of manuscripts is available on the Instructions for Authors page. Sensors is an international peer-reviewed open access semimonthly journal published by MDPI.

Please visit the Instructions for Authors page before submitting a manuscript. The Article Processing Charge (APC) for publication in this open access journal is 2600 CHF (Swiss Francs). Submitted papers should be well formatted and use good English. Authors may use MDPI's English editing service prior to publication or during author revisions.

Keywords

  • UAV sensing devices
  • radiometric and geometric calibration of UAV sensing devices
  • AI-based aerial sensing and processing
  • aerial big data
  • UAV onboard data processing
  • aerial image processing
  • registering and fusion
  • UAV-based image collaborative processing
  • UAV camera/sensor networks
  • autoflight for sensing/imaging
  • aerial object and change detection
  • aerial sensing/imaging applications

Published Papers (4 papers)

Order results
Result details
Select all
Export citation of selected articles as:

Research

19 pages, 31139 KiB  
Article
Vision-Based Detection of Low-Emission Sources in Suburban Areas Using Unmanned Aerial Vehicles
by Marek Szczepański
Sensors 2023, 23(4), 2235; https://doi.org/10.3390/s23042235 - 16 Feb 2023
Cited by 2 | Viewed by 1540
Abstract
The paper discusses the problem of detecting emission sources in a low buildings area using unmanned aerial vehicles. The problem was analyzed, and methods of solving it were presented. Various data acquisition scenarios and their impact on the feasibility of the task were [...] Read more.
The paper discusses the problem of detecting emission sources in a low buildings area using unmanned aerial vehicles. The problem was analyzed, and methods of solving it were presented. Various data acquisition scenarios and their impact on the feasibility of the task were analyzed. A method for detecting smoke objects over buildings using stationary video sequences acquired with a drone in hover with the camera in the nadir position is proposed. The method uses differential frame information from stabilized video sequences and the YOLOv7 classifier. A convolutional network classifier was used to detect the roofs of buildings, using a custom training set adapted to the type of data used. Such a solution, although quite effective, is not very practical for the end user, but it enables the automatic generation of a comprehensive training set for classifiers based on deep neural networks. The effectiveness of such a solution was tested for the latest version of the YOLOv7 classifier. The tests proved the effectiveness of the described method, both for single images and video sequences. In addition, the obtained classifier correctly recognizes objects for sequences that do not meet some of the initial assumptions, such as the angle of the camera capturing the image. Full article
(This article belongs to the Special Issue UAV Imaging and Sensing)
Show Figures

Figure 1

17 pages, 4798 KiB  
Article
YOLOv5 with ConvMixer Prediction Heads for Precise Object Detection in Drone Imagery
by Ranjai Baidya and Heon Jeong
Sensors 2022, 22(21), 8424; https://doi.org/10.3390/s22218424 - 2 Nov 2022
Cited by 18 | Viewed by 3993
Abstract
The potency of object detection techniques using Unmanned Aerial Vehicles (UAVs) is unprecedented due to their mobility. This potency has stimulated the use of UAVs with object detection functionality in numerous crucial real-life applications. Additionally, more efficient and accurate object detection techniques are [...] Read more.
The potency of object detection techniques using Unmanned Aerial Vehicles (UAVs) is unprecedented due to their mobility. This potency has stimulated the use of UAVs with object detection functionality in numerous crucial real-life applications. Additionally, more efficient and accurate object detection techniques are being researched and developed for usage in UAV applications. However, object detection in UAVs presents challenges that are not common to general object detection. First, as UAVs fly at varying altitudes, the objects imaged via UAVs vary vastly in size, making the task at hand more challenging. Second due to the motion of the UAVs, there could be a presence of blur in the captured images. To deal with these challenges, we present a You Only Look Once v5 (YOLOv5)-like architecture with ConvMixers in its prediction heads and an additional prediction head to deal with minutely-small objects. The proposed architecture has been trained and tested on the VisDrone 2021 dataset, and the acquired results are comparable with the existing state-of-the-art methods. Full article
(This article belongs to the Special Issue UAV Imaging and Sensing)
Show Figures

Figure 1

24 pages, 12729 KiB  
Article
UWB and IMU-Based UAV’s Assistance System for Autonomous Landing on a Platform
by Aitor Ochoa-de-Eribe-Landaberea, Leticia Zamora-Cadenas, Oier Peñagaricano-Muñoa and Igone Velez
Sensors 2022, 22(6), 2347; https://doi.org/10.3390/s22062347 - 18 Mar 2022
Cited by 8 | Viewed by 3179
Abstract
This work presents a novel landing assistance system (LAS) capable of locating a drone for a safe landing after its inspection mission. The location of the drone is achieved by a fusion of ultra-wideband (UWB), inertial measurement unit (IMU) and magnetometer data. Unlike [...] Read more.
This work presents a novel landing assistance system (LAS) capable of locating a drone for a safe landing after its inspection mission. The location of the drone is achieved by a fusion of ultra-wideband (UWB), inertial measurement unit (IMU) and magnetometer data. Unlike other typical landing assistance systems, the UWB fixed sensors are placed around a 2 × 2 m landing platform and two tags are attached to the drone. Since this type of set-up is suboptimal for UWB location systems, a new positioning algorithm is proposed for a correct performance. First, an extended Kalman filter (EKF) algorithm is used to calculate the position of each tag, and then both positions are combined for a more accurate and robust localisation. As a result, the obtained positioning errors can be reduced by 50% compared to a typical UWB-based landing assistance system. Moreover, due to the small demand of space, the proposed landing assistance system can be used almost anywhere and is deployed easily. Full article
(This article belongs to the Special Issue UAV Imaging and Sensing)
Show Figures

Figure 1

18 pages, 26542 KiB  
Article
Design and Implementation of a UAV-Based Airborne Computing Platform for Computer Vision and Machine Learning Applications
by Athanasios Douklias, Lazaros Karagiannidis, Fay Misichroni and Angelos Amditis
Sensors 2022, 22(5), 2049; https://doi.org/10.3390/s22052049 - 6 Mar 2022
Cited by 14 | Viewed by 6629
Abstract
Visual sensing of the environment is crucial for flying an unmanned aerial vehicle (UAV) and is a centerpiece of many related applications. The ability to run computer vision and machine learning algorithms onboard an unmanned aerial system (UAS) is becoming more of a [...] Read more.
Visual sensing of the environment is crucial for flying an unmanned aerial vehicle (UAV) and is a centerpiece of many related applications. The ability to run computer vision and machine learning algorithms onboard an unmanned aerial system (UAS) is becoming more of a necessity in an effort to alleviate the communication burden of high-resolution video streaming, to provide flying aids, such as obstacle avoidance and automated landing, and to create autonomous machines. Thus, there is a growing interest on the part of many researchers in developing and validating solutions that are suitable for deployment on a UAV system by following the general trend of edge processing and airborne computing, which transforms UAVs from moving sensors into intelligent nodes that are capable of local processing. In this paper, we present, in a rigorous way, the design and implementation of a 12.85 kg UAV system equipped with the necessary computational power and sensors to serve as a testbed for image processing and machine learning applications, explain the rationale behind our decisions, highlight selected implementation details, and showcase the usefulness of our system by providing an example of how a sample computer vision application can be deployed on our platform. Full article
(This article belongs to the Special Issue UAV Imaging and Sensing)
Show Figures

Figure 1

Back to TopTop