Next Article in Journal / Special Issue
Vision-Based Formation Control of Quadrotors Using a Bearing-Only Approach
Previous Article in Journal
Learning to Walk with Adaptive Feet
Previous Article in Special Issue
Analysis of Attack Intensity on Autonomous Mobile Robots
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
This is an early access version, the complete PDF, HTML, and XML versions will be available soon.
Article

Fixed-Wing UAV Pose Estimation Using a Self-Organizing Map and Deep Learning

by
Nuno Pessanha Santos
1,2,3
1
Portuguese Military Research Center (CINAMIL), Portuguese Military Academy (Academia Militar), R. Gomes Freire 203, 1169-203 Lisbon, Portugal
2
Institute for Systems and Robotics (ISR), Instituto Superior Técnico (IST), Av. Rovisco Pais 1, 1049-001 Lisbon, Portugal
3
Portuguese Navy Research Center (CINAV), Portuguese Naval Academy (Escola Naval), Base Naval de Lisboa, Alfeite, 2800-001 Almada, Portugal
Robotics 2024, 13(8), 114; https://doi.org/10.3390/robotics13080114 (registering DOI)
Submission received: 19 June 2024 / Revised: 9 July 2024 / Accepted: 26 July 2024 / Published: 27 July 2024
(This article belongs to the Special Issue UAV Systems and Swarm Robotics)

Abstract

In many Unmanned Aerial Vehicle (UAV) operations, accurately estimating the UAV’s position and orientation over time is crucial for controlling its trajectory. This is especially important when considering the landing maneuver, where a ground-based camera system can estimate the UAV’s 3D position and orientation. A Red, Green, and Blue (RGB) ground-based monocular approach can be used for this purpose, allowing for more complex algorithms and higher processing power. The proposed method uses a hybrid Artificial Neural Network (ANN) model, incorporating a Kohonen Neural Network (KNN) or Self-Organizing Map (SOM) to identify feature points representing a cluster obtained from a binary image containing the UAV. A Deep Neural Network (DNN) architecture is then used to estimate the actual UAV pose based on a single frame, including translation and orientation. Utilizing the UAV Computer-Aided Design (CAD) model, the network structure can be easily trained using a synthetic dataset, and then fine-tuning can be done to perform transfer learning to deal with real data. The experimental results demonstrate that the system achieves high accuracy, characterized by low errors in UAV pose estimation. This implementation paves the way for automating operational tasks like autonomous landing, which is especially hazardous and prone to failure.
Keywords: computer vision; pose estimation; Kohonen neural network; self-organizing map; deep neural network; unmanned aerial vehicles; autonomous vehicles computer vision; pose estimation; Kohonen neural network; self-organizing map; deep neural network; unmanned aerial vehicles; autonomous vehicles

Share and Cite

MDPI and ACS Style

Pessanha Santos, N. Fixed-Wing UAV Pose Estimation Using a Self-Organizing Map and Deep Learning. Robotics 2024, 13, 114. https://doi.org/10.3390/robotics13080114

AMA Style

Pessanha Santos N. Fixed-Wing UAV Pose Estimation Using a Self-Organizing Map and Deep Learning. Robotics. 2024; 13(8):114. https://doi.org/10.3390/robotics13080114

Chicago/Turabian Style

Pessanha Santos, Nuno. 2024. "Fixed-Wing UAV Pose Estimation Using a Self-Organizing Map and Deep Learning" Robotics 13, no. 8: 114. https://doi.org/10.3390/robotics13080114

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop