sensors-logo

Journal Browser

Journal Browser

Sensors and Perception Systems for Mobile Robot Navigation

A special issue of Sensors (ISSN 1424-8220). This special issue belongs to the section "Physical Sensors".

Deadline for manuscript submissions: closed (13 November 2020) | Viewed by 40537

Special Issue Editors


E-Mail Website
Guest Editor
Dipartimento di Ingegneria Elettrica Elettronica e Informatica, University of Catania, Viale Andrea Doria 6, 95125 Catania (CT), Italy
Interests: service robotics; aerial and ground vehicle cooperation; field robotics
Special Issues, Collections and Topics in MDPI journals

E-Mail Website
Guest Editor
Dipartimento di Ingegneria Elettrica Elettronica e Informatica, University of Catania, Viale Andrea Doria 6, 95125 Catania (CT), Italy
Interests: control and navigation of autonomous mobile robots, aerial and ground robots cooperation, multi-sensor data fusion for robotics

E-Mail Website
Guest Editor
Department of Electrical, Electronic and Computer Engineering, University of Catania, 95131 Catania, Italy
Interests: control and navigation of autonomous robots; aerial and ground robots cooperation; artificial intelligence for autonomous navigation in challenging environments
Special Issues, Collections and Topics in MDPI journals

Special Issue Information

Dear Colleagues,

sensors are a crucial part of a robotic system, since they provide the necessary information to make the system itself aware of its own status and to properly interact with its surrounding environment. The overall sensing system constitutes a block of a general robotic system architecture which is typically referred to as perception. The term perception encloses a wide variety of solutions including both sensing devices and information extraction from the raw data.

It can be easily figured out how closely related the perception and the intended application of the robotic system are. The perception block design changes whether it has to be employed on a manipulator or a mobile platform, or depending on the type of environment considered, e.g. indoor or outdoor. On the other hand, the adoption of improved devices and sensing technologies has opened up a wider range of applications and to perform more complex tasks, which could not be addressed with earlier solutions.

In this Special Issue, you are invited to submit contributions describing the development of novel perception approaches for navigation in mobile robotics applications. Innovative processing algorithms, theoretical studies and experimental results on real vehicles are also welcomed. Particular emphasis will be given to exteroceptive sensors, which help a robot to be aware of its surrounding environment.

Potential topics include, but are not limited, to the following:

  • Localization sensors for mobile robots
  • Situation-awareness sensors for robots
  • Terrain reconstruction sensors
  • Sensor fusion algorithms
  • Signal processing algorithms for sensors measurements
  • Microelectromechanical systems (MEMS) based sensors for mobile robotics

Prof. Dr. Giovanni Muscato
Dr. Luciano Cantelli
Dr. Dario Calogero Guastella
Guest Editors

Manuscript Submission Information

Manuscripts should be submitted online at www.mdpi.com by registering and logging in to this website. Once you are registered, click here to go to the submission form. Manuscripts can be submitted until the deadline. All submissions that pass pre-check are peer-reviewed. Accepted papers will be published continuously in the journal (as soon as accepted) and will be listed together on the special issue website. Research articles, review articles as well as short communications are invited. For planned papers, a title and short abstract (about 100 words) can be sent to the Editorial Office for announcement on this website.

Submitted manuscripts should not have been published previously, nor be under consideration for publication elsewhere (except conference proceedings papers). All manuscripts are thoroughly refereed through a single-blind peer-review process. A guide for authors and other relevant information for submission of manuscripts is available on the Instructions for Authors page. Sensors is an international peer-reviewed open access semimonthly journal published by MDPI.

Please visit the Instructions for Authors page before submitting a manuscript. The Article Processing Charge (APC) for publication in this open access journal is 2600 CHF (Swiss Francs). Submitted papers should be well formatted and use good English. Authors may use MDPI's English editing service prior to publication or during author revisions.

Keywords

  • navigation
  • localization
  • sensor fusion algorithms
  • situation-awareness
  • sensing technologies

Benefits of Publishing in a Special Issue

  • Ease of navigation: Grouping papers by topic helps scholars navigate broad scope journals more efficiently.
  • Greater discoverability: Special Issues support the reach and impact of scientific research. Articles in Special Issues are more discoverable and cited more frequently.
  • Expansion of research network: Special Issues facilitate connections among authors, fostering scientific collaborations.
  • External promotion: Articles in Special Issues are often promoted through the journal's social media, increasing their visibility.
  • e-Book format: Special Issues with more than 10 articles can be published as dedicated e-books, ensuring wide and rapid dissemination.

Further information on MDPI's Special Issue polices can be found here.

Published Papers (9 papers)

Order results
Result details
Select all
Export citation of selected articles as:

Research

Jump to: Review

17 pages, 9068 KiB  
Article
Towards Fast Plume Source Estimation with a Mobile Robot
by Hugo Magalhães, Rui Baptista, João Macedo and Lino Marques
Sensors 2020, 20(24), 7025; https://doi.org/10.3390/s20247025 - 8 Dec 2020
Cited by 9 | Viewed by 2270
Abstract
The estimation of the parameters of an odour source is of high relevance for multiple applications, but it can be a slow and error prone process. This work proposes a fast particle filter-based method for source term estimation with a mobile robot. Two [...] Read more.
The estimation of the parameters of an odour source is of high relevance for multiple applications, but it can be a slow and error prone process. This work proposes a fast particle filter-based method for source term estimation with a mobile robot. Two strategies are implemented in order to reduce the computational cost of the filter and increase its accuracy: firstly, the sampling process is adapted by the mobile robot in order to optimise the quality of the data provided to the estimation process; secondly, the filter is initialised only after collecting preliminary data that allow limiting the solution space and use a shorter number of particles than it would be normally necessary. The method assumes a Gaussian plume model for odour dispersion. This models average odour concentrations, but the particle filter was proved adequate to fit instantaneous concentration measurements to that model, while the environment was being sampled. The method was validated in an obstacle free controlled wind tunnel and the validation results show its ability to quickly converge to accurate estimates of the plume’s parameters after a reduced number of plume crossings. Full article
(This article belongs to the Special Issue Sensors and Perception Systems for Mobile Robot Navigation)
Show Figures

Figure 1

21 pages, 4471 KiB  
Article
Real-Time Compact Environment Representation for UAV Navigation
by Kaitao Meng , Deshi Li , Xiaofan He , Mingliu Liu  and Weitao Song 
Sensors 2020, 20(17), 4976; https://doi.org/10.3390/s20174976 - 2 Sep 2020
Cited by 1 | Viewed by 2711
Abstract
Recently, unmanned aerial vehicles (UAVs) have attracted much attention due to their on-demand deployment, high mobility, and low cost. For UAVs navigating in an unknown environment, efficient environment representation is needed due to the storage limitation of the UAVs. Nonetheless, building an accurate [...] Read more.
Recently, unmanned aerial vehicles (UAVs) have attracted much attention due to their on-demand deployment, high mobility, and low cost. For UAVs navigating in an unknown environment, efficient environment representation is needed due to the storage limitation of the UAVs. Nonetheless, building an accurate and compact environment representation model is highly non-trivial because of the unknown shape of the obstacles and the time-consuming operations such as finding and eliminating the environmental details. To overcome these challenges, a novel vertical strip extraction algorithm is proposed to analyze the probability density function characteristics of the normalized disparity value and segment the obstacles through an adaptive size sliding window. In addition, a plane adjustment algorithm is proposed to represent the obstacle surfaces as polygonal prism profiles while minimizing the redundant obstacle information. By combining these two proposed algorithms, the depth sensor data can be converted into the multi-layer polygonal prism models in real time. Besides, a drone platform equipped with a depth sensor is developed to build the compact environment representation models in the real world. Experimental results demonstrate that the proposed scheme achieves better performance in terms of precision and storage as compared to the baseline. Full article
(This article belongs to the Special Issue Sensors and Perception Systems for Mobile Robot Navigation)
Show Figures

Figure 1

45 pages, 9597 KiB  
Article
Statistical Study of the Performance of Recursive Bayesian Filters with Abnormal Observations from Range Sensors
by Manuel Castellano-Quero, Juan-Antonio Fernández-Madrigal and Alfonso-José García-Cerezo
Sensors 2020, 20(15), 4159; https://doi.org/10.3390/s20154159 - 26 Jul 2020
Viewed by 2615
Abstract
Range sensors are currently present in countless applications related to perception of the environment. In mobile robots, these devices constitute a key part of the sensory apparatus and enable essential operations, that are often addressed by applying methods grounded on probabilistic frameworks such [...] Read more.
Range sensors are currently present in countless applications related to perception of the environment. In mobile robots, these devices constitute a key part of the sensory apparatus and enable essential operations, that are often addressed by applying methods grounded on probabilistic frameworks such as Bayesian filters. Unfortunately, modern mobile robots have to navigate within challenging environments from the perspective of their sensory devices, getting abnormal observations (e.g., biased, missing, etc.) that may compromise these operations. Although there exist previous contributions that either address filtering performance or identification of abnormal sensory observations, they do not provide a complete treatment of both problems at once. In this work we present a statistical approach that allows us to study and quantify the impact of abnormal observations from range sensors on the performance of Bayesian filters. For that, we formulate the estimation problem from a generic perspective (abstracting from concrete implementations), analyse the main limitations of common robotics range sensors, and define the factors that potentially affect the filtering performance. Rigorous statistical methods are then applied to a set of simulated experiments devised to reproduce a diversity of situations. The obtained results, which we also validate in a real environment, provide novel and relevant conclusions on the effect of abnormal range observations in these filters. Full article
(This article belongs to the Special Issue Sensors and Perception Systems for Mobile Robot Navigation)
Show Figures

Figure 1

18 pages, 1525 KiB  
Article
Reactive Autonomous Navigation of UAVs for Dynamic Sensing Coverage of Mobile Ground Targets
by Hailong Huang, Andrey V. Savkin and Xiaohui Li
Sensors 2020, 20(13), 3720; https://doi.org/10.3390/s20133720 - 3 Jul 2020
Cited by 23 | Viewed by 3384
Abstract
This paper addresses a problem of autonomous navigation of unmanned aerial vehicles (UAVs) for the surveillance of multiple moving ground targets. The ground can be flat or uneven. A reactive real-time sliding mode control algorithm is proposed that navigates a team of communicating [...] Read more.
This paper addresses a problem of autonomous navigation of unmanned aerial vehicles (UAVs) for the surveillance of multiple moving ground targets. The ground can be flat or uneven. A reactive real-time sliding mode control algorithm is proposed that navigates a team of communicating UAVs, equipped with ground-facing video cameras, towards moving targets to increase some measure of sensing coverage of the targets by the UAVs. Moreover, the Voronoi partitioning technique is adopted to reduce the movement range of the UAVs and decrease the revisit times of the targets. Extensive computer simulations, from the simple case with one UAV and multiple targets to the complex case with multiple UAVs and multiple targets, are conducted to demonstrate the performance of the developed autonomous navigation algorithm. The scenarios where the terrain is uneven are also considered. As shown in the simulation results, although the additional VP technique leads to some extra computation burden, the usage of the VP technique considerably reduces the target revisit time compared to the algorithm without this technique. Full article
(This article belongs to the Special Issue Sensors and Perception Systems for Mobile Robot Navigation)
Show Figures

Figure 1

22 pages, 19339 KiB  
Article
Peg-in-Hole Assembly Based on Six-Legged Robots with Visual Detecting and Force Sensing
by Yinan Zhao, Feng Gao, Yue Zhao and Zhijun Chen
Sensors 2020, 20(10), 2861; https://doi.org/10.3390/s20102861 - 18 May 2020
Cited by 13 | Viewed by 5960
Abstract
Manipulators with multi degree-of-freedom (DOF) are widely used for the peg-in-hole task. Compared with manipulators, six-legged robots have better mobility performance apart from completing operational tasks. However, there are nearly no previous studies of six-legged robots performing the peg-in-hole task. In this article, [...] Read more.
Manipulators with multi degree-of-freedom (DOF) are widely used for the peg-in-hole task. Compared with manipulators, six-legged robots have better mobility performance apart from completing operational tasks. However, there are nearly no previous studies of six-legged robots performing the peg-in-hole task. In this article, a peg-in-hole approach for six-legged robots is studied and experimented with a six-parallel-legged robot. Firstly, we propose a method whereby a vision sensor and a force/torque (F/T) sensor can be used to explore the relative location between the hole and peg. According to the visual information, the robot can approach the hole. Next, based on the force feedback, the robot plans the trajectory in real time to mate the peg and hole. Then, during the insertion, admittance control is implemented to guarantee the smooth insertion. In addition, during the whole assembly process, the peg is held by the gripper and attached to the robot body. Connected to the body, the peg has sufficient workspace and six DOF to perform the assembly task. Finally, experiments were conducted to prove the suitability of the approach. Full article
(This article belongs to the Special Issue Sensors and Perception Systems for Mobile Robot Navigation)
Show Figures

Figure 1

22 pages, 8638 KiB  
Article
Improving the Heading Accuracy in Indoor Pedestrian Navigation Based on a Decision Tree and Kalman Filter
by Guanghui Hu, Weizhi Zhang, Hong Wan and Xinxin Li
Sensors 2020, 20(6), 1578; https://doi.org/10.3390/s20061578 - 12 Mar 2020
Cited by 28 | Viewed by 4574
Abstract
In pedestrian inertial navigation, multi-sensor fusion is often used to obtain accurate heading estimates. As a widely distributed signal source, the geomagnetic field is convenient to provide sufficiently accurate heading angles. Unfortunately, there is a broad presence of artificial magnetic perturbations in indoor [...] Read more.
In pedestrian inertial navigation, multi-sensor fusion is often used to obtain accurate heading estimates. As a widely distributed signal source, the geomagnetic field is convenient to provide sufficiently accurate heading angles. Unfortunately, there is a broad presence of artificial magnetic perturbations in indoor environments, leading to difficulties in geomagnetic correction. In this paper, by analyzing the spatial distribution model of the magnetic interference field on the geomagnetic field, two quantitative features have been found to be crucial in distinguishing normal magnetic data from anomalies. By leveraging these two features and the classification and regression tree (CART) algorithm, we trained a decision tree that is capable of extracting magnetic data from distorted measurements. Furthermore, this well-trained decision tree can be used as a reject gate in a Kalman filter. By combining the decision tree and Kalman filter, a high-precision indoor pedestrian navigation system based on a magnetically assisted inertial system is proposed. This system is then validated in a real indoor environment, and the results show that our system delivers state-of-the-art positioning performance. Compared to other baseline algorithms, an improvement of over 70% in the positioning accuracy is achieved. Full article
(This article belongs to the Special Issue Sensors and Perception Systems for Mobile Robot Navigation)
Show Figures

Figure 1

18 pages, 1798 KiB  
Article
A Multi-Modal Person Perception Framework for Socially Interactive Mobile Service Robots
by Steffen Müller, Tim Wengefeld, Thanh Quang Trinh, Dustin Aganian, Markus Eisenbach and Horst-Michael Gross
Sensors 2020, 20(3), 722; https://doi.org/10.3390/s20030722 - 28 Jan 2020
Cited by 15 | Viewed by 3577
Abstract
In order to meet the increasing demands of mobile service robot applications, a dedicated perception module is an essential requirement for the interaction with users in real-world scenarios. In particular, multi sensor fusion and human re-identification are recognized as active research fronts. Through [...] Read more.
In order to meet the increasing demands of mobile service robot applications, a dedicated perception module is an essential requirement for the interaction with users in real-world scenarios. In particular, multi sensor fusion and human re-identification are recognized as active research fronts. Through this paper we contribute to the topic and present a modular detection and tracking system that models position and additional properties of persons in the surroundings of a mobile robot. The proposed system introduces a probability-based data association method that besides the position can incorporate face and color-based appearance features in order to realize a re-identification of persons when tracking gets interrupted. The system combines the results of various state-of-the-art image-based detection systems for person recognition, person identification and attribute estimation. This allows a stable estimate of a mobile robot’s user, even in complex, cluttered environments with long-lasting occlusions. In our benchmark, we introduce a new measure for tracking consistency and show the improvements when face and appearance-based re-identification are combined. The tracking system was applied in a real world application with a mobile rehabilitation assistant robot in a public hospital. The estimated states of persons are used for the user-centered navigation behaviors, e.g., guiding or approaching a person, but also for realizing a socially acceptable navigation in public environments. Full article
(This article belongs to the Special Issue Sensors and Perception Systems for Mobile Robot Navigation)
Show Figures

Figure 1

18 pages, 9874 KiB  
Article
3D Exploration and Navigation with Optimal-RRT Planners for Ground Robots in Indoor Incidents
by Noé Pérez-Higueras, Alberto Jardón, Ángel Rodríguez and Carlos Balaguer
Sensors 2020, 20(1), 220; https://doi.org/10.3390/s20010220 - 30 Dec 2019
Cited by 18 | Viewed by 5496
Abstract
Navigation and exploration in 3D environments is still a challenging task for autonomous robots that move on the ground. Robots for Search and Rescue missions must deal with unstructured and very complex scenarios. This paper presents a path planning system for navigation and [...] Read more.
Navigation and exploration in 3D environments is still a challenging task for autonomous robots that move on the ground. Robots for Search and Rescue missions must deal with unstructured and very complex scenarios. This paper presents a path planning system for navigation and exploration of ground robots in such situations. We use (unordered) point clouds as the main sensory input without building any explicit representation of the environment from them. These 3D points are employed as space samples by an Optimal-RRTplanner (RRT * ) to compute safe and efficient paths. The use of an objective function for path construction and the natural exploratory behaviour of the RRT * planner make it appropriate for the tasks. The approach is evaluated in different simulations showing the viability of autonomous navigation and exploration in complex 3D scenarios. Full article
(This article belongs to the Special Issue Sensors and Perception Systems for Mobile Robot Navigation)
Show Figures

Figure 1

Review

Jump to: Research

22 pages, 327 KiB  
Review
Learning-Based Methods of Perception and Navigation for Ground Vehicles in Unstructured Environments: A Review
by Dario Calogero Guastella and Giovanni Muscato
Sensors 2021, 21(1), 73; https://doi.org/10.3390/s21010073 - 25 Dec 2020
Cited by 76 | Viewed by 8258
Abstract
The problem of autonomous navigation of a ground vehicle in unstructured environments is both challenging and crucial for the deployment of this type of vehicle in real-world applications. Several well-established communities in robotics research deal with these scenarios such as search and rescue [...] Read more.
The problem of autonomous navigation of a ground vehicle in unstructured environments is both challenging and crucial for the deployment of this type of vehicle in real-world applications. Several well-established communities in robotics research deal with these scenarios such as search and rescue robotics, planetary exploration, and agricultural robotics. Perception plays a crucial role in this context, since it provides the necessary information to make the vehicle aware of its own status and its surrounding environment. We present a review on the recent contributions in the robotics literature adopting learning-based methods to solve the problem of environment perception and interpretation with the final aim of the autonomous context-aware navigation of ground vehicles in unstructured environments. To the best of our knowledge, this is the first work providing such a review in this context. Full article
(This article belongs to the Special Issue Sensors and Perception Systems for Mobile Robot Navigation)
Show Figures

Figure 1

Back to TopTop