sensors-logo

Journal Browser

Journal Browser

Recent Advances in Visual Sensor Networks for Robotics and Automation

A special issue of Sensors (ISSN 1424-8220). This special issue belongs to the section "Sensors and Robotics".

Deadline for manuscript submissions: closed (20 March 2022) | Viewed by 3709

Special Issue Editors


E-Mail Website
Guest Editor
Department of Information Engineering, University of Padova, Via Gradenigo 6B, 35131 Padova, Italy
Interests: system modeling; control theory and its applications; sensor and actuator networks; multiagent and robotic systems
Special Issues, Collections and Topics in MDPI journals

E-Mail Website
Guest Editor
Department of Management and Engineering, University of Padova, Stradella San Nicola 3, 36100 Vicenza, Italy
Interests: multiagent systems modeling and control with a special regard to networked camera systems and formations of aerial vehicles and nanosatellites
Special Issues, Collections and Topics in MDPI journals

Special Issue Information

Dear Colleagues,

The development of low-cost high-resolution imaging sensors, the wide spread of IoT technologies, and the renewed methodological interest in networked systems are leveraging the transformation of traditional vision systems into ubiquitous and pervasive multiagent visual sensor networks, whose sensing capabilities act synergistically with the actuation capabilities.

In particular, by acquiring information-rich data within a sense-to-act, act-to-sense paradigm, these multiagent architectures play a crucial role in the robotics and automation fields. Indeed, they are able to manage complex dynamic decision-making processes for a wide variety of applications, ranging from localization and navigation in indoor/outdoor environments to the exploration and mapping of unstructured areas, from the scene monitoring and surveillance to the industrial and infrastructure inspection, from the collaborative manipulation to the human-in-the-loop interaction, to cite a few.

This Special Issue on Sensors and Robotics aims at putting together novel contributions on multiagent visual sensor networks where emphasis is placed on the dynamic characteristic of the information and of the systems, which may be induced by the nature of the observed scene, by that of the observing agents, or by both.

Along this trendline, possible topics of interest include but are not limited to:

  • Cooperative robotics;
  • Active and mobile sensing;
  • Vision-based learning for control;
  • Context aware automation;
  • Multicamera networks.

You may choose our Joint Special Issue in Automation.

Prof. Dr. Angelo Cenedese
Dr. Giulia Michieletto
Guest Editors

Manuscript Submission Information

Manuscripts should be submitted online at www.mdpi.com by registering and logging in to this website. Once you are registered, click here to go to the submission form. Manuscripts can be submitted until the deadline. All submissions that pass pre-check are peer-reviewed. Accepted papers will be published continuously in the journal (as soon as accepted) and will be listed together on the special issue website. Research articles, review articles as well as short communications are invited. For planned papers, a title and short abstract (about 100 words) can be sent to the Editorial Office for announcement on this website.

Submitted manuscripts should not have been published previously, nor be under consideration for publication elsewhere (except conference proceedings papers). All manuscripts are thoroughly refereed through a single-blind peer-review process. A guide for authors and other relevant information for submission of manuscripts is available on the Instructions for Authors page. Sensors is an international peer-reviewed open access semimonthly journal published by MDPI.

Please visit the Instructions for Authors page before submitting a manuscript. The Article Processing Charge (APC) for publication in this open access journal is 2600 CHF (Swiss Francs). Submitted papers should be well formatted and use good English. Authors may use MDPI's English editing service prior to publication or during author revisions.

Keywords

  • Multicamera active vision
  • Robotic VSNs
  • VSNs in process and industrial automation
  • Multimodal sensor fusion
  • Emerging applications of VSNs in smart environments

Published Papers (2 papers)

Order results
Result details
Select all
Export citation of selected articles as:

Research

18 pages, 8072 KiB  
Article
Indoor Visual Exploration with Multi-Rotor Aerial Robotic Vehicles
by Panagiotis Rousseas, George C. Karras, Charalampos P. Bechlioulis and Kostas J. Kyriakopoulos
Sensors 2022, 22(14), 5194; https://doi.org/10.3390/s22145194 - 11 Jul 2022
Cited by 1 | Viewed by 1346
Abstract
In this work, we develop a reactive algorithm for autonomous exploration of indoor, unknown environments for multiple autonomous multi-rotor robots. The novelty of our approach rests on a two-level control architecture comprised of an Artificial-Harmonic Potential Field (AHPF) for navigation and a low-level [...] Read more.
In this work, we develop a reactive algorithm for autonomous exploration of indoor, unknown environments for multiple autonomous multi-rotor robots. The novelty of our approach rests on a two-level control architecture comprised of an Artificial-Harmonic Potential Field (AHPF) for navigation and a low-level tracking controller. Owing to the AHPF properties, the field is provably safe while guaranteeing workspace exploration. At the same time, the low-level controller ensures safe tracking of the field through velocity commands to the drone’s attitude controller, which handles the challenging non-linear dynamics. This architecture leads to a robust framework for autonomous exploration, which is extended to a multi-agent approach for collaborative navigation. The integration of approximate techniques for AHPF acquisition further improves the computational complexity of the proposed solution. The control scheme and the technical results are validated through high-fidelity simulations, where all aspects, from sensing and dynamics to control, are incorporated, demonstrating the capacity of our method in successfully tackling the multi-agent exploration task. Full article
(This article belongs to the Special Issue Recent Advances in Visual Sensor Networks for Robotics and Automation)
Show Figures

Figure 1

28 pages, 2349 KiB  
Article
Visual Sensor Networks for Indoor Real-Time Surveillance and Tracking of Multiple Targets
by Jacopo Giordano, Margherita Lazzaretto, Giulia Michieletto and Angelo Cenedese
Sensors 2022, 22(7), 2661; https://doi.org/10.3390/s22072661 - 30 Mar 2022
Cited by 3 | Viewed by 1498
Abstract
The recent trend toward the development of IoT architectures has entailed the transformation of the standard camera networks into smart multi-device systems capable of acquiring, elaborating, and exchanging data and, often, dynamically adapting to the environment. Along this line, this work proposes a [...] Read more.
The recent trend toward the development of IoT architectures has entailed the transformation of the standard camera networks into smart multi-device systems capable of acquiring, elaborating, and exchanging data and, often, dynamically adapting to the environment. Along this line, this work proposes a novel distributed solution that guarantees the real-time monitoring of 3D indoor structured areas and also the tracking of multiple targets, by employing a heterogeneous visual sensor network composed of both fixed and Pan-Tilt-Zoom (PTZ) cameras. The fulfillment of the twofold mentioned goal was ensured through the implementation of a distributed game-theory-based algorithm, aiming at optimizing the controllable parameters of the PTZ devices. The proposed solution is able to deal with the possible conflicting requirements of high tracking precision and maximum coverage of the surveilled area. Extensive numerical simulations in realistic scenarios validated the effectiveness of the outlined strategy. Full article
(This article belongs to the Special Issue Recent Advances in Visual Sensor Networks for Robotics and Automation)
Show Figures

Figure 1

Back to TopTop