sensors-logo

Journal Browser

Journal Browser

Sensors and Human-Robot Interaction

A special issue of Sensors (ISSN 1424-8220). This special issue belongs to the section "Physical Sensors".

Deadline for manuscript submissions: closed (15 July 2023) | Viewed by 1749

Special Issue Editors

Institute for Intelligent Systems Research and Innovation, Deakin University, Geelong, Australia
Interests: haptics; robotics; tactile; human-machine interaction; human-computer interaction
School of Future Technology, South China University of Technology, Guangzhou 510641, China
Interests: flexible sensor technology; virtual reality; emotional stimulation scenes; wearable electronic devices; systems and applications; artificial intelligence; sensor

Special Issue Information

Dear Colleagues, 

With the rapid progress of various sensor technologies and algorithms, robots have become more intelligent, approachable and comprehensive in their understanding and are more able to predict human behaviours, thus allowing a higher degree of interaction paradigms with better control precision, response and usability. This Special Issue focuses on the recent advances of various forms of sensor technologies, as well as their practical applications in humanrobot interaction, including but not limited to the usage of multimodal sensations of vision, hearing and tactility, along with AI-based algorithms to construct, recognise, track and perform various tasks.

This Special Issue will cover all aspects of sensors for humanrobot interaction, including sensor design, sensor fabrications, sensing methodologies, multimodal sensing, system framework, tools and applications, as well as analysis and assessment. We welcome submissions from all topics of humanrobot interaction using sensors, which include, but are not limited to, the following topics:

  • Vision/camera based sensors;
  • Wearable sensors, devices and electronics;
  • Tactile sensors;
  • Humanrobot interaction;
  • Humanmachine interaction;
  • Multimodal sensing;
  • Smart/intelligent sensors;
  • MEMS/NEMS;
  • Localization and object tracking;
  • Machine/deep learning and artificial intelligence in sensing and imaging.

Dr. Lei Wei
Dr. Lin Shu
Guest Editors

Manuscript Submission Information

Manuscripts should be submitted online at www.mdpi.com by registering and logging in to this website. Once you are registered, click here to go to the submission form. Manuscripts can be submitted until the deadline. All submissions that pass pre-check are peer-reviewed. Accepted papers will be published continuously in the journal (as soon as accepted) and will be listed together on the special issue website. Research articles, review articles as well as short communications are invited. For planned papers, a title and short abstract (about 100 words) can be sent to the Editorial Office for announcement on this website.

Submitted manuscripts should not have been published previously, nor be under consideration for publication elsewhere (except conference proceedings papers). All manuscripts are thoroughly refereed through a single-blind peer-review process. A guide for authors and other relevant information for submission of manuscripts is available on the Instructions for Authors page. Sensors is an international peer-reviewed open access semimonthly journal published by MDPI.

Please visit the Instructions for Authors page before submitting a manuscript. The Article Processing Charge (APC) for publication in this open access journal is 2600 CHF (Swiss Francs). Submitted papers should be well formatted and use good English. Authors may use MDPI's English editing service prior to publication or during author revisions.

Published Papers (1 paper)

Order results
Result details
Select all
Export citation of selected articles as:

Research

29 pages, 14199 KiB  
Article
Master–Slave Control System for Virtual–Physical Interactions Using Hands
by Siyuan Liu and Chao Sun
Sensors 2023, 23(16), 7107; https://doi.org/10.3390/s23167107 - 11 Aug 2023
Cited by 1 | Viewed by 1421
Abstract
Among the existing technologies for hand protection, master–slave control technology has been extensively researched and applied within the field of safety engineering to mitigate the occurrence of safety incidents. However, it has been identified through research that traditional master–slave control technologies no longer [...] Read more.
Among the existing technologies for hand protection, master–slave control technology has been extensively researched and applied within the field of safety engineering to mitigate the occurrence of safety incidents. However, it has been identified through research that traditional master–slave control technologies no longer meet current production and lifestyle needs, and they have even begun to pose new safety risks. To resolve the safety risks exposed by traditional master–slave control, this research fuses master–slave control technology for hands with virtual reality technology, and the design of a master–slave control system for hands based on virtual reality technology is investigated. This study aims to realize the design of a master–slave control system for virtual–physical interactions using hands that captures the position, orientation, and finger joint angles of the user’s hand in real time and synchronizes the motion of the slave interactive device with that of a virtual hand. With amplitude limiting, jitter elimination, and a complementary filtering algorithm, the original motion data collected by the designed glove are turned into a Kalman-filtering-algorithm-based driving database, which drives the synchronous interaction of the virtual hand and a mechanical hand. As for the experimental results, the output data for the roll, pitch, and yaw were in the stable ranges of −0.1° to 0.1°, −0.15° to 0.15°, and −0.15° to 0.15°, respectively, which met the accuracy requirements for the system’s operation under different conditions. More importantly, these data prove that, in terms of accuracy and denoising, the data-processing algorithm was relatively compatible with the hardware platform of the system. Based on the algorithm for the virtual–physical interaction model, the authors introduced the concept of an auxiliary hand into the research, put forward an algorithmic process and a judgement condition for the stable grasp of the virtual hand’s, and solved a model-penetrating problem while enhancing the immersive experience during virtual–physical interactions. In an interactive experiment, a dynamic accuracy test was run on the system. As shown by the experimental data and the interactive effect, the system was satisfactorily stable and interactive. Full article
(This article belongs to the Special Issue Sensors and Human-Robot Interaction)
Show Figures

Figure 1

Back to TopTop