sensors-logo

Journal Browser

Journal Browser

Assistance Robotics and Sensors

A special issue of Sensors (ISSN 1424-8220). This special issue belongs to the section "Sensors and Robotics".

Deadline for manuscript submissions: closed (31 January 2023) | Viewed by 27957

Special Issue Editors


E-Mail Website
Guest Editor
Automatics, Robotics and Computer Vision Group, Computer Science Research Institute, University of Alicante, 03690 Alicante, Spain
Interests: dexterous grasping; outdoor manipulation; neuro-robotics; myoelectric control; marine robotics; deep learning; production automation and automatic disassembly
Special Issues, Collections and Topics in MDPI journals

E-Mail Website
Guest Editor
Automatics, Robotics and Computer Vision Group, Computer Science Research Institute, University of Alicante, 03690 Alicante, Spain
Interests: intelligent robotic manipulation; visual control of robots; robot perception systems; field robots and advanced automation for industry 4.0; artificial vision engineering and e-learning
Special Issues, Collections and Topics in MDPI journals

Special Issue Information

Dear Colleagues,

In recent years, exploitation of assistance robotics has experienced a significant growth, mostly based on the development of sensor and processing technologies with the increasing interest in improving interaction of robots with humans in a more natural way. Robots are required to assist humans in the industry, in the manufacturing workspace, in the rehabilitation process, as well as in the medical environment. Robots are used to achieve ambient assisted living and to help older adults. Furthermore, assistive robots are used in security, search, or rescue operations, and to interact with humans with infectious diseases.

This Special Issue is focused on breakthrough developments in the field of assistive robotics, including current scientific progress in machine learning, deep learning, reinforcement learning, and imitation learning to enable assistive robots to help humans in any environment, as well as any supportive sensorial system that facilitates interaction between humans and robots at home or in the industrial environment. In addition to the aforementioned environments, methods and algorithms that combine sensors to enable assistive robots can be considered. Papers should address innovative solutions in these fields. Both review articles and original research papers are solicited.

Prof. Dr. Santiago T. Puente
Prof. Dr. Fernando Torres
Guest Editors

Manuscript Submission Information

Manuscripts should be submitted online at www.mdpi.com by registering and logging in to this website. Once you are registered, click here to go to the submission form. Manuscripts can be submitted until the deadline. All submissions that pass pre-check are peer-reviewed. Accepted papers will be published continuously in the journal (as soon as accepted) and will be listed together on the special issue website. Research articles, review articles as well as short communications are invited. For planned papers, a title and short abstract (about 100 words) can be sent to the Editorial Office for announcement on this website.

Submitted manuscripts should not have been published previously, nor be under consideration for publication elsewhere (except conference proceedings papers). All manuscripts are thoroughly refereed through a single-blind peer-review process. A guide for authors and other relevant information for submission of manuscripts is available on the Instructions for Authors page. Sensors is an international peer-reviewed open access semimonthly journal published by MDPI.

Please visit the Instructions for Authors page before submitting a manuscript. The Article Processing Charge (APC) for publication in this open access journal is 2600 CHF (Swiss Francs). Submitted papers should be well formatted and use good English. Authors may use MDPI's English editing service prior to publication or during author revisions.

Keywords

  • Assistive robotics to human operators in industry and in manufacturing workspaces
  • Assistive robotics in the rehabilitation process and the medical environment
  • Robots to achieve ambient assisted living
  • Assistive robots in operations of security, search, or rescue
  • Robotics to interact with humans with infectious diseases
  • Machine learning, deep learning, reinforcement learning, and imitation learning to enable assistive robots to humans in any environment
  • Methods and algorithms that combine sensors to enable assistive robots
  • Any supportive sensorial system that facilitates interaction between humans and robots

Published Papers (9 papers)

Order results
Result details
Select all
Export citation of selected articles as:

Editorial

Jump to: Research

2 pages, 158 KiB  
Editorial
Assistance Robotics and Sensors
by Santiago T. Puente and Fernando Torres
Sensors 2023, 23(9), 4286; https://doi.org/10.3390/s23094286 - 26 Apr 2023
Viewed by 806
Abstract
In recent years, the exploitation of assistive robotics has experienced significant growth, mostly based on the development of sensor and processing technologies with the increasing interest in improving the interactions between robots and humans and making them more natural [...] Full article
(This article belongs to the Special Issue Assistance Robotics and Sensors)

Research

Jump to: Editorial

15 pages, 2999 KiB  
Article
Low Cost Magnetic Field Control for Disabled People
by Daniel Acosta, Bibiana Fariña, Jonay Toledo and Leopoldo Acosta Sanchez
Sensors 2023, 23(2), 1024; https://doi.org/10.3390/s23021024 - 16 Jan 2023
Cited by 2 | Viewed by 1512
Abstract
Our research presents a cost-effective navigation system for electric wheelchairs that utilizes the tongue as a human–machine interface (HMI) for disabled individuals. The user controls the movement of the wheelchair by wearing a small neodymium magnet on their tongue, which is held in [...] Read more.
Our research presents a cost-effective navigation system for electric wheelchairs that utilizes the tongue as a human–machine interface (HMI) for disabled individuals. The user controls the movement of the wheelchair by wearing a small neodymium magnet on their tongue, which is held in place by a suction pad. The system uses low-cost electronics and sensors, including two electronic compasses, to detect the position of the magnet in the mouth. One compass estimates the magnet’s position while the other is used as a reference to compensate for static magnetic fields. A microcontroller processes the data using a computational algorithm that takes the mathematical formulations of the magnetic fields as input in real time. The system has been tested using real data to control an electric wheelchair, and it has been shown that a trained user can effectively use tongue movements as an interface for the wheelchair or a computer. Full article
(This article belongs to the Special Issue Assistance Robotics and Sensors)
Show Figures

Figure 1

21 pages, 5658 KiB  
Article
Assessment of the Mechanical Support Characteristics of a Light and Wearable Robotic Exoskeleton Prototype Applied to Upper Limb Rehabilitation
by Manuel Andrés Vélez-Guerrero, Mauro Callejas-Cuervo, Juan C. Álvarez and Stefano Mazzoleni
Sensors 2022, 22(11), 3999; https://doi.org/10.3390/s22113999 - 25 May 2022
Cited by 3 | Viewed by 2501
Abstract
Robotic exoskeletons are active devices that assist or counteract the movements of the body limbs in a variety of tasks, including in industrial environments or rehabilitation processes. With the introduction of textile and soft materials in these devices, the effective motion transmission, mechanical [...] Read more.
Robotic exoskeletons are active devices that assist or counteract the movements of the body limbs in a variety of tasks, including in industrial environments or rehabilitation processes. With the introduction of textile and soft materials in these devices, the effective motion transmission, mechanical support of the limbs, and resistance to physical disturbances are some of the most desirable structural features. This paper proposes an evaluation protocol and assesses the mechanical support properties of a servo-controlled robotic exoskeleton prototype for rehabilitation in upper limbs. Since this prototype was built from soft materials, it is necessary to evaluate the mechanical behavior in the areas that support the arm. Some of the rehabilitation-supporting movements such as elbow flexion and extension, as well as increased muscle tone (spasticity), are emulated. Measurements are taken using the reference supplied to the system’s control stage and then compared with an external high-precision optical tracking system. As a result, it is evidenced that the use of soft materials provides satisfactory outcomes in the motion transfer and support to the limb. In addition, this study lays the groundwork for a future assessment of the prototype in a controlled laboratory environment using human test subjects. Full article
(This article belongs to the Special Issue Assistance Robotics and Sensors)
Show Figures

Figure 1

24 pages, 14126 KiB  
Article
Visual Sensor Fusion Based Autonomous Robotic System for Assistive Drinking
by Pieter Try, Steffen Schöllmann, Lukas Wöhle and Marion Gebhard
Sensors 2021, 21(16), 5419; https://doi.org/10.3390/s21165419 - 11 Aug 2021
Cited by 6 | Viewed by 3073
Abstract
People with severe motor impairments like tetraplegia are restricted in activities of daily living (ADL) and are dependent on continuous human assistance. Assistive robots perform physical tasks in the context of ADLs to support people in need of assistance. In this work a [...] Read more.
People with severe motor impairments like tetraplegia are restricted in activities of daily living (ADL) and are dependent on continuous human assistance. Assistive robots perform physical tasks in the context of ADLs to support people in need of assistance. In this work a sensor fusion algorithm and a robot control algorithm for localizing the user’s mouth and autonomously navigating a robot arm are proposed for the assistive drinking task. The sensor fusion algorithm is implemented in a visual tracking system which consists of a 2-D camera and a single point time-of-flight distance sensor. The sensor fusion algorithm utilizes computer vision to combine camera images and distance measurements to achieve reliable localization of the user’s mouth. The robot control algorithm uses visual servoing to navigate a robot-handled drinking cup to the mouth and establish physical contact with the lips. This system features an abort command that is triggered by turning the head and unambiguous tracking of multiple faces which enable safe human robot interaction. A study with nine able-bodied test subjects shows that the proposed system reliably localizes the mouth and is able to autonomously navigate the cup to establish physical contact with the mouth. Full article
(This article belongs to the Special Issue Assistance Robotics and Sensors)
Show Figures

Figure 1

26 pages, 6511 KiB  
Article
Design, Development, and Testing of an Intelligent Wearable Robotic Exoskeleton Prototype for Upper Limb Rehabilitation
by Manuel Andrés Vélez-Guerrero, Mauro Callejas-Cuervo and Stefano Mazzoleni
Sensors 2021, 21(16), 5411; https://doi.org/10.3390/s21165411 - 10 Aug 2021
Cited by 17 | Viewed by 5884
Abstract
Neuromotor rehabilitation and recovery of upper limb functions are essential to improve the life quality of patients who have suffered injuries or have pathological sequels, where it is desirable to enhance the development of activities of daily living (ADLs). Modern approaches such as [...] Read more.
Neuromotor rehabilitation and recovery of upper limb functions are essential to improve the life quality of patients who have suffered injuries or have pathological sequels, where it is desirable to enhance the development of activities of daily living (ADLs). Modern approaches such as robotic-assisted rehabilitation provide decisive factors for effective motor recovery, such as objective assessment of the progress of the patient and the potential for the implementation of personalized training plans. This paper focuses on the design, development, and preliminary testing of a wearable robotic exoskeleton prototype with autonomous Artificial Intelligence-based control, processing, and safety algorithms that are fully embedded in the device. The proposed exoskeleton is a 1-DoF system that allows flexion-extension at the elbow joint, where the chosen materials render it compact. Different operation modes are supported by a hierarchical control strategy, allowing operation in autonomous mode, remote control mode, or in a leader-follower mode. Laboratory tests validate the proper operation of the integrated technologies, highlighting a low latency and reasonable accuracy. The experimental result shows that the device can be suitable for use in providing support for diagnostic and rehabilitation processes of neuromotor functions, although optimizations and rigorous clinical validation are required beforehand. Full article
(This article belongs to the Special Issue Assistance Robotics and Sensors)
Show Figures

Figure 1

11 pages, 2816 KiB  
Communication
Design of a Payload Adjustment Device for an Unpowered Lower-Limb Exoskeleton
by Junghwan Yun, Ohhyun Kang and Hyun-Min Joe
Sensors 2021, 21(12), 4037; https://doi.org/10.3390/s21124037 - 11 Jun 2021
Cited by 4 | Viewed by 2782
Abstract
This paper proposes a device that can change the payload of an unpowered lower-limb exoskeleton supporting the weights of humans and loads. Our previous exoskeletons used a cam–follower structure with a spring applied to the hip joint. This exoskeleton showed satisfying performance within [...] Read more.
This paper proposes a device that can change the payload of an unpowered lower-limb exoskeleton supporting the weights of humans and loads. Our previous exoskeletons used a cam–follower structure with a spring applied to the hip joint. This exoskeleton showed satisfying performance within the payload; however, the performance decreased when the payload was exceeded. Therefore, a payload adjustment device that can adjust the wearer’s required torque by easily applying it to the cam–follower structure was developed. An exoskeleton dynamic equation that can calculate a person’s required joint torque given the required payload and the wearer’s posture was derived. This dynamic equation provides a guideline for designing a device that can adjust the allowable joint torque range of an unpowered exoskeleton. In the Adams simulation environment, the payload adjustment device is applied to the cam–follower structure to show that the payload of the exoskeleton can be changed. User convenience and mass production were taken into account in the design of this device. This payload adjustment device should flexibly change the payload of the level desired by the wearer because it can quickly change the payload of the exoskeleton. Full article
(This article belongs to the Special Issue Assistance Robotics and Sensors)
Show Figures

Figure 1

28 pages, 15906 KiB  
Article
Towards Robust Robot Control in Cartesian Space Using an Infrastructureless Head- and Eye-Gaze Interface
by Lukas Wöhle and Marion Gebhard
Sensors 2021, 21(5), 1798; https://doi.org/10.3390/s21051798 - 05 Mar 2021
Cited by 14 | Viewed by 3452
Abstract
This paper presents a lightweight, infrastructureless head-worn interface for robust and real-time robot control in Cartesian space using head- and eye-gaze. The interface comes at a total weight of just 162 g. It combines a state-of-the-art visual simultaneous localization and mapping algorithm (ORB-SLAM [...] Read more.
This paper presents a lightweight, infrastructureless head-worn interface for robust and real-time robot control in Cartesian space using head- and eye-gaze. The interface comes at a total weight of just 162 g. It combines a state-of-the-art visual simultaneous localization and mapping algorithm (ORB-SLAM 2) for RGB-D cameras with a Magnetic Angular rate Gravity (MARG)-sensor filter. The data fusion process is designed to dynamically switch between magnetic, inertial and visual heading sources to enable robust orientation estimation under various disturbances, e.g., magnetic disturbances or degraded visual sensor data. The interface furthermore delivers accurate eye- and head-gaze vectors to enable precise robot end effector (EFF) positioning and employs a head motion mapping technique to effectively control the robots end effector orientation. An experimental proof of concept demonstrates that the proposed interface and its data fusion process generate reliable and robust pose estimation. The three-dimensional head- and eye-gaze position estimation pipeline delivers a mean Euclidean error of 19.0±15.7 mm for head-gaze and 27.4±21.8 mm for eye-gaze at a distance of 0.3–1.1 m to the user. This indicates that the proposed interface offers a precise control mechanism for hands-free and full six degree of freedom (DoF) robot teleoperation in Cartesian space by head- or eye-gaze and head motion. Full article
(This article belongs to the Special Issue Assistance Robotics and Sensors)
Show Figures

Figure 1

17 pages, 3961 KiB  
Article
Suboptimal Omnidirectional Wheel Design and Implementation
by Jordi Palacín, David Martínez, Elena Rubies and Eduard Clotet
Sensors 2021, 21(3), 865; https://doi.org/10.3390/s21030865 - 28 Jan 2021
Cited by 16 | Viewed by 3232
Abstract
The optimal design of an omnidirectional wheel is usually focused on the minimization of the gap between the free rollers of the wheel in order to minimize contact discontinuities with the floor in order to minimize the generation of vibrations. However, in practice, [...] Read more.
The optimal design of an omnidirectional wheel is usually focused on the minimization of the gap between the free rollers of the wheel in order to minimize contact discontinuities with the floor in order to minimize the generation of vibrations. However, in practice, a fast, tall, and heavy-weighted mobile robot using optimal omnidirectional wheels may also need a suspension system in order to reduce the presence of vibrations and oscillations in the upper part of the mobile robot. This paper empirically evaluates whether a heavy-weighted omnidirectional mobile robot can take advantage of its passive suspension system in order to also use non-optimal or suboptimal omnidirectional wheels with a non-optimized inner gap. The main comparative advantages of the proposed suboptimal omnidirectional wheel are its low manufacturing cost and the possibility of taking advantage of the gap to operate outdoors. The experimental part of this paper compares the vibrations generated by the motion system of a versatile mobile robot using optimal and suboptimal omnidirectional wheels. The final conclusion is that a suboptimal wheel with a large gap produces comparable on-board vibration patterns while maintaining the traction and increasing the grip on non-perfect planar surfaces. Full article
(This article belongs to the Special Issue Assistance Robotics and Sensors)
Show Figures

Figure 1

13 pages, 3851 KiB  
Article
Impact of Acoustic and Interactive Disruptive Factors during Robot-Assisted Surgery—A Virtual Surgical Training Model
by Magret Krüger, Johannes Ackermann, Daniar Osmonov, Veronika Günther, Dirk Bauerschlag, Johannes Hensler, Jan-Hendrik Egberts, Sebastian Lippross, Georgios Gitas, Thomas Becker, Nicolai Maass, Klaus-Peter Jünemann and Ibrahim Alkatout
Sensors 2020, 20(20), 5891; https://doi.org/10.3390/s20205891 - 17 Oct 2020
Cited by 5 | Viewed by 2822
Abstract
The use of virtual reality trainers for teaching minimally invasive surgical techniques has been established for a long time in conventional laparoscopy as well as robotic surgery. The aim of the present study was to evaluate the impact of reproducible disruptive factors on [...] Read more.
The use of virtual reality trainers for teaching minimally invasive surgical techniques has been established for a long time in conventional laparoscopy as well as robotic surgery. The aim of the present study was to evaluate the impact of reproducible disruptive factors on the surgeon’s work. In a cross-sectional investigation, surgeons were tested with regard to the impact of different disruptive factors when doing exercises on a robotic-surgery simulator (Mimic Flex VRTM). Additionally, we collected data about the participants’ professional experience, gender, age, expertise in playing an instrument, and expertise in playing video games. The data were collected during DRUS 2019 (Symposium of the German Society for Robot-assisted Urology). Forty-two surgeons attending DRUS 2019 were asked to participate in a virtual robotic stress training unit. The surgeons worked in various specialties (visceral surgery, gynecology, and urology) and had different levels of expertise. The time taken to complete the exercise (TTCE), the final score (FSC), and blood loss (BL) were measured. In the basic exercise with an interactive disruption, TTCE was significantly longer (p < 0.01) and FSC significantly lower (p < 0.05). No significant difference in TTCE, FSC, or BL was noted in the advanced exercise with acoustic disruption. Performance during disruption was not dependent on the level of surgical experience, gender, age, expertise in playing an instrument, or playing video games. A positive correlation was registered between self-estimation and surgical experience. Interactive disruptions have a greater impact on the performance of a surgeon than acoustic ones. Disruption affects the performance of experienced as well as inexperienced surgeons. Disruption in daily surgery should be evaluated and minimized in the interest of the patient’s safety. Full article
(This article belongs to the Special Issue Assistance Robotics and Sensors)
Show Figures

Figure 1

Back to TopTop