Next Article in Journal
An Open PLC-Based Robot Control System for 3D Concrete Printing
Previous Article in Journal
Design and Performance Analysis of a Torsional Soft Actuator Based on Hyperelastic Materials
 
 
Retraction published on 26 January 2024, see Robotics 2024, 13(2), 21.
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

RETRACTED: A Newly-Designed Wearable Robotic Hand Exoskeleton Controlled by EMG Signals and ROS Embedded Systems

by
Ismail Ben Abdallah
1 and
Yassine Bouteraa
2,*
1
Control and Energy Management Laboratory (CEM Lab.), Ecole Nationale d’Ingénieurs de Sfax (ENIS), University of Sfax, Sfax 3038, Tunisia
2
Department of Computer Engineering, College of Computer Engineering and Sciences, Prince Sattam Bin Abdulaziz University, Al-Kharj 11942, Saudi Arabia
*
Author to whom correspondence should be addressed.
Robotics 2023, 12(4), 95; https://doi.org/10.3390/robotics12040095
Submission received: 14 June 2023 / Revised: 25 June 2023 / Accepted: 30 June 2023 / Published: 1 July 2023 / Retracted: 26 January 2024
(This article belongs to the Section Medical Robotics and Service Robotics)

Abstract

:
One of the most difficult parts of stroke therapy is hand mobility recovery. Indeed, stroke is a serious medical disorder that can seriously impair hand and locomotor movement. To improve hand function in stroke patients, new medical technologies, such as various wearable devices and rehabilitation therapies, are being developed. In this study, a new design of electromyography (EMG)-controlled 3D-printed hand exoskeleton is presented. The exoskeleton was created to help stroke victims with their gripping abilities. Computer-aided design software was used to create the device’s 3D architecture, which was then printed using a polylactic acid filament. For online classifications, the performance of two classifiers—the support vector machine (SVM) and the K-near neighbor (KNN)—was compared. The Robot Operating System (ROS) connects all the various system nodes and generates the decision for the hand exoskeleton. The selected classifiers had high accuracy, reaching up to 98% for online classification performed with healthy subjects. These findings imply that the new wearable exoskeleton, which could be controlled in accordance with the subjects’ motion intentions, could aid in hand rehabilitation for a wider motion range and greater dexterity.

1. Introduction

Stroke is the leading cause of motor impairment, affecting more than 100/100,000 people worldwide each year [1]. Following a stroke, about 80% of patients exhibit upper limb disability, frequently affecting hand function [1]. Since hands are a basic human instrument, the inability to execute several Activities of Daily Living (ADL) has a significant impact on the patient’s autonomy and may lead to permanent incapacity.
Robotic-assisted treatments could help stroke patients restore their motor function, according to the latest developments in hand rehabilitation [2,3]. The exoskeletons that have been presented offer the opportunity to intensify and repeat therapy more frequently, to precisely control motor support, and to monitor patient progress objectively [4]. Robotic therapy could therefore adapt to the patient’s unique movement patterns and progress while performing a personalized and objective follow-up of the patient’s evolution and muscular condition [5]. Moreover, the rehabilitation method can be adapted to the patient’s sensitivity, recovery stage, particular disabilities, and occupational constraints [6].
From a mechanical design point of view, the hand exoskeleton might be classified into glove, pneumatic, and mechanical exoskeletons. Although the pneumatic exoskeleton device is simple to operate, flexible finger movement is challenging [7]. Furthermore, due to the glove’s covering, glove-based exoskeletons are often less flexible and comfortable to wear due to the heat and sweat. It prevents direct contact with the object and interferes with tactile feedback [8]. In linkage-based exoskeletons, the finger components are connected to the mechanical linkages via either fingertip touch [7] or full hand contact [9].
How to operate the hand exoskeleton in accordance with the patients’ intended movements is a further concern. Surface electromyography (sEMG) is a non-invasive technique that has been extensively employed in clinical evaluations [10]. The sEMG displays muscular contractions that are controlled by the central nervous system. A hand exoskeleton’s ability to be controlled in real time may be enhanced by the capture and analysis of sEMG signals [11]. The method of mirror therapy, which proposes extracting the movement intention of stroke survivors from the non-paretic muscles due to their largely proper function, is used by the majority of exoskeletons driven by sEMG [12]. For sEMG interpretation, movement intention parsing, and robotic exoskeleton actuation, various techniques have been created. Real-time control and online sEMG analysis for multiactuator exoskeletons are still difficult to implement. In [13], a novel 3D-printed hand orthosis that is controlled by electromyography (EMG) signals is presented. However, the number of the investigated feature extraction approaches is limited by one time-domain feature. Moreover, a new hand exoskeleton with real-time EMG-driven embedded control is proposed, focusing on quantifying hand gesture recognition delays for bilateral rehabilitation [14]. However, the presented work can be extrapolated to people suffering from stroke who only have one affected hand because the control is performed bilaterally.
This study shows a newly developed wearable robotic hand exoskeleton that can be controlled by motion intention and has higher ranges of motion (ROMs) for most joints. The mechanical shells come into full contact with the human finger, and a total of five actuators are used to produce strong output forces for each digit. The sEMG was used to interpret the motion intentions of the healthy arm’s muscles as they controlled the exoskeleton of the hand. Two classifiers were implemented in several configurations, and the obtained performances were compared. In the same context, authors in [15] present a novel electromyography (EMG)-driven hand exoskeleton for bilateral rehabilitation of grasping in stroke. Although the obtained results are acceptable, the total number of healthy subjects for training and stroked subjects for validation is too limited. On the other hand, a new design of wearable robotic hand exoskeleton with more degrees of freedom (DOFs) and a larger range of motion (ROM) was demonstrated [16]. However, the authors focused on mechanical features by optimizing the DOFs and the ROM.
The Robot Operating System (ROS) is widely used in the design of the control systems of exoskeleton devices [17,18,19,20,21,22]. In [19], the authors present a new design of the robot’s control software based on the ROS (Robot Operating System) platform to realize the basic rehabilitation training of the patient’s shoulder. Moreover, a new framework has been developed for rapid prototyping based on the integration between MATLAB-Simulink and the Robot Operating System (ROS) environment [20]. This framework allows robust position and torque control of the exoskeleton and real-time monitoring. In [21], A PC equipped with ROS is used to simulate the multi-finger dexterous hand with RVIZ (ROS visualization), and the simulation model of the multi-finger dexterous hand is controlled by ROS node communication. In the same context, a set of software components, executed as ROS nodes, solves problems related to hardware issues, and control strategies are proposed [22]. However, the use of ROS-based real-time control architectures still requires more development for human computer systems.
For this purpose, we have created a prototype system that estimates hand-grip state by monitoring EMG signals from the associated muscles of the healthy limb. The main architecture of the developed system consists of controlling a new hand exoskeleton device via EMG signals measured from the healthy non-paretic side. We used the SVM and KNN classifiers to perform handgrip state estimation. Figure 1 depicts the system’s general architecture. As shown in this figure, the main architecture is composed of the EMG acquisition system, the preprocessing stage, the classification algorithms, and the active hand exoskeleton device. In the processing stage, four time-domain features were extracted from the received signals and considered as inputs for the classification block, including Willison amplitude (WAMP), Mean Absolute Value (MAV), Variance (VAR), and Waveform length (WL). The control system is based on ROS, a well-known operating system with resources for building robotic applications. The primary goal of this study was to create a reliable classification of two hand movements that could be used to control the motion of a hand exoskeleton device with few EMG channels and many degrees of freedom.
The structure of the current work is as follows: Section 2 introduces the materials and methods. Results and discussion are presented in Section 3. We finished by providing a work overview and perspective.

2. Materials and Methods

2.1. System Overview and Operating System Selection

The main architecture of the developed system consists of controlling a new hand exoskeleton device via EMG signals measured from the healthy non-paretic side. As shown in Figure 1, the main architecture is composed of the EMG acquisition system, the preprocessing stage, the classification algorithms, and the active hand exoskeleton device. The acquisition system is established by surface electrodes plugged into the MyoWare sensor. The MyoWare Muscle Sensor is an Arduino-compatible, all-in-one electromyography (EMG) sensor from Advancer Technologies. The MyoWare Muscle Sensor measures muscle activity through the electric potential of the muscle, commonly referred to as surface electromyography. MyoWare Muscle Sensor can analyze the filtered and rectified electrical activity of a muscle and output a signal that represents how hard the muscle is being flexed. This board includes a single-supply voltage of +3.3 V to +5 V, three output modes, reverse-polarity-protected power pins, and indicator LEDs. EMG raws were acquired by a real-time SbRIO board, ensuring a high-speed acquisition system. The signal pre-processing stage consists of filtering the acquired signals and using noise cancellation techniques. Four time-domain features were extracted from the received signals and considered as inputs for the classification block, including Willison amplitude (WAMP), Mean Absolute Value (MAV), Variance (VAR), and Waveform length (WL).
The control system is based on ROS, a well-known operating system with resources for building robotic applications. In actuality, ROS is an open-source software development environment for robots. Due to its cutting-edge design, ROS can run simultaneously on multiple machines and link to a wide range of gadgets and programs. The suggested control system design is shown in Figure 2. In order to ensure data collection and classification algorithm execution, several nodes were incorporated into the architecture.
Parallel processing is the main focus of the low-level design (Figure 3). A method for splitting an operation into multiple processes and running them all simultaneously on various CPUs or processors is task scheduling for parallel processing. As illustrated by Figure 3, parallel processing is performed in each task that requires running multiple processes, including the multichannel EMG acquisition, the filtering of the acquired signals, the execution of the feature extraction approaches, and the data fusion for decision making. This design offers concurrency, which removes timing constraints and enables the solution of larger problems. Additional devices may also be integrated with it. To meet real-time requirements, the system operations must be finished within the sample time T, depending on the principal components (including acquisition, computing, and decision-making). The ROS-based node architecture is shown in Figure 4. The first node deals with the acquisition block based on the NI SbRIO real-time acquisition board, which acquires the signals received from the EMG sensor and connects it to the laptop. A ROS toolkit for LabVIEW is used to transmit the data to the sbRIO using Network Streams. Regarding the pre-processing stage, four software nodes were designed for EMG processing in order to extract useful information for classification. The sixth node is also a software node that runs the machine learning algorithms and returns the decision to the last node to apply it to the hand exoskeleton device. In this hardware node, an Arduino Due is used to control the hand exoskeleton actuators via the dedicated interfaces. This node uses the serial protocol to communicate with the ROS core (as a master).
For the classification step, two classifiers were implemented into the LabVIEW interface for intention detection. The classification results were sent to the control system of the hand exoskeleton device via the Arduino Due board. The Arduino board executes the desired motion by running five stepper motors fixed to the related mechanical support. To achieve the desired movement, each stepper motor is driven by a dedicated driver based on the ULN2003 circuit. By applying this architecture, participants can perform grasping, hold suitable things, and release them with high flexibility.
Significant advancements in software engineering lifecycle dependability are required due to the growing complexity of autonomous navigation systems. One of the most challenging research areas for overall systems is middleware. Runtime application adaptation, communication across heterogeneous systems at various middleware layers, and runtime safety assurance in the case of a failure are all instances of middleware issues.
Since ROS1 does not provide real-time performance, we switched to the Robot Operating System (ROS2) architecture [23,24]. In actuality, ROS1 shares many of the same drawbacks, including the lack of a standardized approach for creating a multi-robot system [25,26]. Furthermore, ROS1 lacks real-time design, which forces us to stretch our design to match the high real-time performance requirements and tight real-time performance indicators of our navigation system.
In order to ensure data integrity, ROS1’s distributed strategy also needs a stable network environment, yet the network is unsecured and unencrypted. ROS2 improves network performance for multi-robot communication over ROS1 and adds functionality for multi-robot systems. Real-time control is also supported by ROS2, which can enhance the intended system’s performance as well as the timeliness of control [27,28]. The architecture of ROS2, which is organized into multiple levels, improves fault tolerance capabilities because communication is based on the DDS (Data Distribution Service) standard (Figure 5). Moreover, ROS2’s intra-process communication technique is more efficient.

2.2. Hand Exoskeleton Design

In this section, we describe the use of 3D printing technology to demonstrate a brand-new hand exoskeleton design. The hand exoskeleton’s 3D architecture was developed using the computer-aided design program SOLIDWORKS. We made use of a 3D printer with fused deposition modeling (FDM). One of the most popular manufacturing methods in 3D printing, FDM offers the advantages of being able to produce goods quickly and affordably.
This idea aims to develop a poly-articulated hand exoskeleton inspired by the biology of the human hand in terms of size and principle of actuation. Equipped with five geared stepper motors ensuring the movements of each finger, this hand exoskeleton integrates force and position sensors in order to control the individual movement of each finger based on an integrated computer. Figure 6 shows the 3D CAD model of the assembly of the bionic hand. The module pieces were then manufactured using a 3D printer in preparation for assembly. Final assembly of the manufactured components results in the bionic hand illustrated in Figure 7. The hand exoskeleton device was created as an active hand exoskeleton to assist stroke survivors with gripping activities. Five independent degrees of freedom (DOF), one for each finger, are provided by the hand exoskeleton device to aid in gripping various things. As illustrated in Figure 8, the hand exoskeleton is made up of five pulley transmission systems, one for each finger, situated on the front of the hand so as not to obstruct the ability to grasp objects, and powered by five stepper motors. For material selection, the control unit box and the five finger mechanisms are printed in three dimensions using PLA material, also known as polylactic acid. In fact, PLA is a great, simple-to-use 3D printing material. Since it is made from renewable resources like maize starch, tapioca roots, or sugarcane, its major benefit is that it is a renewable and biodegradable resource. Thus, it naturally degrades when exposed to the environment. In addition, it is non-toxic and has a pleasant smell when printed. A large variety of colors are available in PLA filament, which is also very simple to print well owing to its thermal properties.
The assembling step needs several adjustments of the fishing wires attached to the pulley fixed at each stepper motor shaft to ensure flexion and extension of each finger. Table 1 presents the main components of the designed exoskeleton. The control box is designed to support the electronic architecture of the control system. This box is attached to the motor support piece, which is capable of supporting six motors. The guiding wire part is placed on the carpal zone to adjust the five wires. Regarding performance, the hand exoskeleton was created to produce a potential grasping force of 10 N, which was deemed sufficient to safely hold medium-sized things. Regarding the response time, the actuated hand exoskeleton has less than 2 s of latency to apply the received decision. This device is powered by 5 v of DC voltage, and the maximum consumed current is around 2.5 A.
In order to minimize electromagnetic interference, the hand exoskeleton’s control and drive electronics were built into a small piece of hardware that was integrated inside the control box of the hand exoskeleton. The electronic architecture is fully covered by the plastic box and separated from the power system and the actuators, ensuring a minimum of electromagnetic interference. For control architecture, an embedded Arduino board received the control signals from the LabVIEW system once the classification algorithm had been run and its results transformed into control signals. The Arduino microprocessor was used to drive the hand exoskeleton actuators. It transformed the received control commands into Pulse Width Modulation (PWM) signals. Five geared stepper motors were independently controlled in order to move each finger based on the coupled fishing wires. Each stepper motor is driven by the popular ULN2003 driver dedicated to stepper motor control. Indeed, the ULN2003 module is a high-voltage (up to 50 v) and high-current (up to 500 mA) Darlington driver comprised of seven NPN Darlington pairs. All units feature integral clamp diodes for switching inductive loads. It can be used to drive relay, hammer, lamp, and display (LED) drivers. When the user wants to perform grasping, each stepper motor is actuated in the forward direction. The fishing wire attached to the motor shaft will ensure the flexion of the related finger during the grasping state of the hand. On the other side, the exoskeleton will return to its resting state when motors are actuated in the inverse direction.

2.3. Data Acquisition System for EMG Measurement

In order to assess the degree of assistance during human-exoskeleton contact, EMG recordings can be utilized to correlate the state of hand grasping with the activity of the recruited muscles. Extensor digitorum longus (EDL) and flexor digitorum longus (FDL), two major arm muscles, were used in this research with the aim of assessing the hand grasping state. AgCl electrodes are used to acquire the signal, and by following the surface EMG standards for the noninvasive evaluation of muscles (SENIAM) [29], skin-electrode contact is investigated.
To further improve signal capture, the participant’s skin was cleansed with an alcohol swab and then shaved. The electrode placement followed the recommendations of SENIAM. For each bipolar derivation, the pre-gelled electrodes were employed with an inter-electrode interval of 20 mm. For all bipolar derivations, the reference electrode was placed at the wrist. Figure 8 displays the map of the EMG sensor locations. 30 healthy participants with a mean age of 25 +/− 3.5 years and no history of stroke or hand impairments are used for EMG signal recording dedicated to classifier training. Individuals gave informed consent and consented to take part in the study.
The acquisition system is ensured by the Myoware EMG sensor using surface electrodes. The EMG raw and the EMG rectified are two outputs that this sensor can produce in order to measure muscle activity. The EMG-rectified output is ensured by an integrated analog conditioning block that provides the time-domain average value of the EMG signals. However, this output is not suitable for additional digital processing and can only be used for threshold detection. Therefore, any additional feature can be extracted from the rectified signals. On the other hand, the raw EMG output is more suitable for further digital processing in order to extract the useful features of the generated EMG signals [30,31]. In this investigation, we employed the real-time board sbRIO-9637 from National Instruments along with the EMG raw as the acquisition system’s input.
The captured signal’s amplifier gain is set to 1000. The acquisition system’s sampling frequency is set to 100 kHz with a sampling rate of 10,000 samples per second [32]. Furthermore, a Butterworth-type 10–2000 Hz band-pass filter is utilized [4]. To provide two channels of EMG signals, pre-gelled surface electrodes were attached on both sides of the relevant muscles, with a reference contacting the olecranon joint. We create a Butterworth-type notch filter of 50 Hz in the LabVIEW block diagram to eliminate power line noise in order to enhance the received signal’s quality. This filter’s design was influenced by LabVIEW’s digital filter design module.

2.4. Machine Learning for Handgrip State Estimation

A machine learning method was used to look for patterns in the EMG data of two different hand states. In order to extract pertinent properties from the signals, a feature extraction stage was first constructed. As shown in Table 2, once the EMG signal had been filtered, we computed the four time-domain characteristics, including MAV, WAMP, VAR, and WL. These feature extraction approaches are often used for hand pattern recognition algorithms [33,34].
An event was the signal that was recorded when the subject repeatedly made the grabbing action. Each topic had 20 occurrences, each lasting three seconds. Over the entire experiment (30 subjects × 20 repetitions), there were a total of 600 occurrences. We took the four features indicated above for each of the four electrodes that were synchronously collected from each event, giving each event a total of eight features. The convention for feature identification is shown in Table 3.
Support Vector Machine (SVM) and k-nearest neighbors (K-NN) were two of the classification methods we trained from the feature space to recognize the grasping state (k-NN). These supervised algorithms needed to be optimized after a training phase using 70% of the events and a testing phase using 30%. In addition, we tested various classifier configurations to determine which one performed the best. Five kernel functions were examined for the SVM classifier. Although SVM is a binary classifier, one-to-one and one-vs.-all multiclass techniques were also used. On the other hand, as shown in Table 4, we modified the distance metric and the number of neighbors for the k-NN classifier.

3. Results

This section presents the experimental outcomes obtained with machine learning algorithms for the classification of the two relevant hand gestures by adhering to the acquisition technique indicated in Figure 1.
Eight time-domain features were derived from the EMG signal events and tested in ten classifier configurations—five based on SVM and five based on KNN. First, 1200 events were used to train the classifiers in order to identify the best parameters for minimizing the error ratio between each event’s estimated and actual label. The effectiveness of the techniques was then assessed for the testing set’s 580 events.
For classifier evaluation, a confusion matrix (2 × 2) is designed, including a count of the events that were both correctly and erroneously categorized. The sensitivity, specificity, and accuracy of the used approaches were computed using the true positive, true negative, and trueness rates, respectively. Moreover, Receiving-Operating Characteristic (ROC) curves were created by recording sensitivity (S) versus 1-specificity (FPR) for the best four classifiers, as shown in Figure 9. The average classifier performances were between 0.69 and 0.93 in sensitivity and 0.01 and 0.06 in 1-specificity. For accuracy evaluation, the KNN1 and KNN4 classifiers demonstrated the best performance. Overall, as shown in Table 5, we found that the best classifiers had an average sensitivity of 91%, specificity of 97%, and accuracy of 98%.
In this study, we developed a novel wearable robotic hand prosthesis with several joints, more active DOFs, larger ranges of motion for most joints, and the ability to be autonomously manipulated by the motion intention. This hand exoskeleton can independently drive the five fingers and satisfies the requirements of hand function rehabilitation. Additionally, the exoskeleton’s mechanical design could make it possible for the hand to perform pinching and gripping actions. The anatomical and functional qualities of the human hand served as inspiration for this design. The categorization accuracy of the hand exoskeleton control system based on sEMG signals was high, according to the results.
The first step consists of assembling the 3D-printed components. The participant is asked to wear the main control box and the finger parts. Secondly, the fishing wires were installed in the guiding-wire components and attached to the pulley plugged into the stepper motor shaft. This step is repeated five times to set all fingers movements. The fishing wires should be well adjusted, ensuring a compact mechanical structure without backlash. The final step deals with installing the surface electrodes and powering the control system. In order to avoid any unwanted movements, the control system is initiated with some delay before starting the training. At this time, participants can begin training by exercising the healthy forearm. The desired movement is applied in real-time on the second forearm by actioning the exoskeleton and imitating the healthy side. As illustrated by Figure 10, the preliminary test of the active hand exoskeleton device shows the user can perform the grasping movement with high flexibility.
Regarding the evaluation of the designed hand exoskeleton, 10 healthy participants (from the 30 participants previously mentioned) were used in this study. The hand exoskeleton device was rated by the test subjects. They ranked each item on a scale from 1 (not at all satisfied) to 5 (extremely satisfied) and considered the exoskeleton device’s dimensions, weight, adjustments, safety, simplicity of use, comfort, and effectiveness (Table 6). The majority of participants were satisfied with the exoskeleton’s performance. It was the most highly ranked of the seven items and had an average item score of about 4.7. However, several participants complained that the stepper motors made the hand exoskeleton’s dimensions a little bit cumbersome and difficult for them to manually modify. The participants were generally satisfied with the developed hand exoskeleton device. Additionally, because the control system based on sEMG was simple to understand, none of the participants requested more time to configure the system. None of the participants expressed any discomfort during the trial, and no side effects, such as a force ulcer, were discovered.

4. Discussion

The overall system contains three main parts: the first is the acquisition system composed of the MyoWare EMG sensor and the NI (National Instrument) real-time board connected to the laptop. The second component is the LabVIEW software running classification algorithms. The desired movement was sent to the hand exoskeleton device via the USB protocol. The Arduino board receives the desired variables and executes the hand exoskeleton by running the stepper motors. The consumed current of each stepper motor was acquired, ensuring a closed-loop control system, in order to guarantee that each finger is fully flexed or fully extended. Although this control method is very conventional, the grasping force shows good performance. However, the use of flexion sensors can be a good alternative for finger angle control. As shown in Figure 11, the user can grasp, hold, and release objects based on the desired intention. The overall system shows a good latency (less than 2 s) to apply the desired movement due to the real-time acquisition system and the high-speed stepper motor. As shown in Table 7, the latency is mainly due to the mechanical constraints of the structure of the hand exoskeleton. However, if we want to quantify the maximum latency since starting acquisition until generating a decision—including EMG acquisition and processing, classification, and decision-making—the results in Table 7 prove the real-time performance of the designed system, which is consistent with previous studies showing that the real-time constraints were the more challenging issue of the sEMG-controlled hand exoskeleton [35,36].
The top-performing classifiers (KNN1 and KNN4) had less-than-ideal online classification accuracy, although this was consistent with earlier research [37]. Different classifiers should be chosen and used separately, as the classification accuracy is typically not the same for each person [38,39]. The accuracy of identifying several activities for the same person may likewise vary greatly. Muscle contractions varied from subject to subject, indicating that different ways for people to activate their muscles could be used to carry out the same function. Depending on the person, different hand gesture predictions may have a varying level of accuracy [10]. Although the implemented classifiers show good accuracy for intention detection, participants can find some problems in training after executing several cycles, which are mainly due to the muscle fatigue of the healthy hand. This issue will be processed in the future by considering muscle fatigue in the classification algorithms.
Comparing the hand exoskeleton device developed during this work to other exoskeleton systems recently developed [40,41], some features and benefits stand out. Indeed, there are certain structural benefits to the developed exoskeleton device. Above all, it was made using 3D printing technology with a straightforward control system, which made it lighter than the majority of recently developed equipment. The exoskeleton that was attached to the hand weighed about 180 g in total (95 g for the 3D-printed components and 80 g for the stepper motors).
Since the majority of hand usage in regular activities is supervised by sight, visual feedback was left in place for the experiment. Additionally, the possible impacts of visual input on outcomes could be further restricted because, during hand performance, all individuals equally received visual feedback. The purpose of this study was to demonstrate a newly created wearable robotic hand exoskeleton that can be freely directed by movement intention based on the processed EMG signals. However, further research is required before doing any clinical tests because this is only a preliminary study showcasing a unique exoskeleton design. The sample size was adequate to show that the approaches performed well. However, a bigger database would be required for a clinical investigation. EMG signals from stroke patients in particular should be thoroughly examined since they may have altered shapes and more varied behavior [42]. To demonstrate the effectiveness of this new exoskeleton for patients with neuromuscular diseases, such as stroke patients, we plan to conduct clinical research soon.

5. Conclusions

This work showed a new design of the hand exoskeleton device that interacts with human hand motions through EMG signals. Both the parameters utilized in the classifiers and the features collected from the EMG signals were adjusted in order to improve classification accuracy for precise control. To maximize the number of features and minimize computing complexity, the shape of waveforms collected from various muscle bundles was considered while extracting EMG features. The obtained results prove good performance in terms of real-time requirements and classification accuracy. Although the obtained results are consistent with the related previous works, the subject-related EMG classification should be more addressed. In addition, the designed rehabilitation system is tested only with healthy subjects. Therefore, in order to demonstrate the effectiveness of this new exoskeleton for patients with neuromuscular diseases, such as stroke patients, clinical tests should be conducted in the future.

Author Contributions

Conceptualization, I.B.A. and Y.B.; methodology, I.B.A. and Y.B.; software, I.B.A.; validation, I.B.A.; formal analysis, Y.B.; investigation, Y.B.; resources, Y.B.; data curation, I.B.A.; writing—original draft preparation, I.B.A.; writing—review and editing, I.B.A. and Y.B.; supervision, Y.B.; project administration, Y.B. All authors have read and agreed to the published version of the manuscript.

Funding

This study is supported via funding from Prince Sattam bin Abdulaziz University project number (PSAU/2023/R/1444).

Institutional Review Board Statement

The animal protocols used in this work were evaluated and approved by the Animal Use and Ethic Committee (CEUA) of the Institute Pasteur Montevideo (Protocol 2009_1_3284). They are in accordance with FELASA guidelines and the National Law for Laboratory Animal Experimentation (Law no. 18.611).

Data Availability Statement

The data presented in this study are available on request from the corresponding author. The data are not publicly available due to privacy restrictions.

Conflicts of Interest

The authors declare no conflict of interest.

References

  1. Thrift, A.G.; Cadilhac, D.A.; Thayabaranathan, T.; Howard, G.; Howard, V.J.; Rothwell, P.M.; Donnan, G.A. Global stroke statistics. Int. J. Stroke 2017, 12, 13–32. [Google Scholar] [CrossRef]
  2. Ben Abdallah, I.; Bouteraa, Y.; Rekik, C. Design and development of 3D printed myoelectric robotic exoskeleton for hand rehabilitation. Int. J. Smart Sens. Intell. Syst. 2017, 10, 1–26. [Google Scholar] [CrossRef]
  3. Bouteraa, Y.; Ben Abdallah, I.; Elmogy, A.M. Training of hand rehabilitation using low cost exoskeleton and vision-based game interface. J. Intell. Robot. Syst. 2019, 96, 31–47. [Google Scholar] [CrossRef]
  4. Bouteraa, Y.; Ben Abdallah, I.; Alnowaiser, K.; Islam, M.R.; Ibrahim, A.; Gebali, F. Design and Development of a Smart IoT-Based Robotic Solution for Wrist Rehabilitation. Micromachines 2022, 13, 973. [Google Scholar] [CrossRef] [PubMed]
  5. Bouteraa, Y.; Ben Abdallah, I.; Alnowaiser, K.; Ibrahim, A. Smart solution for pain detection in remote rehabilitation. Alex. Eng. J. 2021, 60, 3485–3500. [Google Scholar] [CrossRef]
  6. Bouteraa, Y.; Ben Abdallah, I.; Ibrahim, A.; Ahanger, T.A. Fuzzy logic-based connected robot for home rehabilitation. J. Intell. Fuzzy Syst. 2021, 40, 4835–4850. [Google Scholar] [CrossRef]
  7. Gerez, L.; Gao, G.; Dwivedi, A.; Liarokapis, M. A hybrid, wearable exoskeleton glove equipped with variable stiffness joints, abduction capabilities, and a telescopic thumb. IEEE Access 2020, 8, 173345–173358. [Google Scholar] [CrossRef]
  8. Sarac, M.; Solazzi, M.; Frisoli, A. Design requirements of generic hand exoskeletons and survey of hand exoskeletons for rehabilitation, assistive, or haptic use. IEEE Trans. Haptics 2019, 12, 400–413. [Google Scholar] [CrossRef]
  9. Gasser, B.W.; Martinez, A.; Sasso-Lance, E.; Kandilakis, C.; Durrough, C.M.; Goldfarb, M. Preliminary assessment of a hand and arm exoskeleton for enabling bimanual tasks for individuals with hemiparesis. IEEE Trans. Neural Syst. Rehabil. Eng. 2020, 28, 2214–2223. [Google Scholar] [CrossRef]
  10. Cote-Allard, U.; Fall, C.L.; Drouin, A.; Campeau-Lecours, A.; Gosselin, C.; Glette, K.; Laviolette, F.; Gosselin, B. Deep learning for electromyographic hand gesture signal classification using transfer learning. IEEE Trans. Neural Syst. Rehabil. Eng. 2019, 27, 760–771. [Google Scholar] [CrossRef]
  11. Li, K.; Na Wei, N.; Cheng, M.; Hou, X.; Song, J. Dynamical coordination of hand intrinsic muscles for precision grip in diabetes mellitus. Sci. Rep. 2018, 8, 4365. [Google Scholar] [CrossRef]
  12. Yang, S.-H.; Koh, C.-L.; Hsu, C.-H.; Chen, P.-C.; Chen, J.-W.; Lan, Y.-H.; Yang, Y.; Lin, Y.-D.; Wu, C.-H.; Liu, H.-K.; et al. An instrumented glove-controlled portable hand-exoskeleton for bilateral hand rehabilitation. Biosensors 2021, 11, 495. [Google Scholar] [CrossRef]
  13. Yoo, H.-J.; Lee, S.; Kim, J.; Park, C.; Lee, B. Development of 3D-printed myoelectric hand orthosis for patients with spinal cord injury. J. Neuroeng. Rehabil. 2019, 16, 162. [Google Scholar] [CrossRef]
  14. Cisnal, A.; Perez-Turiel, J.; Fraile, J.-C.; Sierra, D.; de la Fuente, E. RobHand: A hand exoskeleton with real-time EMG-driven embedded Control. Quantifying hand gesture recognition delays for bilateral rehabilitation. IEEE Access 2021, 9, 137809–137823. [Google Scholar] [CrossRef]
  15. Leonardis, D.; Barsotti, M.; Loconsole, C.; Solazzi, M.; Troncossi, M.; Mazzotti, C.; Castelli, V.P.; Procopio, C.; Lamola, G.; Chisari, C.; et al. An EMG-controlled robotic hand exoskeleton for bilateral rehabilitation. IEEE Trans. Haptics 2015, 8, 140–151. [Google Scholar] [CrossRef]
  16. Li, K.; Li, Z.; Zeng, H.; Wei, N. Control of newly-designed wearable robotic hand exoskeleton based on surface Electromyographic signals. Front. Neurorobotics 2021, 15, 711047. [Google Scholar] [CrossRef]
  17. Bassa, M.M. Development of the Communication System for a Lower Limb Human Exoskeleton Using the Ros Middleware. Master’s Thesis, Università Degli Studi di Padova, Padua, Italy, 2015. [Google Scholar]
  18. Zhao, L.; Yang, T.; Yu, P.; Yang, Y. An exoskeleton-based master device for dual-arm robot teleoperation. In Proceedings of the 2020 Chinese Automation Congress (CAC), Shanghai, China, 6–8 November 2020; IEEE: Piscataway, NJ, USA, 2020; pp. 5316–5319. [Google Scholar]
  19. Boheng, W.; Sheng, C.; Bo, Z.; Zhiwei, L.; Xiang, G. Design and implementation of shoulder exoskeleton robot. In Social Robotics: 10th International Conference, ICSR 2018, Qingdao, China, 28–30 November 2018, Proceedings 10; Springer International Publishing: Berlin/Heidelberg, Germany, 2018; pp. 241–252. [Google Scholar]
  20. Abdelbar, M.; Mohamed, I.; Abdellatif, A.; Hegaze, M.M. Towards the Mechatronic Development of a New Upper-Limb Exoskeleton (SAMA). Designs 2022, 6, 80. [Google Scholar] [CrossRef]
  21. Yu, H.; Li, H.; Wang, Y. Teleoperation of a Dexterous Hand Using a Wearable Hand. In Cognitive Systems and Information Processing: 7th International Conference, ICCSIP 2022, Fuzhou, China, 17–18 December 2022, Revised Selected Papers; Springer: Singapore, 2023; pp. 532–544. [Google Scholar]
  22. Pavón-Pulido, N.; López-Riquelme, J.A.; Feliú-Batlle, J.J. Intelligent Control of an Exoskeleton for Rehabilitation Purposes Using a ROS-Based Software Architecture. In Progress in Artificial Intelligence: 19th EPIA Conference on Artificial Intelligence, EPIA 2019, Vila Real, Portugal, 3–6 September 2019, Proceedings, Part I 19; Springer International Publishing: Berlin/Heidelberg, Germany, 2019; pp. 349–360. [Google Scholar]
  23. Stavrinos, G. ROS2 for ROS1 users. In Robot Operating System (ROS) The Complete Reference; Springer International Publishing: Berlin/Heidelberg, Germany, 2021; Volume 5, pp. 31–42. [Google Scholar]
  24. Kay, J.; Tsouroukdissian, A.R. Real-Time Control in ROS and ROS 2.0; ROSCon15; Open Source Robotics Foundation: Mountain View, CA, USA, 2015. [Google Scholar]
  25. Quigley, M.; Conley, K.; Gerkey, B.; Faust, J.; Foote, T.; Leibs, J.; Wheeler, R.; Ng, A.Y. ROS: An open-source Robot Operating System. ICRA Workshop Open Source Softw. 2009, 3, 5. [Google Scholar]
  26. Lentin, J. Robot Operating System for Absolute Beginners; Apress: Berkeley, CA, USA, 2018. [Google Scholar]
  27. Maruyama, Y.; Kato, S.; Azumi, T. Exploring the performance of ROS2. In Proceedings of the 13th International Conference on Embedded Software, Chengdu, China, 13–14 August 2016; pp. 1–10. [Google Scholar]
  28. Macenski, S.; Foote, T.; Gerkey, B.; Lalancette, C.; Woodall, W. Robot Operating System 2: Design, architecture, and uses in the wild. Sci. Robot. 2022, 7, eabm6074. [Google Scholar] [CrossRef]
  29. Stegeman, D.; Hermens, H. Standards for Surface Electromyography: The European Project Surface EMG for Non-Invasive 355 Assessment of Muscles (SENIAM); Roessingh Research and Development: Enschede, The Netherlands, 2007; Volume 10, pp. 8–12. [Google Scholar]
  30. Phinyomark, A.; Hirunviriya, S.; Limsakul, C.; Phukpattaranont, P. Evaluation of EMG feature extraction for hand movement recognition based on Euclidean distance and standard deviation. In Proceedings of the ECTI-CON2010: The 2010 ECTI International Confernce on Electrical Engineering/Electronics, Computer, Telecommunications and Information Technology, Chiang Mai, Thailand, 19–21 May 2010; IEEE: Piscataway, NJ, USA, 2010; pp. 856–860. [Google Scholar]
  31. Tuncer, T.; Dogan, S.; Subasi, A. Surface EMG signal classification using ternary pattern and discrete wavelet transform based feature extraction for hand movement recognition. Biomed. Signal Process. Control 2020, 58, 101872. [Google Scholar] [CrossRef]
  32. Bouteraa, Y.; Ben Abdallah, I.; Boukthir, K. A New Wrist–Forearm Rehabilitation Protocol Integrating Human Biomechanics and SVM-Based Machine Learning for Muscle Fatigue Estimation. Bioengineering 2023, 10, 219. [Google Scholar] [CrossRef] [PubMed]
  33. Al-Angari, H.M.; Kanitz, G.; Tarantino, S.; Cipriani, C. Distance and mutual information methods for EMG feature and channel subset selection for classification of hand movements. Biomed. Signal Process. Control 2016, 27, 24–31. [Google Scholar] [CrossRef]
  34. Too, J.; Abdullah, A.R.; Saad, N.M.; Tee, W. EMG feature selection and classification using a Pbest-guide binary particle swarm optimization. Computation 2019, 7, 12. [Google Scholar] [CrossRef]
  35. Chen, C.; Chai, G.; Guo, W.; Sheng, X.; Farina, D.; Zhu, X. Prediction of finger kinematics from discharge timings of motor units: Implications for intuitive control of myoelectric prostheses. J. Neural Eng. 2019, 16, 026005. [Google Scholar] [CrossRef]
  36. Parajuli, N.; Sreenivasan, N.; Bifulco, P.; Cesarelli, M.; Savino, S.; Niola, V.; Esposito, D.; Hamilton, T.J.; Naik, G.R.; Gunawardana, U.; et al. Real-time EMG based pattern recognition control for hand prostheses: A review on existing methods, challenges and future implementation. Sensors 2019, 19, 4596. [Google Scholar] [CrossRef]
  37. Furui, A.; Eto, S.; Nakagaki, K.; Shimada, K.; Nakamura, G.; Masuda, A.; Chin, T.; Tsuji, T. A myoelectric prosthetic hand with muscle synergy–based motion determination and impedance model–based biomimetic control. Sci. Robot. 2019, 4, eaaw6339. [Google Scholar] [CrossRef]
  38. Xiloyannis, M.; Gavriel, C.; Thomik, A.A.C.; Faisal, A.A. Gaussian process autoregression for simultaneous proportional multi-modal prosthetic control with natural hand kinematics. IEEE Trans. Neural Syst. Rehabil. Eng. 2017, 25, 1785–1801. [Google Scholar] [CrossRef]
  39. Dwivedi, A.; Kwon, Y.; McDaid, A.J.; Liarokapis, M. A learning scheme for EMG based decoding of dexterous, in-hand manipulation motions. IEEE Trans. Neural Syst. Rehabil. Eng. 2019, 27, 2205–2215. [Google Scholar] [CrossRef]
  40. Yu, S.; Perez, H.; Barkas, J.; Mohamed, M.; Eldaly, M.; Huang, T.-H.; Yang, X.; Su, H.; Cortes, M.D.M.; Edwards, D.J. A soft high force hand exoskeleton for rehabilitation and assistance of spinal cord injury and stroke individuals. In Frontiers in Biomedical Devices; American Society of Mechanical Engineers: New York, NY, USA, 2019; Volume 41037, p. V001T09A011. [Google Scholar]
  41. Yun, Y.; Dancausse, S.; Esmatloo, P.; Serrato, A.; Merring, C.A.; Agarwal, P.; Deshpande, A.D. Maestro: An EMG-driven assistive hand exoskeleton for spinal cord injury patients. In Proceedings of the 2017 IEEE International Conference on Robotics and Automation (ICRA), Singapore, 29 May–3 June 2017; IEEE: Piscataway, NJ, USA, 2017; pp. 2904–2910. [Google Scholar]
  42. Jiang, N.; Dosen, S.; Muller, K.-R.; Farina, D. Myoelectric control of artificial limbs—is there a need to change focus? [In the spotlight]. IEEE Signal Process. Mag. 2012, 29, 152–150. [Google Scholar] [CrossRef]
Figure 1. System components.
Figure 1. System components.
Robotics 12 00095 g001
Figure 2. High level description of the proposed system.
Figure 2. High level description of the proposed system.
Robotics 12 00095 g002
Figure 3. Task scheduling in parallel processing.
Figure 3. Task scheduling in parallel processing.
Robotics 12 00095 g003
Figure 4. ROS-based hardware architecture.
Figure 4. ROS-based hardware architecture.
Robotics 12 00095 g004
Figure 5. ROS2 architecture overview.
Figure 5. ROS2 architecture overview.
Robotics 12 00095 g005
Figure 6. 3D CAD model of the hand exoskeleton.
Figure 6. 3D CAD model of the hand exoskeleton.
Robotics 12 00095 g006
Figure 7. Assembled 3D-printed bionic hand.
Figure 7. Assembled 3D-printed bionic hand.
Robotics 12 00095 g007
Figure 8. The sensor location of the EMG electrodes on the human forearm, including the anterior surface and posterior surface.
Figure 8. The sensor location of the EMG electrodes on the human forearm, including the anterior surface and posterior surface.
Robotics 12 00095 g008
Figure 9. Receiving-Operating Characteristic (ROC) curves for classification: (a) opened hand and (b) closed hand.
Figure 9. Receiving-Operating Characteristic (ROC) curves for classification: (a) opened hand and (b) closed hand.
Robotics 12 00095 g009
Figure 10. Preliminary tests of classification.
Figure 10. Preliminary tests of classification.
Robotics 12 00095 g010
Figure 11. Experimental results: (a) grasp object, (b) hold grasping, and (c) release object.
Figure 11. Experimental results: (a) grasp object, (b) hold grasping, and (c) release object.
Robotics 12 00095 g011
Table 1. Main components of the hand exoskeleton device.
Table 1. Main components of the hand exoskeleton device.
PiecesMaterialQuantity
Thumb (3 pcs)PLA1
Index (4 pcs)PLA1
Middle (4 pcs)PLA1
Ring (4 pcs)PLA1
Little (4 pcs)PLA1
CarpalPLA1
Control boxPLA1
CoverPLA1
Motor supportPLA1
PulleyPLA5
Table 2. Time-domain features used in this study.
Table 2. Time-domain features used in this study.
DomainFeatureFormulationInformation
Time domain Willison amplitude (WAMP) W A M P = i = 1 N 1 f x i x i + 1
f x = 1   i f   x t h r e s h o l d 0   i f   x < t h r e s h o l d
Energy and power
Time domain Mean Absolute Value (MAV) M A V = 1 N i = 1 N x i Energy and complexity
Time domain Variance (VAR) V A R = 1 N 1 i = 1 N x i 2 Energy and power
Time domain Waveform length (WL) W L = i = 1 N 1 x i + 1 x i Frequency
Table 3. The eight features that were extracted were recorded as follows: four features × two channels.
Table 3. The eight features that were extracted were recorded as follows: four features × two channels.
Channel 1Channel 2
WAMPFeature 1Feature 2
MAVFeature 3Feature 4
VARFeature 5Feature 6
WLFeature 7Feature 8
Table 4. Configuration of classifiers.
Table 4. Configuration of classifiers.
ClassifierKernelNeighborsDistance
SVM-Config-1Linear__
SVM-Config-2Quadratic__
SVM-Config-3Cubic__
SVM-Config-4Fine Gaussian__
SVM-Config-5Medium Gaussian__
KNN-Config-1_Fine (1)Euclidean
KNN-Config-2_Medium (10)Euclidean
KNN-Config-3_Medium (10)Cosine
KNN-Config-4_Medium (10)Weighted
KNN-Config-5_Medium (10)Cubic
Table 5. Evaluation of the top performance classifiers.
Table 5. Evaluation of the top performance classifiers.
ClassifierSensitivitySpecificityAccuracy
Fine k-NN (KNN1)90%98%98%
Weighted k-NN (KNN4)92%96%98%
Average91%97%98%
Table 6. Results of the survey of the usage of the hand exoskeleton device.
Table 6. Results of the survey of the usage of the hand exoskeleton device.
ItemsSatisfaction
Dimensions 3.3
Weight3.5
Adjustments 3.6
Safety4.6
Simplicity of use4.1
Comfort3.9
Effectiveness4.7
Table 7. Timing analysis and real-time performance.
Table 7. Timing analysis and real-time performance.
TaskEMG AcquisitionEMG Pre-ProcessingClassificationData Fusion and Decision MakingActuating System
Used HardwaresbRIO-9637ComputerArduino Due
Hardware performance667 MHz Dual-Core CPU, 512 MB DRAM, Zynq-7020 FPGAIntel Core i7 12th Gen, RAM 24 GBAtmel SAM3X8E ARM Cortex-M3 CPU
Communication protocolParallelEthernetDDSSerial
Software—Operating systemNI Linux Real-TimeLabVIEWROS2Arduino IDE
Software configuration10 K samples/second100 Khz10 Khz1 Khz
Maximum latency0.1 ms0.01 ms0.1 ms0.1 ms1 ms
Total latency1.31 ms (to generate decision) + 1.92 s (to apply decision) < 2 s
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Abdallah, I.B.; Bouteraa, Y. RETRACTED: A Newly-Designed Wearable Robotic Hand Exoskeleton Controlled by EMG Signals and ROS Embedded Systems. Robotics 2023, 12, 95. https://doi.org/10.3390/robotics12040095

AMA Style

Abdallah IB, Bouteraa Y. RETRACTED: A Newly-Designed Wearable Robotic Hand Exoskeleton Controlled by EMG Signals and ROS Embedded Systems. Robotics. 2023; 12(4):95. https://doi.org/10.3390/robotics12040095

Chicago/Turabian Style

Abdallah, Ismail Ben, and Yassine Bouteraa. 2023. "RETRACTED: A Newly-Designed Wearable Robotic Hand Exoskeleton Controlled by EMG Signals and ROS Embedded Systems" Robotics 12, no. 4: 95. https://doi.org/10.3390/robotics12040095

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop