Next Article in Journal
Atmospheric Attenuation Correction Based on a Constant Reference for High-Precision Infrared Radiometry
Previous Article in Journal
Thickness Dependence of Switching Behavior in Ferroelectric BiFeO3 Thin Films: A Phase-Field Simulation
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Experimental Study of Real-Time Classification of 17 Voluntary Movements for Multi-Degree Myoelectric Prosthetic Hand

Department of Robotics and Mechatronics, Tokyo Denki University, Tokyo 120-8551, Japan
*
Author to whom correspondence should be addressed.
Appl. Sci. 2017, 7(11), 1163; https://doi.org/10.3390/app7111163
Submission received: 4 October 2017 / Revised: 31 October 2017 / Accepted: 8 November 2017 / Published: 13 November 2017

Abstract

:

Featured Application

This work is intended for the development of myoelectric prosthetic hand systems. Furthermore, the outcome of this study may also benefit other electromyographic based human-machine interfaces.

Abstract

The myoelectric prosthetic hand is a powerful tool developed to help people with upper limb loss restore the functions of a biological hand. Recognizing multiple hand motions from only a few electromyography (EMG) sensors is one of the requirements for the development of prosthetic hands with high level of usability. This task is highly challenging because both classification rate and misclassification rate worsen with additional hand motions. This paper presents a signal processing technique that uses spectral features and an artificial neural network to classify 17 voluntary movements from EMG signals. The main highlight will be on the use of a small set of low-cost EMG sensor for classification of a reasonably large number of hand movements. The aim of this work is to extend the capabilities to recognize and produce multiple movements beyond what is currently feasible. This work will also show and discuss about how tailoring the number of hand motions for a specific task can help develop a more reliable prosthetic hand system. Online classification experiments have been conducted on seven male and five female participants to evaluate the validity of the proposed method. The proposed algorithm achieves an overall correct classification rate of up to 83%, thus, demonstrating the potential to classify 17 movements from 6 EMG sensors. Furthermore, classifying 9 motions using this method could achieve an accuracy of up to 92%. These results show that if the prosthetic hand is intended for a specific task, limiting the number of motions can significantly increase the performance and usability.

Graphical Abstract

1. Introduction

The hand is one of the most important parts of the human body. It is responsible for almost all of the intellectual activities of our daily living [1]. The dexterity of the hand not only enables us to perform delicate tasks but to also regard that object as a part of our body and manipulate it to execute new tasks that are not possible with the hand alone. Also, the fingertip is equipped with fine sensors that allow the sensing of pressure, temperature, pain etc. For these reasons, amputation of the hand generates both psychological trauma and functional impairment because the person becomes unable to perform most daily tasks and baffled by the change in their appearance [2]. According to the 2006 survey on persons with physical disability conducted by the ministry of health, labour and welfare of Japan, there are approximately 80 thousand upper limb amputees in Japan [3]. Prosthetic arm/hand were developed to help these people by restoring as much as possible the functions of a biological hand and natural appearance. Myoelectric prosthetic hand/arm is a type of externally powered prostheses that is controlled by electrical signals generated during voluntary muscle contraction. The measuring of muscle activity via electric potential is referred to as electromyography (EMG). Since EMG signals can be recorded by placing surface electrodes on top of the skin, it is possible to use this electrical potential to provide a signal for control of prosthetic limbs that is relatively intuitive and non-invasive.
Over recent decades, extensive research on myoelectric prosthetic hand/arms have been conducted by many research groups across the globe, improving the overall functionality and reliability of myoelectric hands rapidly each year. However, despite these technological advances, the capability to recognize and produce extensive movements of many existing myoelectric hands is still primitive. The limited capability to execute many movements or to provide the user with the control of multiple degrees of freedom (DOF) consequently limits the number of task the user can perform, and is one of the cause that leads to rejection or abandonment of a prosthesis [4,5]. Recently, due to huge progress in the mechanical development of prosthetic hands, several complex, articulated (multi degree of freedom) myoelectric hands have become available on the market. The Bebionic hand [6] and the iLimb hand [7] are examples of articulated prosthetic hands. Other multi-DOF prosthetic hands like the Vanderbilt hand [8], the UNB hand [9], the Yale hand [10], Smart hand [11], the DLR/HIT hand [12], the Keio hand [13], and the 6 DOF hand [14] are also being developed as research devices.
With the emerging of complex multi-DOF prosthetic hands, development of a control system that can distinguish and produce large number of movements are needed to allow the execution of more complex daily tasks. This study presents a signal processing technique to classify 17 forearm motions (with a rest state included) using 6 EMG sensors. The aim is to explore and determine the extent of hand movement recognition capabilities for myoelectric prosthetic hands with the currently available hardware. The novelty of this work is providing the basic methodology to classify not only wrist movements (grasping, opening, flexing, etc.) but also individual flexion and extension control of the five fingers. It will extend the capabilities of recognizing multiple motions beyond what is currently feasible (3–10 motions) [15,16,17,18,19]. The experiments will be conducted on fully-limbed participants as we believe that this defines the upper bound. Furthermore, this work will also show and discuss about how tailoring the number of hand motions for a specific task can help increase the performance and usability of the prosthetic hand system.
This paper is organized as follows. Section 2 gives a literature review of EMG based pattern recognition and highlights the features of this study. Section 3 describes the system architecture and the procedures needed to acquire EMG signals. Details on the necessary preprocessing steps are also given in this section. Section 4 explains in detail, the feature extraction method using spectral analysis and classification algorithm using artificial neural network. Section 5 gives an evaluation of the proposed algorithm. The classification accuracy of the proposed system will be given and discussed. Section 6 concludes this paper.

2. Related Works

To myo-electrically control a dexterous hand it is necessary to map EMG signals (corresponding to different muscle contractions) to the different existing degree of freedom using pattern recognition based algorithms. To this aim, researchers have been working on various aspects of EMG classification techniques such as electrode placement, selection of suitable feature extraction techniques, and selection of suitable classification algorithms to improve the classification accuracy and increase the number of controllable functions (motions).
Electromyography are usually performed by placing several electrodes on the skin. Over the past decades, different electrode placement strategies have been investigated. Some researchers study the use of multichannel electrode arrays [20] or high-density EMG (large number of electrodes) strategy [21,22], while others explore the precise anatomical positioning approach [18,23].
In the pattern recognition based control approach, feature extraction and classification are the two important steps in achieving higher classification performance. The feature extraction process involves the transforming of raw EMG signals into feature vector that are assigned to represent different motions. Several features extraction methods have been suggested and these features can be sorted into time domain features [17], frequency domain features [18], and time-frequency domain features [24]. Time domain features are generally evaluated based on signal amplitude that varies with time. On the other hand, frequency domain features contain the power spectrum density of the signals and are computed by parametric methods or a periodogram. Time-frequency domain features contains the combination of temporal and frequency information. These features can characterize varying frequency information at different time locations, providing plentiful non-stationary information of the EMG signals. A list of specific features in each category can be found in [25]. Classification of arm/hand motions based on the extracted features can be performed by a large variety of methods such as linear discriminant analysis [26], support vector machines [18], or artificial neural networks [17].
Using these state-of-the-art approaches, many studies concentrate on the forearm and wrist movements such as flexion/extension, pronation/supination, and sometimes hand grasping/opening (all fingers simultaneously). However, since these systems can only produce a small number of motions, they have yet to fully mimic the human hand. With increasing number of multi-DOF prosthetic hands on the market, additional motions are needed to dexterously control these devices in daily activities. In one survey, ten participants with upper limb deficiency were asked what movements or features useful in a future prosthetic hand [27]. All participants wanted to point with the extended finger, 90% of them desired individual control of fingers, and 70% responded that wrist flexion and extension would be beneficial.
Control of the fingers using EMG is extremely challenging compared to forearm/wrist movements. This is because the amplitude of EMG signal for finger movements are generally smaller than those of arm/wrist movements. Furthermore, since the muscles controlling the finger movements lie in the intermediate and deep layers of the forearm [28], attenuation of EMG signals caused by forearm tissues can be observed. Therefore, multiple electrodes are needed to provide enough information to distinguish the intended movement. In recent years, studies on finger movement control have started to appear in literature. However, the number of available literature is still relatively small compared to studies on classification of arm/wrist movements. Furthermore, arm/wrist movement and finger movements are rarely investigated together. However, both wrist and finger movements are needed in daily activities and thus, it is desirable to develop a system that can classify both wrist and finger movements.
Figure 1 shows an overview of recent studies on EMG based pattern recognition. The number of electrodes used in the study is plotted against the number of classes. As can be seen, the number of electrode is greater than or equal to the number of classes in nearly all of these studies. Also, almost all of the current works only deals with small number of classes (less than 10) [11,17,18,19,20,23,29,30]. Only an extremely small number of studies have reported classification results for more than 10 classes but they require large number of electrodes [21,31,32,33]. However, this option may not be possible with an amputee. This is because using large number of electrodes requires a large surface on the forearm, which may not be available in the case of amputees. Furthermore, having large number of electrodes will increase the cost and complexity of the required hardware as well as increase the computation load which slows down the processing speed of the system [34]. As such, it is desirable to reduce the number of electrode but at the same time have the ability to classify large number of movements. It is also worth noting that most existing studies involve sophisticated clinical grade sensor equipment that are expensive, making practical social use difficult or prohibitive.
For the purpose of developing a low-cost EMG control system that can classify large numbers of movements, this study presents a signal processing technique to classify 17 wrist and finger movements (with a rest state included) using 6 consumer grade EMG sensors. This study will tackle the following research questions:
  • How well can the state-of-the-art classification scheme classify 17 types of movements?
  • Can a consumer grade EMG sensor serve as a modality in myoelectric prosthetic control?
  • How long can the user control the prosthetic hands until the accuracy drops significantly?
To answer the first question, evaluation of a classification scheme using spectral features and multilayer perceptron as the classifier will be performed and discussed. To answer the second question, a consumer grade acquisition device (sampling frequency of 200 Hz) will be used in the experiment. The answer to these two questions will give us an idea of the extent of hand movement recognition capabilities with limited hardware. The answer to the third question will give us a hint of when classifiers may need to be recalibrate.

3. Data Acquisition and Preprocessing of EMG Signals

The aim of this study is to classify 17 types of motion from EMG signals. The 17 motions include 6 forearm/wrist movements (pronation, supination, flexion, extension, hand grasping and hand opening) as shown in Figure 2, flexion and extension of the five fingers, and a rest state (no movement). In this work, an EMG-based control system is designed as illustrated in Figure 3 to achieve the aim of classifying these 17 motions. As depicted in Figure 3, the system consists of three subsystems: the data acquisition system, the EMG signal processing system, and the interface system. The data acquisition system records EMG signals in a noninvasive fashion using six EMG sensors (IDPAD series manufactured by Oisaka Electronic Equipment Ltd., Hiroshima, Japan). These signals are sampled at 200 Hz using a 12-bit analog-digital converter (ADC) and then sent to the signal processing system to undergo further processing. The EMG signal processing system processes the EMG signal and classifies the arm/hand motion. The interface system displays EMG signal waveform, movement cues, as well as classification results in real-time. The software is developed using MATLAB/Simulink® (made by MathWorks, Natick, MA, USA) environment. The proposed algorithm is executed after being built by Simulink real-time workshop.
In this study, six EMG sensors are positioned on the flexor carpi radialis muscle, flexor carpi ulnaris muscle, flexor digitorum profundus muscle, flexor pollicis longus muscle, extensor carpi radialis longus muscle and extensor pollicis longus muscle of the left arm as shown in Figure 4 [35]. The functions of each muscle are summarized in Table 1. Muscles acting on the hand can be divided into two groups: extrinsic and intrinsic muscles. Intrinsic muscles of the hand are muscles which are located within the hand itself. On the other hand, extrinsic muscles are located in the anterior and posterior compartments of the forearm. Extrinsic muscles are responsible for crude movements of hand whereas intrinsic muscles control fine movements. Controlling the prosthesis using intrinsic hand muscles has the advantage of providing finger control independent of wrist motion [32], but depending on the level of amputation, the required region may not be available. For this reason, this study will only concentrate on extrinsic muscles.
According to the standards for reporting EMG data, significant EMG activity occurs in the 5–450 Hz bandwidth [36]. To analyze EMG signals in this bandwidth, most studies uses a clinical grade EMG acquisition device which samples data at 1000 Hz or higher. However, these instruments are expensive, making practical social use difficult or prohibitive. To overcome this issue, this study takes on the challenge of using a relatively low-cost consumer grade EMG sensor with lower sampling rate to decode large number of wrist and finger motions. The results of this work will help determine the validity of low cost EMG sensors as a modality in myoelectric prosthetic control.
The acquired raw EMG (rEMG) signals comes with a DC offset of 2.5 V which needs to be removed before any analysis can be performed. The offset is removed by demeaning, i.e., subtracting the average amplitude of rEMG signal during the first 10 s from the whole signal. The preprocessed EMG (pEMG) signals will be later used in the feature extraction process. An example of raw EMG signals during the execution of 17 motions is shown in Figure 5.

4. Signal Processing Algorithm for Classification of 17 Motions

In this section, the proposed signal processing algorithm to classify 17 motions is described. In this method, the pEMG signals are transformed from time domain to frequency domain by performing fast Fourier transform (FFT) that uses a hamming window function. Next, power spectrum densities (PSDs) are calculated from the FFT results. Feature points that best characterize each motion are selected from the power spectrum densities for each channel to form feature vectors. Finally, the 17 motions are classified by feeding the feature vectors into the artificial neural network (ANN) classifier. The details of the proposed method are described in the following sections.

4.1. Feature Extraction Based on Spectral Analysis

After removing the DC offset, PSDs are calculated by applying FFT to the pEMG signals. In this step, a 32-sample hamming window with 75% overlap is used when performing FFT. Next, feature points for each motion are extracted from the PSDs to construct a feature vector. In this study, spectral magnitudes at 6 frequencies are used as features. Considering the fact that the Nyquist frequency is 100 Hz (sampling frequency of the equipment is 200 Hz) and the frequency resolution is 6.25 Hz, one frequency is chosen from each of the following frequency range: 10–20 Hz, 20–30 Hz, 30–40 Hz, 60–70 Hz, 70–80 Hz, 80–90 Hz. The 40–50 Hz and 50–60 Hz frequency ranges are not considered because they contain the frequencies of the power-line interferences. Based on the selection results, the elements of the feature vectors are made up of log-transformed spectral magnitude at 18.75 Hz, 25 Hz, 31.25 Hz, 68.75 Hz, 75 Hz, 81.25 Hz of each channel. Thus, the feature vector of jth motion is a 36-dimensional vector and can be expressed as
f j = [ f 1 j f i j f 36 j ] 1 × 36 ,
where f i j is the ith element of the feature vector of jth motion. Figure 6 and Figure 7 shows an example of normalized PSDs of all 17 motions derived from pEMG signals on channel 2 and channel 5.
Let m be the number of sets of training data used in the training process, with each set containing EMG signals of all 17 motions (1 trial for each motion). Then, from a single set of training data, a matrix consisting of feature vectors of each motion can be derived and defined as
F n = [ f 1 f j f 17 ] 17 × 36 ,       n { 1 , 2 , , m } ,
Furthermore, using (2), a feature matrix containing all feature vectors derived from m sets of training data can be expressed as
F = [ F 1 F n F m ] 17 m × 36 = [ f 1 f i f 36 ] ,
where f i is a 17m dimension column vector. Next, the feature matrix (3) is normalized by
F ˜ = [ f 1 min ( f 1 ) f i min ( f i ) f 36 min ( f 36 ) ] 17 m × 36 ,
and use as the input signal to train the ANN classifier. In the online classification stage, the feature vector at kth sampling instance are given by
f ^ ( k ) = [ f ^ 1 ( k ) min ( f 1 ) f ^ i ( k ) min ( f i ) f ^ 36 ( k ) min ( f 36 ) ] 1 × 36 .
Note that min ( f i ) should be determined beforehand from m training data sets.

4.2. Training of Artificial Neural Network Classifier

A feature vector of a given motion consists of 36 elements as denoted in (1). Since there are 17 motions, a data set can produce 17 feature vectors (one feature vector for each motion). In the training stage, feature matrix generate from m data sets are used to train the classifier as defined in (4). Thus, the teaching signal T is given by
T = [ I 17 I 17 I 17 ]   T 17 m × 17 .
In this study, the artificial neural network (ANN) is chosen as the classifier to discriminate the 17 motions. Currently, several kinds of ANN have been proposed. In this paper, a multilayered perceptron, which is a feed-forward ANN is chosen.
The multilayer perceptron is a multilayered ANN which consists of one or more hidden layers. This type of ANN is known as a supervised network because it requires a desired output (teaching signal) in order to learn. A set of learning data that consists of the feature vector and the corresponding teaching signals is used to have the perceptron acquire appropriate connection weights and threshold values of the middle layer and the output layer by machine learning. The multilayered perceptron and many other ANN learn by using an algorithm called backpropagation. With back propagation method, the input data (feature vector) is repeatedly presented to the ANN. With each presentation, the output of the ANN is compared to the desired output (teaching signal) and an error is computed. This error is then feedback (back-propagated) to the ANN and used to adjust weights so that the error decreases with each iteration and the ANN model gets closer and closer to producing the desired output [37,38]. When performing real-time classification, these connection weights and threshold values are fixed and determined beforehand. To keep the computational cost low, an ANN with only one hidden layer is structured in this study. Table 2 shows the number of neurons in the input, hidden and output layer along with the learning coefficient of each classification scenarios (see Section 5.1.2). Based on our experience [35] and also suggested by [39], overfitting may occur in ANN if the training iteration (epoch) is too long. To prevent this, the training process is terminated when the error reaches 0.001 or when the number of training epoch reaches 2000. It is also worth noting that the number of neurons in the hidden layer for each classification scenarios are determined based on our experience and trial and error. More specifically, the number of neurons that produces the lowest training error is selected. In the case that the training error reaches 0.001 for two or more neuron arrangements, the number of neurons that requires the least training epoch will be chosen.
Furthermore, a classification label of a motion at any given sampling instance is defined by the class with the highest ANN output. Thus, the classification output corresponding to the discriminated motion will have a value of 1 while classification output for other motions will have a value of 0.

5. Online Classification Experiment

In this section, experimental conditions necessary to perform online classification of 17 motions are described. The aim of this experiment is to clarify how well the proposed algorithm works based on the following three criteria:
(1)
Classification accuracy of 17 motions
(2)
Classification accuracy without re-training the classifier
(3)
Classification accuracy of 9 motions (motions that are required to complete most daily tasks).
The results of the experiment are evaluated and discussed.

5.1. Experimental Conditions

5.1.1. Participants

In this study, the experiments were carried out with the cooperation of 12 able-bodied participants (referred to as participant A to L) in their twenties. A total of 7 were males (participant A, D, E, G, H, I, and L) and 5 were females (participant B, C, F, J and K). The entire protocol and aims of the study are fully explained to them before the experiment, and all participants signed the written informed consent. All of the experiments are conducted with the approval from Tokyo Denki University Human Bioethics Committee. Apart from participant A and B, all other participants had no prior experience with myoelectric control experiments at all.
During the electrode placement, no particular skin preparation technique was applied to the forearm as to mimic the real-life usage scenarios. Also, hair on the forearms of male participants were not shaved according their request.

5.1.2. Experiment Protocol

In the experiment, participants were seated in front of a monitor and we asked to perform the 17 movements in order as shown in Figure 8. During the experiment, visual and audio cues were presented to the participants. In each experimental run, participants were asked to perform each motion once and sustain that motion for 5 s. They were also instructed to execute each motion naturally and not forcefully. For flexion and extension movements of middle, ring, and little finger, it is generally difficult to independently move each finger without simultaneously moving the other two. The participants were informed that it is perfectly fine if the other fingers were to also move.
The experiment is divided into training and online testing stage both conducted on the same day. First, five sets of data were acquired for training the ANN and later, 10 sets of testing data were recorded for evaluation. The same experiment is repeated again on a different day for participants A, B, E and H. There was a 4-day gap in between day 1 and day 2 for participant A, a 2-day gap for participant B, and a 6-day gap for participant E and H. To address the research questions, four online and pseudo-online classification scenarios are designed as describe below. Details of each classification scenarios can also be found in Table 2. The results are evaluated and discussed in the next section.
(1)
Classification of 17 voluntary movements: This classification scenario aims to evaluate the classification performance of the proposed system. The result of this experiment will give us a hint of the extent of using state-of-the-art pattern recognition scheme to classify 17 types of motions. It will also help demonstrate the validity of low-cost hardware as a modality for myoelectric control. In this experiment, ANN is trained using all five training data sets and the accuracy is evaluated using all 10 testing data sets.
(2)
Classification of 17 movements using testing data obtain on a different day: This classification scenario aims to determine how well the proposed method can perform without having to re-train the classifier. To do so, the ANN parameters from day1 are used to classify the testing data obtained in day 2, and vice-versa.
(3)
Classification of 9 movements: This scenario is intended to see how much the accuracy improves when the number of classes are reduced to nine. The 9 motions involved in this classification scenario are pronation, supination, hand grasping, hand opening, flexion and extension of thumb and index finger, and a rest state. These motions were selected because they are the basic hand/arm movements needed to carry out most daily tasks, e.g., turning a door knob, grabbing and relocating an object, picking up a small object. The same EMG data involved in the previous classifications are used in this scenario. Since both training and testing data sets contains EMG signals for 17 motions, the epoch of the non-target motions along with subsequent rest period was cut off and the remaining epoch were merged together to form a new time series data set lasting 100 s.
(4)
Classification of 9 movements using testing data obtain on a different day: Similar to the scenario 2, this scenario hopes to test the functionality of the proposed algorithm without having to re-train the classifier.
In this study, the evaluation of classification accuracy is based on correct and incorrect rates. Since the ANN has the same number of output neurons as the number of motions, the correct and incorrect rates for each motion are calculated from the output neuron belonging to the motion of interest (Figure 9). Correct classification rate is given by
Correct Rate = Number of correct classification Total number of classification samples × 100 % .
On the other hand, incorrect classification rate is calculated by
Incorrect Rate = Number of erroneous classification Total number of classification samples × 100 % .
Figure 9 shows an example of a classification output. We will use this example to demonstrate how correct and incorrect rates are calculated using (7) and (8). In this example, correct classification label of pronation movement is indicated by the green circle. Three red circles indicate the incorrect classification label of pronation movement. Since the ANN classifier gives a decision every 0.005 s (200 Hz), and each movement is sustained for 5 s, the total number of classification samples for each motion is 1000. In the example shown in Figure 9, 904 samples out of 1000 samples were correctly classified for pronation movement, thus, the correct rate becomes 90.4%. On the other hand, 1/1000 samples (0.1%) were misclassified as supination movement, 19/1000 samples (1.9%) were misclassified as middle finger flexion movement, and 76/1000 samples (7.6%) were misclassified as rest state. Adding up these numbers will give an incorrect rate of 9.6% for middle finger flexion movement.

5.2. Results and Discussions

5.2.1. Classification of 17 Voluntary Movements

This classification scenario provides some answers to all of the three research questions. The questions are:
(1)
How well can the state-of-the-art classification scheme classify 17 types of movements?
(2)
Can a consumer grade EMG sensor serve as a modality in myoelectric prosthetic control?
(3)
How long can the user control the prosthetic hands until the accuracy drops significantly?
First, let us answer questions 1 and 2. According to Figure 10, under the condition that training and testing are done on the same day, an overall correct rate of 63.8% across subject can be achieved for classification of 17 motions using the proposed method. Also, depending on the participant and the muscle condition, the system can accomplish a correct rate of 82.9% (participant E) or an average accuracy of up to 76.1% (participant H). Furthermore, the statistical results in Figure 11 suggests that improvement in accuracy may be due to the fact that the user has gained more experience from the training on the previous day. However, since the current work only compares the accuracy between two days, it is still too early to draw a decisive conclusion based on the current results. As an extension to this study, more experiments should be conducted for multiple days. It would be interesting to observe how long this trend will persist and how long can the training effect last. Based on the above results combined with the fact that the user can promptly modify any erroneous actions executed by the prosthetic hand by looking at the actual output, the proposed method is functional and has demonstrated the potential to classify 17 motions using six EMG sensors. Furthermore, the results also suggest that a consumer-grade EMG sensor can serve as a modality in myoelectric prosthetic control and other human-machine interface applications.
Although our results cannot be directly compared with results in other studies due the difference in participants, equipment, experiment conditions, type of movement, etc., the results reported in this paper has demonstrated some advantages over existing studies. We will compare these with some of the works summarized in Table 3.
The first advantage is that the number of movements involved in this work is much greater than most studies. The work done by Adewuyi et al. [32] is the only study that involves more movement than this work. They report very high accuracy of up to 96% for classification of 19 classes. In their work, 19 electrodes are placed on the surface of the hand to capture the EMG signals responsible for fine motor control of the fingers. However, this electrode configuration is limited to only amputees who still retain a portion of the hand. At the same time, it is also inconvenient for healthy people to wear many electrodes. Works by Tenore et al. [21] Kanitz et al. [33] show that accurate classification of 12 finger motions is possible but they also face the disadvantage of having very large number of electrodes. In contrast, our work involves a smaller number of electrodes on the forearm which contributes to reducing equipment costs and preparation time, hence increases the usability of the system. Another wonderful advantage of using small number of electrodes is that it is easier to configure the current setup onto existing prosthetic hand. Furthermore, this work also offers the highest ratio of movements to electrodes ( N m / N c h ).
On the other hand, the limitation of the proposed system lies in the fact that the accuracy is lower than others studies. One of the reasons for this is that duration of each motion is too long. Many participants have reported experiencing difficulty in sustaining each movement for 5 s. Pronation/supination and flexion/extension of the index, middle, and the ring finger movements were especially hard according to the participants. This can be observed by looking at the ANN output signal where the system outputs the correct decisions only for the first 3 s. Muscle fatigue is also another reason for low accuracy. Participants also reported that they experienced muscular fatigue after performing 6–7 continuous classification trials (11–12 trials, if including trials from the training stage). The effects of the fatigue can be seen in the gradual decline of correct rates shown in Figure 12. The results of a paired t-test show that decline of accuracy due to fatigue starts on the 8th trial ( p < 0.1 ) and drops significantly on the 9th trial ( p < 0.05 , approximately 1 h of continuous usage). This observation suggests that in practice, a control strategy that will hold the posture of the prosthetic hand without having to sustain muscular contraction may be desirable to help relief the level of fatigue. Also, re-training of the classifier should be performed after 1 h of continuous usage. Thus, the above results address the research question number 3.
It is also worth mentioning that there is a tradeoff between the number of channels and the accuracy of the system. In practice, depending on the demands of the user or the type of application, the number of electrodes should be configured for each user in order to suit the needs of any individual.

5.2.2. Classification of 17 Movements Using Testing Data Obtain on a Different Day

The classification results of 17 motions where training and testing data sets are not from the same day are shown in Figure 11. According to this figure, the accuracy of classification using ANN parameters that are a few days old is around 40%. Depending on user and their condition, the system can yield an accuracy of 56% at best. Although re-training the classifier is highly recommended in order to restore dexterous control, the results suggest that control of the prosthetic hand is possible to a certain degree without having to re-train the classifier. An extension to this study would be to investigate how accuracy changes over time in order to reduce the number of recalibration and optimize the usability.

5.2.3. Classification of 9 Movements

This scenario is intended to see how much the accuracy improves when the number of classes are reduced to nine. According to Figure 13, under the condition that training and testing are done on the same day, an overall correct rate of 72.9% across subject can be achieved for classification of 9 motions using the proposed method. Also, depending on the participant and the muscle condition, the system can accomplish a correct rate of 92.1% (participant L) or an average accuracy of up to 82.6% (participant F). These results demonstrate that reducing the number of motions helps improve the overall performance. Therefore, if the prosthetic hand is intended for a specific task, limiting the number of motions can significantly increase the performance and usability of the system. Furthermore, the above results also help strengthened the conclusion that a consumer-grade EMG sensor can serve as a modality in myoelectric prosthetic control.
Compared with the scenario dealing with 17 motions (Figure 11), it is interesting to see that no significant statistical difference can be observed for participant B, E and H in this scenario. On the other hand, statistical difference can now be observed for participant A. This statistical difference was not present in the previous classification scenario with 17 movements. The above results suggest that for participant B, E and H, the accuracies of the selected 9 motions may already be relatively high to begin with and that the trainings done on day 1 may not be sufficient to significantly improve these accuracies. The opposite could be implied for participant A. Since the current skill level and the rate of improvement is different for each individual for each type of motion, a training method/menu specifically tailored for each individual may help improve the accuracy significantly.

5.2.4. Classification of 9 Movements Using Testing Data Obtain on a Different Day

The classification results of 9 motions where training and testing data sets are not from the same day are shown in Figure 14. According to this figure, the accuracy of classification using ANN parameters that are a few days old is around 50%. Depending on user and their condition, the system can yield an accuracy of 67.5% at best. Compared to the scenario involving 17 movements, the accuracy has improved greatly. An extension to this study would be to test this scenario in real-time in order to evaluate the feasibility of a system with accuracy of around 50%.
.

5.2.5. Erroneous Classification

Further analysis shows that erroneous classification sometimes occurs for motions associated with fingers. As shown in Figure 15, for example, when the participant flexes his/her ring finger, the ANN sometimes misclassifies the motion as flexion of the middle finger. This happens because as the participant tries to flex the ring finger, the middle finger also flexes as well. Another example is when the participant extends his/her index finger or performs hand opening motion, the ANN misclassifies the intended motion as extension of the ring finger. In this case, misclassification happens because the muscles associated with the movement of these fingers are very close to each other which makes it difficult to obtain distinguishable EMG patterns. Interestingly, the motions that often get misclassified are different depending on who the user is. This may be due to the fact that each participant executes each motion in a different manner. An extension to this study would be to discuss ways to deal with these erroneous classifications in order to improve the accuracy.

6. Conclusions

In this paper, a signal processing technique to classify 17 voluntary movements from EMG signals is proposed. The major novelty of this study is the use of a small set of low-cost EMG sensors (low sampling rate) to classify a reasonably large number of hand movements. Only six electrodes were used for EMG recording. In the proposed method, power spectrum densities are used as features and the motions are classified using artificial neural network. Online classification experiments were conducted on 12 participants to evaluate the validity of the proposed method. The results show that the proposed method is capable of achieving 42–83% overall accuracy (average of 63.8% across subjects). Statistical analysis results suggest that improvement in accuracy may be due to the fact that the user had gained more experience from the training on the previous day. Results also indicate that the accuracy drops significantly after one hour of continuous usage and re-training of the classifier is recommended. Based on the above results combined with the fact that users can promptly modify any incorrect classifications by looking at the actual output of the prosthetic hand, the proposed algorithm demonstrates the potential to classify 17 voluntary movements from 6 consumer grade EMG sensors. Furthermore, classifying 9 motions using this method could achieve 43–92% accuracy (average of 72.9% across subjects). Based on this result, if the prosthetic hand is intended for a specific task, limiting the number of motions can significantly increase the performance and usability. Overall, the findings reported in this paper have given us an idea of the extent of how well the current state-of-the-art classification scheme can perform with limited hardware. The discussions have also revealed the challenges towards developing a practical and multi-functional prosthesis. An extension to this study would be to raise the overall performance by improving the feature extraction algorithm as well as devising a training method in order to improve the accuracy.

Author Contributions

T.J. and E.N. conceived, designed and performed the experiments; T.J. and C.L. analyzed the data; T.J., E.N. and J.I. developed the hardware and software of the myoelectric hand system; T.J., E.N., C.L. and J.I. wrote the paper.

Conflicts of Interest

The authors declare no conflict of interest.

References

  1. Chao, E.Y.S.; An, K.-N.; Cooney, W.P.; Linscheid, R. Biomechanics of the Hand: A Basic Research Study; World Scientific Publishing: Teaneck, NJ, USA, 1989; pp. 5–75. ISBN 978-9971-5-0103-7. [Google Scholar]
  2. Pillet, J.; Didierjean-Pillet, A. Aesthetic hand prosthesis: Gadget or therapy? Presentation of a new classification. J. Hand Surg. Br. Eur. Vol. 2001, 6, 523–528. [Google Scholar] [CrossRef]
  3. Ministry of Health, Labour and Welfare, Social Welfare and War Victims’ Relief Bureau, Department of Health and Welfare for Persons with Disabilities Policy Planning Division. Report on Survey on Persons with Physical Disability 2006; Ministry of Health, Labour and Welfare: Tokyo, Japan, 2006; pp. 3–4. (In Japanese)
  4. Pezzin, L.E.; Dillingham, T.R.; Mackenzie, E.J.; Ephraim, P.; Rossbach, P. Use and satisfaction with prosthetic limb devices and related services. Arch. Phys. Med. Rehabil. 2004, 85, 723–729. [Google Scholar] [CrossRef]
  5. Biddiss, E.A.; Chau, T.T. Upper limb prosthesis use and abandonment: A survey of the last 25 years. Prosthet. Orthot. Int. 2007, 31, 236–257. [Google Scholar] [CrossRef]
  6. Bionic Hand with 14 Grip Patterns That Makes Common Tasks Easy—Bebionic. Available online: http://bebionic.com/the_hand/grip_patterns (accessed on 18 July 2016).
  7. I-Limb Ultra|Touch Bionics. Available online: http://www.touchbionics.com/products/active-prostheses/i-limb-ultra (accessed on 18 July 2016).
  8. Wiste, T.E.; Dalley, S.A.; Withrow, T.J.; Goldfarb, M. Design of a multifunctional anthropomorphic prosthetic hand with extrinsic actuation. IEEE/ASME Trans. Mechatron. 2009, 14, 699–706. [Google Scholar] [CrossRef]
  9. Losier, Y.; Clawson, A.; Wilson, A.; Scheme, E.; Englehart, K.; Kyberd, P.; Hudgins, B. An overview of the UNB hand system. In Proceedings of the 2011 MyoElectric Controls/Powered Prosthetics Symposium, Fredericton, NB, Canada, 14–19 August 2011. [Google Scholar]
  10. Belter, J.T.; Dollar, A.M. Novel differential mechanism enabling two DOF from a single actuator: Application to a prosthetic hand. In Proceedings of the IEEE International Conference on Rehabilitation Robotics (ICORR), Seattle, DC, USA, 24–26 June 2013; pp. 1–6. [Google Scholar] [CrossRef]
  11. Cipriani, C.; Controzzi, M.; Carrozza, M.C. The SmartHand transradial prosthesis. J. NeuroEng. Rehabil. 2011, 8, 1–13. [Google Scholar] [CrossRef]
  12. Liu, H.; Wu, K.; Meusel, P.; Seitz, N.; Hirzinger, G.; Jin, M.H.; Liu, Y.W.; Fan, S.W.; Lan, T.; Chen, Z.P. Multisensory five-fingered dexterous hand: The DLR/HIT Hand II. In Proceedings of the IEEE/RSJ International Conference on Intelligent Robots Systems (IROS), Nice, France, 22–26 September 2008; pp. 3692–3697. [Google Scholar] [CrossRef]
  13. Kamikawa, Y.; Maeno, T. Underactuated five-finger prosthetic hand inspired by grasping force distribution of humans. In Proceedings of the IEEE/RSJ International Conference on Intelligent Robots Systems (IROS), Nice, France, 22–26 September 2008; pp. 717–722. [Google Scholar] [CrossRef]
  14. Krausz, N.E.; Rorrer, R.A.L.; Weir, R.F.F.F. Design and Fabrication of a Six Degree-of-Freedom Open Source Hand. IEEE Trans. Neural Syst. Rehabil. Eng. 2016, 24, 562–572. [Google Scholar] [CrossRef]
  15. Jahan, M.; Manas, M.; Sharma, B.B.; Gogoi, B.B. Feature extraction and pattern recognition of EMG-based signal for hand movements. In Proceedings of the 2015 International Symposium on Advanced Computing and Communication (ISACC), Silchar, India, 14–15 September 2015. [Google Scholar] [CrossRef]
  16. Zhang, H.; Zhao, Y.; Yao, F.; Xu, L.; Shang, P.; Li, G. An adaptation strategy of using LDA classifier for EMG pattern recognition. In Proceedings of the 35th Annual International Conference of the IEEE Engineering in Medicine and Biology Society (EMBC), Osaka, Japan, 3–7 July 2013; pp. 4267–4270. [Google Scholar] [CrossRef]
  17. Ahsan, M.R.; Ibrahimy, M.I.; Khalifa, O.O. Electromygraphy (EMG) signal based hand gesture recognition using artificial neural network (ANN). In Proceedings of the 2011 4th International Conference on Mechatronics (ICOM), Kuala Lumpur, Malaysia, 17–19 May 2011. [Google Scholar] [CrossRef]
  18. Oskoei, M.A.; Hu, H. Support Vector Machine-Based Classification Scheme for Myoelectric Control Applied to Upper Limb. IEEE Trans. Biomed. Eng. 2008, 55, 1956–1965. [Google Scholar] [CrossRef]
  19. Benatti, S.; Casamassima, F.; Milosevic, B.; Farella, E.; Schönle, P.; Fateh, S.; Burger, T.; Huang, Q.; Benini, L. A Versatile Embedded Platform for EMG Acquisition and Gesture Recognition. IEEE Trans. Biomed. Circuits Syst. 2015, 9, 620–630. [Google Scholar] [CrossRef]
  20. Riillo, F.; Quitadamo, L.R.; Cavrini, F.; Gruppioni, E.; Pinto, C.A.; Cosimo Pastò, N.; Sbernini, L.; Albero, L.; Saggio, G. Optimization of EMG-based hand gesture recognition: Supervised vs. unsupervised data preprocessing on healthy subjects and transradial amputees. Biomed. Signal Process. Control 2014, 14, 117–125. [Google Scholar] [CrossRef]
  21. Tenore, F.V.G.; Ramos, A.; Fahmy, A.; Acharya, S.; Etienne-Cummings, R.; Thakor, N.V. Decoding of Individuated Finger Movements Using Surface Electromyography. IEEE Trans. Biomed. Eng. 2009, 56, 1427–1434. [Google Scholar] [CrossRef]
  22. Huang, H.; Zhou, P.; Li, G.; Kuiken, T. Spatial Filtering Improves EMG Classification Accuracy Following Targeted Muscle Reinnervation. Ann. Biomed. Eng. 2009, 37, 1849–1857. [Google Scholar] [CrossRef]
  23. Jiang, M.W.; Wang, R.C.; Wang, J.Z.; Jin, D.W. A Method of Recognizing Finger Motion Using Wavelet Transform of Surface EMG Signal. In Proceedings of the 27th Annual International Conference of the Engineering in Medicine and Biology Society, Shanghai, China, 1–4 September 2005; pp. 2672–2674. [Google Scholar] [CrossRef]
  24. Englehart, K.; Hudgins, B.; Parker, P.A.; Stevenson, M. Classification of the myoelectric signal using time-frequency based representations. Med. Eng. Phys. 1999, 21, 431–438. [Google Scholar] [CrossRef]
  25. Nazmi, N.; Rahman, M.A.A.; Yamamoto, S.; Ahmad, S.A.; Zamzuri, H.; Mazlan, S.A. A Review of Classification Techniques of EMG Signals during Isotonic and Isometric Contractions. Sensors 2016, 16, 1304. [Google Scholar] [CrossRef]
  26. Huang, Y.; Englehart, K.B.; Hudgins, B.; Chan, A.D.C. A Gaussian mixture model based classification scheme for myoelectric control of powered upper limb prostheses. IEEE Trans. Biomed. Eng. 2005, 52, 1801–1811. [Google Scholar] [CrossRef]
  27. Pylatiuk, C.; Schulz, S.; Doderlein, L. Results of an Internet survey of myoelectric prosthetic hand users. Prosthet. Orthot. Int. 2007, 31, 362–370. [Google Scholar] [CrossRef]
  28. Drake, R.; Vogl, A.W.; Mitchell, A. Gray’s Anatomy for Students, 2nd ed.; Churchill Livingstone: Amsterdam, The Netherlands, 2004; ISBN 9781455755417. [Google Scholar]
  29. Tsujimura, T.; Yamamoto, S.; Izumi, K. Hand Sign Classification Employing Myoelectric Signals of Forearm. In Computational Intelligence in Electromyography Analysis—A Perspective on Current Applications and Future Challenges; Naik, G.R., Ed.; InTech: Houston, TX, USA, 2012. [Google Scholar] [CrossRef]
  30. Fukuda, O.; Bu, N.; Tsuji, T. Control of an Externally Powered Prosthetic Forearm Using Raw-EMG Signals. Trans. Soc. Instrum. Control Eng. 2004, 40, 1124–1131. (In Japanese) [Google Scholar]
  31. Liu, L.; Liu, P.; Clancy, E.A.; Scheme, E.; Englehart, K.B. Electromyogram Whitening for Improved Classification Accuracy in Upper Limb Prosthesis Control. IEEE Trans. Neural Syst. Rehabil. Eng. 2013, 21, 767–774. [Google Scholar] [CrossRef]
  32. Adewuyi, A.A.; Hargrove, L.J.; Kuiken, T.A. An Analysis of Intrinsic and Extrinsic Hand Muscle EMG for Improved Pattern Recognition Control. IEEE Trans. Neural Syst. Rehabil. Eng. 2016, 24, 485–494. [Google Scholar] [CrossRef]
  33. Kanitz, G.R.; Antfolk, C.; Cipriani, C.; Sebelius, F.; Carrozza, M.C. Decoding of Individuated Finger Movements Using Surface EMG and Input Optimization Applying a Genetic Algorithm. In Proceedings of the 2011 Annual International Conference of the IEEE Engineering in Medicine and Biology Society, Boston, MA, USA, 30 August–3 September 2011; pp. 1608–1611. [Google Scholar] [CrossRef]
  34. Hargrove, L.J.; Englehart, K.; Hudgins, B. A comparison of surface and intramuscular myoelectric signal classification. IEEE Trans. Biomed. Eng. 2007, 54, 847–853. [Google Scholar] [CrossRef]
  35. Kambayashi, A.; Kuniyasu, K.; Jiralerspong, T.; Ishikawa, J. Identification of voluntary movements of five fingers for myoelectric prosthetic hand. In Proceedings of the 32nd Annual Conference of the Robotics Society of Japan, Tokyo, Japan, 4–6 September 2014; p. 32. (In Japanese). [Google Scholar]
  36. Merletti, R. Standards for Reporting EMG Data. J. Electromyogr. Kinesiol. 2017, 35, 1–2. [Google Scholar] [CrossRef]
  37. Ebecken, N.F.F. An Overview on the Use of Neural Networks for Data Mining Tasks. J. Br. Neural Netw. Soc. 2011, 9, 202–212. [Google Scholar] [CrossRef]
  38. Kodaka, T. Hjimete no Kikai Gakushu; Ohmsha: Tokyo, Japan, 2011; pp. 172–219. ISBN 978-4-274-06846-1. (In Japanese) [Google Scholar]
  39. Shim, H.; An, H.; Lee, S.; Lee, E.H.; Min, H.; Lee, S. EMG Pattern Classification by Split and Merge Deep Belief Network. Symmetry 2016, 8, 148. [Google Scholar] [CrossRef]
Figure 1. Overview of current electromyography (EMG) based pattern recognition studies illustrating the number of electrodes used and the number of movements classified.
Figure 1. Overview of current electromyography (EMG) based pattern recognition studies illustrating the number of electrodes used and the number of movements classified.
Applsci 07 01163 g001
Figure 2. Six forearm motions.
Figure 2. Six forearm motions.
Applsci 07 01163 g002
Figure 3. System Architecture.
Figure 3. System Architecture.
Applsci 07 01163 g003
Figure 4. Electrode positions. The positions of the six electrodes are selected with reference to previous work [35].
Figure 4. Electrode positions. The positions of the six electrodes are selected with reference to previous work [35].
Applsci 07 01163 g004
Figure 5. An example of raw EMG (rEMG) signal during the execution of 17 motions. The channels are layout in ascending order starting from the top.
Figure 5. An example of raw EMG (rEMG) signal during the execution of 17 motions. The channels are layout in ascending order starting from the top.
Applsci 07 01163 g005
Figure 6. An example of PSDs prior to log transform of 17 motions shown as a function of time. The PSDs are derived from pEMG signals on channel 2. In order from the top, PSDs at 18.75 Hz, 25 Hz, 31.25 Hz, 68.75 Hz, 75 Hz, 81.25 Hz. Note that the PSDs are normalized for better visualization.
Figure 6. An example of PSDs prior to log transform of 17 motions shown as a function of time. The PSDs are derived from pEMG signals on channel 2. In order from the top, PSDs at 18.75 Hz, 25 Hz, 31.25 Hz, 68.75 Hz, 75 Hz, 81.25 Hz. Note that the PSDs are normalized for better visualization.
Applsci 07 01163 g006
Figure 7. An example of PSDs prior to log transform of 17 motions shown as a function of time. The PSDs are derived from pEMG signals on channel 5. In order from the top, PSDs at 18.75 Hz, 25 Hz, 31.25 Hz, 68.75 Hz, 75 Hz, 81.25 Hz. Note that the PSDs are normalized for better visualization.
Figure 7. An example of PSDs prior to log transform of 17 motions shown as a function of time. The PSDs are derived from pEMG signals on channel 5. In order from the top, PSDs at 18.75 Hz, 25 Hz, 31.25 Hz, 68.75 Hz, 75 Hz, 81.25 Hz. Note that the PSDs are normalized for better visualization.
Applsci 07 01163 g007
Figure 8. Execution sequence of 17 motions (including rest state). Each motion is executed for five seconds with a five second resting interval in between each motion. The total duration of each run is 180 s. The feature point for rest state is selected between 170 and 175 s.
Figure 8. Execution sequence of 17 motions (including rest state). Each motion is executed for five seconds with a five second resting interval in between each motion. The total duration of each run is 180 s. The feature point for rest state is selected between 170 and 175 s.
Applsci 07 01163 g008
Figure 9. Example of a classification output. The blue line indicates the classification signal. The red line indicates teaching signals. The green circle represents correct classification label of pronation movement as an example. The red circle indicates the incorrect classification label of pronation movement. In this study, the artificial neural network (ANN) classifier gives a decision every 0.005 s (200 Hz).
Figure 9. Example of a classification output. The blue line indicates the classification signal. The red line indicates teaching signals. The green circle represents correct classification label of pronation movement as an example. The red circle indicates the incorrect classification label of pronation movement. In this study, the artificial neural network (ANN) classifier gives a decision every 0.005 s (200 Hz).
Applsci 07 01163 g009
Figure 10. Average overall correct classification rate of 17 voluntary movements of 12 participants. The red error bars indicate standard deviation. The two numbers on top of the bars indicate highest and lowest overall correct rate. Note that the average correct rate of participant A, B, E and H are from the experiment on day 2 (best classification). Also, the mean overall correct rate indicates the overall correct rate across subjects.
Figure 10. Average overall correct classification rate of 17 voluntary movements of 12 participants. The red error bars indicate standard deviation. The two numbers on top of the bars indicate highest and lowest overall correct rate. Note that the average correct rate of participant A, B, E and H are from the experiment on day 2 (best classification). Also, the mean overall correct rate indicates the overall correct rate across subjects.
Applsci 07 01163 g010
Figure 11. Comparison of mean correct rates of each classification scenario (17 Movements). (*) represents p < 0.05 , (**) represents p < 0.01 and (***) represents p < 0.001 .
Figure 11. Comparison of mean correct rates of each classification scenario (17 Movements). (*) represents p < 0.05 , (**) represents p < 0.01 and (***) represents p < 0.001 .
Applsci 07 01163 g011
Figure 12. Correct classification rate of 17 voluntary movements of each trial. Note that the correct rates of participant A, B, E and H are from the experiment on day 2 (best classification).
Figure 12. Correct classification rate of 17 voluntary movements of each trial. Note that the correct rates of participant A, B, E and H are from the experiment on day 2 (best classification).
Applsci 07 01163 g012
Figure 13. Average overall correct classification rate of 9 voluntary movements of 12 participants. The red error bars indicate standard deviation. The two numbers on top of the bars indicate highest and lowest overall correct rate. Note that the average correct rate of participant A, B, E and H are from the experiment on day 2 (best classification). Also, the mean overall correct rate indicates the overall correct rate across subjects.
Figure 13. Average overall correct classification rate of 9 voluntary movements of 12 participants. The red error bars indicate standard deviation. The two numbers on top of the bars indicate highest and lowest overall correct rate. Note that the average correct rate of participant A, B, E and H are from the experiment on day 2 (best classification). Also, the mean overall correct rate indicates the overall correct rate across subjects.
Applsci 07 01163 g013
Figure 14. Comparison of mean correct rates of each classification scenario (9 movements). (*) represents p < 0.05 .
Figure 14. Comparison of mean correct rates of each classification scenario (9 movements). (*) represents p < 0.05 .
Applsci 07 01163 g014
Figure 15. Confusion matrix of classification performance across subject. The 17 motions include pronation (pro), supination (sup), flexion (flex), extension (ext), hand grasping (grasp), hand opening (open), thumb flexion (TF) and extension (TE), index finger flexion (IF) and extension (IE), middle finger flexion (MF) and extension (ME), ring finger flexion (RF) and extension (RE), little finger flexion (LF) and extension (LE), and rest state (rest).
Figure 15. Confusion matrix of classification performance across subject. The 17 motions include pronation (pro), supination (sup), flexion (flex), extension (ext), hand grasping (grasp), hand opening (open), thumb flexion (TF) and extension (TE), index finger flexion (IF) and extension (IE), middle finger flexion (MF) and extension (ME), ring finger flexion (RF) and extension (RE), little finger flexion (LF) and extension (LE), and rest state (rest).
Applsci 07 01163 g015
Table 1. Muscle of the forearm and their functions.
Table 1. Muscle of the forearm and their functions.
Channel NumberMuscleFunction
1Flexor carpi radialis muscleFlexion of the hand at the wrist, pronation of the forearm (works along with other muscles which pronate the forearm)
2Flexor carpi ulnaris muscleFlexion of the hand at the wrist
3Flexor digitorum profundus muscleFlexion of the fingers (excluding the thumb)
4Flexor pollicis longus muscleFlexion of the thumb
5Extensor carpi radialis longus muscleExtension of the wrist, assists movements of the digits
6Extensor pollicis longus muscleExtension of the thumb
Table 2. Details of each classification scenarios.
Table 2. Details of each classification scenarios.
Classification ScenarioNumber of MotionsTraining DataTesting DataParticipantNumber of Neurons in the Input Layer/Hidden Layer/Output LayerLearning Coefficient
117Day 1Day 1A36/20/171.2
Day 2Day 2A36/20/17
Day 1Day 1B36/20/17
Day 2Day 2B36/20/17
Day 1Day 1C36/24/17
Day 1Day 1D36/20/17
Day 1Day 1E36/24/17
Day 2Day 2E36/20/17
Day 1Day 1F36/16/17
Day 1Day 1G36/20/17
Day 1Day 1H36/18/17
Day 2Day 2H36/18/17
Day 1Day 1I36/20/17
Day 1Day 1J36/22/17
Day 1Day 1K36/24/17
Day 1Day 1L36/20/17
217Day 1Day 2A36/20/171.2
Day 2Day 1A36/20/17
Day 1Day 2B36/20/17
Day 2Day 1B36/20/17
Day 1Day 2E36/24/17
Day 2Day 1E36/20/17
Day 1Day 2H36/18/17
Day 2Day 1H36/18/17
39Day 1Day 1A36/14/91.2
Day 2Day 2A36/12/9
Day 1Day 1B36/14/9
Day 2Day 2B36/16/9
Day 1Day 1C36/20/9
Day 1Day 1D36/14/9
Day 1Day 1E36/16/9
Day 2Day 2E36/14/9
Day 1Day 1F36/12/9
Day 1Day 1G36/18/9
Day 1Day 1H36/20/9
Day 2Day 2H36/16/9
Day 1Day 1I36/20/9
Day 1Day 1J36/16/9
Day 1Day 1K36/14/9
Day 1Day 1L36/18/9
49Day 1Day 2A36/14/91.2
Day 2Day 1A36/12/9
Day 1Day 2B36/14/9
Day 2Day 1B36/16/9
Day 1Day 2E36/16/9
Day 2Day 1E36/14/9
Day 1Day 2H36/20/9
Day 2Day 1H36/16/9
Table 3. Summary of existing EMG based pattern recognition studies.
Table 3. Summary of existing EMG based pattern recognition studies.
Author/sYearNo. of Electrodes N c h No. of Movements N m Sampling Frequency N m / N c h AccuracyNumber of Subjects
Fukuda et al. [30]200466 W200 Hz192.1%3 H and 2 A
Jiang et al. [23]200546 IF2000 Hz1.580%10 H
Oskoei & Hu [18]200846 (5 W + R)1000 Hz1.595 %11 H
Tenore et al. [21]20093212 (10 IF & 2 CF)2000 Hz0.3890%5 H
190.6383%1 A
Ahsan et al. [17]201124 W1000 Hz288.4%3 H
Cipriani et al. [11]201187 (4 IF + 2CF + HG)10,000 Hz0.8889%5 H
79%5 A
Kanitz et al. [33]20111613 (12 IF + R)16,000 Hz0.8180%5 H and 1 A
Tsujimura et al. [29]201233 CF10,000 Hz197%1 H
Benatti et al. [19]201587 (5 W + IF + R)1000 Hz0.8890%4 H
Adewuyi et al. [32]20161919 (12 IF + 7 HG)1000 Hz196%9 H
This work-617 (10 IF + 6 W + R)200 Hz2.8363.8%12 H
9 (4 IF + 4 W + R)1.572.9%
1 IF: Individual Finger Movement. W: Wrist Movement. R: Rest State. CF: Combine Finger Movement. HG: Hand Grasp Movement. H: Healthy. A: Amuputee.

Share and Cite

MDPI and ACS Style

Jiralerspong, T.; Nakanishi, E.; Liu, C.; Ishikawa, J. Experimental Study of Real-Time Classification of 17 Voluntary Movements for Multi-Degree Myoelectric Prosthetic Hand. Appl. Sci. 2017, 7, 1163. https://doi.org/10.3390/app7111163

AMA Style

Jiralerspong T, Nakanishi E, Liu C, Ishikawa J. Experimental Study of Real-Time Classification of 17 Voluntary Movements for Multi-Degree Myoelectric Prosthetic Hand. Applied Sciences. 2017; 7(11):1163. https://doi.org/10.3390/app7111163

Chicago/Turabian Style

Jiralerspong, Trongmun, Emi Nakanishi, Chao Liu, and Jun Ishikawa. 2017. "Experimental Study of Real-Time Classification of 17 Voluntary Movements for Multi-Degree Myoelectric Prosthetic Hand" Applied Sciences 7, no. 11: 1163. https://doi.org/10.3390/app7111163

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop