Next Article in Journal
A Battery-Less Wireless Respiratory Sensor Using Micro-Machined Thin-Film Piezoelectric Resonators
Previous Article in Journal
A New Approach for the Control and Reduction of Warpage and Residual Stresses in Bonded Wafer
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Development of a Wearable Glove System with Multiple Sensors for Hand Kinematics Assessment

1
College of Automation Engineering, Nanjing University of Aeronautics and Astronautics, Nanjing 211100, China
2
Nanjing Research Institute of Electronic Technology, Nanjing 210013, China
3
Department of Mechanical Engineering, City University of Hong Kong, Hong Kong 999077, China
*
Author to whom correspondence should be addressed.
Micromachines 2021, 12(4), 362; https://doi.org/10.3390/mi12040362
Submission received: 4 March 2021 / Revised: 22 March 2021 / Accepted: 25 March 2021 / Published: 27 March 2021

Abstract

:
In traditional hand function assessment, patients and physicians always need to accomplish complex activities and rating tasks. This paper proposes a novel wearable glove system for hand function assessment. A sensing system consisting of 12 nine-axis inertial and magnetic unit (IMMU) sensors is used to obtain the acceleration, angular velocity, and geomagnetic orientation of human hand movements. A complementary filter algorithm is applied to calculate the angles of joints after sensor calibration. A virtual hand model is also developed to map with the glove system in the Unity platform. The experimental results show that this glove system can capture and reproduce human hand motions with high accuracy. This smart glove system is expected to reduce the complexity and time consumption of hand kinematics assessment.

1. Introduction

Hand function assessment plays an essential role in the recovery of stroke patients. Rehabilitation of hand function always requires long-term training after medical treatment. Traditional hand function assessment methods that have been used by physicians include Fugl-Meyer assessment (FMA), the action research arm test (ARAT), Jebsen–Taylor test (JTT), or Wolf motor function test (WMFT) [1,2,3,4,5]. With these methods, the patients must carry out some prescribed actions to showcase their hand function, and rating scores are given according to the patients’ performance. However, assessment with these methods often takes several hours to complete and does not reveal enough details during the rehabilitation process. Therefore, it is important to develop hand function assessment equipment that can help physicians make a quantitative analysis of patients’ rehabilitation status.
One obvious solution to get hand gesture information is using a visual camera system [6,7,8]. The advantage of using such a system is that it does not require the patients to wear any equipment and it can capture the gesture information of multiple patients at the same time. The disadvantage is that it can only work effectively in good light conditions and its shooting angle is highly restricted by the local environment. IR or radar sensors can be another non-contact alternative for hand gesture capture or recognition [9,10].
In recent years, data gloves have been widely used in human–computer interaction studies [11]. With the advancement of sensor technology, the cost of data gloves has been greatly reduced to suit daily medical rehabilitation needs [12]. Depending on their working mechanism, data gloves can generally be divided into these main types: mechanical, flexible sensing, optical fiber sensing, and inertial sensing. A mechanical data glove can be considered as a type of glove-like haptic device with force feedback. Blake et al. proposed a haptic glove for virtual reality applications based on magnetorheological (MR) brakes in 2009 [13]. Ma et al. developed a five-finger haptic glove using a worm-geared motor in 2015 [14]. Chiri et al. developed a multi-phalange finger module for post-stroke rehabilitation in 2012 [15]. Gu et al. presented a low-cost exoskeleton glove with passive force feedback in 2016 [16]. Unlike that of mechanical data gloves, the performance of flexible sensing data gloves largely depends on the repeatability and linearity of the sensor itself. Kostas et al. presented a data glove with flexible sensors early in 2003 [17]. Togetti et al. presented a sensing glove made of conductive elastomer materials in 2006 [18]. Shen et al. also developed a soft stretchable bending data glove that incorporates a sensor based on ethylene propylene rubber (EPR) in 2016 [19]. However, the most successful commercial data glove systems today are mainly based on optical fiber sensors, such as CyberGlove and 5DT data gloves. Innovations have also occurred in the research community for the design of optical fiber data gloves. Fujiwara et al. discussed a low-cost and flexible optical fiber glove to measure joint angles in 2014 [20]. Da Silva et al. developed a sensing glove based on optical fiber Bragg grating (FBG) sensors in 2011 [21]. Although optical fiber-based data gloves such as CyberGlove and 5DT data gloves can provide sufficiently precise joint angle data for rehabilitation applications, they are still too expensive for daily rehabilitation at home. Low-cost MEMS-based inertial sensors have been investigated in a number of studies on data gloves or human interactions due to their ability to provide stable attitudes in a three-dimensional (3D) space and ease of integration with PCBs. Lin et al. presented a data glove system with six-axis inertial sensors for stroke evaluation [22,23]. Choi et al. developed a low-cost inertial measurement unit (IMU) wearable sensing glove [24]. Liu et al. designed a novel inertial and magnetic unit (IMMU)-based data glove with an optimized sensor layout [25].
The data glove system discussed in this paper aims to help physicians evaluate patients’ hand function during rehabilitation. To get the joint angles of hand attitudes, 12 nine-axis inertial sensors are integrated with the Micro Controller Unit (MCU) and Universal Asynchronous Receiver/Transmitter (UART) modules on the glove. Several calibration algorithms are implemented to avoid errors caused by the accelerometer, gyroscope, and magnetometer. In addition, a human–computer interaction module is developed in the Unity platform. A grasping ball experiment is conducted to demonstrate how the glove system can be used to evaluate dynamic hand functions.

2. System Architecture

The workflow of our data glove system is demonstrated in Figure 1. First, the sensor units installed on the data glove obtain the raw information of hand movements. Then, the MCU controller initializes the data acquisition process, in which the sensor data are received with a Serial Peripheral Interface (SPI) bus and transmitted to the computer through a USB connection line or a Bluetooth transmitter. Finally, the data processing module verifies the data frames and calculates the angles of joints for hand function evaluation. Moreover, the human–computer interaction module uses the joint angle data to drive hand modeling in the Unity platform.

2.1. Hardware Design

The hardware of our data glove system includes several parts: (1) some IMMUs for capturing the angles of hand joints; (2) a microcontrol unit for data collection and transmission; (3) a USB connection wire or wireless Bluetooth transmitter; and (4) a PC or laptop for receiving sensor data and reconstructing hand motions. The MEMS-based sensors are uniquely suited for wearable applications due to their advantages of low power consumption, compactness, and high precision. MPU-9250 units from InvenSense, Inc. are used to capture the angles of joints when hand exercises are done. The nine-axis sensor MPU-9250 can be considered as a combination of a six-axis IMU and a three-axis compass, one is the six-axis IMU MPU6515, the other is the AK8963 3-axis magnetometer from Asahi Kasei Microdevices (AKM). The angular velocity range of the gyroscope is set to ±1000°/s and the acceleration range of the accelerometer is set to ±8 g. The sample rate is set to 100 Hz for all the sensors, which is high enough to capture most human hand motions. To achieve better fusion performance, a multi-sensor fusion algorithm is developed to work with the onboard motion fusion processor that comes with MPU-9250. A Leonardo microprocessor chip is used to collect data from the IMMUs. The microprocessor controller has 21 in–out digital ports, 7 Pulse Width Modulation (PWM) ports and 6 analog input ports to connect with 12 sensors via an SPI bus, as shown in Figure 2. Both the wired UART connection and Bluetooth wireless connection are tested in our glove system. The results show that the wired UART connection is more efficient and reliable than Bluetooth wireless connection, especially at a high transmission rate.

2.2. Analysis of Hand Joints and Layout of Sensors

Human hand motions can be represented by the rotation of hand joints, so the number and position of IMMU sensors can be determined according to the anatomical structures of the hand. A human hand includes several types of bones and tissues such as muscles and ligaments. As shown in Figure 3, the joints of the index finger, middle finger, ring finger, and little finger of a human hand can be divided into three main categories: distal interphalangeal joint (DIP), proximal interphalangeal joint (PIP), and metacarpophalangeal joint (MP). As for the thumb, it only has two joints: interphalangeal joint (IP) and MP. The MP is connected to the carpometacarpal joint (CM) for hand rotation. IMMU sensors should be placed near the abovementioned joints to measure the bending status of fingers. An understanding of the parent–child relationship between joints is a prerequisite for analyzing finger motions. The movement of the parent joints has a knock-on effect on all the child joints. The local x–y–z coordinates should be established to describe the rotation of each joint, as shown in Figure 4. The bending of a finger can be defined as a rotation around the x-axis. Empirically, the bending angle of the DIP joint can be approximately considered as 1/3 of the bending angle of the PIP joint, which has no obvious influence on the reconstruction of hand gestures.
Therefore, 12 IMMU sensors are placed on hand phalanges and one IMMU sensor is placed on the back of the hand, as shown in Figure 4. Joint angles and hand gestures can be estimated by the orientation of 11 IMMU sensors. Two generations of data gloves have been developed in our lab. For the 1st generation glove, all the sensors are connected by soft wires; it is easy to fabricate but may suffer from sealing off and bad contact problems. For the 2nd generation data glove, a flexible PCB design is adopted to enhance the system’s stability and reliability, as shown in Figure 5. The IMMU sensors are directly soldered on the flexible PCB without using any wire. Moreover, the flexible PCB is coated with a silicone layer to protect the electronic components from being corroded by moisture and sweat. The thickness of the silicone layer is set to 2 mm so as not to impair the flexibility of hand movements. The plug-in connector of the flexible PCB can be easily linked to a controller board. In this paper, the 1st generation data glove is used to demonstrate the feasibility of our algorithm and software.

3. Sensor Data Processing

The data processing is as follows: The microcontroller of the data glove sends the data from IMMU sensors to the computer frame by frame, and then the received data are preprocessed by parity frame check and frame decoding. The raw sensor data must be calibrated before they can be fused. Since the environmental influence on accelerometers and gyroscopes is mostly negligible, the calibration parameters of the accelerometer and gyroscope can be used for a long time after initial calibration. Unlike the accelerometer and gyroscope, the magnetometer needs to be recalibrated because it is subjected to the disturbance from environmental ferromagnetic materials. The attitude of each IMMU sensor can be estimated with a multi-sensor fusion algorithm using calibrated acceleration, angular velocity, and geomagnetism data. The orientation of the IMMU sensors on the glove of left hand is as defined in Figure 6 with the International Society of Biomechanics (ISB) recommendations on definitions of joint coordinates [26].

3.1. Sensor Calibration

Gyroscopes can be used to measure the rotation of hand joints with angular velocity. The angle of joint rotation can be calculated by integrating the angular velocity data over time. However, the error of angular velocity will accumulate after a long measurement process, resulting in a drift in the angle. Theoretically, the output value of a gyroscope should be zero after calibration without any motions. The zero deviation caused by the random error of the gyroscope can be estimated using the average filtering method. The corrected value of each axis can be obtained by subtracting the zero deviation error from the measured value. The error model of the gyroscope can be represented as follows:
g x _ c g y _ c g z _ c = g x g y g z g x _ e g y _ e g z _ e
where g x , g y , and g z denote the raw data of the gyroscope; g x _ c , g y _ c , and g z _ c the corrected angular velocity data; g x _ e , g y _ e ,   and   g z _ e the zero bias of three axes.
Possible errors of the accelerometer include zero deviation error, scale factor error, and non-orthogonal error. The accelerometer error model can be represented as follows [27]:
a x _ c a y _ c a z _ c = m x x m x y m x z m x y m y y m y z m x z m y z m z z · a x a x _ e a y a y _ e a z a z _ e
where a i i = x , y , z denotes the raw data of the accelerometer; a i _ c i = x , y , z the corrected acceleration data; b i i = x , y , z the zero bias; m i j i = x , y , z ; i = j the scale error factor; m i j i = x , y , z ; j = x , y , z ; i j the non-orthogonal error factor. The calibration error e m x x , , m z z , a x e , , a z _ e is defined as the residual between the squared sum of corrected acceleration and squared gravitational acceleration:
e ( m x x , , m z z , a x _ e , , a z _ e ) = a x _ c 2 + a y _ c 2 + a z _ c 2 g 2
Gauss–Newton’s method is applied to estimate the unknown calibration parameters ( m x x , , m z z , a x _ e , , a z _ e ) in this part. The iteration of calibration error can be represented by the following equation:
e k + 1 = e k + β k d k
where β k is the damping control factor, which is used to control the convergence speed of the iteration algorithm. A larger β k value means a faster convergence speed but lower accuracy. d k is defined as the iterative direction:
d k = J T J 1 J T e
where matrix J is defined as the Jacobian matrix of calibration error:
J e = e x = e m x x e b z e m x x e b z
When the iteration time reaches its maximum or the convergence condition is satisfied, the unknown calibration parameters can be estimated. With ε defined as the convergence threshold, the convergence condition can be represented as follows:
e ( m x x , , m z z , a x _ e , , a z _ e ) < ε
The external interference of a magnetic field can be divided into hard magnetic interference and soft magnetic interference. The hard magnetic interference would cause the measured data to deviate from the sphere origin as a whole. The soft magnetic interference would cause the shape of data distribution to change from a standard sphere to an ellipsoid. The error model of the magnetometer can be represented as follows:
m x _ c m y _ c m z _ c = k x k y k z · m x m y m z + c x c y c z
where m i i = x , y , z denotes the raw data of the magnetometer; m i _ c i = x , y , z the corrected magnetic field data; c i i = x , y , z the hard magnetic calibration parameters; k i i = x , y , z the soft magnetic calibration parameters. The magnetometer should be rotated in all directions to find out the maximum and minimum values of each axis. The geomagnetic calibration parameters can be calculated as follows:
c x = X m a x + X m i n 2 c y = Y m a x + Y m i n 2 c z = Z m a x + Z m i n 2
k x = X m a x X m i n X m a x X m i n k y = X m a x X m i n Y m a x Y m i n k z = X m a x X m i n Z m a x Z m i n
where X m a x , X m i n , Y m a x , Y m i n , Z m a x ,   and   Z m i n are the maximum and minimum values of three axes.

3.2. Sensor Fusion

Several methods are available for representing attitudes in space, such as Euler angle, direction cosine, and quaternion. The direction cosine method can be used to solve matrix transformations, but it requires extensive calculations. The Euler angle method is easy to understand, but it may cause the gimbal lock problem. The quaternion method is more efficient and requires fewer calculations compared to the other two methods. Since the update of the quaternion depends on the previous state, the acceleration and geomagnetism data in the static state is used to calculate the initial quaternion attitude. The initial pitch angle θ 0 and roll angle γ 0 can be calculated using the following equation:
θ 0 = a r c s i n a y n γ 0 = a r c t a n a x n a z n
Specifically, a x n a y n a z n T is the projection of gravitational acceleration from reference frame to carrier frame. Then, the initial yaw angle can be calculated with θ 0 , γ 0 , and the projection of magnetic field data m x n m y n m z n T as follows:
m y n = m x b c o s γ 0 + m y b s i n γ s i n θ 0 + m z b s i n γ c o s θ 0 m x n = m y b c o s θ 0 m z b s i n θ 0 φ 0 =   a r c t a n m y n m x n
When θ 0 γ 0 φ 0 T is confirmed, the initial quaternion can be calculated using the following equations:
i _ q 0 = cos γ 0 2 cos θ 0 2 cos φ 0 2 sin γ 0 2 sin θ 0 2 sin φ 0 2 i _ q 1 = cos γ 0 2 sin θ 0 2 cos φ 0 2 sin γ 0 2 cos θ 0 2 sin φ 0 2 i _ q 2 = sin γ 0 2 cos θ 0 2 cos φ 0 2 + cos γ 0 2 sin θ 0 2 sin φ 0 2 i _ q 3 = cos γ 0 2 cos θ 0 2 sin φ 0 2 + sin γ 0 2 sin θ 0 2 cos φ 0 2
As proven by the calculation process of the initial Euler angle, the yaw, pitch, and roll angles can be calculated directly from the acceleration, angular velocity, and magnetic field data in the static state. However, when the sensor is in motion, extra linear acceleration is introduced, resulting in inaccurate calculation of the sensor’s attitude. In addition, the integral drift still exists when angular velocity is used to calculate the attitude after sensor calibration. In this paper, a complementary filter algorithm is applied to solve the multi-sensor fusion problem [28,29]. The essence of multi-sensor fusion is to use the proportional–integral (PI) method to calculate the error between the estimated and measured values of acceleration and geomagnetism, and apply this error to correct the updated quaternion. The sensor fusion process is as shown in Figure 7.
e a x e a y e a z T is defined as the error caused by the acceleration data in carrier coordinates; it can be expressed as follows:
e a x e a y e a z = a × v = i j k a x a y a z v x v y v z = a y · v z a z · v y a z · v x a x · v z a x · v y a z · v x
where v x v y v z T is the estimated acceleration in carrier coordinates.
e m x e m y e m z T is defined as the error caused by the geomagnetic field data in carrier coordinates; it can be expressed as follows:
e m x e m y e m z = m × w = i j k m x m y m z w x w y w z = m y · n z m z · n y m z · n x m x · n z m x · n y m z · n x
where n x n y n z T is the estimated geomagnetic field data in carrier coordinates.
Then, the total error can be defined as:
e x e y e z = e a x e a y e a z + e m x e m y e m z = a y · v z a z · v y + m y · n z m z · n y a z · v x a x · v z + m z · n x m x · n z a x · v y a z · v x + m x · n y m z · n x
The angular velocity can be fixed using the following equations:
e x i e y i e z i = e x i e y i e z i + K i · e x e y e z ω x ω y ω z = ω x ω y ω z + K p · e x e y e z + e x i e y i e z i
where ω x ω y ω z T is the measured angular velocity from a gyroscope, and ω x   ω y   ω z T is the fixed angular velocity to update the quaternion matrix. The updated quaternion attitude q 0 q 1     q 2 q 3 T can be represented as follows:
q 0 q 1 q 2 q 3 = q 0 q 1 q 2 q 3 + 0 q 1 · ω x q 2 · ω y q 3 · ω z q 0 · ω x 0 q 2 · ω z q 3 · ω y q 0 · ω y q 1 · ω z 0 q 31 · ω x q 0 · ω z q 1 · ω y q 2 · ω x 0 · T 2
Finally, the Euler angle can be obtained using the following equations:
θ = arcsin 2 q 2 q 3 + q 0 q 1 γ = arctan 2 q 1 q 3 q 0 q 2 q 0 2 q 1 2 q 2 2 + q 3 2 φ = arctan 2 q 1 q 2 q 0 q 3 q 0 2 q 1 2 + q 2 2 + q 3 2
where θ ,   γ ,   and   φ are the pitch, roll, and yaw angles of a single IMMU sensor.

4. Experimental Results of Sensor Calibration and Fusion

Since all the IMMU sensors on the data glove are of the same model (MPU-9250), one of these sensors is used to demonstrate the experimental results of sensor calibration and fusion.
The comparisons of angular velocity data before and after calibration are shown in Figure 8. It can be seen that the zero bias of the gyroscope is reduced. However, the integral drift of the gyroscope still fails to be eliminated after a long period of time, which needs to be corrected by fusing the acceleration and geomagnetism data.
The comparisons of normalized acceleration data before and after calibration are shown in Figure 9. The theoretical value of normalized gravitational acceleration in the static state should be equal to 1. The accuracy of local gravitational acceleration is improved after calibration.
The geomagnetic field data should lie on the sphere equidistant from the center. The 3D scatter plots of the geomagnetic field data are shown in Figure 10. The comparisons of the geomagnetic field data on X–Y, X–Z, and Y–Z planes are shown in Figure 11. It can be seen that the data are distributed in a nearly elliptical shape before calibration, and in a nearly standard spherical shape after calibration.
As discussed earlier, the attitude angles of the inertial sensor can be calculated using a complementary filter algorithm. The attitude angles of the IMMU sensor in the static state are shown in Figure 12. The static accuracy of attitude angles is smaller than 1°, which satisfies the application requirement of our data glove system.
The commercial HCM365B E-compass is used to evaluate the dynamic performance of the IMMU sensor, as shown in Figure 13a. The output attitude angle curves of HCM365B and the IMMU sensor are very close, while the roll angle changes from 0° to 200°.

5. Human–Computer Interaction Using Data Glove

Real-time human–computer interaction is also a required function for patients while doing rehabilitation exercises at home. A series of virtual scenes needs to be established by the human–computer interaction module, including a virtual hand, movable virtual items, lights, and a background. After the virtual scenes are built, all the 12 IMMU sensors will be mapped to the joints of the virtual hand by converting sensor attitude angles into finger rotation angles. In this paper, a virtual hand is established in the Unity platform, as shown in Figure 14.
The human interaction module includes the following three parts: (1) Virtual scene design: The virtual hand should be established according to the skeletal structure of the human hand. Some virtual items are also provided for practicing hand grasping motions. (2) Sensor–joints mapping: 12 IMMU sensors need to be mapped to the corresponding joints. (3) Scripts: The scripts include finding joint objects in the hierarchical view, mapping sensors to model joints, receiving attitude angles, and displaying animations. The bending angles of the joints of the virtual hand are limited to a range that fits in with daily human hand motions. A schematic diagram showing the bending angles of different joints is provided in Figure 15.
To validate the joint angle obtained by the data glove, the digital goniometer is used to measure the angle, as shown in Figure 16. One problem of using such a steel goniometer is that the magnetometer inside the IMMU sensor may be disturbed by nearby ferromagnetic materials. Therefore, the digital goniometer should be placed on the finger after the angle data have been exported from the data glove.
The comparison of measurement results is represented in Table 1. The angles measured by goniometer and data glove are recorded when the index finger is held in four different positions. The average error rate is less than 2% and the maximum deviation is nearly 1.4°. The accuracy of the data glove will be enough for most wearable applications.
Hand rehabilitation assessment, by the total active range of motion (TAM) method, for example, requires the bending angle data of the joints, which can be calculated with the attitude angles of sensors on the data glove. Since the DIP joint has no sensor placed on it, its bending angle is empirically set to 1/3 that of the PIP joint. The relationship between the bending angles of joints and the attitude angles of sensors is represented in Table 2.
Specifically, S i i = 1 ,   2 ,   ,   12 represents the attitude angles of IMMU sensors around the x-axis. The experimental results of three hand gestures showing the numbers ”2”, “5”, and “10” are provided in Figure 17. The hand gestures are captured and reconstructed with the virtual hand in Unity with high accuracy. The gesture data can be saved and exported by physicians for off-line diagnosis, as shown in Table 3.
The dynamic ball grasping experiment is demonstrated in Figure 18. The virtual ball and hand should be set in such original positions that their distance is the same as that between the physical ball and hand. The subject needs to pick up the ball and flip the hand several times. The changes in hand posture can be reviewed from the data of Sensor 8 on the back of the hand, as shown in Figure 19. The recovery status can be estimated from the recorded data, for example, the time taken for an action, the accuracy of an action, and the degree of hand tremor.
Four different data glove systems are compared with our proposed IMMU data glove, as listed in Table 4. It can be found that both the optical fiber sensor-based glove system and IMMU sensor-based data glove system can provide a good performance while measuring the joint angle. However, the cost of IMMU sensors is much less than optical fiber sensors for massive commercial wearable applications.

6. Conclusions

This paper presents a novel data glove system to assess hand function during hand rehabilitation. 12 nine-axis inertial sensors are integrated in a data glove to obtain the angles of hand joints. The error models of the accelerometer, gyroscope, and magnetometer are analyzed. The least square principle is used to solve the correction parameters of the accelerometer. The validity of the calibration process is verified by comparing raw sensor data and calibrated data. A quaternion-based complementary filtering algorithm is applied to fuse the acceleration, angular velocity and geomagnetism data. A commercial E-compass is also used to verify the stability and dynamic performance of our fusion algorithm.
Real-time and high-precision human–computer interaction is realized with the data glove and Unity. A virtual hand model, a ball model, and some virtual scenes are established in Unity, and then the attitude angles of 12 sensors are transformed into the rotation angles of hand joints. The experimental results show that the glove system can effectively measure hand postures and joint angles in motion, and it is expected to be used to develop a variety of rehabilitation scenarios to get patients more engaged in the long-term rehabilitation process.

Author Contributions

F.F. and D.Y. conceptualized the project; F.F. and G.Z. designed the methodology; S.X. and X.X. worked with the software; K.Y. and G.Z. performed validation; C.W. made the formal analysis; F.F. conducted investigation; F.F. provided the resources; F.F. performed data curation; F.F. wrote the original manuscript; D.Y. reviewed and edited the manuscript; S.X. performed visualization; F.F. supervised and administered the project; F.F acquired the funding. All authors have read and agreed to the published version of the manuscript.

Funding

This research is funded by the National Natural Science Foundation of China under Grant Nos. 61501226, 61803201, and 61673278, the China Postdoctoral Science Foundation under Grant No. 2019M661686, Science and Technology Innovation Commission of Shenzhen Municipality Projects under Grant No. JCYJ20190808181803703, and the Fundamental Research Funds of Nanjing University of Aeronautics and Astronautics under Grant No. NT2020008.

Data Availability Statement

The data presented in this study are available in the article.

Conflicts of Interest

The authors declare no conflict of interest.

References

  1. Platz, T.; Pinkowski, C.; van Wijck, F.; Kim, I.-H.; Di Bella, P.; Johnson, G. Reliability and validity of arm function assessment with standardized guidelines for the Fugl-Meyer Test, Action Research Arm Test and Box and Block Test: A multicentre study. Clin. Rehabil. 2005, 19, 404–411. [Google Scholar] [CrossRef] [PubMed]
  2. Yozbatiran, N.; Der-Yeghiaian, L.; Cramer, S.C. A standardized approach to performing the action research arm test. Neurorehabilit. Neural Repair 2008, 22, 78–90. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  3. Lyle, R.C. A performance test for assessment of upper limb function in physical rehabilitation treatment and research. Int. J. Rehabil. Res. 1981, 4, 483–492. [Google Scholar] [CrossRef] [PubMed]
  4. Sears, E.D.; Chung, K.C. Validity and responsiveness of the jebsen–taylor hand function test. J. Hand Surg. 2010, 35, 30–37. [Google Scholar] [CrossRef] [Green Version]
  5. Nijland, R.; van Wegen, E.; Verbunt, J.; van Wijk, R.; van Kordelaar, J.; Kwakkel, G. A comparison of two validated tests for upper limb function after stroke: The Wolf Motor Function Test and the Action Research Arm Test. J. Rehabil. Med. 2010, 42, 694–696. [Google Scholar] [PubMed] [Green Version]
  6. Ren, Z.; Meng, J.; Yuan, J. Depth camera based hand gesture recognition and its applications in human-computer-interaction. In Proceedings of the 2011 8th International Conference on Information, Communications & Signal Processing, Singapore, 13–16 December 2011; pp. 1–5. [Google Scholar]
  7. Li, Z.; Jarvis, R. Real time hand gesture recognition using a range camera. In Proceedings of the Australasian Conference on Robotics and Automation, Sydney, Australia, 2–4 December 2009; pp. 21–27. [Google Scholar]
  8. Wachs, J.P.; Kölsch, M.; Stern, H.; Edan, Y. Vision-based hand-gesture applications. Commun. ACM 2011, 54, 60–71. [Google Scholar] [CrossRef] [Green Version]
  9. Skaria, S.; Huang, D.; Al-Hourani, A.; Evans, R.J.; Lech, M. Deep-Learning for Hand-Gesture Recognition with Simultaneous Thermal and Radar Sensors. In Proceedings of the 2020 IEEE Sensors, Rotterdam, The Netherlands, 25–28 October 2020; pp. 1–4. [Google Scholar]
  10. Fan, T.; Ma, C.; Gu, Z.; Lv, Q.; Chen, J.; Ye, D.; Huangfu, J.; Sun, Y.; Li, C.; Ran, L. Wireless hand gesture recognition based on continuous-wave Doppler radar sensors. IEEE Trans. Microw. Theory Tech. 2016, 64, 4012–4020. [Google Scholar] [CrossRef]
  11. Sturman, D.J.; Zeltzer, D. A survey of glove-based input. IEEE Comput. Graph. Appl. 1994, 14, 30–39. [Google Scholar] [CrossRef]
  12. Dipietro, L.; Sabatini, A.M.; Dario, P. A survey of glove-based systems and their applications. IEEE Trans. Syst. Man Cybern. Part C (Appl. Rev.) 2008, 38, 461–482. [Google Scholar] [CrossRef]
  13. Blake, J.; Gurocak, H.B. Haptic glove with MR brakes for virtual reality. IEEE/ASME Trans. Mechatron. 2009, 14, 606–615. [Google Scholar] [CrossRef]
  14. Ma, Z.; Ben-Tzvi, P. Design and optimization of a five-finger haptic glove mechanism. J. Mech. Robot. 2015, 7, 041008. [Google Scholar] [CrossRef] [Green Version]
  15. Chiri, A.; Vitiello, N.; Giovacchini, F.; Roccella, S.; Vecchi, F.; Carrozza, M.C. Mechatronic design and characterization of the index finger module of a hand exoskeleton for post-stroke rehabilitation. IEEE/ASME Trans. Mechatron. 2011, 17, 884–894. [Google Scholar] [CrossRef]
  16. Gu, X.; Zhang, Y.; Sun, W.; Bian, Y.; Zhou, D.; Kristensson, P.O. Dexmo: An inexpensive and lightweight mechanical exoskeleton for motion capture and force feedback in VR. In Proceedings of the 2016 CHI Conference on Human Factors in Computing Systems, San Jose, CA, USA, 7–12 May 2016; pp. 1991–1995. [Google Scholar]
  17. Tarchanidis, K.N.; Lygouras, J.N. Data glove with a force sensor. IEEE Trans. Instrum. Meas. 2003, 52, 984–989. [Google Scholar] [CrossRef]
  18. Tognetti, A.; Carbonaro, N.; Zupone, G.; De Rossi, D. Characterization of a novel data glove based on textile integrated sensors. In Proceedings of the 2006 International Conference of the IEEE Engineering in Medicine and Biology Society, New York, NY, USA, 30 August–3 September 2006; pp. 2510–2513. [Google Scholar]
  19. Shen, Z.; Yi, J.; Li, X.; Lo, M.H.P.; Chen, M.Z.; Hu, Y.; Wang, Z. A soft stretchable bending sensor and data glove applications. Robot. Biomim. 2016, 3, 1–8. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  20. Fujiwara, E.; Ferreira, M.; dos Santos, M.; Suzuki, C.K. Flexible Optical Fiber Bending Transducer for Application in Glove-Based Sensors. IEEE Sens. J. 2014, 14, 3631–3636. [Google Scholar] [CrossRef]
  21. Da Silva, A.F.; Gonçalves, A.F.; Mendes, P.M.; Correia, J.H. FBG sensing glove for monitoring hand posture. IEEE Sens. J. 2011, 11, 2442–2448. [Google Scholar] [CrossRef]
  22. Lin, B.-S.; Hsiao, P.-C.; Yang, S.-Y.; Su, C.-S.; Lee, I.-J. Data glove system embedded with inertial measurement units for hand function evaluation in stroke patients. IEEE Trans. Neural Syst. Rehabil. Eng. 2017, 25, 2204–2213. [Google Scholar] [CrossRef]
  23. Lin, B.-S.; Lee, I.; Yang, S.-Y.; Lo, Y.-C.; Lee, J.; Chen, J.-L. Design of an inertial-sensor-based data glove for hand function evaluation. Sensors 2018, 18, 1545. [Google Scholar] [CrossRef] [Green Version]
  24. Choi, Y.; Yoo, K.; Kang, S.J.; Seo, B.; Kim, S.K. Development of a low-cost wearable sensing glove with multiple inertial sensors and a light and fast orientation estimation algorithm. J. Supercomput. 2018, 74, 3639–3652. [Google Scholar] [CrossRef]
  25. Liu, Q.; Qian, G.; Meng, W.; Ai, Q.; Yin, C.; Fang, Z. A new IMMU-based data glove for hand motion capture with optimized sensor layout. Int. J. Intell. Robot. Appl. 2019, 3, 19–32. [Google Scholar] [CrossRef]
  26. Wu, G.; van der Helm, F.C.; Veeger, H.E.; Makhsous, M.; Van Roy, P.; Anglin, C.; Nagels, J.; Karduna, A.R.; McQuade, K.; Wang, X.; et al. ISB recommendation on definitions of joint coordinate systems of various joints for the reporting of human joint motion--Part II: Shoulder, elbow, wrist and hand. J. Biomech. 2005, 38, 981–992. [Google Scholar] [CrossRef] [PubMed]
  27. Frosio, I.; Pedersini, F.; Borghese, N.A. Autocalibration of MEMS accelerometers. IEEE Trans. Instrum. Meas. 2008, 58, 2034–2041. [Google Scholar] [CrossRef]
  28. Euston, M.; Coote, P.; Mahony, R.; Kim, J.; Hamel, T. A complementary filter for attitude estimation of a fixed-wing UAV. In Proceedings of the 2008 IEEE/RSJ International Conference on Intelligent Robots and Systems, Nice, France, 22–26 September 2008; pp. 340–345. [Google Scholar]
  29. Madgwick, S. An efficient orientation filter for inertial and inertial/magnetic sensor arrays. Rep. x-io Univ. Bristol (UK) 2010, 25, 113–118. [Google Scholar]
  30. Cha, Y.; Seo, J.; Kim, J.-S.; Park, J.-M. Human–computer interface glove using flexible piezoelectric sensors. Smart Mater. Struct. 2017, 26, 057002. [Google Scholar] [CrossRef]
  31. Li, K.; Chen, I.M.; Yeo, S.H.; Lim, C.K. Development of finger-motion capturing device based on optical linear encoder. J. Rehabil. Res. Dev. 2011, 48, 69–82. [Google Scholar] [CrossRef] [PubMed]
  32. Kortier, H.G.; Sluiter, V.I.; Roetenberg, D.; Veltink, P.H. Assessment of hand kinematics using inertial and magnetic sensors. J. Neuroeng. Rehabil. 2014, 11, 1–15. [Google Scholar] [CrossRef] [Green Version]
Figure 1. Workflow of the proposed data glove system for hand function evaluation.
Figure 1. Workflow of the proposed data glove system for hand function evaluation.
Micromachines 12 00362 g001
Figure 2. Hardware design of the proposed data glove system.
Figure 2. Hardware design of the proposed data glove system.
Micromachines 12 00362 g002
Figure 3. Structure of hand bones and joints.
Figure 3. Structure of hand bones and joints.
Micromachines 12 00362 g003
Figure 4. The 1st generation data glove; (a) layout of inertial and magnetic unit (IMMU) sensors; (b) prototype connected with soft wires.
Figure 4. The 1st generation data glove; (a) layout of inertial and magnetic unit (IMMU) sensors; (b) prototype connected with soft wires.
Micromachines 12 00362 g004
Figure 5. The 2nd generation data glove. (a) Flexible PCB and mounted IMMU sensors; (b) prototype coated with a silicone protective layer.
Figure 5. The 2nd generation data glove. (a) Flexible PCB and mounted IMMU sensors; (b) prototype coated with a silicone protective layer.
Micromachines 12 00362 g005
Figure 6. Definition of the orientation of IMMU sensors on the glove of the left hand.
Figure 6. Definition of the orientation of IMMU sensors on the glove of the left hand.
Micromachines 12 00362 g006
Figure 7. Flowchart of sensor fusion using a complementary filter algorithm.
Figure 7. Flowchart of sensor fusion using a complementary filter algorithm.
Micromachines 12 00362 g007
Figure 8. Comparisons of static gyroscope data before and after calibration. (a) X-axis angular velocity; (b) y-axis angular velocity; (c) z-axis angular velocity.
Figure 8. Comparisons of static gyroscope data before and after calibration. (a) X-axis angular velocity; (b) y-axis angular velocity; (c) z-axis angular velocity.
Micromachines 12 00362 g008aMicromachines 12 00362 g008b
Figure 9. Comparisons of local gravitational acceleration before and after calibration.
Figure 9. Comparisons of local gravitational acceleration before and after calibration.
Micromachines 12 00362 g009
Figure 10. Comparisons of geomagnetic field data distribution before and after calibration. (a) X–Y plane; (b) X–Z plane; (c) Y–Z plane; (d) X–Y–Z 3D view.
Figure 10. Comparisons of geomagnetic field data distribution before and after calibration. (a) X–Y plane; (b) X–Z plane; (c) Y–Z plane; (d) X–Y–Z 3D view.
Micromachines 12 00362 g010
Figure 11. Comparisons of geomagnetic field data before and after calibration.
Figure 11. Comparisons of geomagnetic field data before and after calibration.
Micromachines 12 00362 g011
Figure 12. Pitch, roll, and yaw angles in the static state (when pitch angle is set to 30°).
Figure 12. Pitch, roll, and yaw angles in the static state (when pitch angle is set to 30°).
Micromachines 12 00362 g012
Figure 13. Comparisons of measured dynamic attitude angles between MPU-9250 and HCM365B. (a) HCM365B E-compass and MPU-9250 IMMU sensor; (b) dynamic attitude angle changes in motion.
Figure 13. Comparisons of measured dynamic attitude angles between MPU-9250 and HCM365B. (a) HCM365B E-compass and MPU-9250 IMMU sensor; (b) dynamic attitude angle changes in motion.
Micromachines 12 00362 g013
Figure 14. Human hand modeling in the Unity platform. (a) Hierarchy of joints; (b) hand skinned mesh; (c) virtual hand model.
Figure 14. Human hand modeling in the Unity platform. (a) Hierarchy of joints; (b) hand skinned mesh; (c) virtual hand model.
Micromachines 12 00362 g014
Figure 15. Schematic side view of the bending angles of different joints.
Figure 15. Schematic side view of the bending angles of different joints.
Micromachines 12 00362 g015
Figure 16. Comparison of the measured angle between data glove and digital goniometer.
Figure 16. Comparison of the measured angle between data glove and digital goniometer.
Micromachines 12 00362 g016
Figure 17. Gestures reproduced by the virtual hand in Unity. (a) Gesture “2”; (b) gesture “5”; (c) gesture “10”.
Figure 17. Gestures reproduced by the virtual hand in Unity. (a) Gesture “2”; (b) gesture “5”; (c) gesture “10”.
Micromachines 12 00362 g017
Figure 18. Interactions with a ball using the proposed data glove. (a) Locating the ball; (b) grasping the ball; (c) flipping the hand.
Figure 18. Interactions with a ball using the proposed data glove. (a) Locating the ball; (b) grasping the ball; (c) flipping the hand.
Micromachines 12 00362 g018
Figure 19. Hand posture changes during ball grasping experiment.
Figure 19. Hand posture changes during ball grasping experiment.
Micromachines 12 00362 g019
Table 1. Measurement results of goniometer and data glove.
Table 1. Measurement results of goniometer and data glove.
Angle from goniometer112.10°91.20°80.52°69.65°
Average angle from data glove110.28°89.55°79.66°68.30°
Error rate1.6%1.8%1.1%1.9%
Table 2. Calculation of bending angles of joints using sensor attitudes.
Table 2. Calculation of bending angles of joints using sensor attitudes.
JointIDBending Angle
Thumb IP joint θ IP _ Thumb S 1 S 2
Thumb MP joint θ MP _ Thumb S 3 S 8
Index finger PIP joint θ PIP _ Index S 4 S 5
Index finger MP joint θ MP _ Index S 5 S 8
Middle finger PIP joint θ PIP _ Middle S 6 S 7
Middle finger MP joint θ MP _ Middle S 7 S 8
Ring finger PIP joint θ PIP _ Ring S 9 S 10
Ring finger MP joint θ MP _ Ring S 10 S 8
Little finger PIP joint θ PIP _ Little S 11 S 12
Little finger MP joint θ MP _ Little S 12 S 8
Table 3. Bending angles of joints with three gestures.
Table 3. Bending angles of joints with three gestures.
GestureIndex FingerMiddle FingerRing FingerLittle Finger
θ PIPJ θ MPJ θ PIPJ θ MPJ θ PIPJ θ MPJ θ PIPJ θ MPJ
“2”9.2°7.9°2.4°24.8°109.2°49.4°64.4°90.7°
“5”5.5°2.2°16.4°3.9°9.5°10.5°24.3°20.6°
“10”110.3°70.2°108.2°67.5°148.6°62.9°68.9°77.3°
Table 4. Comparison of proposed data glove and other data glove systems.
Table 4. Comparison of proposed data glove and other data glove systems.
PublicationsType of SensorNumber of SensorsDeviation of Joint Angle
Cha et al. [30]flexible piezoelecric sensor19 (one hand)
da Silva et al. [21] fiber bragg gratings sensor 1 (each finger)
Li et al. [31]optical linear encoder3 (each finger)
Kortier et al. [32]IMMU3 (each finger)1.1°
Proposed glove systemIMMU12 (one hand)1.4°
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Share and Cite

MDPI and ACS Style

Fei, F.; Xian, S.; Xie, X.; Wu, C.; Yang, D.; Yin, K.; Zhang, G. Development of a Wearable Glove System with Multiple Sensors for Hand Kinematics Assessment. Micromachines 2021, 12, 362. https://doi.org/10.3390/mi12040362

AMA Style

Fei F, Xian S, Xie X, Wu C, Yang D, Yin K, Zhang G. Development of a Wearable Glove System with Multiple Sensors for Hand Kinematics Assessment. Micromachines. 2021; 12(4):362. https://doi.org/10.3390/mi12040362

Chicago/Turabian Style

Fei, Fei, Sifan Xian, Xiaojian Xie, Changcheng Wu, Dehua Yang, Kuiying Yin, and Guanglie Zhang. 2021. "Development of a Wearable Glove System with Multiple Sensors for Hand Kinematics Assessment" Micromachines 12, no. 4: 362. https://doi.org/10.3390/mi12040362

APA Style

Fei, F., Xian, S., Xie, X., Wu, C., Yang, D., Yin, K., & Zhang, G. (2021). Development of a Wearable Glove System with Multiple Sensors for Hand Kinematics Assessment. Micromachines, 12(4), 362. https://doi.org/10.3390/mi12040362

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop