Next Article in Journal
Recent Progress in Flexible Wearable Sensors for Vital Sign Monitoring
Next Article in Special Issue
Classification of Neurological Patients to Identify Fallers Based on Spatial-Temporal Gait Characteristics Measured by a Wearable Device
Previous Article in Journal
Research on a CMOS-MEMS Infrared Sensor with Reduced Graphene Oxide
Previous Article in Special Issue
Machine Learning to Quantify Physical Activity in Children with Cerebral Palsy: Comparison of Group, Group-Personalized, and Fully-Personalized Activity Classification Models
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Letter

Estimation of Relative Hand-Finger Orientation Using a Small IMU Configuration

1
Department of Biomedical Signals Systems, Technical Medical Centre, University of Twente, 7500 AE Enschede, The Netherlands
2
School of Marine Science and Technology, Northwestern Polytechnical University, Xi’an 710072, China
*
Author to whom correspondence should be addressed.
Sensors 2020, 20(14), 4008; https://doi.org/10.3390/s20144008
Submission received: 30 June 2020 / Revised: 14 July 2020 / Accepted: 17 July 2020 / Published: 19 July 2020

Abstract

:
Relative orientation estimation between the hand and its fingers is important in many applications, such as virtual reality (VR), augmented reality (AR) and rehabilitation. It is still quite a big challenge to do the estimation by only exploiting inertial measurement units (IMUs) because of the integration drift that occurs in most approaches. When the hand is functionally used, there are many instances in which hand and finger tips move together, experiencing almost the same angular velocities, and in some of these cases, almost the same accelerations are measured in different 3D coordinate systems. Therefore, we hypothesize that relative orientations between the hand and the finger tips can be adequately estimated using 3D IMUs during such designated events (DEs) and in between these events. We fused this extra information from the DEs and IMU data with an extended Kalman filter (EKF). Our results show that errors in relative orientation can be smaller than five degrees if DEs are constantly present and the linear and angular movements of the whole hand are adequately rich. When the DEs are partially available in a functional water-drinking task, the orientation error is smaller than 10 degrees.

1. Introduction

Hand-finger movement tracing is useful in many areas, such as virtual reality (VR), augmented reality (AR), ergonomic assessment and especially medical applications [1,2,3,4,5]. People who suffered from stroke or injury of the spinal cord need an effective rehabilitation therapy for recovery of body functions, including hand function. In a hospital, therapists evaluate the hand function through some traditional assessments such as the Fugl–Meyer or Jebsen–Taylor hand function assessment [6,7]. Currently, the results may be subjective and dependent on the therapist. Therefore, it is essential to provide a quantitative and understandable measurement to make the therapist’s diagnosis more objective. Several sensory systems can be used to trace hand motion, which can be categorized as camera-based, glove-based, magnetic actuator-based and inertial measurement unit (IMU)-based. Camera-based systems can be divided into two different types. One uses high-speed cameras to trace markers attached to body segments, which is quite accurate and often used as the reference [8]. However, occlusion problems will influence its accuracy and the distance between cameras and hands needs to be below a few meters in order to accurately measure hand and finger orientations. Because of these problems, you need many cameras (6 to 12). The other camera-based system traces objects, including their orientations, by exploiting depth maps to reconstruct the object [9,10]. Its advantage is that no finger or hand attachments are needed, making it friendly to users. However, this system also suffers from the occlusion problem and only allows hand movements to be evaluated if they occur in the vicinity of the cameras [11,12]. Besides, it requires a powerful processor to process the images. Glove-based sensor systems exploit varying sensors, such as resistive-bend sensors and optical-fiber sensors on the glove, transducing finger movement into corresponding signals to estimate relative orientations between hand and finger segments [13,14]. It has the benefit of a low price. However, the glove needs to be well attached for the measurement and requires thorough calibration before utilization. The magnetic actuator-based system also has two types, active actuation and passive actuation. The first one deploys active magnetic actuators on the finger tip and receivers on the dorsal side of the hand [15]. It has high accuracy and no occlusion issue. However, it requires different frequencies for each degree of freedom (DoF) of the actuator, which often needs equipment such as multiple power signal sources and a high-speed processor. This affects the complexity of the system and its physical dimensions. The second one uses magnets as passive sources, magnets are placed on the finger tips while magnetic sensors are worn on the wrist [16]. It has the benefit of having a simple structure and a low cost. However, it is difficult to distinguish the fields of different magnets, since only the sum of the fields are measured, especially when the magnets get close. The IMU-based system utilizes inertial sensors to trace the hand [17,18,19]. Compared with previous methods, it can provide raw data including angular velocity and acceleration. Orientation can be estimated by fusing the raw data. This operation suffers from drift, since it involves integration operations. However, this drift can be compensated using magnetometer data, which is easily accessible since it is often embedded in IMU systems. However, magnetometers are used in this solution and are therefore vulnerable to external magnetic disturbances, such as indoor iron surroundings [20,21]. Thomas and Wolfgang et al. proposed magnetometer-free methods for the joint angle estimation [22,23,24,25]. However, such methods assume the rotation is restricted to two DoFs because of the anatomy constraint [23]. Thus, they cannot be applied to flexible joints, such as the metacarpophalangeal joint (MCP) of the thumb. Relative orientations between the hand and its fingers are important for the reconstruction of hand-finger movement, which is essential information for AR, VR and rehabilitation.
Our goal is to estimate relative 3D orientations between finger tips and the dorsal side of the hand with only IMUs, essentially getting rid of magnetic disturbance by not using magnetometers. In order to reduce the integration drift, we exploit information during the daily life rather than using the biomechanical constraint. The information is based on the assumption that there are many instances in which hand and finger tips move together, experiencing almost the same angular velocities and accelerations represented in different 3D coordinate systems. The method was verified with a small sensor configuration: one sensor on the dorsal side of hand, and one on the most distal finger segment of interest.

2. Methods

In order to estimate the relative orientation, the information from the gyroscope and accelerometer and extra information during DEs need to be combined in an optimal way. Therefore, an extended Kalman filter (EKF) was introduced to estimate 3D relative orientations between the dorsal side of the hand and finger tips, assuming angular velocities and accelerations are the same, but just represented in a different coordinate system. The process model is based on integrating relative angular velocity, the measurement model is mainly based on the information during the DE. The quality of the DE is considered in the measurement variance. When the DE is available with small variance, we trust the measurement model more; otherwise, we trust the process model more. Thus, the information from process and measurement models is optimally fused to estimate relative 3D orientations during functional hand and finger movements.

2.1. Sensor Model

The gain error and non-orthogonality error are assumed to be time-invariant and can be obtained through sensor calibration; thus, the outputs of calibrated gyroscope can be expressed as
y g y r , h h = ω h h + b h + ζ h y g y r , f f = ω f f + b f + ζ f
where y g y r , h h and y g y r , f f are gyroscope outputs on the hand and finger tip in their own frames. b x ( x = h , f ) is the slowly varying offset. ζ x ( x = h , f ) is Gaussian noise.
For the calibrated accelerometer, the outputs on the hand and finger tips are
y a c c , h h = a h h + g h + η h h y a c c , f f = a f f + g f + η f f
where g is the gravity, and η h h and η f f are Gaussian noise.

2.2. Process Model

The process model is based on integrating the relative angular velocity between the hand and its fingers in its own frame. We choose the quaternion q h f = q 0 q 1 q 2 q 3 T that expresses relative orientation from a finger tip to the dorsal side of the hand as the state vector x = q h f . The relative orientation x k is updated as
x k = x k 1 1 1 2 ω k d t + m
where m is the process error. ω k is the relative angular velocity between the hand and fingers; ⊗ represents the multiplication operator between two quaternions.
ω k = ( ω h h ) k x k 1 ( ω f f ) k x k 1 *
where ω h h and ω f f are hand and finger angular velocities.

2.3. Measurement Model

The measurement update of EKF is based on the DE. During the DE, the hand and fingers share the same angular velocity in different coordinate frames
ω h h = q h f ω f f q h f *
where ω x y ( x = h , f , y = h , f ) is the angular velocity of an object in frame x expressed in the coordinate frame of object y. h represents the hand and f represents the finger tip. Combining Equations (1) and (5), we find:
y g y r , h h = q h f y g y r , f f q h f * + b h q h f b f q h f * + ζ h q h f ζ f q h f * = q h f y g y r , f f q h f * + d g y r
where the combined error of gyroscope d g y r is
d g y r = ( b h q h f d f q h f * ) + ( ζ h q h f ζ f q h f * )
Unlike the angular velocity, accelerations at different positions are different, which can be expressed as
a h h = q h f a f f q h f * + ω h h × ( ω h h × r f h h ) + ω ˙ h h × r f h h = q h f a f f q h f * + ( ω h h × ω h h × + ω ˙ h h × ) r f h h
where a x y ( x = h , f , y = h , f ) is the acceleration of object in frame x relative to frame y. ω ˙ h h is the hand angular acceleration in its own frame. r f h h is the position vector between hand and fingers in the hand frame. × denotes a skew-symmetric matrix.
a × = 0 a z a y a z 0 a x a y a x 0
If the second term ( ω h h × ω h h × + ω ˙ h h × ) r f h h is relatively small compared with the first term q h f a f f q h f * , then Equation (8) can be approximated as the following equation:
a h h q h f a f f q h f *
Combining Equations (2) and (8), we find:
y a c c , h h = q h f y a c c , f f q h f * + ( ω h h × ω h h × + ω ˙ h h × ) r f h h + η c
where the combined error η c can be expressed as
η c = η h h q h f η f f q h f *
Finally, an overall relation between hand and fingers based on Equations (6) and (11) is
y g y r , h h = q h f y g y r , f f q h f * + d g y r y a c c , h h = q h f y a c c , f f q h f * + ( ω h h × ω h h × + ω ˙ h h × ) r f h h + η c
Subsequently, we can get the measurement model based on the sensor model and quaternion constraint
y k = f ( x k ) + v
where y and f can be expressed as
y k = ( y a c c , h h ) T ( y g y r , h h ) T 0 T
f ( x ) = x k y a c c , f f x k * x k y g y r , f f x k * q 0 2 + q 1 2 + q 2 2 + q 3 2 1
As shown in Equations (3) and (16), the process and measurement model are both nonlinear with respect to x k . In order to update the covariance matrix for x k , linearization is performed and the Jacobian matrix F and H for process and measurement model are calculated; the details can be found in the Appendix A.

2.4. Uncertainty Error Variance

In order to assess the relative confidence in the measurement model (based on our DE assumptions) and the process model, the measurement variance is determined. According to the assumption that a hand and finger share approximately the same angular velocity and acceleration based on Equation (13), the differences in angular velocity and acceleration between the hand and fingers measured by the IMU determine the measurement variance. From Equation (7), the error is related to the offset error, the white noise and relative orientation. d g y r can be expressed with following equation from Equation (6).
d g y r = y g y r , h h q h f y g y r , f f q h f *
We approximate the distribution of d g y r as Gaussian distribution with zero mean and standard deviation σ g 1 1 1 (rad/s)
σ g = y g y r , h h q h f y g y r , f f q h f * 2
For Equation (13), the error d a c c can be expressed with the following equation:
d a c c = ( ω h h × ω h h × + ω ˙ h h × ) r f h h + η
We can express the error in another format from Equation (11).
d a c c = y a c c , h h q h f y a c c , f f q h f *
Similarly to the gyroscope, we assume the error d a c c has an approximate Gaussian distribution with zero mean while its standard deviation σ a 1 1 1 is
σ a = y a c c , h h q h f y a c c , f f q h f * 2
Based on the Gaussian approximation, as described in Equations (17) and (20), it is essential to know the rotation quaternion q h f before we get the variance. However, q h f is the variable we try to estimate which is also unknown. As we assume there is no or a slow orientation change between the hand and finger tips, the estimated relative orientation at time k 1 is used as the true relative orientation at time k.
q h f , k = q ^ h f , k 1
where q h f , k is the “true” rotation quaternion we use to estimate the variance at time k. q ^ h f , k 1 is the estimated rotation quaternion at time k 1 . The measurement covariance is determined as
R m = σ g I 3 × 3 0 0 0 σ a I 3 × 3 0 0 0 0
The initial value for the state vector of relative orientations x k was set as 1 0 0 0 T .

3. Experiments

3.1. Experiment Setup

The sensor system includes three IMUs fixed on the most distal segments of the thumb and index finger and the dorsal side of the hand, as shown in Figure 1. MPU9250 (InvenSense) was chosen for the IMU, which contains a tri-axis accelerometer and tri-axis gyroscope (it also contains a tri-axis magnetometer, which was not used in the current study). All IMUs were sampled synchronously; the sample frequencies of gyroscope and accelerometer were 200 Hz and 100 Hz respectively. All the data were collected by a master micro-controller (Atmel XMEGA) and then transmitted to the PC via a USB connection. Prior to the experiment, the accelerometer was calibrated based on local gravity; the gyroscope was calibrated based on the calibrated accelerometer [26]. An optical Vicon system with eight cameras was used to perform 3D orientation reference measurements. For this purpose, three optical markers were attached to each IMU. The sampling frequency of Vicon was 100 Hz.

3.2. Alignment of the IMU and Reference Marker Frame for the Validation Experiment

For evaluation of the IMU-based 3D relative orientation estimation using the optical system, it is essential to calibrate the relative orientation between the sensor and marker-based reference frame. Here, we used the accelerometer for this marker system’s IMU calibration. Holding the system static, we obtained the gravity in the IMU frame from the accelerometer readings. Meanwhile, we obtained the orientation from the global Vicon frame to marker frame q m g . Gravity in marker frame is q m g g q m g * , where g is gravity in global Vicon frame (z-axis of global Vicon system was vertical upward; gravity in this frame was g = 0 0 g ; g is the local gravity value). When we have at least two poses, we obtain more than two vectors expressed in the marker frame and IMU frame respectively, which is enough to determine the relative orientation between the IMU and marker frame.

3.3. Sensor to Segment Calibration

Before the experiment, IMU errors were calibrated according to D Tedaldi et al.’s and WT Fong et al.’s research [26,27], including sensitivity error, offset error, non-orthogonal error and misalignment between the accelerometer and gyroscope. After the IMU was fixed on the hand and fingers, the relative orientations between IMU sensors and body segments were calibrated. An accelerometer was used to achieve the alignment by exploiting static accelerometer measures of gravity. When we held our hand sequentially horizontally and vertically, we obtained the 3D relative orientation between two frames. More details can be found in Kortier et al.’s research [28].

3.4. Synchronization of Vicon and IMU System

In this experiment, the two measurement systems were synchronized by recording the sensed responses of an induced impact at the start and end of each experiment. At the start and end of every experiment, we hit the IMU on a desk, resulting in an acceleration peak measured by the IMU system and a minimum vertical position of the Vicon markers simultaneously, which was used to synchronize the two systems.

3.5. Protocols for the Experiment

In order to demonstrate the feasibility of our approach, an experimental part was designed to estimate the accuracy of the algorithm compared with the optical system. Our feasibility experiment involved three participants. The protocol was reviewed, approved and conducted under the auspices of the Ethics Committee EEMCS, Univerisity of Twente. The following tasks were performed:
Task1: Movements and rotations of the hand, while not varying relative orientations between hand and fingers: IMUs were fixed on fingers and the dorsal side of the hand. Then, the participant did the pronation and supination movements with the arm while the axis of pronation and supination was continuously changing. The orientation was changed over approximately 160 ° around the rotation axis; see Figure 2. Furthermore, we varied the angular velocity by performing these cyclical movements with varying repetition rate of pronation and supination (60, 120, 240 cycles/min), with the help of a metronome. This was done in order to test the performance of the algorithm under different conditions. During the process, the subject was asked to close the hand and not change the relative orientations between the hand and fingers, while displacing or rotating the hand.
Task2: Simple functional task. The subject was asked to place the hand on the desk; then rise the hand and grasp a cup; subsequently drink some water and place the cup back; and finally place the hand on the original position. The illustration of the movement can be seen in Figure 3.
For task 1, the orientation reference was directly derived from the IMUs, because the relative orientation was imposed by the hand, and therefore, known and not varying. For task 2, the reference measurement was performed using the optical VICON system (software version 2.8.2).

4. Results

4.1. Movements and Rotations of the Hand, While Not Varying Relative Orientations between the Hand and Finger (Task 1)

The error angle used was the arccos of the first component of quaternion error q e r r [29]:
q e r r = q e s t 1 q r e f = 1 1 2 θ e r r
where q e s t was the estimated relative orientation and q r e f was the orientation reference. We obtained more than two independent vectors from the gyroscope, accelerometer or both from 3D movements. The error angle estimated when DE is available is shown in Figure 4. The orientation error is smallest with the gyroscope and accelerometer, while the orientation error is largest with accelerometer data only.

Influence of Repetition Rate of Movement

The estimation may be influenced by the repetition rate of movements. Figure 5 and Figure 6 show the relation between the norm of gyroscope or accelerometer on thte hand and finger for several repetition rates. Ideally, the gyroscope output norms y g y r , h , y g y r , f should be equal for the measurement update and for the accelerometer. The differential output norms cause estimation errors, as shown in Equation (13). For the accelerometer, the different output norms | y a c c , h y a c c , f | were 29.3 m/s 2 , 66.4 m/s 2 and 370.2 m/s 2 under the repetition rates 60, 120 and 240 beats/min respectively. Meanwhile, the correspondingly differential output norms of gyroscope were 2.2 rad/s, 2.7 rad/s and 4.4 rad/s. As shown in Figure 7b,c, the estimated orientation error based on the accelerometer became larger when the repetition rate increased, while orientation error based on gyroscope changed little when the repetition rate increased. As shown in Figure 7a, the estimated result based on the gyroscope and accelerometer trusted the gyroscope more than the accelerometer because it contained less error; thus, it was also insensitive to repetition rate.

4.2. Simple Functional Task (Task 2)

According to Figure 3, the whole process was divided into several phases; the estimated orientation errors based on the optical system in different phases are shown in Figure 8. The quaternion-based orientation estimated by IMU system and optical can be seen in Figure A3 in the Appendix B. The error during the drinking part was relatively low because the cub imposed a constant relative orientation on the hand and fingers and the whole hand moved with varying position and orientation, as shown in Figure 8. Since the angular velocity and acceleration norms were close to each other, the standard deviations of measurement noise σ a and σ g were small, as shown in subfigure (b); the measurement model was trusted relatively more relative to the process model under said condition. For the other phases of this functional task, there were bigger differences between gyroscope and accelerometer norms on the hand and fingers; thus σ a and σ g were bigger; the trust in the process model was relatively high. A good estimation of relative orientation was achieved by choosing a suitable standard deviation for the process error (see Figure 8c). The results of other two participants can be seen in Figure A1 and Figure A2 in the Appendix B.

5. Discussion

We proposed and evaluated an IMU-based setup for estimating 3D relative orientation between hand and finger tips. Compared with the IMU-based data glove system described by Salchov-Homer et al. [19] and Kortier et al. [28], we reduced the number of IMUs as much as possible and avoided magnetic disturbance, but still obtained comparable precision of estimated orientation. In reference [19], the orientation error magnitude is approximately five to ten degrees. In our research, the orientation error is related to the movement quality. When the hand and fingers move together, the median orientation error can be smaller than five degrees. For the water-drinking experiment, the estimated error is less than ten degrees when hand and fingers approximately move together, but around ten degrees during the rest periods. In our view, this is a promising method for the hand finger orientation estimation with a small IMU configuration which can be used if rich whole-hand movements occur and the change of relative orientations between hand and finger tips is regular and relatively small. Standard deviations σ g and σ a can be used to assess whether such DEs regularly apply during a specify movement.
Most previous IMU-based systems [17,28] for finger orientation estimation usually require a magnetometer to reduce the drift caused by the gyroscope, which will suffer from the magnetic disturbance problem in indoor environments. To our knowledge, in order to remove magnetic disturbance but still suppress the drift, a biomechanical model is additionally used in methods described in the literature (e.g., [17,28]). We have not applied additional information from a bioimechanical model in our current study, although this additional information could be applied. However, it should be noted that finger movements are usually assumed to be restricted to two DoFs while using biomechanical constraints. In construct, our method can be implemented without biomechanical constraints and can be applied to estimating three-DoF-relative orientation during 3D hand movements without such biomechanical assumptions.
For the result in task 1, the relative orientation estimation is less sensitive to an increase of repetition rate of the same movement when using gyroscopes or gyroscopes plus an accelerometer than the accelerometer only. That is because as the difference among the accelerometer signals from the hand and finger becomes larger, the non-gravitational acceleration caused by increasing angular velocity or angular acceleration becomes relatively more important compared to the gravity component.
Position estimation only based on inertial sensors is quite challenging and limited by integration drift. Our further research will concentrate on relative position estimation based on IMUs combined with sensing the magnetic field of a magnetic source. For this to be feasible, an adequate estimate of relative orientation is required, so the 3D magnetic field measurement can be expressed in the coordinate system of the magnetic source. This is an essential first step in estimating relative positions. In this research, only one healthy participant was involved since we are mainly concentrating on verifying the performance of the algorithm. Subsequently, the proposed relative orientation and position estimation methods for the hand and finger using a small sensing configuration need to be evaluated in healthy subjects and patients during more complex daily tasks, in order to assess the applicability in clinical and daily-life settings. To make the system more friendly to users, the system could be wireless in the future.

6. Conclusions

In conclusion, IMUs can be used to estimate the relative orientation between the hand and fingers without using magnetometers. Compared with previous systems, we only exploit IMUs on finger tips and the dorsal side of the hand rather than having IMUs on every segment. The performance is dependent on how well the hand and fingers move together, which influences the accuracy of the estimate. The median value of estimation error can be smaller than five degrees when IMUs are on our hand and fingers if their relative orientation is not variant over time, while the object or hand is moving. During the water-drinking task, the estimation error can be smaller than 10 degrees during periods when the hand and fingers approximately move together, which may be adequately accurate to provide useful information to clinicians when judging.

Author Contributions

Conceptualization, Z.Y. and P.H.V.; methodology, Z.Y., P.H.V., B.-J.F.v.B., S.Y. and B.L.; validation, Z.Y.; formal analysis, Z.Y.; writing—original draft preparation, Z.Y. and P.H.V.; writing—review and editing, Z.Y., P.H.V., B.-J.F.v.B., S.Y. and B.L.; supervision, P.H.V. All authors have read and agreed to the published version of the manuscript.

Funding

This research received funding from China Scholarship Council (CSC).

Acknowledgments

The authors would like to thank Roessingh Research and Development (Enschede, Netherlands) for sharing the gait laboratory, and the laboratory manager, Leendert Schaake, for helping with the optical system and the processing of data. Thanks to A. Droog and G.J.W. Wolterink from Biomedical Signals and Systems, University of Twente for providing the inertial sensor setup and 3D printed coat for inertial sensors.

Conflicts of Interest

The authors declare no conflict of interest.

Appendix A

By linearizing the nonlinear function of the process model and measurement model, we obtained the Jacobian matrixes F and H respectively, which were used in EKF for the covariance update. Based on Equation (3):
F = x k x k 1 = 1 2 a 11 ω 2 a 12 ω 2 a 13 ω 2 a 14 ω 2 a 21 ω 2 a 22 ω 2 a 23 ω 2 a 24 ω 2 a 31 ω 2 a 32 ω 2 a 33 ω 2 a 34 ω 2 a 41 ω 2 a 42 ω 2 a 43 ω 2 a 44 ω 2 d t + 1 2 0 ω 1 x ω 1 y ω 1 z ω 1 x 0 ω 1 z ω 1 y ω 1 y ω 1 z 0 ω 1 x ω 1 z ω 1 y ω 1 x 0 d t + I 4 × 4
a 11 = 2 q 0 q 1 q 0 q 2 q 0 q 3 a 12 = 1 + 2 q 1 2 2 q 1 q 2 2 q 1 q 3 a 13 = 2 q 1 q 2 1 + 2 q 2 2 2 q 2 q 3 a 14 = 2 q 1 q 3 2 q 2 q 3 1 + 2 q 3 2 a 21 = 1 + 2 q 0 2 2 q 0 q 3 2 q 0 q 2 a 22 = 2 q 0 q 1 q 1 q 3 q 1 q 2 a 23 = 2 q 0 q 2 2 q 2 q 3 1 + 2 q 2 2 a 24 = 2 q 0 q 3 1 + 2 q 3 2 2 q 2 q 3 a 31 = 2 q 0 q 3 1 + 2 q 0 2 2 q 0 q 1 a 32 = 2 q 1 q 3 2 q 0 q 1 1 + 2 q 1 2 a 33 = 2 q 2 q 3 q 0 q 2 q 1 q 2 a 34 = 1 + 2 q 3 2 2 q 0 q 3 2 q 1 q 3 a 41 = 2 q 0 q 2 2 q 0 q 1 1 + 2 q 0 2 a 42 = 2 q 1 q 2 1 + 2 q 1 2 2 q 0 q 1 a 43 = 1 + 2 q 2 2 2 q 1 q 2 q 0 q 2 a 44 = 2 q 2 q 3 q 1 q 3 q 0 q 3
Based on Equation (16):
H = f x k = ( x k y a c c , f f x k * ) x k ( x k y g y r , f f x k * ) x k ( q 0 2 + q 1 2 + q 2 2 + q 3 2 ) x k = 2 b 11 y a c c , f f b 12 y a c c , f f b 13 y a c c , f f b 14 y a c c , f f b 21 y a c c , f f b 22 y a c c , f f b 23 y a c c , f f b 24 y a c c , f f b 31 y a c c , f f b 32 y a c c , f f b 33 y a c c , f f b 34 y a c c , f f b 11 y g y r , f f b 12 y g y r , f f b 13 y g y r , f f b 14 y g y r , f f b 21 y g y r , f f b 22 y g y r , f f b 23 y g y r , f f b 24 y g y r , f f b 31 y g y r , f f b 32 y g y r , f f b 33 y g y r , f f b 34 y g y r , f f q 0 q 1 q 2 q 3
b 11 = q 0 q 3 q 2 b 12 = q 1 q 2 q 3 b 13 = q 2 q 1 q 0 b 14 = q 3 q 0 q 1 b 21 = q 3 q 0 q 1 b 22 = q 2 q 1 q 0 b 23 = q 1 q 2 q 3 b 24 = q 0 q 3 q 2 b 31 = q 2 q 1 q 0 b 32 = q 3 q 0 q 1 b 33 = q 0 q 3 q 2 b 34 = q 1 q 2 q 3

Appendix B

In this section, three figures are shown. Figure A1 and Figure A2 are the results of task 2 from other two participants. Compared with the first participant, the drinking phase (see subfigure (d) of Figure 3) is replaced as displacing, which means, we did not put the cup to the mouth but to another position on the desk. Figure A3 is the estimation of relative orientation between the hand and fingers based on the IMU and optical system; based on this, we obtained the orientation error in subfigure (c) in Figure 8.
Figure A1. Relative orientation between the hand and thumb during the water-drinking process. Subfigure (a) shows the output norms of the two gyroscopes (on the hand and finger tip respectively). Subfigure (b) shows the normalized SDs σ a and σ g from Equations (18) and (21). Larger σ a and σ g mean larger measurement error. The EKF trusts the process model more and the measurement model less when σ a and σ g are larger. Subfigure (c) shows the estimated results with different SDs of the process model. The variance of process error Q was determined as σ p I 4 .
Figure A1. Relative orientation between the hand and thumb during the water-drinking process. Subfigure (a) shows the output norms of the two gyroscopes (on the hand and finger tip respectively). Subfigure (b) shows the normalized SDs σ a and σ g from Equations (18) and (21). Larger σ a and σ g mean larger measurement error. The EKF trusts the process model more and the measurement model less when σ a and σ g are larger. Subfigure (c) shows the estimated results with different SDs of the process model. The variance of process error Q was determined as σ p I 4 .
Sensors 20 04008 g0a1
Figure A2. Relative orientation between the hand and thumb during the water-drinking process. Subfigure (a) shows the output norms of the two gyroscopes (on the hand and finger tip respectively). Subfigure (b) shows the normalized SDs σ a and σ g from Equation (18) and (21). Larger σ a and σ g mean larger measurement error. The EKF trusts the process model more and the measurement model less when σ a and σ g are larger. Subfigure (c) shows the estimated results with different SDs of the process model. The variance of process error Q was determined as σ p I 4 .
Figure A2. Relative orientation between the hand and thumb during the water-drinking process. Subfigure (a) shows the output norms of the two gyroscopes (on the hand and finger tip respectively). Subfigure (b) shows the normalized SDs σ a and σ g from Equation (18) and (21). Larger σ a and σ g mean larger measurement error. The EKF trusts the process model more and the measurement model less when σ a and σ g are larger. Subfigure (c) shows the estimated results with different SDs of the process model. The variance of process error Q was determined as σ p I 4 .
Sensors 20 04008 g0a2
Figure A3. Estimation based on relative orientation between the hand and thumb based on IMU system and optical system. Orientations are expressed based on quaternion; based on these results, we obtained the orientation error in subfigure (c) of Figure 8. σ p has the same meaning as in Figure 8.
Figure A3. Estimation based on relative orientation between the hand and thumb based on IMU system and optical system. Orientations are expressed based on quaternion; based on these results, we obtained the orientation error in subfigure (c) of Figure 8. σ p has the same meaning as in Figure 8.
Sensors 20 04008 g0a3

References

  1. Oka, K.; Sato, Y.; Koike, H. Real-time fingertip tracking and gesture recognition. IEEE Comput. Graph. Appl. 2002, 22, 64–71. [Google Scholar] [CrossRef]
  2. Tunik, E.; Saleh, S.; Adamovich, S.V. Visuomotor discordance during visually-guided hand movement in virtual reality modulates sensorimotor cortical activity in healthy and hemiparetic subjects. IEEE Trans. Neural Syst. Rehabil. Eng. 2013, 21, 198–207. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  3. Vignais, N.; Miezal, M.; Bleser, G.; Mura, K.; Gorecky, D.; Marin, F. Innovative system for real-time ergonomic feedback in industrial manufacturing. Appl. Ergon. 2013, 44, 566–574. [Google Scholar] [CrossRef] [PubMed]
  4. Szturm, T.; Peters, J.F.; Otto, C.; Kapadia, N.; Desai, A. Task-specific rehabilitation of finger-hand function using interactive computer gaming. Arch. Phys. Med. Rehabil. 2008, 89, 2213–2217. [Google Scholar] [CrossRef] [PubMed]
  5. Ahmad, N.; Ghazilla, R.A.R.; Khairi, N.M.; Kasi, V. Reviews on various inertial measurement unit (IMU) sensor applications. Int. J. Signal Process. Syst. 2013, 1, 256–262. [Google Scholar] [CrossRef] [Green Version]
  6. Jebsen, R.H.; Taylor, N.; Trieschmann, R.B.; Trotter, M.J.; Howard, L.A. An objective and standardized test of hand function. Arch. Phys. Med. Rehabil. 1969, 50, 311–319. [Google Scholar]
  7. Platz, T.; Pinkowski, C.; van Wijck, F.; Kim, I.H.; Di Bella, P.; Johnson, G. Reliability and validity of arm function assessment with standardized guidelines for the Fugl-Meyer Test, Action Research Arm Test and Box and Block Test: A multicentre study. Clin. Rehabil. 2005, 19, 404–411. [Google Scholar] [CrossRef]
  8. Kapur, A.; Virji-Babul, N.; Tzanetakis, G.; Driessen, P.F. Gesture-Based Affective Computing on Motion Capture Data. In Proceedings of the International Conference on Affective Computing and Intelligent Interaction, Beijing, China, 22–24 October 2005; Springer: Berlin/Heidelberg, Germany, 2005; Volume 3784, pp. 1–7. [Google Scholar]
  9. Guna, J.; Jakus, G.; Pogačnik, M.; Tomažič, S.; Sodnik, J. An analysis of the precision and reliability of the leap motion sensor and its suitability for static and dynamic tracking. Sensors 2014, 14, 3702–3720. [Google Scholar] [CrossRef] [Green Version]
  10. Smeragliuolo, A.H.; Hill, N.J.; Disla, L.; Putrino, D. Validation of the Leap Motion Controller using markered motion capture technology. J. Biomech. 2016, 49, 1742–1750. [Google Scholar] [CrossRef]
  11. Lu, W.; Tong, Z.; Chu, J. Dynamic hand gesture recognition with leap motion controller. IEEE Signal Process. Lett. 2016, 23, 1188–1192. [Google Scholar] [CrossRef]
  12. Mohandes, M.; Aliyu, S.; Deriche, M. Arabic sign language recognition using the leap motion controller. In Proceedings of the 2014 IEEE 23rd International Symposium on Industrial Electronics (ISIE), Istanbul, Turkey, 1–4 June 2014; pp. 960–965. [Google Scholar]
  13. Borghetti, M.; Sardini, E.; Serpelloni, M. Sensorized Glove for Measuring Hand Finger Flexion for Rehabilitation Purposes. IEEE Trans. Instrum. Meas. 2013, 62, 3308–3314. [Google Scholar] [CrossRef]
  14. Chen, S.; Lou, Z.; Chen, D.; Jiang, K.; Shen, G. Polymer-Enhanced Highly Stretchable Conductive Fiber Strain Sensor Used for Electronic Data Gloves. Adv. Mater. Technol. 2016, 1, 1600136. [Google Scholar] [CrossRef]
  15. Chen, K.Y.; Patel, S.N.; Keller, S. Finexus. In CHI 2016; Kaye, J., Druin, A., Lampe, C., Morris, D., Hourcade, J.P., Eds.; The Association for Computing Machinery: New York, NY, USA, 2016; pp. 1504–1514. [Google Scholar] [CrossRef]
  16. Ma, Y.; Mao, Z.H.; Jia, W.; Li, C.; Yang, J.; Sun, M. Magnetic Hand Tracking for Human-Computer Interface. IEEE Trans. Magn. 2011, 47, 970–973. [Google Scholar] [CrossRef]
  17. Lin, B.S.; Hsiao, P.C.; Yang, S.Y.; Su, C.S.; Lee, I.J. Data Glove System Embedded With Inertial Measurement Units for Hand Function Evaluation in Stroke Patients. IEEE Trans. Neural Syst. Rehabil. Eng. 2017, 25, 2204–2213. [Google Scholar] [CrossRef] [PubMed]
  18. Chang, H.T.; Chang, J.Y. Sensor Glove based on Novel Inertial Sensor Fusion Control Algorithm for 3D Real-time Hand Gestures Measurements. IEEE Trans. Ind. Electron. 2019, 1. [Google Scholar] [CrossRef]
  19. Salchow-Hömmen, C.; Callies, L.; Laidig, D.; Valtin, M.; Schauer, T.; Seel, T. A Tangible Solution for Hand Motion Tracking in Clinical Applications. Sensors 2019, 19, 208. [Google Scholar] [CrossRef] [Green Version]
  20. Seel, T.; Ruppin, S. Eliminating the Effect of Magnetic Disturbances on the Inclination Estimates of Inertial Sensors. IFAC-PapersOnLine 2017, 50, 8798–8803. [Google Scholar] [CrossRef]
  21. Madgwick, S.O.; Harrison, A.J.; Vaidyanathan, R. Estimation of IMU and MARG orientation using a gradient descent algorithm. In Proceedings of the 2011 IEEE International Conference on Rehabilitation Robotics, Zurich, Switzerland, 29 June–1 July 2011; pp. 1–7. [Google Scholar]
  22. Teufl, W.; Miezal, M.; Taetz, B.; Fröhlich, M.; Bleser, G. Validity, Test-Retest Reliability and Long-Term Stability of Magnetometer Free Inertial Sensor Based 3D Joint Kinematics. Sensors 2018, 18, 1980. [Google Scholar] [CrossRef] [Green Version]
  23. Seel, T.; Schauer, T.; Raisch, J. Joint axis and position estimation from inertial measurement data by exploiting kinematic constraints. In Proceedings of the IEEE International Conference on Control Applications (CCA), Dubrovnik, Croatia, 3–5 October 2012; IEEE: Piscataway, NJ, USA, 2012; pp. 45–49. [Google Scholar] [CrossRef]
  24. Laidig, D.; Lehmann, D.; Bégin, M.A.; Seel, T. Magnetometer-free realtime inertial motion tracking by exploitation of kinematic constraints in 2-dof joints. In Proceedings of the 2019 41st Annual International Conference of the IEEE Engineering in Medicine and Biology Society (EMBC), Berlin, Germany, 23–27 July 2019; pp. 1233–1238. [Google Scholar]
  25. Lehmann, D.; Laidig, D.; Seel, T. Magnetometer-free motion tracking of one-dimensional joints by exploiting kinematic constraints. Proc. Autom. Med. Eng. 2020, 1, 027. [Google Scholar]
  26. Tedaldi, D.; Pretto, A.; Menegatti, E. A robust and easy to implement method for IMU calibration without external equipments. In Proceedings of the 2014 IEEE International Conference on Robotics and Automation (ICRA), Hong Kong, China, 31 May–5 June 2014; pp. 3042–3049. [Google Scholar]
  27. Fong, W.; Ong, S.; Nee, A. Methods for in-field user calibration of an inertial measurement unit without external equipment. Meas. Sci. Technol. 2008, 19, 085202. [Google Scholar] [CrossRef]
  28. Kortier, H.G.; Sluiter, V.I.; Roetenberg, D.; Veltink, P.H. Assessment of hand kinematics using inertial and magnetic sensors. J. NeuroEng. Rehabil. 2014, 11, 70. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  29. Shepperd, S.W. Quaternion from rotation matrix. J. Guid. Control 1978, 1, 223–224. [Google Scholar] [CrossRef]
Figure 1. IMUs on the dorsal side of the hand and fingertips. The inset shows the cluster of optical markers used on top of each IMU for reference measurement of segment orientations using the optical VICON system. Every cluster contains three markers, which determine a 3D coordinate frame.
Figure 1. IMUs on the dorsal side of the hand and fingertips. The inset shows the cluster of optical markers used on top of each IMU for reference measurement of segment orientations using the optical VICON system. Every cluster contains three markers, which determine a 3D coordinate frame.
Sensors 20 04008 g001
Figure 2. Movement for task 1: rotations of the hand, while not varying relative orientations between the hand and fingers. Subfigure (a,b) are a set of pronation and supination movements. Subfigure (c,d) are another set of pronation and supination but with a different rotation axis. During this task, we did the pronation and supination movements with different rotation axes.
Figure 2. Movement for task 1: rotations of the hand, while not varying relative orientations between the hand and fingers. Subfigure (a,b) are a set of pronation and supination movements. Subfigure (c,d) are another set of pronation and supination but with a different rotation axis. During this task, we did the pronation and supination movements with different rotation axes.
Sensors 20 04008 g002
Figure 3. Movement for task 2: Simple functional task. The task can be divided into several phases. (a) Put the hand static on the desk; (b) raise the hand; (c) grasp the cup; (d) drink the water; (e) release the hand; (f) withdraw the hand.
Figure 3. Movement for task 2: Simple functional task. The task can be divided into several phases. (a) Put the hand static on the desk; (b) raise the hand; (c) grasp the cup; (d) drink the water; (e) release the hand; (f) withdraw the hand.
Sensors 20 04008 g003
Figure 4. Estimated orientation error | θ e r r | with gyroscope and accelerometer (values under 99.3 percent coverage are shown in the boxplot figures). “G”, “A” and “G+A” represent estimated results based on gyroscope, accelerometer and gyroscope plus accelerometer respectively.
Figure 4. Estimated orientation error | θ e r r | with gyroscope and accelerometer (values under 99.3 percent coverage are shown in the boxplot figures). “G”, “A” and “G+A” represent estimated results based on gyroscope, accelerometer and gyroscope plus accelerometer respectively.
Sensors 20 04008 g004
Figure 5. Relation of output norms between gyroscopes on the dorsal side of the hand and finger tip with different repetition rates.
Figure 5. Relation of output norms between gyroscopes on the dorsal side of the hand and finger tip with different repetition rates.
Sensors 20 04008 g005
Figure 6. Relation of output norms between accelerometers on the dorsal side of the hand and finger tip with different repetition rates.
Figure 6. Relation of output norms between accelerometers on the dorsal side of the hand and finger tip with different repetition rates.
Sensors 20 04008 g006
Figure 7. Estimation error | θ e r r | with different repetition rates (values under 99.3 percent coverage are shown in the boxplot figures). Subfigures (ac) are estimations with gyroscope plus accelerometer, and gyroscope and accelerometer individually.
Figure 7. Estimation error | θ e r r | with different repetition rates (values under 99.3 percent coverage are shown in the boxplot figures). Subfigures (ac) are estimations with gyroscope plus accelerometer, and gyroscope and accelerometer individually.
Sensors 20 04008 g007
Figure 8. Relative orientation between hand and thumb during the water-drinking process. Subfigure (a) shows the output norms of the two gyroscopes (on the hand and finger tip respectively). Subfigure (b) shows the normalized SDs σ a and σ g from Equations (18) and (21). Larger σ a and σ g mean larger measurement error. The EKF trusts the process model more and the measurement model less when σ a and σ g are larger. Subfigure (c) shows the estimated results with different SDs of the process model. The variance of process error Q was determined as σ p I 4 .
Figure 8. Relative orientation between hand and thumb during the water-drinking process. Subfigure (a) shows the output norms of the two gyroscopes (on the hand and finger tip respectively). Subfigure (b) shows the normalized SDs σ a and σ g from Equations (18) and (21). Larger σ a and σ g mean larger measurement error. The EKF trusts the process model more and the measurement model less when σ a and σ g are larger. Subfigure (c) shows the estimated results with different SDs of the process model. The variance of process error Q was determined as σ p I 4 .
Sensors 20 04008 g008

Share and Cite

MDPI and ACS Style

Yang, Z.; van Beijnum, B.-J.F.; Li, B.; Yan, S.; Veltink, P.H. Estimation of Relative Hand-Finger Orientation Using a Small IMU Configuration. Sensors 2020, 20, 4008. https://doi.org/10.3390/s20144008

AMA Style

Yang Z, van Beijnum B-JF, Li B, Yan S, Veltink PH. Estimation of Relative Hand-Finger Orientation Using a Small IMU Configuration. Sensors. 2020; 20(14):4008. https://doi.org/10.3390/s20144008

Chicago/Turabian Style

Yang, Zhicheng, Bert-Jan F. van Beijnum, Bin Li, Shenggang Yan, and Peter H. Veltink. 2020. "Estimation of Relative Hand-Finger Orientation Using a Small IMU Configuration" Sensors 20, no. 14: 4008. https://doi.org/10.3390/s20144008

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop