Next Article in Journal
Evaluation of Image Reconstruction Algorithms for Confocal Microwave Imaging: Application to Patient Data
Next Article in Special Issue
Efficient Force Control Learning System for Industrial Robots Based on Variable Impedance Control
Previous Article in Journal
Collaborative Working Architecture for IoT-Based Applications
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Real-Time Hand Position Sensing Technology Based on Human Body Electrostatics

State Key Laboratory of Mechatronics Engineering and Control, Beijing Institute of Technology, Beijing 100081, China
*
Author to whom correspondence should be addressed.
Sensors 2018, 18(6), 1677; https://doi.org/10.3390/s18061677
Submission received: 10 April 2018 / Revised: 17 May 2018 / Accepted: 17 May 2018 / Published: 23 May 2018
(This article belongs to the Special Issue Innovative Sensor Technology for Intelligent System and Computing)

Abstract

:
Non-contact human-computer interactions (HCI) based on hand gestures have been widely investigated. Here, we present a novel method to locate the real-time position of the hand using the electrostatics of the human body. This method has many advantages, including a delay of less than one millisecond, low cost, and does not require a camera or wearable devices. A formula is first created to sense array signals with five spherical electrodes. Next, a solving algorithm for the real-time measured hand position is introduced and solving equations for three-dimensional coordinates of hand position are obtained. A non-contact real-time hand position sensing system was established to perform verification experiments, and the principle error of the algorithm and the systematic noise were also analyzed. The results show that this novel technology can determine the dynamic parameters of hand movements with good robustness to meet the requirements of complicated HCI.

1. Introduction

Natural, harmonious, and highly efficient human-computer interactions (HCI) have become a trend in the field of human-computer interaction research. Human hand motion-based human-computer interaction is one of the most important methods used in human-computer interactions [1]. Generally, hand motion sensing systems can be divided into: data glove-based capturing, attached force-based capturing, surface electromyography (SEMG)-based capturing, optical markers-based capturing, and vision-based capturing [2], which are mainly divided into wearable hand motion sensing and vision-based hand motion sensing methods [1,3]. The two methods have their own characteristics. Wearable gesture recognition has high recognition accuracy that can capture the movement details of the hand. Vision-based gesture recognition obtains gesture information in a non-contact manner and can be applied to a wider range of fields. However, wearable and vision-based gesture recognition methods have disadvantages, including poor user experience and vulnerability to environmental factors such as illumination [1,4]. Wearable electromyography (EMG) signal control systems require electrodes to be stuck onto the forearm or wrist, which requires skin cleanliness and electrode performance with certain use restrictions [5]. Wearable data gloves are not only inconvenient to use, and shifts in the data glove can easily cause errors in the system [1,3]. The vision-based gesture interaction method, in addition to certain requirements for light and background, are of relatively high computational complexity, perform poorly in real-time, and have other shortcomings in terms of target recognition and processing; for example, very complex feature representations and inaccurate feature extractions caused by incorrect human segmentation [6]. For an accuracy of 90% or higher, the average processing time is longer than 30 ms [7]. There is a gesture recognition method based on terahertz radar and the Doppler signal that avoids the disadvantages of the above two methods. However, the terahertz radar is large, expensive and not suitable for development [8].
Because the human body is electrostatically charged in certain environments for a variety of reasons, when the hand moves, the electrostatic charge of the hand will disturb the space’s electric field. The disturbed electric field will lead to an electrostatic induction on the electrode in space, thus hand movement can be verified by detecting the electrostatic induction signal on the electrode. Detecting the hand position using the human body’s own electrostatic field is a method for recognizing human hand movements under non-contact conditions that is not influenced by environmental factors such as light [9]. Therefore, the study of real-time hand position sensing technology based on human body electrostatics can obtain hand movement trajectories via electrostatics detection using a non-contact method, avoiding the disadvantages of being visually sensitive to light and complicated by background interference, preventing a poor user experience. This research will lay the theoretical and practical foundation for new non-contact human–computer interaction methods, which will be of practical significance.
In the field of human body electrostatic detection, the Center for Physical Electronics and Quantum Technology (CPEQT) at the University of Sussex in the U.K. conducted an in-depth study of the electrostatic signal detection of the human heartbeat that was reported in Nature [10,11,12]. In the field of human body movement detection and human–computer interaction, Takiguchi introduced and applied a non-contact electrostatic detection system to detect the moving human body [13,14]. Kurita detected human foot movements with a non-contact electrostatics detection method and applied it to throwing movement analysis [15,16,17,18,19]. Four detection electrodes were used to measure the movement of the hand in eight directions and applied to human–computer interactions. In 2015, we conducted research on sensing the direction and velocity parameters of human hand movements with an electrostatic electrode array. Theoretical and experimental studies confirmed that the direction and velocity parameters of the hand movement can be obtained with the human body’s electrostatic information and can be applied to simple interactive games [9].
In our previous works, only the direction and velocity of human hand movements were obtained using the electrostatic detection method, and the spatial position of the operator’s hand was not obtained in real time, which restrained the application of this technology. With the rapid development of human–computer interaction technology, increasing application demands for complex operations in three-dimensional (3D) space, such as interactive operation in the Virtual Reality/Augmented Reality (VR/AR) environment and 3D modeling, the precise positioning of the operator’s hand position is required. To this end, we further developed a real-time hand position algorithm based on the electrostatic information from the human body, designed a new electrostatic electrode array structure and a new solving method to solve the position of the charge source, achieving real-time human hand positioning.

2. Sensing Principle and Positioning Algorithm

In this paper, five-spherical-electrode sensing array sensing signals are established, and the method used to calculate the position of the hand in real time is studied. The solution of the 3D coordinates of the hand position is obtained, and the movement of the hand is obtained using the constrained scanning positioning method. By locating the charge source, the human hand real-time position can be obtained. The algorithms are specified in Section 3.
The induction signal expression of the five-spherical-electrode detection array and the principle of “constraint scanning positioning method” are mainly analyzed in this section.

2.1. Electrostatic Field Sensing Principle

As a charged body, with the hand being the tip, it can produce a disturbance on the electric field of surrounding space by using hand motion. Based on the theories of electrodynamics, as the hand movement changes the electric fields in space, the induced charge on the surface of the conductor redistributes, leading to the generation of induced current on the conductor.
In this study, the induction signals of spherical electrodes were quantitatively analyzed. Figure 1 shows the positional relationship between the spherical electrode and the hand target T. The spherical electrode is at the origin of the coordinate, the hand charge quantity is Q, the hand moves along a certain trajectory, and the center of the hand has a distance r from the center of the spherical electrode.
According to the previous research results, when the human hand moves, the induced current signal of the spherical electrode can be expressed as [7]:
i = d Q E dt = d ( R 0 r Q ) d t = Q R 0 r 2 d r d t
Q0 is the target (hand) charge quantity. R0 is the spherical electrode radius. r is the distance from the hand to the spherical electrode.
The real-time position of the 3D space of the hand movement was detected with a five-spherical- electrode sensing array, and a Cartesian coordinate system model of the detection array was established, as shown in Figure 2.
Coordinates of the charged target T are (x, y, z), with a charge quantity of Q0. Electrodes S1, S2, S3, and S4 are located at the four vertices of the square centered on the origin O and with 2l being the side length on the XOY plane. The coordinates of the four vertices are S1 (−l, −l, 0), S2 (l, −l, 0), S3 (l, l, 0), and S4 (−l, l, 0). The electrode S5 was set on the z-axis with the coordinates (0, 0, −h).
Let S be the effective sensing area of the spherical electrode and R0 the spherical electrode radius. According to the Gaussian theorem, the induced charge of the electrode i, i = (1, 2, 3, 4, 5) under the electric field of the charge source T is:
{ i 1 = Q 0 S 4 π { d z d t · 1 ( x + l ) 2 + ( y + l ) 2 + z 2 2 [ x + l ( x + l ) 2 + ( y + l ) 2 + z 2 2 · d x d t + y + l ( x + l ) 2 + ( y + l ) 2 + z 2 · d y d t + z ( x + l ) 2 + ( y + l ) 2 + z 2 · d z d t ] · z [ ( x + l ) 2 + ( y + l ) 2 + z 2 ] 2 } i 2 = Q 0 S 4 π { d z d t · 1 ( x l ) 2 + ( y + l ) 2 + z 2 2 [ x l ( x l ) 2 + ( y + l ) 2 + z 2 2 · d x d t + y + l ( x l ) 2 + ( y + l ) 2 + z 2 · d y d t + z ( x l ) 2 + ( y + l ) 2 + z 2 · d z d t ] · z [ ( x l ) 2 + ( y + l ) 2 + z 2 ] 2 } i 3 = Q 0 S 4 π { d z d t · 1 ( x l ) 2 + ( y l ) 2 + z 2 2 [ x l ( x l ) 2 + ( y l ) 2 + z 2 2 · d x d t + y l ( x l ) 2 + ( y l ) 2 + z 2 · d y d t + z ( x l ) 2 + ( y l ) 2 + z 2 · d z d t ] · z [ ( x l ) 2 + ( y l ) 2 + z 2 ] 2 } i 4 = Q 0 S 4 π { d z d t · 1 ( x + l ) 2 + ( y l ) 2 + z 2 2 [ x + l ( x + l ) 2 + ( y l ) 2 + z 2 2 · d x d t + y l ( x + l ) 2 + ( y l ) 2 + z 2 · d y d t + z ( x + l ) 2 + ( y l ) 2 + z 2 · d z d t ] · z [ ( x + l ) 2 + ( y l ) 2 + z 2 ] 2 } i 5 = Q 0 S 4 π { d z d t · 1 x 2 + y 2 + ( z + h ) 2 2 [ x x 2 + y 2 + ( z + h ) 2 · d x d t + y x 2 + y 2 + ( z + h ) 2 · d y d t + z + h x 2 + y 2 + ( z + h ) 2 · d z d t ] · z [ x 2 + y 2 + ( z + h ) 2 ] 2 }
Equation (2) is the induced current output of each element of the five-spherical-electrode detection array.

2.2. Hand Position Solving Algorithm Based on Human Body Electrostatic Forces

By integrating the induced current of each element of the five-spherical-electrode detection array, the induced charge quantity of all electrodes can be obtained. Calculate the induced charge ratio q 2 q 1 ,   q 3 q 2 , q 5 q 1 of electrodes 1 to 5. The charge of the charge source is eliminated to obtain an equation containing only the position parameters. The geometric constraint is applied to constrain the charge source position to a circle. Finally, scan the position of the charge source on the circle, and determine the 3D coordinates of the charge source by using the charge quantity of electrode 5 on the z-axis, which is called the constraint scanning positioning method.
Step 1 Current integration to obtain the charge quantity
Integrating the detection from Equation (2), Q1Q5 can be obtained:
Q i = 0 t i i d t   ( i = 1 , 2 , 3 , 4 , 5 )
Step 2: Ratio elimination
Calculate the ratio of the induced charge quantity of each electrode to eliminate the charge quantity of the charge source, and we can obtain:
Q 2 Q 1 = ( r 1 r 2 ) 2 = k 1 ,   Q 3 Q 2 = ( r 2 r 3 ) 2 = k 2 ,   Q 4 Q 3 = ( r 3 r 4 ) 2 = k 3 ,   Q 5 Q 1 = ( r 1 r 52 ) 2 = k 4
Step 3: Spherical constraint
Calculate the square root of ( r 1 r 2 ) 2 = k 1 in Equation (4) to obtain:
r 1 r 2 = k 1
where r1 is the distance between the charge source T and electrode S1, and r2 is the distance between charge source T and S2. According to the sphere definition, “the trajectory of all the points with constant ratio of distances to two fixed points” [20], all the points satisfying Equation (5) can form a sphere, so the charge source T can be constrained on this sphere. The center and radius of the sphere are calculated below.
To more easily determine the spherical center and radius, coordinate system transformation is implemented first, as shown in Figure 3. Point T and the electrodes S1 and S2 are transformed from the 3D coordinate system to the plane coordinate system composed of T, S1, and S2. Since the S1 and S2 points are on the YOZ plane, the coordinate transformation does not change the distance between two points.
Suppose point T to be (x’, y’), S1 (−l, 0), S2 (l, 0) in the plane coordinate system X’O’Y’. If k1 = 1, point T is on the bisector plane of line S1 S2. If k1 1, then the square of the distance ratio from point T to S1, S2 can be written as:
( x + l ) 2 + y 2 ( x l ) 2 + y 2 = k 1
Substituting Equation (6) into the equation of a circle:
( x + 1 + k 1 1 k 1 l ) 2 + y 2 = [ ( 1 + k 1 1 k 1 ) 2 1 ] l 2
That is, on the plane X’O’Y’, point T is on the circle with center A’(− 1 + k 1 1 k 1 l ,   0 ), and radius r 1 = [ ( 1 + k 1 1 k 1 ) 2 1 ] l 2 . The center of circle A’ is collinear with S1, S2.
When transforming to the XYZ coordinate system, because the center position and the radius do not change when the coordinate transformation is performed and the center of circle A’ is collinear with S1, S2, the center of circle is in the XOY plane in XYZ coordinate system. Therefore, in XYZ coordinate system, the distance between points T and A (− 1   +   k 1 1     k 1 l ,   l , 0 ) is [ ( 1   +   k 1 1     k 1 ) 2 1 ] l 2 . All points T on the sphere with A as the center and [ ( 1   +   k 1 1     k 1 ) 2 1 ] l 2 as the radius satisfy Equation (5).
Similarly, in the plane TS2S3, the distance between point T and point B(− 1   +   k 2 1     k 2 l ,   0 ) is r 2 = [ ( 1   +   k 2 1     k 2 ) 2 1 ] l 2 . When transforming to XYZ space, the distance between point T and B (l, − 1   +   k 2 1     k 2 l ,   0 ) is [ ( 1   +   k 2 1     k 2 ) 2 1 ] l 2 . Then point T can be constrained to the sphere with B as the center and [ ( 1   +   k 2 1 k 2 ) 2 1 ] l 2 as the radius.
In the plane TS3S4, the distance between T and C’(− 1   +   k 3 1     k 3 l ,   0 ) is r 3 = [ ( 1   +   k 3 1     k 3 ) 2 1 ] l 2 . When transforming to XYZ space, the distance between point T and C ( 1   +   k 3 1     k 3 l , l ,   0 ) is [ ( 1   +   k 3 1     k 3 ) 2 1 ] l 2 . Then point T can be constrained to the sphere with C as the center and [ ( 1   +   k 3 1     k 3 ) 2 1 ] l 2 as the radius.
Step 4: Circular constraint
Since S1, S2, S3, and S4 are coplanar, it is easy to prove that A ( 1   +   k 1 1     k 1 l ,   l , 0 ), B(− 1   +   k 2 1     k 2 l ,   0 ), and C ( 1   +   k 3 1     k 3 l , l ,   0 ) are collinear. So, the three spherical surfaces with A, B, and C as centers obtained in the spherical constraint step will intersect on a circle, so one of the spheres is redundant. Taking the two spherical surfaces with A and B as centers as an example, implement the circular constraint.
As shown in Figure 4, through the above-mentioned two points, A and B, point T can be constrained to two spherical surfaces that intersect to a circle, meaning point T is further constrained to a circle with radius of r0 and center W. Since the center W is on the line AB, the center W is on the XOY plane, and the coordinates of W are (x0, y0, 0). The specific values of x0, y0 and r0 can be obtained from the coordinates of points A and B, r1 and r2, through the geometric relations.
From a mathematical point of view, the two spherical surfaces with A and B as centers can constrain the charge sources to a circle, but during the actual use of the system, due to systematic error, the positions of A and B may have some deviations. To improve the accuracy of the circular constraint, it can be further improved by straight line fitting of A, B, and C.
Through A, B, C straight line fitting:
y = a x + b
According to the positions of A, B, C and r1, r2, r3, we can determine that point T is located on the circle with r0 as the radius and center W (x0, y0, 0). The line in Equation (8) is vertical to the plane where the circle is located and through the center W of the circle.
Step 5: Angle Scanning
Represent the charge source T with the polar coordinates on the circle determined in Step 4. Set the angle between the TW connection line and plane XOY as θ , with a scan value of θ , so that when point T moves along the circle, the position of point T can be obtained with Equation (4).
As shown in Figure 5, set 0 < θ < π, as the hand is always in the z-axis positive direction side in human–computer interactions. The projection of TW on the plane XOY is a straight line y = a1x + b1, a1 = 1 a , where a is the slope of the straight line in Equation (8). Then the coordinates of point T are converted to polar coordinates:
{ x T = r 0 c o s θ 1 a 1 2 + 1 + x 0 y T = r 0 c o s θ 1 a 1 a 1 2 + 1 + y 0 z T = r 0 s i n θ
Scan θ in the range of (0, π ) according to a step size, so that:
( x T x 1 ) 2 + ( y T y 1 ) 2 ( z T z 1 ) 2 ( x T x 5 ) 2 + ( y T y 5 ) 2 ( z T z 5 ) 2 = Q 5 Q 1
where ( x 1 , y 1 , z 1 ) and ( x 5 , y 5 , z 5 ) are the positions of electrodes 1 and 5, respectively, and ( x 1 , y 1 , z 1 ) is (−l, −l, 0) and ( x 5 , y 5 , z 5 ) is (0, 0, h) in the system. At this time, ( x T , y T , z T ) in Equation (9) is the actual coordinates of the hand.
The real-time position of the hand can be obtained by calculating the spatial coordinates of the charge source with a small amount of computation and high accuracy.
The calculated position of the hand is affected by the noise of the system, which will cause some errors. Since the hand motion is a continuous trajectory, the Kalman filtering method can be used to reduce the hand position resolution error and improve the system accuracy. When the system sampling rate is 1k, the calculated distance between the adjacent hand positions is on the millimeter scale, and the distance is very close. This can be approximated as a rectilinear motion between the adjacent hand positions, so we use a rectilinear motion model and select velocity covariance and acceleration covariance to do the Kalman filtering.

2.3. Design and Experiment of Non-Contact Interactive System Based on Human Body Electrostatic Forces

Using the electrostatic array layout and signal processing, the human–computer interactive system based on the electrostatics of the human body can obtain human hand movement parameter information, such as angle, direction, speed, or real-time 3D positions, obtain the hand movement trajectory, judge the operation intentions of operators, control and operate the software running on the computer, and complete the human-machine interactive function. Figure 6 shows the application schematic diagram of our human-machine interactive system based on the electrostatics of the human body. With the display devices (display screen and projector) and computer, the system can allow operators to perform the gesture human–computer interactive operations with various 3D software, such as virtual assembly, virtual experiment, and 3D modeling software, under a variety of lighting and complex backgrounds without wearing any sensing equipment.
We designed a top-level human–computer interaction system based on the electrostatics of the human body, built the hierarchical architecture, and divided the entire human–computer interaction system into physical, information, and application layers. The physical layer is the physical basis of the whole system, including the sensing circuits and signal processing circuit hardware. The information layer processes the signals from the physical layer, and mainly performs model and algorithm research, including solving for the position information of the charge sources according to the source azimuth solving model of electric fields. The information layer also performs mode recognition and matching calculations for gesture information patterns, and finally outputs the standard human hand movement identification information. The application layer mainly completes research to output data from the information layer, constructs the human-machine interactive mode through the definition of human gestures, and designs the human-machine interactive interface to complete the HCI function based on electrostatic detection. The overall scheme is shown in Figure 7.
We built a real-time position measurement system for non-contact HCI based on the electrostatic signals, as shown in Figure 8a. The system mainly consists of five-spherical-electrode, electrostatic sensing circuits, and data acquisition processing units. The five-spherical-electrode are arranged in the positions shown in Figure 8a, and the vertical plane is defined as the XOY plane. The four spherical electrodes are S1–S5, arranged at the four corners of the square with a side length of 0.3 m on the XOY plane. The square center is the origin of the coordinates, and the fifth spherical electrode is S5. As the center electrode, S5 is arranged on the z-axis 0.2 m away from the XOY platform. We used a round planar electrode instead of a spherical electrode to make the same experimental device. As shown in Figure 8b, this device has the same performance parameters as a spherical electrode device and has a more compact structure. In actual use, the electrodes can be made of transparent conductive materials such as Indium Tin Oxide (ITO) and integrated with the display screen to accommodate more application scenarios.

3. Results

3.1. Hand Real-Time Position Acquisition

A real-time position sensing system for non-contact HCI based on human body electrostatics was built and verified by experiments. The operator stood 1.5 m away from the electrode plane and moved one hand in the air. The trajectory of the hand movement was calculated using the induced current data measured by the electrode.
The results are shown in Figure 9 and Figure 10. The operator’s spiral gesture trajectory was detected, and the induced current detection curve was obtained as shown in Figure 9a. The obtained induced charge curve after processing is shown in Figure 9b.
We calculated the position points of the hand movement trajectory, and the calculated hand position is indicated by the red dots in Figure 10. The smooth hand movement trajectory obtained through Kalman filtering is indicated by a blue line and the time delay was less than one millisecond.

3.2. System Accuracy Analysis and Verification

We used Leap Motion as a standard detection system to verify the accuracy of the electrostatic detection system. We placed the Leap Motion in the front of the spherical electrode plane so that both the Leap Motion and the five-electrode sensing array can sense hand motion. When the hand moves, Leap Motion acquires the trajectory of the hand, and its positioning accuracy is better than that of the millimeter scale. At the same time, our system also acquires hand movement trajectories. We adjust the coordinate system and sampling rate of the two measurement systems so that the two systems have the same measurement reference. The distance between the corresponding measurement data of the two systems is calculated as the measurement error of our system. Through this method, the accuracy of our system is verified.
The experimental scenario is shown in Figure 11. A person at a distance of 1.5 m from the electrode plane drew circles in the air, and the hand movement trajectory was solved using the induced current data measured by the electrode.
The solving result is shown in Figure 12, in which the red indicates the hand position of solving, and the blue indicates the hand movement trajectory after fitting, and the green indicates the hand movement trajectory from Leap Motion. From Figure 12, the hand positions obtained after measurement and calculation were generally distributed around a circular trajectory, so we fit the calculated hand positions by using the Kalman filtering method, and the fitted movement trajectory can better follow the circular movement trajectory of the hand.
After testing by eight experimenters with different heights, genders and clothing, the detection range of the system is 0.1 m–2.3 m, which can respond to the movement of the experimenter’s hand 1–2 cm. However, Leap Motion’s detection range is 0.025 to 0.6 m. Therefore, we tested the accuracy of the electrostatic detection system by move the placement of the Leap Motion to detect the accuracy within the detection range of 0.1 to 2.3 m.
We determined the accuracy of the system by using the experimental data of 160 groups of hand circles collected by eight experimenters as shown in Figure 13.
We analyzed the hand position errors for the measuring results, and the results are shown in Figure 13. In Figure 13, the horizontal axis is n, and the vertical axis indicates the solved hand position errors with units of m. The maximum error of 0.043 m is seen in Figure 13, which can meet the needs of human–computer interaction systems in 3D space such as play games or handwriting.

4. Discussion

According to the experimental results, an accurate hand movement trajectory curve was obtained using our real-time position sensing system based on the electrostatic hand signals. However, the obtained hand position had a certain error, distributed around the hand’s actual position. The real-time hand position sensing system model was established by using MATLAB software. The hand position calculation under an ideal situation and the hand position calculation under the condition of adding the measurement error were completed. The precision of the hand locus resolution was analyzed using different system measurement errors.

4.1. Situation with Measurement Error of Zero

The quantity of charge source was set at 10−9 C in simulation, with an electrode spacing of 0.3 m, and electrode radius of 0.025 m, so that the charge source T can move at a velocity of one m/s following the set path. The induced current and induced charge quantity of each electrode were calculated, and the 3D coordinates of the charge source were solved in real time with the constrained scan positioning method. The positioning error was obtained by comparing the actual coordinates with the charge source.
The simulation results are shown in Figure 14, in which green represents the actual position of charge source, and the red represents the calculated position of the charge source. The solved position and the actual position are strictly consistent.
From the above two simulations, we concluded that, in an ideal case without noise, the real-time solution accuracy of the charge source is very high, meaning the algorithm has no principle error.

4.2. Adding Measurement Error

Since the system uses a high-gain amplifier circuit to detect the weak current signal, the thermal noise and shot noise are the most important system noise, and are considered the main source of measurement error. The thermal noise and shot noise of the measurement circuit were simulated by adding a Gaussian white noise analog to the simulation, and the positioning error caused by the measurement error of the system was analyzed.
Figure 15 shows the solving result of the position of the charge source by adding 10% noise into the signal, in which blue represents the actual position of charge source and red denotes the position of charge source calculated by the induced charges of the five electrodes.
Figure 16 shows the position error and velocity distribution of the charge source with 10% noise. Figure 16a shows the result of the position error analysis of the charge source, with an abscissa of n and an ordinate of the calculated position error of the charge source in m units. The maximum error was 0.045 m. Figure 16b shows a comparison between the actual velocity and the calculated charge source moving velocity, with abscissa of n, and ordinate of the velocity value in m/s. The red line represents the actual velocity of the charge source, which is 1 m/s, and the blue line represents the calculated charge source moving velocity, with a maximum of 1.08 m/s, and an error of less than 8%.
The experimental results show that when positioning the human hand with the human body electrostatic signal, noise has some influence on solving the hand position and velocity, but the maximum distance error was less than 0.045 m with 10% noise, and the velocity error did not exceed 8%.
Because the position error caused by the system measurement error is a random error and a sudden change will not occur in the hand position during movement; the hand moves in a continuous trajectory. Therefore, the influence of a random error can be reduced by center fitting of adjacent position points or by Kalman filter improving the positioning accuracy.

5. Conclusions

The method we introduced to solve the real-time hand position was able to obtain the position of the hand with high accuracy by acquiring the hand’s electrostatic signal. There is no principle error in the method. The hand real-time position sensing system has some positioning error due to the systematic measurement error. The maximum distance error was less than 0.045 m and the velocity error was not more than 8% when the noise is 10%. This error can be further reduced by multi-point fitting.
After the subject’s movement trajectory is further identified, the function of mouse-like interactive functions such as selecting, dragging, opening, closing through the combination of hand positions and hand motion trajectories, and handwriting input can also be achieved. We have already started this research and will present it in a later paper.
Therefore, the real-time hand movement position tracking and calculation by measuring the human hand movement electrostatic signal can better determine the hand movement trajectory, meeting the higher accuracy, more complex hand movement trajectory tracking, and human–computer interaction demands.

6. Patents

Xi Chen; Kai Tang; Pengfei Li; Wei Wang. Motion charge source real-time location sensing method. 201610516862.9, 2016.11.16.
Pengfei Li; Xi Chen; Kai Tang; Chuang Wang. Motion charge source movement speed and direction sensing method. 201610516790.8,2016.10.26.

Author Contributions

Conceptualization, K.T. and X.C.; Methodology , K.T. and X.C.; Software, K.T. and Y.W.; Validation, K.T. and C.W.; Formal Analysis, , Y.W., and P.L. ; Investigation, C.W.; Resources, K.T., X.C. and P.L.; Data Curation, K.T. and X.C.; Writing-Original Draft Preparation, K.T. and X.C.; Visualization, K.T.; Supervision, X.C.; Project Administration, X.C.; Funding Acquisition, K.T., X.C. and P.L.

Funding

This work was financially supported by grants from National Natural Science Foundation of China (#51777010, #51707008, #U1630130, #51407009).

Conflicts of Interest

The authors declare no conflict of interest.

References

  1. Cheng, J.; Xie, C.; Bian, W.; Tao, D. Feature fusion for 3D hand gesture recognition by learning a shared hidden space. Pattern Recognit. Lett. 2012, 33, 476–484. [Google Scholar] [CrossRef]
  2. Xue, Y.; Ju, Z.; Xiang, K.; Chen, J.; Liu, H. Multimodal Human Hand Motion Sensing and Analysis—A Review. IEEE Trans. Cogn. Dev. Syst. 2018, 8920, 1–14. [Google Scholar] [CrossRef]
  3. Liu, J.; Pan, Z.; Li, X. An accelerometer-based gesture recognition algorithm and its application for 3D interaction. Comput. Sci. Inf. Syst. 2010, 7, 177–188. [Google Scholar] [CrossRef]
  4. Ogino, H.; Arita, J.; Tsuji, T. A wearable pointing device using EMG signal. J. Robot. Mechatron. 2005, 17, 173–180. [Google Scholar] [CrossRef]
  5. Geng, W.; Du, Y.; Jin, W. Gesture recognition by instantaneous surface EMG images. Sci. Rep. 2016, 11. [Google Scholar] [CrossRef] [PubMed]
  6. Ji, X.; Wang, C.; Ju, Z. A New Framework of Human Interaction Recognition Based on Multiple Stage Probability Fusion. Appl. Sci. 2017, 7, 567. [Google Scholar] [CrossRef]
  7. Chevtchenko, S.F.; Vale, R.F.; Macario, V. Multi-objective optimization for hand posture recognition. Expert Syst. Appl. 2018, 92, 170–181. [Google Scholar] [CrossRef]
  8. Zhou, Z.; Cao, Z.; Pi, Y. Dynamic Gesture Recognition with a Terahertz Radar Based on Range Profile Sequence sand Doppler Signatures. Sensors 2018, 18. [Google Scholar] [CrossRef]
  9. Tang, K.; Chen, X.; Zheng, W.; Han, Q.; Li, P. A Non-contact Technique Using Electrostatics to Sense Three-dimensional Hand Movement for Human Computer Interaction. J. Electrost. 2015, 77, 101–109. [Google Scholar] [CrossRef]
  10. Harland, C.J.; Clark, T.D. Remote Detection of Human Electroencephalograms Using Ultrahigh Input Impedance Electric Potential Sensors. Appl. Phys. Lett. 2002, 81, 3284–3286. [Google Scholar] [CrossRef]
  11. Harland, C.J.; Clark, T.D.; Prance, R.J. Electric Potential Probes-New Directions in the Remote Sensing of the Human Body. Meas. Sci. Technol. 2002, 13, 163–169. [Google Scholar] [CrossRef]
  12. Harland, C.J.; Clark, T.D.; Prance, R.J. High Resolution Ambulatory Electrocardiograph Monitoring Using Wrist Mounted Electric Potential Sensors. Meas. Sci. Technol. 2003, 14, 923–928. [Google Scholar] [CrossRef]
  13. Takiguchi, K.; Wada, T.; Toyama, S. Human Body Detection that Uses Electric Field by Walking. J. Adv. Mech. Des. Syst. Manuf. 2007, 1, 294–305. [Google Scholar] [CrossRef]
  14. Takiguchi, K.; Wada, T.; Toyama, S. Rhythm Pattern of Sole through Electrification of the Human Body When Walking. J. Adv. Mech. Des. Syst. Manuf. 2008, 2, 429–440. [Google Scholar] [CrossRef]
  15. Kurita, K. Detection of Human Respiration Based on Measurement of Current Generated by Electrostatic Induction. Artif. Life Robot. 2010, 15, 181–184. [Google Scholar] [CrossRef]
  16. Kurita, K. Human Heartbeat Measurement on the Basis of Current Generated by Electrostatic Induction. Rev. Sci. Instrum. 2011, 82, 026105. [Google Scholar] [CrossRef] [PubMed]
  17. Kurita, K. Novel Non-contact and Non-attached Technique for Detecting Sports Motion. Measurement 2011, 44, 1361–1366. [Google Scholar] [CrossRef]
  18. Kurita, K.; Ueta, S. A New Control Method for Bipedal Robot Based on Noncontact and Nonattached Human Movement Sensing Technique. IEEE Trans. Ind. Appl. 2011, 47, 1022–1027. [Google Scholar] [CrossRef]
  19. Prance, R.J.; Debray, A.; Clark, T.D.; Prance, H.; Nock, M.; Harland, C.J.; Clippingdale, A.J. An Ultra-low-noise Electric-potential Probe for Human-body Scanning. Meas. Sci. Technol. 2000, 11, 1–7. [Google Scholar] [CrossRef]
  20. Feynman, R.P.; Leighton, R.B.; Sands, M. The Feynman Lectures on Physics; The New Millennium Edition; Basic Books: New York, NY, USA, 2011; Volume 2. [Google Scholar]
Figure 1. Schematic of the spherical electrode.
Figure 1. Schematic of the spherical electrode.
Sensors 18 01677 g001
Figure 2. Planar electrode layout.
Figure 2. Planar electrode layout.
Sensors 18 01677 g002
Figure 3. Schematic of coordinate transformation: (a) three-dimensional (3D) coordinates and (b) two-dimensional (2D) coordinates.
Figure 3. Schematic of coordinate transformation: (a) three-dimensional (3D) coordinates and (b) two-dimensional (2D) coordinates.
Sensors 18 01677 g003
Figure 4. Schematic of circular constraint diagram.
Figure 4. Schematic of circular constraint diagram.
Sensors 18 01677 g004
Figure 5. Schematic of angle scanning.
Figure 5. Schematic of angle scanning.
Sensors 18 01677 g005
Figure 6. Schematic of human-computer interaction (HCI) system based on the electrostatics of the human body.
Figure 6. Schematic of human-computer interaction (HCI) system based on the electrostatics of the human body.
Sensors 18 01677 g006
Figure 7. Architecture diagram depicting the overall scheme.
Figure 7. Architecture diagram depicting the overall scheme.
Sensors 18 01677 g007
Figure 8. Photograph of our experimental equipment. (a) spherical electrode device and (b) round planar electrode device.
Figure 8. Photograph of our experimental equipment. (a) spherical electrode device and (b) round planar electrode device.
Sensors 18 01677 g008
Figure 9. (a) Induced current curve and (b) induced charge curve after processing.
Figure 9. (a) Induced current curve and (b) induced charge curve after processing.
Sensors 18 01677 g009
Figure 10. Hand position and trajectory curve obtained by constrained scanning positioning method.
Figure 10. Hand position and trajectory curve obtained by constrained scanning positioning method.
Sensors 18 01677 g010
Figure 11. Experimental scene.
Figure 11. Experimental scene.
Sensors 18 01677 g011
Figure 12. Measured results. (a) Side view and (b) top view.
Figure 12. Measured results. (a) Side view and (b) top view.
Sensors 18 01677 g012
Figure 13. Position error result.
Figure 13. Position error result.
Sensors 18 01677 g013
Figure 14. Charge source real-time position solution results for (a) a linear trajectory and (b) a circular trajectory.
Figure 14. Charge source real-time position solution results for (a) a linear trajectory and (b) a circular trajectory.
Sensors 18 01677 g014
Figure 15. Position solution results by adding 10% noise into signal for (a) a linear trajectory and (b) a circular trajectory.
Figure 15. Position solution results by adding 10% noise into signal for (a) a linear trajectory and (b) a circular trajectory.
Sensors 18 01677 g015
Figure 16. (a) Position and (b) velocity deviation results with 10% noise in the signal.
Figure 16. (a) Position and (b) velocity deviation results with 10% noise in the signal.
Sensors 18 01677 g016

Share and Cite

MDPI and ACS Style

Tang, K.; Li, P.; Wang, C.; Wang, Y.; Chen, X. Real-Time Hand Position Sensing Technology Based on Human Body Electrostatics. Sensors 2018, 18, 1677. https://doi.org/10.3390/s18061677

AMA Style

Tang K, Li P, Wang C, Wang Y, Chen X. Real-Time Hand Position Sensing Technology Based on Human Body Electrostatics. Sensors. 2018; 18(6):1677. https://doi.org/10.3390/s18061677

Chicago/Turabian Style

Tang, Kai, Pengfei Li, Chuang Wang, Yifei Wang, and Xi Chen. 2018. "Real-Time Hand Position Sensing Technology Based on Human Body Electrostatics" Sensors 18, no. 6: 1677. https://doi.org/10.3390/s18061677

APA Style

Tang, K., Li, P., Wang, C., Wang, Y., & Chen, X. (2018). Real-Time Hand Position Sensing Technology Based on Human Body Electrostatics. Sensors, 18(6), 1677. https://doi.org/10.3390/s18061677

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop