Next Article in Journal
Reference-Guided Draft Genome Assembly, Annotation and SSR Mining Data of the Peruvian Creole Cattle (Bos taurus)
Previous Article in Journal
Ground Truth Dataset: Objectionable Web Content
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Data Descriptor

Dataset on Force Myography for Human–Robot Interactions

1
Menrva Research Group, School of Mechatronic Systems Engineering and Engineering Science, Simon Fraser University, Metro Vancouver, BC V5A 1S6, Canada
2
Biomedical and Mobile Health Technology Laboratory, Department of Health Sciences and Technology, ETH Zurich, Lengghalde 5, 8008 Zurich, Switzerland
*
Author to whom correspondence should be addressed.
Data 2022, 7(11), 154; https://doi.org/10.3390/data7110154
Submission received: 29 June 2022 / Revised: 3 November 2022 / Accepted: 4 November 2022 / Published: 8 November 2022

Abstract

:
Force myography (FMG) is a contemporary, non-invasive, wearable technology that can read the underlying muscle volumetric changes during muscle contractions and expansions. The FMG technique can be used in recognizing human applied hand forces during physical human robot interactions (pHRI) via data-driven models. Several FMG-based pHRI studies were conducted in 1D, 2D and 3D during dynamic interactions between a human participant and a robot to realize human applied forces in intended directions during certain tasks. Raw FMG signals were collected via 16-channel (forearm) and 32-channel (forearm and upper arm) FMG bands while interacting with a biaxial stage (linear robot) and a serial manipulator (Kuka robot). In this paper, we present the datasets and their structures, the pHRI environments, and the collaborative tasks performed during the studies. We believe these datasets can be useful in future studies on FMG biosignal-based pHRI control design.
Dataset License: CC BY NC ND 4.0

1. Summary

In industrial physical human–robot interactions (pHRI), a human worker mainly interacts using hand forces to perform a collaborative task. Commercially available force sensors attached to the robot can read the applied force, which requires the worker to apply force on the sensor and confines the worker’s movement within the workspace. On the other hand, measuring human-applied force via wearable sensors can be advantageous in allowing unrestricted movements while including human bio feedback in the control loop. Among many wearables, force myography (FMG) is a contemporary, non-invasive, affordable technique, that can map exerted forces from muscle contractions via force sensing resistors (FSRs) [1,2]. An FMG band wrapped around upper extremities can read the voluntary muscle changes during isometric hand force, grasping force or interactive applied hand force during dynamic arm movements [3,4,5,6,7,8,9,10].
In the literature, studies conducted on human–machine interactions (HMI), or human–robot interactions (HRI) are mainly monitored using vision system, proximity detectors, wideband/radio frequencies, and invasive or non-invasive sensory systems. Only few HRI datasets are publicly available, such as: PinSoRo [11] and DREAM [12] for understanding social constructions of childhood; CADDY [13] and Aquaticus [14] for under water interactions with machines; MAHNOB-HCI [15] for understanding emotions; MHHRI [16], P2PSTORY [17] and UE-HRI [18] for understanding behavioral and socio-emotional profile; and TacAct [19] for tactile information during interactions with robots. These datasets contain video and audio recordings, image depth sensing, skin temperature, eye-gaze tracking, physiological sensors, tactile sensors and acceleration sensors.
Including human biosignals to control or interact with machines is an ongoing research area. Many studies used biosignals such as electroencephalogram (EEG), electrooculogram (EOG) and electromyogram (EMG) for neuronal association with machines. These biosignals are mostly acquired from a specialized tissue, organ, or the nervous system. Research has shown that including human feedback via one or more of these biosignals was feasible [20,21,22,23,24,25,26]. However, implementing human–machine interactions via these signals is sometimes impractical because of poor signal quality, bulky signal processing equipment, and restrictive human movements. Alternatively, many HRI studies [27,28,29,30,31,32,33,34] utilized the traditional, non-invasive, wearable surface electromyography (sEMG) technique, which measures the electrical activities of underlying muscle bundles. However, none of the myography-based HRI datasets are publicly available. Interestingly, the FMG technique was found comparable with the conventional sEMG technique in hand gesture recognition, rehabilitation, or prosthetic control applications for human–machine interaction studies. Research has shown that the FMG technique was advantageous compared to sEMG technique, with lower cost, smaller signal processing units using Bluetooth technology, ease of wearing the band and, hence, a better choice for HMI [35,36,37]. Recently, FMG-based human–robot collaboration (HRC) tasks were conducted where an industrial YUMI robot avoided collision with a human participant by recognizing intentional or random hand movements [38]. In separate HRC studies, grasping forces via FMG data were used to recognize the intended tool (object) grasped by the human worker during a shared task [39,40].
Very recently, a few FMG-based HRI studies were conducted by the authors of this article, where human-applied interaction force in dynamic motion was predicted to perform a collaborative task. There is no other similar research work in predicting human interactive forces for HRC tasks like these studies because of the uncertain and dynamic HRI environment and availability of research funding and trained personnel. Understanding human intended intentions of interactions can be vital in developing safe collaborations between a human worker and a manipulator. Hence, this paper releases a dataset under the CC BY NC ND 4.0 license collected during the studies conducted in [6,7,8,9,10] while dynamic interactions happened between several human participants with a biaxial stage and a Kuka robot. These experimental data were collected using certain setups. A participant interacted with the robot, applying hand forces in several dynamic arm movements that represented common activities during human–robot interactions. The FMG bands wrapped around the forearm, or the upper arm, captured the muscle readings of a participant during an activity with the robot and were mapped to an applied force in a certain motion via a trained model. So far, to our knowledge, there is not a publicly available FMG dataset on pHRI application. Hence, this dataset will provide a starting point for future research works and avoid the need for collecting data. These data will also help researchers to understand the nature of the sensory signals that captured muscle activities during certain pHRI interactions. Our goal is to provide insight about the FMG signals and their applicability as a safety band in human–robot interactions, and inspire other researchers to work on this dataset. As the studies on FMG-based pHRI have revealed its viability [6,7,8,9,10], we hope that this release will aid to fill the gap in available datasets and to facilitate future research in biosignal-based HRI control design.

2. Dataset Collection Instrumentations

This dataset contains FMG data collected from two separate pHRI setups where a participant interacted with (i) a biaxial stage (2-DoF linear robot), and (ii) a serial manipulator (7-DoF Kuka robot). FMG data were collected from: (a) upper arm and forearm positions during interactions with the biaxial stage, and (b) forearm position only when interacted with the Kuka robot. Each row of FMG data had a corresponding true force reading (N) from a 6-axis force-torque (FT) sensor.
Participants in these studies were healthy, right-handed and their average age was 33 ± 8 years. They acknowledged the study protocol and signed the consent forms as approved by the Office of Research Ethics at Simon Fraser University, Canada. In this repository, total 18 participants’ data are presented. Each participant is masked with a subject id (SubID: S1, S2, …, S18) only, there are no personal data associated in this public release to identify them.
  • FMG Bands
Data presented in this paper were collected using two custom-made wearable FMG bands worn on the upper arm and/or forearm muscle bellies during pHRI interactions, as shown in Figure 1. The bands were made of force sensing resistors (FSRs) whose resistances changed when muscles contracted. During interactions, resistances of these FSRs decreased when pressure increased (muscle contracted) and vice versa. Each of these bands had 16 FSRs (TPE 502C, Tangio Printed Electronics, Vancouver, BC, Canada), with a length of roughly 30 cm. Data acquisition devices from National Instruments (NI USB 6259 and 6341, National Instruments, Austin, TX, USA) were used to collect data from these bands at 10–50 Hz. The FMG data presented in the .csv files are the measured voltage drops across the voltage divider (10–20 kΩ base resistor) against each FSR. Better understanding on the FMG band can be found in [1]. Each row in a file corresponds to 10–100 ms of the time-series data based on the setup used.
  • The Biaxial Stage (2-DoF Linear Robot)
This 2-DoF linear robot consisted of two linear stages (X-LSQ450B, Zaber Technologies, Vancouver, BC, Canada) for the desired translational movements of 450 × 450 mm travel distance in the X and Y plane. The bottom linear stage was placed in the X direction while the upper stage was placed in the Y direction, as shown in Figure 2. A customized 3D printed knob-like gripper was mounted on top of the biaxial stage. Implementing admittance control allowed compliant collaboration that enabled participants to grab the gripper and apply forces to slide it in any intended direction in real-time. A 6-axis FT sensor (NI DAQ 6210, National Instruments, Austin, TX, USA) was placed inside the gripper as the true label generator. Detailed information of this setup can be found in [6].
  • The Serial Manipulator (7-DoF Kuka Robot)
The advanced KUKA LBR IIWA 14 R820 collaborative robot featured a 14 kg payload with an 820 mm reach. It came with built-in torque sensors in all joints except the end-effector and had its own controller: the ‘Kuka Sunrise Cabinet’. Implementing torque control helped ensure compliant collaboration such that the displacements and trajectories of the robot were governed by applied hand forces realized on the end-effector of the robot.
Figure 2. A biaxial stage with a knob-like gripper mounted on its top.
Figure 2. A biaxial stage with a knob-like gripper mounted on its top.
Data 07 00154 g002
For pHRI, a custom-made cylindrical gripper was attached as the end-effector via a customized adapter. A 6-axis FT sensor (NI DAQ 6210, National Instruments, Austin, TX, USA) was placed between the gripper and the adapter for true label generation. The orientation of the gripper was kept at {0, pi, 0} for ease of grasping, as shown in Figure 3a.
For an object transportation task during human–robot collaboration (HRC), a 45 cm rectangular wooden rod was attached to the end-effector of the robot via a custom-made adapter, as shown in Figure 3b. The rod was parallel in the horizontal X dimension, with one end free to grasp and apply force to move it from point A to point B in the 3D plane. The 6-axis FT sensor was used as the true force label generator placed in between the adapter and the end-effector. Further details of both setups can be found in [9].
Figure 3. Kuka robot with different end-effectors for a participant to grab and interact with (a) a cylindrical gripper, and (b) a wooden rod.
Figure 3. Kuka robot with different end-effectors for a participant to grab and interact with (a) a cylindrical gripper, and (b) a wooden rod.
Data 07 00154 g003

3. Dataset Association

3.1. Dataset 1: pHRI between Human Participants and the Biaxial Stage

A total of 17 participants’ (subjects’) pHRI data with a biaxial stage are presented in Dataset 1 as ‘pHRI_Biaxial_Stage’. Five different dynamic arm motions such as: “x-direction (X)”, “y-direction (Y)”, “diagonal (DG)”, “square (SQ)”, and “diamond (DM)” in the cartesian space were considered as the intended path trajectories for a participant to interact. During interactions, the participant, wearing upper arm and forearm FMG bands, grasped the gripper and continuously interacted for a certain time in sinusoidal fashion, as shown in Figure 4 and Table 1. Detailed study protocol and analysis are available in [6].
For model generalization, a multiple source dataset was constructed from participants S5, S6, S7, S8, and S10. The trained model was evaluated on repetitive participants: S3, S4, S9, and new participants: S11–S17. Detailed descriptions can be found in [7].
For domain adaptation and generalization, data were collected from participant S6 while interacting in square motion of different sizes: ‘SQ-diffSize’ [8]. For data collection and evaluation, interactions were performed continuously for a certain time in a sinusoidal motion on the planar surface, with directions as indicated with arrows in Table 1.

3.2. Dataset 2: pHRI between a Human Participant and a Manipulator

In this dataset, Raw FMG signals from a 16-channel forearm band are presented as ‘pHRI_Manipulator’. It contains data during interactions between a participant S18 and a manipulator (Kuka robot) in 1D, 2D, and 3D planes. During pHRI, the Kuka robot had cylindrical gripper as the end-effector, as shown in Figure 5a [9,10]. The participant grabbed the cylindrical gripper, applied hand forces, and moved the gripper in 1D (X, Y, Z directions), 2D (XY, YZ, XZ planes) and 3D (XYZ plane). The robot followed the directions and trajectories of the human participant during interactions.
For collaborative tasks, a wooden rod was attached to the robot’s flange. Participant S18 grabbed the open-end of the rod and moved it in the half-circle trajectories in the 3D plane from point A to point B, as shown in Figure 5b [9]. Detailed descriptions of these study protocols and data analysis are available in [9,10].

4. Dataset Description

4.1. Dataset 1: pHRI_Biaxial Stage

Raw FMG signals from the 32-channel bands (two bands) along with true force label(s) are saved as .csv files with corresponding subject identification (SubID such as S1, S2, …, S17) and presented as tabular formats. For the first ten participants (S1–S10), 5 repetitions (rep0-rep4) of training data are included in this dataset for each arm motion (1D-X, 1D-Y, 2D-DG, 2D-SQ, and 2D-DM) collected during the study carried out in [6]. For 1D-X and 1D-Y directions, there are 33 columns in each data file, out of which the first column corresponds to the true force label (Fx or Fy) in newtons (N). The positive force values (+N) indicate interactive forces in one direction while negative force values (−N) indicate the opposite direction because of continuous movements in sinusoidal fashion. The rest of the 32 columns are the 32-channel FMG data (32 feature space) collected from the upper arm and forearm bands. Data files on 2D-DG, 2D-SQ, and 2D-DM interactions have 34 columns, the first two columns of which are force readings from the FT sensor in the X and Y direction (Fx, Fy). The name format follows notation such as ‘pHRI_BiaxialStage_S5_2D_DG_Rep3.csv’ to indicate the 4th repetition of FMG data during interactions between the participant S5 and the biaxial stage in diagonal direction. A total of 100,000 × 34 data samples were collected during this study.
In the study conducted in [7], two repetitions (rep0-rep1) of interactive data in 1D-X and 2D-DG were collected from seven new participants (S11-S17) and were used to calibrate a long-term generalized model. As existing users, 2 repetitions of 1D-X and 2D-DG data were also collected from participants S3, S4, and S9. The naming convention for the repetitive participants are given as ‘pHRI_BiaxialStage_S9_1D_X_Session2_Rep0.csv’, with the additional tagging ‘Session2’ to indicate a second data-collection session. Each file has 400 rows of data. In addition to the existing dataset from the previous study [6], a total of 16,000 × 34 samples of data were collected for zero-shot learning.
The study conducted in [8] collected pHRI data between participant S6 and the biaxial stage in ‘2D-SQ-DiffSize’ dynamic motion. A total of 16 repetitions of data collected in four separate sessions are included in the dataset. Data from the first three sessions (14 repetitions) were used for pretraining a deep learning model and the final session data were used for fine tuning [8]. These files have 600 rows and 34 columns, where the first two columns are the true force labels (Fx, Fy) and the rest are the FMG feature space distributions. A total of 9600 × 34 samples of data were collected for pretraining and finetuning. The files have a naming convention like ‘pHRI_BiaxialStage_S6_2D_SQ_diffSize_Rep0.csv’ corresponding to the first repetition of interactions in 2D-SQ-diffSize motions with the biaxial stage. For model generalization, the pretrained model was evaluated on five participants (S1:S5) during interactions in ‘2D-SQ’ motion [8]. Figure 6 shows the FMG signals capturing interactions with the biaxial stage in diagonal motion for a male and a female participant. The plot of these captured signals shows that the participants were interacting with the stage with their own pace of applied force.

4.2. Dataset 2: pHRI_Manipulator

All FMG data during interactions with the serial manipulator in the study conducted and presented in [9,10] are gathered in this dataset, saved as .csv files and presented in tabular formats. As before, the FT sensor data (true labels) have positive force values (+N), indicating applied force in one direction and negative force values (−N) in the opposite direction during interactions. For appropriate interaction data, 50 rows at the beginning and at the end can be stripped.
Five (5) repetitions (rep0-rep4) of data were collected for each dimension (1D, 2D, and 3D plane), and hence the naming format follows the notation starting with ‘pHRI_Manipulator’. Each file has 19 columns, the first three of which are the true force labels (Fx, Fy, Fz) in newtons (N) and the last 16 columns of which are the 16-channel FMG features space. Data files have names such as ‘pHRI_Manipulator_S18_1D_X_Rep0.csv’ to indicate the first repetition of interaction data in 1D-X direction between the participant S18 and the manipulator. Likewise, ‘pHRI_Manipulator_S18_2D_XZ_Rep4.csv’, or ‘pHRI_Manipulator_S18_3D_XYZ_Rep2.csv’ file names mean final repetition of interactions between S18 and the Kuka robot in 2D-XZ plane or the third repetition in the 3D-XYZ plane. Figure 7 shows a few plots of the FMG signals during interactions with the Kuka robot in a certain direction such as 1D-X, 1D-Y, and 3D-XYZ.
For the collaborative task of moving the wooden rod in 3D, five repetitions of FMG data were collected and have names such as ‘pHRI_Manipulator_S18_3D_XYZ_HRC_Rep3.csv’, where ‘HRC’ means human–robot collaboration, ‘Manipulator’ is the Kuka robot, and the 4th repetition of interactive forces as participant ‘SubID: S18’ moved the rod collaboratively with the robot in the 3D plane.
Data repetitions collected during each category of pHRI are shown in Table 2.
Table 3 shows the summary of the dataset, format of each file, and the total repository presented.

5. Discussion

This article describes the datasets that were collected during the studies conducted in [6,7,8,9,10]. A brief description of each study is provided below for readers’ clarity and ease of understanding the relevance of the datasets.
In [6], interactive force estimation to manipulate the linear robot/ biaxial stage was derived from 32 FMG channels in a 2D planar workspace. Interactions occurred in five different dynamic motions (1D-X, 1D-Y, 2D-diagonal/DG, 2D-square/SQ and 2D-diamond/DM) to examine human intentions of manipulating the robot in an intended direction. The 2D motions required more gradually complex muscle activities and arm movements. Real-time evaluations conducted via intra-session (i.e., training and testing in the same session) supervised models (support vector regressor, SVR and kernel ridge regressor, KRR), and were found effective in real-time force estimations (R2: 90–94% in 1D and 82–91% in 2D motions). In this study, a separate trained model was required to estimate forces for each participant interacting in each motion.
For industrial pHRI applications, general applicability of a trained model for all workers was investigated in [7,8]. A large volume of 32-channel FMG source data collected during real-time interactions between several participants and the linear robot in selected motions (1D-X, 2D-diagonal/DG, 2D-square/SQ and 2D-square of different sizes/SQ-diffSize) were used for training. Real-time evaluations of a generalized model to recognize applied forces from 32-channel, out-of-distribution (OOD) target data were conducted. In [7], the supervised generalized model based on support vector regression (SVR) was evaluated for recognizing interactive forces in a new intended motion, or for a new participant (R2: 90–94% [1D-X], 80–85% [2D-DG]). While in [8], a generalized model based on convolutional neural network (CNN) was found effective in recognizing unseen, similarly intended motion or a participant interacting daily in the same intended motion for (R2: 88% [2D-SQ], 89% [2D-SQ-diffSize]).
In [9], a 3D-HRC task of moving a wooden rod in collaboration with the 7-DoF Kuka LBR IIWA 14 robot was investigated via a 16-channel FMG forearm band. Additionally, interactions with the Kuka robot were investigated to estimate grasping forces in a dynamic motion in the 1D, 2D and 3D workspaces via an intra-session CNN model. To improve model performance and to generalize with adequate training data, a large volume of source data (long-term data) collected during interactions with the biaxial stage was used. A cross-domain generalization (CDG) method was implemented for transferring knowledge between this unrelated source (2D-pHRI platform) and the target data (3D-pHRI platform). A pretrained model with CNN architecture performed better in simple 1D grasping interactions (R2: 79–87%), while its performance slightly improved during collaborative task of moving the rod in 3D (R2: ≈60–63%).
In [10], a study was conducted to address the real-world problem of unlabeled or inadequate training data. Obtaining enough training data, having more participants, or labeling all data were not possible with the Kuka robot. Therefore, the study focused on synthetic FMG data generation by implementing domain randomization technique using a CNN-based generative adversarial network (GAN). Knowledge learnt from the latent feature distributions was transferred via semi-supervised learning during intra-session test data evaluation. For this investigation, pHRI with the Kuka robot in 1D (X, Y and Z directions) was investigated using 16-channel forearm FMG signals. The proposed model performed (R2: 77–84%) like the supervised model (R2: 78–88%) with fewer labeled training datasets (only 25% were labeled) and a large volume of unlabeled synthetic FMG data (2.5 times more than the real data).
These studies revealed that FMG-based model generalization, domain adaptation, and cross-domain generalization were possible where a pretrained model was evaluated to estimate interactive forces in dynamic motions [7,8,9]. In [5], we also found that recognizing hand grasping with FMG data was feasible via a transfer learning technique even with an unrelated dataset, i.e., the pretrained Alexnet model. Hence, in the future, these data can be used in pretraining a transfer learning model for research or industrial applications of either FMG-based or other sensory-based pHRI activities. In [10], we generated synthetic FMG data using a few real FMG data from this dataset using domain randomizations; the aforementioned transformation techniques can be utilized in real-life FMG data generation for research work when collecting data is not possible. There was room for improvements for model performances during a collaborative task with the Kuka robot. Therefore, the use of this dataset by others can enhance pHRI quality in safe collaborations with either 16-channel or 32-channel FMG signals.
These data were collected over a few years with different setups, corresponding studies were conducted, and results were published. We did not include data in each article because they would be mostly a fraction of the whole dataset and would require us to describe the data repeatedly. Collecting these human–robot interaction data required an expensive setup, experienced research personnel, time and effort in recruiting participants and longer hours for gathering these data. Thus, we expect this release will have a great impact on the research field.
The dataset discussed in Section 4.1 was collected before the pandemic and several participants voluntarily participated. The pandemic started before we could collect the interaction data described in Section 4.2 with the serial manipulator. These data were collected when restricted use of research areas opened. Working with human participants was strictly monitored to avoid health hazards, and it became difficult to have volunteers at that time. We recruited only one participant to collect interactive data with the Kuka robot. As the project timeline had also finished by the time pandemic ended, we had no option to engage more human participants.

6. Conclusions

Implementing pHRI with FMG data by learning human intentions is a state-of-the-art research area for industrial application. With traditional machine learning and recent deep learning techniques, the FMG-based human interactions with robots show potential for industrial applications. Due to limited resources, collecting HRI data is expensive and time consuming. As it is hard to find any datasets or repositories of myographic signals or any other bio signals related to HRI applications, we expect to fill a void in the field with the published research works and the corresponding data. Therefore, the release of these FMG-based pHRI data with two different robots will be useful in future studies of human intents of movements during collaborative tasks, and will benefit the research community.

7. User Notes

Dataset is readily available on Zenodo and can be downloaded at: https://doi.org/10.5281/zenodo.6632020 (accessed on 28 June 2022).

Author Contributions

U.Z. investigated and developed methodologies, designed the protocol, collected the FMG-based pHRI datasets, prepared the dataset and wrote the manuscript. C.M. supervised and conceptualized the project, contributed to the design of the protocol and methods, and participated in manuscript revisions. All authors have read and agreed to the published version of the manuscript.

Funding

This research was supported by the Natural Sciences and Engineering Research Council of Canada (NSERC), the Canadian Institutes of Health Research (CIHR), and the Canada Research Chair (CRC) program.

Institutional Review Board Statement

The study was conducted in accordance with the Declaration of Helsinki and approved by the Office of the Research Ethics of Simon Fraser University, Burnaby, BC, Canada.

Informed Consent Statement

Informed consent was obtained from all subjects involved in the study.

Data Availability Statement

Data can be freely downloaded at: https://doi.org/10.5281/zenodo.6632020 (accessed on 7 November 2022) under Creative Commons Attribution 4.0 International License. The corresponding author can be contacted in case of need.

Acknowledgments

We would like to thank the participants for their voluntary contributions and members of Menrva Research Group for assisting in this project.

Conflicts of Interest

The authors declare no conflict of interest. The principal investigator, Carlo Menon, and members of his research team have a vested interest in commercializing the technology tested in this study if it is proven to be successful, and may benefit financially from its potential commercialization.

References

  1. Xiao, Z.G.; Menon, C. Towards the development of a wearable feedback system for monitoring the activities of the upper-extremities. J. NeuroEng. Rehabil. 2014, 11, 2. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  2. Xiao, Z.G.; Menon, C. A Review of Force Myography Research and Development. Sensors 2019, 19, 4557. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  3. Jiang, X.; Merhi, L.-K.; Menon, C. Force exertion affects grasp classification using force myography. IEEE Trans. Hum.-Mach. Syst. 2018, 48, 219–226. [Google Scholar] [CrossRef]
  4. Sakr, M.; Menon, C. On the estimation of isometric wrist/forearm torque about three axes using force myography. In Proceedings of the 2016 6th IEEE International Conference on Biomedical Robotics and Biomechatronics (BioRob), Singapore, 26–29 June 2016; pp. 827–832. [Google Scholar]
  5. Zakia, U.; Jiang, X.; Menon, C. Deep learning technique in recognizing hand grasps using FMG signals. In Proceedings of the 2020 11th IEEE Annual Information Technology, Electronics and Mobile Communication Conference (IEMCON), Vancouver, BC, Canada, 4–7 November 2020; pp. 0546–0552. [Google Scholar] [CrossRef]
  6. Zakia, U.; Menon, C. Estimating Exerted Hand Force via Force Myography to Interact with a Biaxial Stage in Real-Time by Learning Human Intentions: A Preliminary Investigation. Sensors 2020, 20, 2104. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  7. Zakia, U.; Menon, C. Toward Long-Term FMG Model-Based Estimation of Applied Hand Force in Dynamic Motion during Human–Robot Interactions. IEEE Trans. Hum.-Mach. Syst. 2021, 51, 310–323. [Google Scholar] [CrossRef]
  8. Zakia, U.; Menon, C. Force Myography-Based Human Robot Interactions via Deep Domain Adaptation and Generalization. Sensors 2022, 22, 211. [Google Scholar] [CrossRef]
  9. Zakia, U.; Menon, C. Human Robot Collaboration in 3D via Force Myography based Interactive Force Estimations using Cross-Domain Generalization. IEEE Access 2022, 10, 35835–35845. [Google Scholar] [CrossRef]
  10. Zakia, U.; Barua, A.; Jiang, X.; Menon, C. Unsupervised, Semi-supervised Interactive Force Estimations during pHRI via Generated Synthetic Force Myography Signals. IEEE Access 2022, 10, 69910–69921. [Google Scholar] [CrossRef]
  11. Lemaignan, S.; Edmunds, C.E.; Senft, E.; Belpaeme, T. The PInSoRo dataset: Supporting the data-driven study of child-child and child-robot social dynamics. PLoS ONE 2018, 13, e0205999. [Google Scholar] [CrossRef]
  12. Billing, E.; Belpaeme, T.; Cai, H.; Cao, H.L.; Ciocan, A.; Costescu, C.; David, D.; Homewood, R.; Hernandez Garcia, D.; Gómez Esteban, P.; et al. The DREAM Dataset: Supporting a data-driven study of autism spectrum disorder and robot enhanced therapy. PLoS ONE 2020, 15, e0236939. [Google Scholar] [CrossRef]
  13. CADDY Underwater Gestures Dataset. Available online: http://www.caddian.eu/ (accessed on 9 June 2022).
  14. Novitzky, M.; Robinette, P.; Benjamin, M.R.; Fitzgerald, C.; Schmidt, H. Aquaticus: Publicly available datasets from a marine human-robot teaming testbed. In Proceedings of the 14th ACM/IEEE International Conference on Human-Robot Interaction (HRI ‘19), Daegu, Korea, 11–14 March 2019; IEEE Press: Piscataway, NJ, USA, 2019; pp. 392–400. [Google Scholar]
  15. MAHNOB-HCI Tagging DATABASE. HCI Tagging Database–Home. Available online: mahnob-db.eu/hci-tagging/ (accessed on 9 June 2022).
  16. Celiktutan, O.; Skordos, E.; Gunes, H. Multimodal Human-Human-Robot Interactions (MHHRI) Dataset for Studying Personality and Engagement. IEEE Trans. Affect. Comput. 2017, 10, 484–497. [Google Scholar] [CrossRef] [Green Version]
  17. Singh, N.; Lee, J.J.; Grover, I.; Breazeal, C. P2PSTORY: Dataset of Children Storytelling and Listening in Peer-to-Peer Interactions. In Proceedings of the CHI Conference on Human Factors in Computing Systems, Montreal, QC, Canada, 21–26 April 2018. [Google Scholar]
  18. Ben-Youssef, A.; Clavel, C.; Essid, S.; Bilac, M.; Chamoux, M.; Lim, A. UE-HRI: A new dataset for the study of user engagement in spontaneous human-robot interactions. In Proceedings of the 19th ACM International Conference on Multimodal Interaction (ICMI ‘17), Glasgow, UK, 13–17 November 2017; Association for Computing Machinery: New York, NY, USA, 2017; pp. 464–472. [Google Scholar] [CrossRef]
  19. Peng, W.; Dicai, C. A Physical Human-Robot Interaction Dataset-TacAct [Data set]. In Proceedings of the IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS 2021), Prague, Czech Republic, 27 September–1 October 2021. [Google Scholar]
  20. Ludovico, M.; Yoshimura, N.; Koike, Y. Hybrid control of a vision-guided robot arm by EOG, EMG, EEG biosignals and head movement acquired via a consumer-grade wearable device. IEEE Access 2016, 4, 9528–9541. [Google Scholar]
  21. Maki, Y.; Sano, G.; Kobashi, Y.; Nakamura, T.; Kanoh, M.; Yamada, K. Estimating subjective assessments using a simple biosignal sensor. In Proceedings of the 2012 IEEE International Conference on Fuzzy Systems, Brisbane, QLD, Australia, 10–15 June 2012; pp. 1–6. [Google Scholar]
  22. Cao, T.; Sun, J.; Liu, D.; Wang, Q.; Wang, H. Wireless Collaboration Technology Based on Electroencephalograph and Electromyography. In Proceedings of the IEEE 4th International Conference on Power, Intelligent Computing and Systems (ICPICS), Shenyang, China, 29–31 July 2022; pp. 921–925. [Google Scholar]
  23. Liu, Y.; Yang, C.; Wang, M. Human-Robot Interaction Based on Biosignals. In Proceedings of the IEEE International Symposium on Autonomous Systems (ISAS), Guangzhou, China, 6–8 December 2020; pp. 58–63. [Google Scholar]
  24. Specht, B.; Tayeb, Z.; Dean, E.; Soroushmojdehi, R.; Cheng, G. Real-Time Robot Reach-To-Grasp Movements Control Via EOG and EMG Signals Decoding. In Proceedings of the IEEE International Conference on Robotics and Automation (ICRA), Paris, France, 31 May–31 August 2020; pp. 3812–3817. [Google Scholar]
  25. Perdiz, J.; Pires, G.; Nunes, U.J. Emotional state detection based on EMG and EOG biosignals: A short survey. In Proceedings of the IEEE 5th Portuguese Meeting on Bioengineering (ENBENG), Coimbra, Portugal, 16–18 February 2017; pp. 1–4. [Google Scholar]
  26. Al-Qaysi, Z.T.; Zaidan, B.B.; Zaidan, A.A.; Suzani, M.S. A review of disability EEG based wheelchair control system: Coherent taxonomy, open challenges and recommendations. Comput. Methods Programs Biomed. 2018, 164, 221–237. [Google Scholar] [CrossRef]
  27. Geethanjali, P.; Ray, K.K. A low-cost real-time research platform for EMG pattern recognition-based prosthetic hand. IEEE/ASME Trans. Mechatron. 2015, 20, 1948–1955. [Google Scholar] [CrossRef]
  28. Gui, K.; Liu, H.; Zhang, D. A Practical and Adaptive Method to Achieve EMG-Based Torque Estimation for a Robotic Exoskeleton. IEEE/ASME Trans. Mechatron. 2019, 24, 483–494. [Google Scholar] [CrossRef]
  29. Yokoyama, M.; Koyama, R.; Yanagisawa, M. An evaluation of hand-force prediction using artificial neural-network regression models of surface EMG signals for handwear devices. J. Sens. 2017, 2017, 3980906. [Google Scholar] [CrossRef] [Green Version]
  30. Zhang, Q.; Hayashibe, M.; Fraisse, P.; Guiraud, D. FES-Induced torque prediction with evoked EMG sensing for muscle fatigue tracking. IEEE/ASME Trans. Mechatron. 2011, 16, 816–826. [Google Scholar] [CrossRef] [Green Version]
  31. Duan, F.; Dai, L.; Chang, W.; Chen, Z.; Zhu, C.; Li, W. sEMG-based identification of hand motion commands using wavelet neural network combined with discrete wavelet transform. IEEE Trans. Ind. Electron. 2016, 63, 1923–1934. [Google Scholar] [CrossRef]
  32. Allard, U.C.; Nougarou, F.; Fall, C.L.; Giguère, P.; Gosselin, C.; Laviolette, F.; Gosselin, B. A convolutional neural network for robotic arm guidance using sEMG based frequency-features. In Proceedings of the 2016 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), Daejeon, Korea, 9–14 October 2016; pp. 2464–2470. [Google Scholar]
  33. Meattini, R.; Benatti, S.; Scarcia, U.; De Gregorio, D.; Benini, L.; Melchiorri, C. An sEMG-based human–robot interface for robotic hands using machine learning and synergies. IEEE Trans. Compon. Packag. Manuf. Technol. 2018, 8, 1149–1158. [Google Scholar] [CrossRef]
  34. Oskoei, M.A.; Hu, H. Myoelectric control systems—A survey. Biomed. Signal Process. Control 2007, 2, 275–294. [Google Scholar] [CrossRef]
  35. Sanford, J.; Patterson, R.; Popa, D.O. Concurrent surface electromyography and force myography classification during times of prosthetic socket shift and user fatigue. J. Rehabil. Assist. Technol. Eng. 2017, 4, 1–13. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  36. Jiang, X.; Merhi, L.K.; Xiao, Z.G.; Menon, C. Exploration of force myography and surface electromyography in hand gesture classification. Med. Eng. Phys. 2017, 41, 63–73. [Google Scholar] [CrossRef] [PubMed]
  37. Belyea, A.; Englehart, K.; Scheme, E. FMG Versus EMG: A comparison of usability for real-time pattern recognition based control. IEEE Trans. Biomed. Eng. 2019, 66, 3098–3104. [Google Scholar] [CrossRef] [PubMed]
  38. Anvaripour, M.; Khoshnam, M.; Menon, C.; Saif, M. FMG- and RNN-Based Estimation of Motor Intention of Upper-Limb Motion in Human-Robot Collaboration. Front. Robot. AI 2020, 7, 183. [Google Scholar] [CrossRef]
  39. Kahanowich, N.D.; Sintov, A. Robust Classification of Grasped Objects in Intuitive Human-Robot Collaboration Using a Wearable Force-Myography Device. IEEE Robot. Autom. Lett. 2021, 6, 1192–1199. [Google Scholar] [CrossRef]
  40. Bamani, E.; Kahanowich, N.D.; Ben-David, I.; Sintov, A. Robust Multi-User In-Hand Object Recognition in Human-Robot Collaboration Using a Wearable Force-Myography Device. IEEE Robot. Autom. Lett. 2022, 7, 104–111. [Google Scholar] [CrossRef]
Figure 1. Two customized force myography (FMG) bands worn on the forearm and upper arm muscle belly of a participant to read muscle contraction.
Figure 1. Two customized force myography (FMG) bands worn on the forearm and upper arm muscle belly of a participant to read muscle contraction.
Data 07 00154 g001
Figure 4. Participant wearing FMG bands on upper-arm and forearm interacts with the biaxial stage by grasping the gripper/knob.
Figure 4. Participant wearing FMG bands on upper-arm and forearm interacts with the biaxial stage by grasping the gripper/knob.
Data 07 00154 g004
Figure 5. (a) pHRI between participant S18 and the Kuka robot, and (b) HRC between participant S18 and the manipulator during moving a wooden rod in 3D workspace from point A to point B in a half-circular path.
Figure 5. (a) pHRI between participant S18 and the Kuka robot, and (b) HRC between participant S18 and the manipulator during moving a wooden rod in 3D workspace from point A to point B in a half-circular path.
Data 07 00154 g005
Figure 6. 32-channel FMG signals capturing muscle readings during interactions with biaxial stage in DG motion: (a) A male participant, and (b) A female participant.
Figure 6. 32-channel FMG signals capturing muscle readings during interactions with biaxial stage in DG motion: (a) A male participant, and (b) A female participant.
Data 07 00154 g006
Figure 7. 16 channel FMG signal readings during pHRI between participant S18 and the Kuka robot: (a) in 1D-X, and (b) in 1D-Y, and (c) in 3D for HRC task. Adapted with permission from Ref. [9]. Copyright 2022 IEEE.
Figure 7. 16 channel FMG signal readings during pHRI between participant S18 and the Kuka robot: (a) in 1D-X, and (b) in 1D-Y, and (c) in 3D for HRC task. Adapted with permission from Ref. [9]. Copyright 2022 IEEE.
Data 07 00154 g007
Table 1. Five interactive arm motion patterns.
Table 1. Five interactive arm motion patterns.
XYDG
Data 07 00154 i001Data 07 00154 i002Data 07 00154 i003
SQSQ-diffSizeDM
Data 07 00154 i004Data 07 00154 i005Data 07 00154 i006
Table 2. Naming conventions followed.
Table 2. Naming conventions followed.
Dataset 1: pHRI_BiaxialStage Dataset 2: pHRI_Manipulator
1D-XpHRI_BiaxialStage_SubID_1D_X_Rep0: Rep4.csv
pHRI_BiaxialStage_SubID_1D_X_Session2_Rep0: Rep1.csv
1D-XpHRI_Manipulator_1D_SubID_X_Rep0: Rep4.csv
1D-YpHRI_BiaxialStage_SubID_1D_Y_Rep0: Rep4.csv1D-YpHRI_Manipulator_SubID_1D_Y_Rep0: Rep4.csv
2D-DGpHRI_BiaxialStage_SubID_2D_DG_Rep0: Rep4.csv
pHRI_BiaxialStage_SubID_2D_DG_Session2_Rep0: Rep1.csv
2D-XYpHRI_Manipulator_SubID_2D_XY_Rep0: Rep4.csv
2D-SQpHRI_BiaxialStage_SubID_2D_SQ_Rep0: Rep4.csv2D-YZpHRI_Manipulator_SubID_2D_YZ_Rep0: Rep4.csv
2D-DMpHRI_BiaxialStage_2D_SubID_DM_Rep0: Rep4.csv2D-XZpHRI_Manipulator_SubID_2D_XZ_Rep0: Rep4.csv
2D-SQ-Diff-SizepHRI_BiaxialStage_2D_SubID_SQ_diffSize_Rep0: Rep15.csv3D-XYZpHRI_Manipulator_SubID_3D_XYZ_Rep0: Rep4.csv
HRC in 3D-XYZpHRI_Manipulator_SubID_3D_XYZ_HRC_Rep0: Rep4.csv
Table 3. Dataset summary.
Table 3. Dataset summary.
pHRI1D2D3D
Biaxial StageTotal files: 120Total files: 186NA
Participant: S1-S17
Upper arm & Forearm FMG data
Col 1 # Fx/Fy data
Col 2:33 # FMG data
Col 1:2 # Fx, Fy data
Col 3:34 # FMG data
ManipulatorTotal files: 15 Total files: 15 Total files: 10
Participant: S18
Forearm FMG data
Col 1:3 # Fx, Fy, Fz data
Col 4:19 # FMG data
Col 1:3 # Fx, Fy, Fz data
Col 4:19 # FMG data
Col 1:3 # Fx, Fy, Fz data
Col 4:19 # FMG data
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Share and Cite

MDPI and ACS Style

Zakia, U.; Menon, C. Dataset on Force Myography for Human–Robot Interactions. Data 2022, 7, 154. https://doi.org/10.3390/data7110154

AMA Style

Zakia U, Menon C. Dataset on Force Myography for Human–Robot Interactions. Data. 2022; 7(11):154. https://doi.org/10.3390/data7110154

Chicago/Turabian Style

Zakia, Umme, and Carlo Menon. 2022. "Dataset on Force Myography for Human–Robot Interactions" Data 7, no. 11: 154. https://doi.org/10.3390/data7110154

Article Metrics

Back to TopTop