Next Article in Journal
Luxury Car Data Analysis: A Literature Review
Next Article in Special Issue
Predicting Academic Success of College Students Using Machine Learning Techniques
Previous Article in Journal
WEA-Acceptance Data—A Dataset of Acoustic, Meteorological, and Operational Wind Turbine Measurements
Previous Article in Special Issue
Knowledge Discovery and Dataset for the Improvement of Digital Literacy Skills in Undergraduate Students
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Data Descriptor

An EEG Dataset of Subject Pairs during Collaboration and Competition Tasks in Face-to-Face and Online Modalities

by
María A. Hernández-Mustieles
,
Yoshua E. Lima-Carmona
,
Axel A. Mendoza-Armenta
,
Ximena Hernandez-Machain
,
Diego A. Garza-Vélez
,
Aranza Carrillo-Márquez
,
Diana C. Rodríguez-Alvarado
,
Jorge de J. Lozoya-Santos
and
Mauricio A. Ramírez-Moreno
*
Mechatronics Department, School of Engineering and Sciences, Tecnologico de Monterrey, Monterrey 64700, NL, Mexico
*
Author to whom correspondence should be addressed.
Submission received: 17 February 2024 / Revised: 9 March 2024 / Accepted: 13 March 2024 / Published: 27 March 2024

Abstract

:
This dataset was acquired during collaboration and competition tasks performed by sixteen subject pairs (N = 32) of one female and one male under different (face-to-face and online) modalities. The collaborative task corresponds to cooperating to put together a 100-piece puzzle, while the competition task refers to playing against each other in a one-on-one classic 28-piece dominoes game. In the face-to-face modality, all interactions between the pair occurred in person. On the other hand, in the online modality, participants were physically separated, and interaction was only allowed through Zoom software with an active microphone and camera. Electroencephalography data of the two subjects were acquired simultaneously while performing the tasks. This article describes the experimental setup, the process of the data streams acquired during the tasks, and the assessment of data quality.
Dataset License: CC0

1. Summary

Hyperscanning is a technique that involves analyzing synchronized EEG signals from different subjects. It allows for the study of brain activity in multiple users and opens opportunities for studying social interactions [1,2,3,4,5]. The dynamics of learning with human interaction are complex and imply behavioral synchronization [6], but a comparison between competitive and collaborative paradigms demonstrates the replication of social dynamics and the occurrence of inter-brain neural relations [7].
Collaboration and competition are complex social interactions, and the first study aimed at understanding the neural bases behind them used functional Magnetic Resonance Imaging (fMRI) to measure the hemodynamic response of participants while they played a personalized computer game [8]. Other studies with a similar purpose instead used the hyperscanning technique with EEG to analyze inter-brain coupling and intra-brain spectral power [9], inter-brain synchronization [10], or functional connectivity patterns based on phase [11]. Considering that hyperscanning extracts EEG data simultaneously, multiple systems or devices are needed, but this is an effective approach to studying functional connectivity between brain regions of different individuals, a metric related to interpersonal coupling during social interactions that also serves as a predictor of collective educational performance in online and face-to-face scenarios [5].
A database that follows the previous conditions is presented, with the purpose of performing more robust analyses over subject pairs’ synchronized EEG data and proposing better learning strategies in the future. It consists of eight subject pairs performing collaborative and competition tasks in a face-to-face modality, and eight pairs performing identical tasks in an online modality. The dataset was recorded over the course of two months and contains the synchronized electroencephalography (EEG) signals of the subject pairs while performing the tasks.
During the collaborative task, participants were asked to cooperate to put together a 100-piece puzzle. They were instructed to talk to each other, to interact as much as possible between them, to come to agreements, and to work as a team to attempt to complete as much of the puzzle as possible in a time duration of ten minutes. On the other hand, for the competition task, participants were asked to play against each other in a classic one-on-one 28-piece domino game. In this case, the instruction was to avoid any type of interaction or communication between them, with the purpose of focusing only on winning as many matches as possible against the other participant in ten minutes. In the face-to-face modality, during the collaborative task, participants were seated next to each other (Figure 1A) or in front of each other during the competition task (Figure 1B), and all interactions between the pair occurred in person. Meanwhile, for the online modality, participants were seated in the same room, but far from each other, and interaction was only allowed through computers using Zoom software with an active microphone and camera. Tasks were performed using online applications on the computers, and the screen was shared via Zoom. EEG data from each of the participants were recorded synchronously during each task.

2. Data Description

This dataset consists of EEG recordings from 32 healthy subjects when performing different tasks in different modalities, with four recordings per subject (a total of 128 recordings). The first two recordings per subject (1 min each) correspond to the resting state of Eyes Open (EO) and Eyes Closed (EC), while the third and fourth recordings correspond to the collaboration and the competition tasks, respectively (10 min each). Half of the recordings correspond to tasks performed in the face-to-face modality and half to the online modality. A detailed explanation of the experimental procedure followed for each of the tasks and each of the modalities (face-to-face and online) can be found in Section 3.2.

2.1. Participants

A total of 32 healthy subjects (16 f. and 16 m.) aged 18–24 ( μ  = 21.90 ± 1.52) participated in the study, making up a total of eight female–male pairs participating in the face-to-face modality and eight pairs participating in the online modality. Similar sample sizes have been reported in other hyperscanning studies in the literature. In [12], EEG signals of six pairs of civil pilots were acquired to study collaborative interactions during a simulated flight. Similarly in [13], nine subject pairs participated in playing collaborative and competitive computer games, using EEG to analyze team neurodynamics. Other studies also delve into brain synchronization during creative performances. In [14], inter-brain synchrony is studied between three subjects forming three pairs, while [15] studied the same synchrony with five subjects and three pairs during a musical performance and a dance performance, respectively.
Participants were excluded from this study if they:
  • Presented or had presented any mental disability.
  • Had been diagnosed with a neurological disorder, such as autism, Parkinson’s, cerebral palsy, and/or attention deficit disorder.
  • Were currently under any type of medication.
The experimental protocol and the informed consent forms were reviewed and approved by the Institutional Research Ethics Committee of Tecnologico de Monterrey (protocol code: EHE-2023-03; date of approval: 11 August 2023), and an informed consent form was handed to each participant for review and signing, ensuring the informed and voluntary participation of each volunteer.

2.2. EEG Recordings and Structure

The EEG signals were obtained using four dry electrodes (A1, A2, C3, and C4), placed according to the 10–20 international system and with a sampling rate of 250 Hz. EO and EC resting state recordings lasted for 1 min, while recordings during the collaboration and competition tasks in both face-to-face and online modalities lasted for 10 min.
The nomenclature of each of the files in the dataset is as follows:
  • ALAS: acronym stands for Advanced Learner Assistance System.
  • Recording01: Tasks performed in the face-to-face modality (first EEG recording of the experiment).
  • Recording02: Tasks performed in the online modality (second EEG recording of the experiment).
  • P01: Female subject.
  • P02: Male subject.
  • Dyad0X: Data collected for different dyads.
  • Task0X: Data collected for different tasks.
    • Eyes Open.
    • Eyes Closed.
    • Puzzle (collaborative task).
    • Domino (competitive task).
  • _suffix: Type of data.
Each EEG recording was saved in a .csv file containing five columns. The first column corresponds to timestamps (in UNIX format) of all samples, essential for future data synchronization. Subsequently, the four following columns represent the measurements of each of the four electrodes (A2, A1, C4, and C3, respectively). The .csv files were converted to .mat files using MATLAB.
The _EEG.mat file contains a 1 × 2 structure with electroencephalographic data collected by the Enophones. The first element in the structure corresponds to Recording01 (face-to-Face modality); the second element in the structure corresponds to Recording02 (online modality). The structure contains multiple fields corresponding to each dyad (Dyad01–Dyad08) and each task (Task01–Task04). Table 1 breaks down the information contained within these fields.

3. Methods

3.1. Instrumentation and Data Collection

For the recording of EEG data, an Enophones (Eno, Montreal, CA) wearable device was used, shown in Figure 2A. Its selection as a wearable, non-invasive device for EEG signal acquisition addresses the demand for simplicity and safety in obtaining signals. They are equipped with four dry EEG electrodes with placements that follow the 10–20 system: two in the top band for C3 and C4, and two in the earcups, for A1 and A2, as depicted in Figure 2B. These noise-canceling headphones offer an affordable solution (approximately USD 400 at the time of purchase) for wireless and comfortable EEG data acquisition. Their practicality and convenience make them ideal for conducting experiments in realistic settings.
Even though the channels C3 and C4 on the top band of the Enophones are related to the primary motor cortex, research indicates that this area is also involved in cognitive processes. For instance, the premotor cortex processes spatial information and its rostral sector is related to cognitive manipulation, while the dorsolateral prefrontal cortex is in charge of redirecting mental resources and attention to relevant information for execution [16]. Another study implies that the evolution and the increase in complexity of motor cortical areas have resulted in the development of high cognitive abilities in human and nonhuman primates, encompassing abilities such as action imitation and recognition, perceiving and producing speech, and rhythm execution and appreciation in music [17]. Moreover, research suggests that action plans in the motor cortex may be related to predictive eye movements [18], which play a key role in collaboration and competition scenarios to understand the goal of other peoples’ actions [19].
Despite not being the best-suited region to explore collaboration and competition, the primary motor cortex has been shown to be associated with various cognitive processes relevant to the activities performed during the tests, which is why they were selected for this study. Additionally, the Enophones wearable device is highly user-friendly and comfortable for the subjects to perform signal acquisition and tests.
For the tests, a domino game and a 100-piece puzzle were used. In the online modality, we made use of the “Cats in Art II” puzzle found on the website www.jigsawplanet.com (accessed on 19 October 2023). Zoom’s desktop control sharing function enabled both test subjects to arrange the pieces. On the other hand, the domino game was obtained online (www.Ludoteka.com, accessed on 19 October 2023), and players were connected through an online interface, allowing them to play several rounds continuously.
EEG data collection was achieved with a custom Python algorithm utilizing a multiprocessing module. The use of this library enables parallel operation of all the declared functions within the script. Each Enophone establishes a connection to a single computer device, allowing for simultaneous data extraction of two EEG devices. The algorithm acquires data in time windows of four seconds, considering the sampling rate of the Enophones (250 Hz); each time window extracts a total of 1000 datasets per electrode, also referred to as channels. The algorithm stores these raw data in a .csv file for each subject at the end of the execution of each time window; in this way, further offline processes can be applied over the entire data structure to estimate brain-to-brain synchronization. The algorithm’s architecture ensures that the data remain unmanipulated, facilitating future trustworthy comparisons.
Another crucial characteristic of the stored data is the inclusion of timestamps. Each Enophone has a timestamp that is stored in tandem with the data from its four channels. Access to this information allows for verification of whether the two Enophones are genuinely synchronized. The algorithm ensures correct functioning by implementing locking functions that force processes to run over a single timer, governed by the manager. However, minor differences may be reflected due to technological constraints or efficiency issues during runtime. Once the raw data and timestamps are analyzed offline, data can be precisely paired before applying any further processes.

3.2. Experimental Protocol

The experiment was conducted over the course of twelve weeks. Upon arriving at the test site, the general procedure of the experiment was first explained to the participants. They were handed the informed consent form for signing and were encouraged to ask any questions that may be raised about the experiment. Next, the areas of the skin where the electrodes would make contact were cleaned with alcohol, and a pair of Enophones were placed on the head of each participant. Any earrings or jewelry worn on the ears were removed before the placement of the Enophones, and the participants were asked beforehand to avoid the use of gel, spray, or any other hair product that would interfere with the signal acquisition, as well as refrain from consuming any form of caffeine or alcohol the day of the experiment. Finally, the participants were seated on chairs with their backs against each other to start with the experimental protocol.
Figure 3 shows the experimental protocol followed for the acquisition of the EEG signals. Baseline EEG data of the participants were first acquired with their eyes open for sixty seconds, followed by a one-minute break, and then by another similar recording, but this time with their eyes closed. They were instructed to remain calm at all times, procuring an empty mind and keeping their bodies as still as possible. After another five-minute break, participants performed the collaborative task followed by the competitive task in both modalities.
For the face-to-face modality, during the collaboration task, the participants were seated next to each other at a table with the puzzle pieces scattered in front of them. They were allowed and encouraged to talk to each other in order to try to put the puzzle together over ten minutes while EEG data were being recorded. Once the time was up, the participants were given a five-minute break before continuing with the competition task. For this, participants were seated facing each other and instructed to refrain from speaking or interacting with each other in any way while playing dominoes. The goal was to win as many matches as possible against the other participant in a time of ten minutes while their EEG data were being recorded. Once the participants finishes this task, the visit ended. A depiction of the face-to-face tasks can be observed in Figure 4A for competition (left) and collaboration (right) tasks.
For the online modality, in both tasks, the participants were seated in front of each other with a physical barrier between them, and each one used a computer to join a video call using the Zoom program. In the collaborative task, they tried to resolve an online puzzle on the website Jigsaw Planet. For this task, one participant entered the website, shared their screen, and activated the remote control function so that both participants could control the mouse and move the pieces of the puzzle on the Zoom program. They attempted to arrange the pieces for ten minutes while EEG data were being recorded. When the ten minutes was up, the participants were given a five-minute break like in the face-to-face modality, and later they continued with the competitive task. In this task, they could not interact with each other and played dominoes on the website Ludoteka. Each participant tried to win the highest number of rounds for ten minutes while EEG data were being recorded. When the time ran out, the task was over; an example of an online modality test is shown in Figure 4B. Different views are represented to reflect there was no physical interaction between participants.
Overall, in both modalities, during the collaboration task, participants were encouraged to communicate and interact with each other, while in the competition task, they were explicitly instructed to avoid any type of communication. As the experiment was designed to study brain synchronization under the conditions of collaboration and competition, it was also of interest to evaluate different levels of communication (with and without communication, respectively) to further highlight the differences between the two types of interactions.

3.3. Assessment of Data Quality

In order to assess the quality of the obtained data, an algorithm in MATLAB R2022b software (The Mathworks Inc., Natick, MA, USA), and functions from EEGLAB [20] (https://sccn.ucsd.edu/eeglab/index.php, accessed on 1 October 2023) were implemented. Figure 5 shows the preprocessing steps taken with their specified parameters for each dataset.
EEG data were preprocessed using EEGLAB tools [20]. The time units were converted automatically from UNIX format to samples by EEGLAB. First, EEG data were cleaned of the 60 Hz power line artifact and then, to ensure the removal of most of this artifact, the Zap-Line plugin was used [21]. EEG data were then re-referenced to the average using the PREP pipeline [22] and band-pass filtered using a fifth-order Butterworth FIR filter in the [0.01–50] Hz range.
In addition, the Artifact Subspace Reconstruction (ASR) algorithm was applied to the data [23,24]. It was used to reconstruct data periods on each channel that were contaminated by an artifact with an amplitude higher than  κ  = 10 standard deviations of a clean portion of the signals in the principal components analysis (PCA) space.
Finally, an independent component analysis (ICA) [25] was applied to detect the independent components of artifacts mixed with the EEG signals, such as muscle, electrocardiography, ocular, and power line artifacts. The independent components were manually inspected and those identified as artifacts were removed. Despite ICA being mainly beneficial in multi-channel signal recordings, research indicates that even in cases where only three channels are used, an ICA manages to increase the signal-to-noise ratio for components of signals obtained from C3 and C4 channels [25].
ICA and ASR both contribute to artifact control in noisy EEG signals. ICA suppresses muscle artifacts by removing such artifact-related components and obtains the denoised EEG signals after projecting back from component space [26]. Meanwhile, research has shown that ASR removes eye and muscle components and reduces their temporal power in the case that some of them are retained; using ASR before ICA also leads to an enhanced quality of decomposition [27].
A comparison regarding the time and frequency of the EEG data before being preprocessed and after going through the proposed pipeline to remove artifacts from the signals can be seen in Figure 6. Also, Figure 7 shows a sample output of the EEG preprocessing sequence, with raw data in shaded line, and preprocessed data in bold line.

3.4. Data Synchronization

To acquire the EEG signals, an algorithm capable of establishing a simultaneous connection with two Enophones was implemented. By using Python’s multiprocessing module, a library that allows the execution of multiple functions and processes at the same time, it is possible to enable connection with each Enophone separately but concurrently. This module creates variables that can be shared between the different functions, and this allows for the same timer to be applied throughout the complete algorithm. Multiprocessing implements techniques such as locking, inter-process communication (IPC), and atomic operations that are employed to ensure proper synchronization and to maintain data integrity [28].
While the multiprocessing module inherently ensures parallel data extraction, synchronized connections for both Enophones were established using Brainflow; refer to its website https://brainflow.readthedocs.io/en/stable/SupportedBoards.html (accessed on 1 October 2023) for the specific details about procedures to initialize data extraction with paired Bluetooth devices. This library facilitates the establishment of the connection and the use of a function to extract timestamps from connected devices. Consequently, the signals are stored with corresponding timestamps in a .csv file, as shown in Algortihm 1.
Synchronized multiconnection to both Enophones was achieved in real time, enabling data storage immediately upon capturing EEG signals. Subsequently, data curation was performed by supervising the timestamps of both files, each containing the raw data for of each dyad. When parallelization is implemented in any algorithm, certain constraints may arise during execution, preventing perfectly aligned concurrency. Obtaining individual timestamp records for each connected device becomes crucial in serving as a parameter to match records precisely at the beginning of the tests. Therefore, applying high-precision offline processing with truly synchronized data is feasible due to the alignment of these timestamps.
Algorithm 1: Simplified explanation of the operation of the algorithm implemented for synchronized acquisition of EEG data from two different Enophone devices
Data 09 00047 i001

3.5. Suggested Data Analysis

The experimental protocol followed to obtain this dataset offers the perfect opportunity to study brain synchronization since it involves a simultaneous recording of brain activity from multiple individuals while they interact with each other in different scenarios. Hyperscanning is an analysis technique used for exploring brain activity between subjects interacting simultaneously and in synchrony [29]. It allows researchers to investigate the neural basis of social interactions, cooperation, competition, and communication in real time [6]. On this note, there are multiple metrics that can be used to study brain synchronization.
The bispectrum is a recent metric that provides information about the temporal, spatial, and spectral levels of a signal, and unlike other metrics of brain synchronization, it is a higher-order spectrum that refers to the degree of temporal synchronization, phase coupling, and nonlinear interactions between any pair of signals analyzed in different frequency bands [14].
Additionally, the power spectral density can be calculated to estimate the spectral band power and calculate phase synchronization between brains [30]. Statistical and temporal analyses of the signal can also be performed to further strengthen the analysis.

Author Contributions

Conceptualization, M.A.R.-M. and J.d.J.L.-S.; methodology, Y.E.L.-C. and M.A.H.-M.; software, Y.E.L.-C. and M.A.H.-M.; validation, M.A.R.-M.; formal analysis, Y.E.L.-C.; investigation, X.H.-M., A.C.-M., D.A.G.-V., A.A.M.-A. and D.C.R.-A.; resources, M.A.R.-M. and J.d.J.L.-S.; data curation, Y.E.L.-C., D.C.R.-A. and D.A.G.-V.; writing—original draft preparation, M.A.H.-M., Y.E.L.-C., X.H.-M., A.C.-M., D.A.G.-V., A.A.M.-A. and D.C.R.-A.; writing—review and editing, M.A.H.-M. and M.A.R.-M.; visualization, M.A.R.-M.; supervision, M.A.R.-M. and J.d.J.L.-S.; project administration, M.A.H.-M.; funding acquisition, M.A.R.-M. and J.d.J.L.-S. All authors have read and agreed to the published version of the manuscript.

Funding

The APC of this work was funded by the Campus City initiative from Tecnologico de Monterrey.

Institutional Review Board Statement

The study was conducted in accordance with the Declaration of Helsinki, and approved by the Institutional Research Ethics Committee of Tecnológico de Monterrey (protocol code: EHE-2023-03; date of approval: 11 August 2023).

Informed Consent Statement

Informed consent was obtained from all subjects involved in the study.

Data Availability Statement

The dataset is available at https://doi.org/10.6084/m9.figshare.c.7062272. (accessed on 9 February 2024).

Acknowledgments

The authors would like to acknowledge the Conscious Technologies research group and the International IUCRC BRAIN Affiliate Site at Tecnologico de Monterrey for their support in the development of this work.

Conflicts of Interest

The authors declare no conflicts of interest.

References

  1. Davidesco, I.; Laurent, E.; Valk, H.; West, T.; Dikker, S.; Milne, C.; Poeppel, D. Brain-to-brain synchrony between students and teachers predicts learning outcomes. bioRxiv 2019, 644047. [Google Scholar] [CrossRef]
  2. Nam, C.S.; Choo, S.; Huang, J.; Park, J. Brain-to-brain neural synchrony during social interactions: A systematic review on hyperscanning studies. Appl. Sci. 2020, 10, 6669. [Google Scholar] [CrossRef]
  3. Liu, D.; Liu, S.; Liu, X.; Zhang, C.; Li, A.; Jin, C.; Chen, Y.; Wang, H.; Zhang, X. Interactive Brain Activity: Review and Progress on EEG-Based Hyperscanning in Social Interactions. Front. Psychol. 2018, 9, 411913. [Google Scholar] [CrossRef] [PubMed]
  4. Tan, S.J.; Wong, J.N.; Teo, W.P. Is Neuroimaging Ready for the Classroom? A Systematic Review of Hyperscanning Studies in Learning. NeuroImage 2023, 281, 120367. [Google Scholar] [CrossRef] [PubMed]
  5. Balconi, M.; Angioletti, L.; Cassioli, F. Hyperscanning EEG Paradigm Applied to Remote vs. Face-To-Face Learning in Managerial Contexts: Which Is Better? Brain Sci. 2023, 13, 356. [Google Scholar] [CrossRef] [PubMed]
  6. Czeszumski, A.; Eustergerling, S.; Lang, A.; Menrath, D.; Gerstenberger, M.; Schuberth, S.; Schreiber, F.; Rendon, Z.Z.; König, P. Hyperscanning: A valid method to study neural inter-brain underpinnings of social interaction. Front. Hum. Neurosci. 2020, 14, 39. [Google Scholar] [CrossRef] [PubMed]
  7. Balconi, M.; Vanutelli, M.E. Cooperation and competition with hyperscanning methods: Review and future application to emotion domain. Front. Comput. Neurosci. 2017, 11, 86. [Google Scholar] [CrossRef] [PubMed]
  8. Decety, J.; Jackson, P.L.; Sommerville, J.A.; Chaminade, T.; Meltzoff, A.N. The neural bases of cooperation and competition: An fMRI investigation. Neuroimage 2004, 23, 744–751. [Google Scholar] [CrossRef] [PubMed]
  9. Liu, H.; Zhao, C.; Wang, F.; Zhang, D. Inter-brain amplitude correlation differentiates cooperation from competition in a motion-sensing sports game. Soc. Cogn. Affect. Neurosci. 2021, 16, 552–564. [Google Scholar] [CrossRef]
  10. Sinha, N.; Maszczyk, T.; Wanxuan, Z.; Tan, J.; Dauwels, J. EEG hyperscanning study of inter-brain synchrony during cooperative and competitive interaction. In Proceedings of the 2016 IEEE International Conference on Systems, Man, and Cybernetics (SMC), Budapest, Hungary, 9–12 October 2016; IEEE: Piscataway, NJ, USA, 2016; pp. 004813–004818. [Google Scholar]
  11. Susnoschi Luca, I.; Putri, F.D.; Ding, H.; Vuckovič, A. Brain synchrony in competition and collaboration during multiuser neurofeedback-based gaming. Front. Neuroergon. 2021, 2, 29. [Google Scholar] [CrossRef]
  12. Toppi, J.; Borghini, G.; Petti, M.; He, E.J.; De Giusti, V.; He, B.; Astolfi, L.; Babiloni, F. Investigating cooperative behavior in ecological settings: An EEG hyperscanning study. PLoS ONE 2016, 11, e0154236. [Google Scholar] [CrossRef] [PubMed]
  13. Chen, Q. EEG Hyperscanning Study of Team Neurodynamics Analysis during Cooperative and Competitive Interaction. Master’s Thesis, University of Twente, Enschede, The Netherlands, 2019. [Google Scholar]
  14. Ramírez-Moreno, M.A.; Cruz-Garza, J.G.; Acharya, A.; Chatufale, G.; Witt, W.; Gelok, D.; Reza, G.; Contreras-Vidal, J.L. Brain-to-brain communication during musical improvisation: A performance case study. F1000Research 2023, 11, 989. [Google Scholar] [CrossRef] [PubMed]
  15. Theofanopoulou, C.; Paez, S.; Huber, D.; Todd, E.; Ramírez-Moreno, M.A.; Khaleghian, B.; Sánchez, A.M.; Barceló, L.; Gand, V.; Contreras-Vidal, J.L. Mobile brain imaging in butoh dancers: From rehearsals to public performance. bioRxiv 2023. [Google Scholar] [CrossRef]
  16. Abe, M.; Hanakawa, T. Functional coupling underlying motor and cognitive functions of the dorsal premotor cortex. Behav. Brain Res. 2009, 198, 13–23. [Google Scholar] [CrossRef] [PubMed]
  17. Mendoza, G.; Merchant, H. Motor system evolution and the emergence of high cognitive functions. Prog. Neurobiol. 2014, 122, 73–93. [Google Scholar] [CrossRef] [PubMed]
  18. Elsner, C.; D’Ausilio, A.; Gredebäck, G.; Falck-Ytter, T.; Fadiga, L. The motor cortex is causally related to predictive eye movements during action observation. Neuropsychologia 2013, 51, 488–492. [Google Scholar] [CrossRef] [PubMed]
  19. Falck-Ytter, T.; Gredebäck, G.; Von Hofsten, C. Infants predict other people’s action goals. Nat. Neurosci. 2006, 9, 878–879. [Google Scholar] [CrossRef]
  20. Delorme, A.; Makeig, S. EEGLAB: An open source toolbox for analysis of single-trial EEG dynamics including independent component analysis. J. Neurosci. Methods 2004, 134, 9–21. [Google Scholar] [CrossRef]
  21. Klug, M.; Kloosterman, N.A. Zapline-plus: A Zapline extension for automatic and adaptive removal of frequency-specific noise artifacts in M/EEG. Hum. Brain Mapp. 2022, 43, 2743–2758. [Google Scholar] [CrossRef]
  22. Bigdely-Shamlo, N.; Mullen, T.; Kothe, C.; Su, K.M.; Robbins, K.A. The PREP pipeline: Standardized preprocessing for large-scale EEG analysis. Front. Neuroinform. 2015, 9, 16. [Google Scholar] [CrossRef] [PubMed]
  23. Kothe, C.A.E.; Jung, T.P. Artifact Removal Techniques with Signal Reconstruction. U.S. Patent Application 14/895,440, 3 June 2016. [Google Scholar]
  24. Mullen, T.R.; Kothe, C.A.; Chi, Y.M.; Ojeda, A.; Kerth, T.; Makeig, S.; Jung, T.P.; Cauwenberghs, G. Real-time neuroimaging and cognitive monitoring using wearable dry EEG. IEEE Trans. Biomed. Eng. 2015, 62, 2553–2567. [Google Scholar] [CrossRef] [PubMed]
  25. Rejer, I.; Gorski, P. Benefits of ICA in the Case of a Few Channel EEG. In Proceedings of the Annual International Conference of the IEEE Engineering in Medicine and Biology Society, EMBS, Milan, Italy, 25–29 August 2015; pp. 7434–7437. [Google Scholar] [CrossRef]
  26. Chen, X.; Xu, X.; Liu, A.; Lee, S.; Chen, X.; Zhang, X.; McKeown, M.J.; Wang, Z.J. Removal of muscle artifacts from the EEG: A review and recommendations. IEEE Sens. J. 2019, 19, 5353–5368. [Google Scholar] [CrossRef]
  27. Chang, C.Y.; Hsu, S.H.; Pion-Tonachini, L.; Jung, T.P. Evaluation of artifact subspace reconstruction for automatic artifact components removal in multi-channel EEG recordings. IEEE Trans. Biomed. Eng. 2019, 67, 1114–1121. [Google Scholar] [CrossRef] [PubMed]
  28. Aziz, Z.A.; Abdulqader, D.N.; Sallow, A.B.; Omer, H.K. Python parallel processing and multiprocessing: A rivew. Acad. J. Nawroz Univ. 2021, 10, 345–354. [Google Scholar] [CrossRef]
  29. Douglas, C.L.; Tremblay, A.; Newman, A.J. A two for one special: EEG hyperscanning using a single-person EEG recording setup. MethodsX 2023, 10, 102019. [Google Scholar] [CrossRef]
  30. Ahn, S.; Cho, H.; Kwon, M.; Kim, K.; Kwon, H.; Kim, B.S.; Chang, W.S.; Chang, J.W.; Jun, S.C. Interbrain phase synchronization during turn-taking verbal interaction—A hyperscanning study using simultaneous EEG/MEG. Hum. Brain Mapp. 2018, 39, 171–188. [Google Scholar] [CrossRef]
Figure 1. (A) Two subjects performing the collaboration task (putting together a 100-piece puzzle) in the face-to-face modality. (B) Two subjects performing the competition task (playing a one-on-one classic domino game) in the face-to-face modality.
Figure 1. (A) Two subjects performing the collaboration task (putting together a 100-piece puzzle) in the face-to-face modality. (B) Two subjects performing the competition task (playing a one-on-one classic domino game) in the face-to-face modality.
Data 09 00047 g001
Figure 2. (A) A picture of the Enophones wearable device used to acquire the EEG signals for this dataset. (B) The position of the electrodes found in the Enophones according to the international 10–20 system (A1, A2, C3, and C4) is marked in red.
Figure 2. (A) A picture of the Enophones wearable device used to acquire the EEG signals for this dataset. (B) The position of the electrodes found in the Enophones according to the international 10–20 system (A1, A2, C3, and C4) is marked in red.
Data 09 00047 g002
Figure 3. A breakdown of the experimental protocol followed to obtain the EEG signals. First, EEG signals were acquired during Eyes Open and Eyes Closed resting states (1 min each with a 1 min break in between), followed by a 5 min break. Then, the pairs participated in either the face-to-face or the online modality, each consisting of a 10 min window to put a puzzle together (collaboration task), a 5 min break, and, finally, another window of 10 min to play a one-on-one domino game (competition task).
Figure 3. A breakdown of the experimental protocol followed to obtain the EEG signals. First, EEG signals were acquired during Eyes Open and Eyes Closed resting states (1 min each with a 1 min break in between), followed by a 5 min break. Then, the pairs participated in either the face-to-face or the online modality, each consisting of a 10 min window to put a puzzle together (collaboration task), a 5 min break, and, finally, another window of 10 min to play a one-on-one domino game (competition task).
Data 09 00047 g003
Figure 4. (A) A graphical representation of the testing space when the volunteers participated in the face-to-face modality during the competitive task ((left), playing a one-on-one classical dominoes game) and the collaborative task ((right), putting together a 100-piece puzzle). (B) A top view of the testing space when the volunteers participated in the online modality (left), and a close-up of the cubicles of one of the participants during the competitive task (right). Notice how, for this modality, the participants are separated by a small wall between the cubicles and can only see each other through the Zoom software on their laptops.
Figure 4. (A) A graphical representation of the testing space when the volunteers participated in the face-to-face modality during the competitive task ((left), playing a one-on-one classical dominoes game) and the collaborative task ((right), putting together a 100-piece puzzle). (B) A top view of the testing space when the volunteers participated in the online modality (left), and a close-up of the cubicles of one of the participants during the competitive task (right). Notice how, for this modality, the participants are separated by a small wall between the cubicles and can only see each other through the Zoom software on their laptops.
Data 09 00047 g004
Figure 5. Steps followed to assess data quality. First, EEG data were cleaned of the 60 Hz power line artifact, followed by the use of the Zap–Line plugin. Then, the data were re-referenced to the average using the PREP pipeline and filtered using a fifth–order Butterworth FIR filter in the [0.01–50] Hz range. Finally, the ASR algorithm was applied to reconstruct data periods contaminated by artifacts and an ICA was performed to remove visually identified artifacts.
Figure 5. Steps followed to assess data quality. First, EEG data were cleaned of the 60 Hz power line artifact, followed by the use of the Zap–Line plugin. Then, the data were re-referenced to the average using the PREP pipeline and filtered using a fifth–order Butterworth FIR filter in the [0.01–50] Hz range. Finally, the ASR algorithm was applied to reconstruct data periods contaminated by artifacts and an ICA was performed to remove visually identified artifacts.
Data 09 00047 g005
Figure 6. Response of the EEG signals to the proposed preprocessing pipeline in terms of frequency and time. It can be appreciated from the comparison of the examples of raw and clean EEG signals that the line noise at 60 Hz present throughout the test is eliminated. Additionally the high frequencies [51–100 Hz] are reduced in power, while preserving the characteristics of EEG signals.
Figure 6. Response of the EEG signals to the proposed preprocessing pipeline in terms of frequency and time. It can be appreciated from the comparison of the examples of raw and clean EEG signals that the line noise at 60 Hz present throughout the test is eliminated. Additionally the high frequencies [51–100 Hz] are reduced in power, while preserving the characteristics of EEG signals.
Data 09 00047 g006
Figure 7. (Left). Example of before and after preprocessing was implemented in the EEG data collected (four channels), corresponding to P01-Dyad01 during the Eyes Open task in the face-to-face modality. (Right). Extended view of the clean EEG signals shown in the left subfigure.
Figure 7. (Left). Example of before and after preprocessing was implemented in the EEG data collected (four channels), corresponding to P01-Dyad01 during the Eyes Open task in the face-to-face modality. (Right). Extended view of the clean EEG signals shown in the left subfigure.
Data 09 00047 g007
Table 1. Description of the fields contained within the _EEG.mat file structure.
Table 1. Description of the fields contained within the _EEG.mat file structure.
FieldsDescription
EEG.setnameName defined for the dataset.
EEG.filenameName defined for the file.
EEG.subjectSubject ID according to the nomenclature (P01, P02).
EEG.groupDyad ID according to the nomenclature (Dyad01–Dyad08).
EEG.conditionRecording ID according to the nomenclature (Recording01, Recording02).
EEG.sessionTask ID according to the nomenclature (Task01–Task04).
EEG.nbchanScalar value indicating number of channels used in EEG acquisition.
EEG.trialsNumber of times the experiment was performed.
EEG.pntsScalar value indicating the length of each EEG signal acquired by each channel.
EEG.srateScalar value indicating the sample rate of the Enophones (250 Hz).
EEG.xminScalar value indicating the sample start time of the data recording.
EEG.xmaxScalar value indicating the sample end time of the data recording.
EEG.times1 × N vector containing UNIX timestamps in seconds at each time point N for EEG data.
EEG.data4 × N matrix containing EEG data vector (channels) at each time point N (units: microvolts).
EEG.chanlocs1 × N structure containing the spatial location of each channel N according to the international 10–20 system.
EEG.refChannel referencing to the common average
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Hernández-Mustieles, M.A.; Lima-Carmona, Y.E.; Mendoza-Armenta, A.A.; Hernandez-Machain, X.; Garza-Vélez, D.A.; Carrillo-Márquez, A.; Rodríguez-Alvarado, D.C.; Lozoya-Santos, J.d.J.; Ramírez-Moreno, M.A. An EEG Dataset of Subject Pairs during Collaboration and Competition Tasks in Face-to-Face and Online Modalities. Data 2024, 9, 47. https://doi.org/10.3390/data9040047

AMA Style

Hernández-Mustieles MA, Lima-Carmona YE, Mendoza-Armenta AA, Hernandez-Machain X, Garza-Vélez DA, Carrillo-Márquez A, Rodríguez-Alvarado DC, Lozoya-Santos JdJ, Ramírez-Moreno MA. An EEG Dataset of Subject Pairs during Collaboration and Competition Tasks in Face-to-Face and Online Modalities. Data. 2024; 9(4):47. https://doi.org/10.3390/data9040047

Chicago/Turabian Style

Hernández-Mustieles, María A., Yoshua E. Lima-Carmona, Axel A. Mendoza-Armenta, Ximena Hernandez-Machain, Diego A. Garza-Vélez, Aranza Carrillo-Márquez, Diana C. Rodríguez-Alvarado, Jorge de J. Lozoya-Santos, and Mauricio A. Ramírez-Moreno. 2024. "An EEG Dataset of Subject Pairs during Collaboration and Competition Tasks in Face-to-Face and Online Modalities" Data 9, no. 4: 47. https://doi.org/10.3390/data9040047

Article Metrics

Back to TopTop