Next Article in Journal
Reducing Go-Around Attempts Based on History of Successful Landings for Aviation in General
Previous Article in Journal
Exploring Innovative Thinking of Bergson’s Philosophy and Modern Art via Computer-Aided Design—A Case Study with Three Works as Examples
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Proceeding Paper

CameraEEG: Synchronous Recording of Electroencephalogram and Video Data for Neuroergonomics Applications †

Neural Engineering Lab, Department of Biosciences and Bioengineering, Indian Institute of Technology Guwahati, Guwahati 781039, India
*
Authors to whom correspondence should be addressed.
Presented at the IEEE 5th Eurasia Conference on Biomedical Engineering, Healthcare and Sustainability, Tainan, Taiwan, 2–4 June 2023.
Eng. Proc. 2023, 55(1), 46; https://doi.org/10.3390/engproc2023055046
Published: 4 December 2023

Abstract

:
Lab-confined electroencephalogram experiments generally impel the subject’s mobility. Hence, we provide a wearable solution enabling human brain activity while monitoring during everyday activities, especially for neuroergonomics. This paper introduces CameraEEG, a new Android application that allows for synchronized smartphone acquisition of electroencephalogram (EEG) and camera data. Using a button on the app, the subject can record the witnessed audio-visual events of interest. Android SDK version 28 and mBrainTrain’s Smarting Mobi SDK were used to develop the app. The app can be used across all Android smartphones that have Android OS–Lollipop at least. In this paper, we used the app to record synchronized video and EEG data from four subjects during two tasks (namely, closed and open eyes), each under sitting conditions. We used the POz electrode data for analysis. There was a visible difference between the power spectrums of both the tasks, with the eyes-closed task reflecting an alpha band peak. Also, the obtained video and EEG data showed accurate synchronization. A download weblink for the .apk file along with a detailed help document for the developed app is provided for further testing.

1. Introduction

With the advancements in cognitive psychology, the present requirement is to investigate the neural responses that are naturally triggered by a dynamic environment in synchronicity with the participant’s task [1]. Applications using restricted stimuli in natural settings have also been developed in order to examine the variations in brain responses [2]. Electroencephalogram (EEG) and smartphone integration have made brain activity measurement possible on the move [3]. Attention studies that gauge the neural activity corresponding to different contexts within a classroom also employ portable electroencephalogram systems [4]. Recently, a study on the neuronal dynamics of interactive synchrony among musicians reported using smartphone-based hyper-scanning [5]. The above investigations might have been effective in locating events of interest if the camera data were also added alongside the EEG.
Synchronized EEG and camera data streams seem to be of current interest for natural environment experiments. Hence, we set up to develop a smartphone Android app called “CameraEEG” that enables this. Since the app was built for natural environment cognitive experiments, the EEG and rear camera data stream are synchronized. We used a transparent mobile holder pouch for placing the smartphone to enable hands-free video mode recording. Eventually, we tested the app’s performance with four subjects performing eyes-closed and eyes-open tasks while sitting. Offline analysis was carried out for the synchronization between the camera and electroencephalogram data streams. We then studied each task’s power spectral density (PSD) from the EEG signals. We believe that the CameraEEG app is the first of its kind that enables synchronized video and brain signals to be recorded on an Android platform.
Wascher et al. reviewed mobileEEG in neuroergonomics to understand mental states in workplace environments [6]. Our CameraEEG app will play a vital role in such scenarios by providing the opportunity to obtain EEG as well as video data. A portable solution for examination of cognitive states in workplace and daily life situations is becoming increasingly important for the neuroergonomics community.

2. Materials and Methodology

2.1. Software Architecture of Android Application

Java Development Kit (JDK) was used for the app development on Android Studio (AS). The EEG device manufacturer’s software development kit (SDK), version 2019 for data acquisition was also used, thus restricting the app to a particular EEG device, namely, mBrainTrain. However, the performance was robust. Two modules related to camera and EEG made up the app. An application programming interface (API) named camera2 was used from AS to modify the camera module. Built-in camera unit facilitates and stores image frames through a buffer. Google’s “Camera2Basic” functions were used to create the camera preview of the app (https://github.com/googlearchive/android-Camera2Video, accessed on 1 December 2023). This function uses a background handler thread to parallelize camera preview as well as record the video on smartphone. The closing and opening overlay of the camera device object was hindered by a binary semaphore. Resolution of the recorded video through the back camera was kept at 480 p. The camera preview feed is automatically scaled to fit the surface by rescaling its dimensions through a modified SurfaceTexture class called “AutofitTexture”.

2.2. Operation of Android Application

The steps for utilizing the app to record data are shown in Figure 1. First, we must connect the app to mBraintrain’s smarting EEG device’s Bluetooth. The app then activates the recording button (Figure 2), which records the synchronized EEG data as well as the video feed onto the smartphone as .bdf and .mp4 files, respectively.
Stopping and re-recording can be carried out using the same button on the app (Figure 2d). An event marker button is also available for the subject to mark any unique events during the experiment (Figure 2c). The apps. apk file is available for download at the following link: https://github.com/NeuralLabIITGuwahati/CameraEEG, accessed on 1 December 2023.

2.3. Synchronized EEG and Video Data Acquisition

Consent was obtained from our institute’s ethics committee at IIT Guwahati for the collection of data from human participants. Data were recorded from four subjects, three male and one female with a mean age of 25 and standard deviation 2.16 for this study. We used the developed app with an Easycap 24-channel headcap of mBraintrain smarting device (https://mbraintrain.com/smarting-wireless-eeg/, accessed on 1 December 2023). However, it is also compatible with 20-channel concealed-EEG (cEEGrid) electrodes. Before starting the synchronized EEG and camera recording, impedance levels of each electrode were checked using the SMARTING app. The Android smartphone was placed in a clear mobile holder pouch as in Figure 3, which permitted the recording of video stream with the rear camera and enabling touch-screen operation. The cell phone was not held continuously in hand by the subject but worn as in Figure 3, with the rear camera pointed away from the subject. A wearable experimental setup for the CameraEEG app is shown in Figure 3 and can include 20 cEEGrid electrode or headcap EEG setups.
Synchronization of video and EEG streams as well as data fidelity were checked when the subjects performed two tasks, namely, with their eyes open and closed. Each task lasted for about 5 min. The app had an event marker button to record situations of interest, but this was not used in this study.

3. Results

The video and EEG from the synchronized recordings were analyzed for any information loss and mismatch. The EEG and video data streams matched with less than a ±5 ms inaccuracy for all four subjects when we evaluated the recorded sessions for synchronization.
The recorded EEG time series during the two tasks were partitioned into two segments using MATLAB (Version 2021a) and EEGLAB (Version 2021.1) for further analysis. Custom scripts were written to analyze the data from the two tasks offline. The POz channel from the 24-channel EEG electrode setup was employed in this study. Following this, the change in the alpha activity was obtained as reported in [7]. Power spectral analysis depicted a peak around ~10 Hz during the eyes-closed task (Figure 4) across all subjects, indicating, as expected, a higher-alpha power.

4. Discussion

The app’s performance was deciphered from the data fidelity using the eyes-closed and eyes-open sessions, as performed in earlier works [8]. We observed an alpha peak in the eyes-closed sessions across all subjects, showing that the built app correctly acquired EEG data. Our CameraEEG app offers a straightforward integrated application to record unrestricted, natural electroencephalogram and video data for neuroergonomics applications. This data can also be utilized as a biomarker in intelligent building applications [9]. It also finds applications in understanding the mental implications for people living in urban environments [10,11]. Garza et al. found that high-beta power PSDs from prefrontal and frontal electrodes appeared during a museum walk compared with the baseline condition (staring at the wall) [12]. They also observed high-alpha power PSDs in central, parietal and occipital electrodes for the baseline condition. Hence, through this work, we demonstrate that good-quality synchronized video–electroencephalogram data can be acquired using an Android smartphone. This cost-effective solution also has numerous biomedical applications like monitoring epileptic, Alzheimer’s and sleep disorder patients.
We show that a single Android app can record synchronized EEG and video streams, in contrast to other research works, which appear to use several apps [1,13]. This configuration makes it simple to re-engineer for different applications by adding extra modules, decreasing the smartphone’s load compared with using multiple apps at once. However, our app is only compatible with mBrainTrain-based EEG devices, because it uses libraries from that specific EEG device manufacturer.
Although the recording sessions provided primary validation of the application’s operation, longer-duration recordings are necessary to thoroughly assess the app’s robustness. Running the CameraEEG application on Android phone models from later than 2020 (checked on the Redmi Note 7 pro with 4 GB RAM and Realme 5 pro with 6 GB RAM) showed very little heating. However, an older Android smartphone may become heated during extended recording sessions. Furthermore, since only 480 p resolution camera recordings were used, higher resolutions can degrade the smartphone’s performance by causing stutters of the toolbar. However, an enhanced video quality may provide increased resolutions for observing specific natural environment stimulus. Going forward, future challenges for this work include exploring the possibilities of online EEG analysis on smartphones like artefact removal and classifying tasks.

Author Contributions

D.H. and S.M. contributed equally. Discussions between S.M. and C.N.G. led to this idea. S.M. developed the app. D.H. tested the app and carried out data analysis. The paper was written by all three authors. A major portion of this work was submitted for the M. Tech degree (Biotechnology) of S.M. at IIT Guwahati, Assam, India. All authors have read and agreed to the published version of the manuscript.

Funding

D.H. received funding from MHRD Doctoral fellowship and NEWGEN IEDC Grant, DST, Govt. of India. S.M. was funded by MHRD master’s program fellowship. C.N.G’s time was also funded by NEWGEN IEDC Grant, DST, Govt. of India.

Institutional Review Board Statement

The study was conducted in accordance with the Declaration of Helsinki, and approved by the Institutional Review Board (or Ethics Committee) of Indian Institute of Technology Guwahati, India (Date of approval: 27 March 2019).

Informed Consent Statement

Informed consent was obtained from all subjects involved in the study.

Data Availability Statement

Data can be made available upon request to the authors.

Acknowledgments

We also thank Dasari Shivakumar and Nanaki Singh for their time and help during experiments.

Conflicts of Interest

The authors declare no conflict of interest.

References

  1. Hölle, D.; Blum, S.; Kissner, S.; Debener, S.; Bleichner, M.G. Real-Time Audio Processing of Real-Life Soundscapes for EEG Analysis: ERPs Based on Natural Sound Onsets. Front. Neuroergonomics 2022, 3, 793061. [Google Scholar] [CrossRef]
  2. Liebherr, M.; Corcoran, A.W.; Alday, P.M.; Coussens, S.; Bellan, V.; Howlett, C.A.; Immink, M.A.; Kohler, M.; Schlesewsky, M. EEG and behavioral correlates of attentional processing while walking and navigating naturalistic environments. Sci. Rep. 2021, 11, 22325. [Google Scholar] [CrossRef] [PubMed]
  3. Blum, S.; Debener, S.; Emkes, R.; Volkening, N.; Fudickar, S.; Bleichner, M.G. EEG Recording and Online Signal Processing on Android: A Multiapp Framework for Brain-Computer Interfaces on Smartphone. BioMed Res. Int. 2017, 2017, 3072870. [Google Scholar] [CrossRef] [PubMed]
  4. Grammer, J.K.; Xu, K.; Lenartowicz, A. Effects of context on the neural correlates of attention in a college classroom. NPJ Sci. Learn. 2021, 6, 15. [Google Scholar] [CrossRef] [PubMed]
  5. Zamm, A.; Palmer, C.; Bauer, A.-K.R.; Bleichner, M.G.; Demos, A.P.; Debener, S. Behavioral and Neural Dynamics of Interpersonal Synchrony Between Performing Musicians: A Wireless EEG Hyperscanning Study. Front. Hum. Neurosci. 2021, 15, 476. [Google Scholar] [CrossRef] [PubMed]
  6. Wascher, E.; Reiser, J.; Rinkenauer, G.; Larrá, M.; Dreger, F.A.; Schneider, D.; Karthaus, M.; Getzmann, S.; Gutberlet, M.; Arnau, S. Neuroergonomics on the Go: An Evaluation of the Potential of Mobile EEG for Workplace Assessment and Design. Hum. Factors 2023, 65, 86–106. [Google Scholar] [CrossRef] [PubMed]
  7. Hohaia, W.; Saurels, B.W.; Johnston, A.; Yarrow, K.; Arnold, D.H. Occipital alpha-band brain waves when the eyes are closed are shaped by ongoing visual processes. Sci. Rep. 2022, 12, 1194. [Google Scholar] [CrossRef] [PubMed]
  8. Bateson, A.D.; Asghar, A.U.R. Development and Evaluation of a Smartphone-Based Electroencephalography (EEG) System. IEEE Access 2021, 9, 75650–75667. [Google Scholar] [CrossRef]
  9. Guan, H.; Hu, S.; Lu, M.; He, M.; Zhang, X.; Liu, G. Analysis of human electroencephalogram features in different indoor environments. Build. Environ. 2020, 186, 107328. [Google Scholar] [CrossRef]
  10. Smith, N.; Georgiou, M.; King, A.C.; Tieges, Z.; Webb, S.; Chastin, S. Urban blue spaces and human health: A systematic review and meta-analysis of quantitative studies. Cities 2021, 119, 103413. [Google Scholar] [CrossRef]
  11. Norwood, M.F.; Lakhani, A.; Maujean, A.; Zeeman, H.; Creux, O.; Kendall, E. Brain activity, underlying mood and the environment: A systematic review. J. Environ. Psychol. 2019, 65, 101321. [Google Scholar] [CrossRef]
  12. Cruz-Garza, J.G.; Brantley, J.A.; Nakagome, S.; Kontson, K.; Megjhani, M.; Robleto, D.; Contreras-Vidal, J.L. Deployment of Mobile EEG Technology in an Art Museum Setting: Evaluation of Signal Quality and Usability. Front. Hum. Neurosci. 2017, 11, 527. Available online: https://www.frontiersin.org/articles/10.3389/fnhum.2017.00527 (accessed on 24 February 2023). [CrossRef] [PubMed]
  13. Scanlon, J.E.M.; Townsend, K.A.; Cormier, D.L.; Kuziek, J.W.P.; Mathewson, K.E. Taking off the training wheels: Measuring auditory P3 during outdoor cycling using an active wet EEG system. Brain Res. 2019, 1716, 50–61. [Google Scholar] [CrossRef] [PubMed]
Figure 1. CameraEEG application’s operational flow.
Figure 1. CameraEEG application’s operational flow.
Engproc 55 00046 g001
Figure 2. Graphical user interface of the app: (a) preview of video recording, (b) button for connecting/disconnecting to the EEG device, (c) button to mark events, (d) start/stop record button for video–EEG, (e) chronometer for tracking the recording time.
Figure 2. Graphical user interface of the app: (a) preview of video recording, (b) button for connecting/disconnecting to the EEG device, (c) button to mark events, (d) start/stop record button for video–EEG, (e) chronometer for tracking the recording time.
Engproc 55 00046 g002
Figure 3. CameraEEG experimental setup with cEEGrid electrodes.
Figure 3. CameraEEG experimental setup with cEEGrid electrodes.
Engproc 55 00046 g003
Figure 4. Comparison of eyes-open and eyes-closed multisubject EEG recordings across POz channels.
Figure 4. Comparison of eyes-open and eyes-closed multisubject EEG recordings across POz channels.
Engproc 55 00046 g004
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Hazarika, D.; Madhavan, S.; Gupta, C.N. CameraEEG: Synchronous Recording of Electroencephalogram and Video Data for Neuroergonomics Applications. Eng. Proc. 2023, 55, 46. https://doi.org/10.3390/engproc2023055046

AMA Style

Hazarika D, Madhavan S, Gupta CN. CameraEEG: Synchronous Recording of Electroencephalogram and Video Data for Neuroergonomics Applications. Engineering Proceedings. 2023; 55(1):46. https://doi.org/10.3390/engproc2023055046

Chicago/Turabian Style

Hazarika, Doli, Srihari Madhavan, and Cota Navin Gupta. 2023. "CameraEEG: Synchronous Recording of Electroencephalogram and Video Data for Neuroergonomics Applications" Engineering Proceedings 55, no. 1: 46. https://doi.org/10.3390/engproc2023055046

Article Metrics

Back to TopTop