Next Article in Journal
Characterization of Phenolic Compounds of Arnica montana Conventional Extracts
Previous Article in Journal
Exploring the Potential of Compressive Sensing Payloads for Earth Observation from Geostationary Platforms: An Instrumental Concept for Fire Monitoring
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Proceeding Paper

Unraveling Imaginary and Real Motion: A Correlation Indices Study in BCI Data †

by
Stavros T. Miloulis
1,2,
Ioannis Zorzos
1,2,
Ioannis Kakkos
1,2,*,
Aikaterini Karampasi
1,2,
Errikos C. Ventouras
1,
Ioannis Kalatzis
1,
Charalampos Papageorgiou
3,
Panteleimon Asvestas
1 and
George K. Matsopoulos
2
1
Department of Biomedical Engineering, University of West Attica, 12243 Athens, Greece
2
Biomedical Engineering Laboratory, National Technical University of Athens, 15773 Athens, Greece
3
Department of Psychiatric, School of Medicine, National & Kapodistrian University of Athens, 11527 Athens, Greece
*
Author to whom correspondence should be addressed.
Presented at the Advances in Biomedical Sciences, Engineering and Technology (ABSET) Conference, Athens, Greece, 10–11 June 2023.
Eng. Proc. 2023, 50(1), 11; https://doi.org/10.3390/engproc2023050011
Published: 7 November 2023

Abstract

:
The efficient translation of brain signals into an output device is an essential characteristic to establish a Brain-computer Interface (BCI) link. This research investigates the applicability of diverse correlation indices for the differentiation of specific movements (left, right, both, or none) and states (real or imaginary) in a private BCI dataset, including EEG recordings of 32 participants. As such, the recorded brain activation data were employed to illustrate the differences between visual- and auditory-event-related responses during task performance. Our methodology involved a two-pronged approach. Firstly, EEG data were collected, capturing both the visual- and auditory-event-related signals that corresponded to each of the four movement classes. Secondly, we performed a comparative analysis of the collected dataset using various correlation algorithms, such as Pearson, Spearman, and Kendall, among others, to evaluate their effectiveness in differentiating between movements and states. The results demonstrated distinctive correlation patterns, as the selected indices effectively distinguished between real and imaginary movements, as well as between different lower limp movements in most cases. Moreover, the correlation schemas of certain individuals presented greater sensitivity in discerning nuances within the dataset. In this regard, it can be inferred that the chosen correlation indices can provide valuable insights into the aforementioned differentiation in EEG data. The results open up potential paths for improving BCI interfaces and contributing to more accurate prediction models.

1. Introduction

By creating a direct communication channel between a wired or augmented brain and an external device, Brain-computer Interface (BCI) technology is expanding the field of neuroscience and enabling novel therapeutic uses [1]. BCIs are used in a wide range of industries, including gaming and the military as well as medical applications that restore lost hearing, sight, and mobility. Although the technology has made significant strides in recent years, many elements of BCI functioning remain unexplored, especially in the area of understanding and interpreting brain signals [2]. Understanding the distinctions between real and imaginary movements is a key component of BCI research. Movement imagination, also known as motor imagery, is the practice of a physical motion in the mind without actual bodily movement. Real and imagined actions produce different brainwave patterns in BCI, which may be converted into precise commands for the interface [3,4]. For the actual use of BCI, the capacity to distinguish between these motions is essential because it can expand the range of instructions, enhancing the functionality and flexibility of the interface. In addition, the study of Event-related Potentials (ERPs) offers a time-locked illustration of the reactions the brain and nervous system undergo in response to external stimuli. These potentials can be utilized in the context of BCIs to comprehend how the brain responds to particular stimuli and may subsequently be used to develop more accurate and responsive interfaces [5]. The reactions to sound and visual stimuli, respectively are known as auditory and visual ERPs, allowing EEG recordings to provide a window into the specific brain’s activity and its reaction to image or sound stimuli [6,7].
Taking the above into account, we designed our study using a multimodal approach to probe the different facets of Brain-computer interactions. Our objective is to identify patterns that may be used to enhance the functionality and usability of BCIs. As such, we recorded electroencephalography (EEG) data from subjects performing both real and imaginary leg movements, in response to visual and auditory stimuli, offering a rich dataset for analysis. The recorded ERPs were then used to compute Pearson and Spearman correlation coefficients, with the aim of identifying any discernible patterns or differences between various states and movements. By examining ERPs, this work intends to explore more into the distinctions between actual and fictitious motions in BCI applications. Our study focuses more explicitly on investigating the correlation indices between four different states of our BCI dataset: real and imagined left- and right-leg motions, simultaneous movement of both legs, and no movement. This methodology enabled us to offer fresh insights to the research community regarding the relationship between different states (real and imaginary) and movements in a BCI context. Our approach of combining different sensory modalities and movement types extends the understanding of how different neural patterns are elicited and can be differentiated, thereby contributing to the optimization of BCI technology. This study is especially significant given the increasing role of BCIs in fields ranging from neuroscience to rehabilitative medicine and beyond. Through our rigorous approach and detailed analysis, we aim to provide a robust basis for further research in this rapidly evolving field.

2. Materials and Method

2.1. Dataset

2.1.1. Participants and Data Collection

Our study involved a total of thirty-three subjects (seven females), with a mean age of 25 ± 3.7 All subjects were healthy individuals without any known neurological or psychiatric disorders. All participants had normal (or corrected to normal) vision and normal hearing. Each subject provided informed consent prior to participating in the study, and all protocols adhered to ethical guidelines set by the National Technical University of Athens institution’s review board (10 November 2021). Data were collected with a 64-channel Ag/AgCl electroencephalographic (EEG) system (Biosemi, Activetwo System, BioSemi, Amsterdam, Netherlands), using the standard 10-20 system.

2.1.2. Experimental Design

The experimental protocol consisted of two sessions, one utilizing visual and one auditory stimulus. In each session, each subject was presented with a data item in a queue. In the visual session, the data item was an image of shoe soles, highlighting the left, the right, or both shoe soles, while none presented non-highlighted soles. In the auditory session, items were delivered through headphones, delivering 1 s spoken commands: right, left, both, none. Each subject was requested to perform each session twice: once for real motion (where participants had to move (or not) each limp based on the data item) and another time for imaginary motion (where participant had to imagine the four different movements). This resulted in two states for each movement—real and imaginary. Both tasks were implemented in Python programming language with Psychopy framework.

2.1.3. Preprocessing

Collected data were preprocessed, utilizing a previously validated methodology to ensure signal quality and consistency [8]. Firstly, a bandpass filter (0.5–40 Hz) was applied to eliminate high-frequency noise and DC drifts in the data. Then, signals were re-referenced to the average signal of all electrodes. Artifacts corresponding to eye blinks were removed manually utilizing Independent Component Analysis. The resulting EEG data were then segmented into epochs, time-locked to the onset of the stimuli. The epochs spanned from −200 ms to 800 ms relative to stimulus onset, where 0 ms marked the onset/stimuli of the evoked event. Epochs were baseline-corrected using a 150 ms pre-stimulus period. In this project, we use the ‘Evoked’ potentials from the MNE-Python module [9]. This module provides tools for visualizing, analyzing, and decoding these signals, allowing us to compare brain activity during real and imaginary movements.

2.2. Computation

2.2.1. ERP Creation and Feature Extraction

Feature extraction involved the computation of the ERPs for each condition. Each epoch was averaged across trials for each condition, resulting in ERPs for each of the eight conditions (four movements in two states—real and imaginary) for every subject (Figure 1). This method facilitated the comparison between real and imaginary states for each movement, serving as an effective means to examine the brain’s response to different stimuli and conditions.

2.2.2. Correlation Computation

The correlation between ERPs was computed using the Pearson’s correlation coefficient, which measures the linear relationship between two datasets, and the Spearman’s correlation, which assesses monotonic relationships (whether linear or not). The Pearson correlation coefficient was used to identify and quantify the strength of the association between the real and imagined states of the same movement. Its sensitivity to the linear relationship makes it ideal for our dataset (given the nature of our experiment), presenting the existence of linear associations between the signal [4,10]. On the other hand, the Spearman correlation coefficient, a nonparametric measure of rank correlation, was used to assess the similarity in the orderings of data when ranked by each of the variables. This was implemented for the comparison between different types of movement across the real and imaginary states, where a linear relationship may not be present [11]. Both methods were employed to provide a comprehensive overview of the relationships between different conditions (linear and non-linear relationships), accommodating the complexity and diversity of neural data [12]. Each pair of conditions (real/imaginary and left/right/both/none) was analyzed for each subject, resulting in a correlation matrix for each individual.

3. Results

In the pursuit of exploring the relationship between various states and movements, we computed correlation indices for all combinations within our experimental design. Figure 2 presents the average Pearson’s and Spearman correlation coefficients, across all subjects for each pair of conditions. To highlight individual differences, Figure 2, Figure 3 and Figure 4 depict the distribution of correlation coefficients for specific subjects and comparisons, and the different correlation methods.
Our analysis revealed several interesting patterns. Firstly, higher correlations were observed between similar types of movements, irrespective of whether they were real or imaginary. This was consistent across both the Pearson and Spearman measures of correlation. For example, the correlation between real and imaginary states when moving the right leg was significantly higher compared to the correlation between the right leg and both legs, for both real and imaginary states (Figure 2). To highlight individual differences, Figure 3 and Figure 4 present the distribution of correlation coefficients for specific subjects and comparisons.
On the other hand, differences emerged when comparing the correlations between the real and imaginary states of the same movement. The imaginary state exhibited a tendency for higher correlation when the left and right movements were compared with the no-movement condition. This pattern was consistent across the majority of subjects and was further reinforced by the Spearman correlation coefficients (Figure 2b). In terms of visual and auditory stimuli comparisons, our results showed no significant difference in the correlation indices. However, a trend was observed where visual ERPs had marginally higher correlation values when compared to the auditory ERPs within the same state and movement (Figure 3).
Finally, across the majority of subjects, the correlation from visual and auditory stimulus for the “both legs” movement presented the highest values compared to the other movement conditions (Figure 2, Figure 3, Figure 4 and Figure 5). This may suggest a similar cognitive processing mechanism during both real and imagined bilateral leg movements. These results provide insight into the relationship between different states and movements within a BCI context. By understanding these relationships, we can refine the design and performance of BCI devices for improved user adaptability and responsiveness [2].

4. Discussion

Our study aimed to discern the differences between real and imaginary movements in a Brain-computer Interface (BCI) context, focusing specifically on different lower limp movements and the resulting ERPs. Our findings have provided notable insights into the complex relationships among these factors and are described below.
One of the key findings was the higher correlation between similar types of movements, whether real or imaginary. This suggests that the neural representations for similar physical and imagined movements share commonalities, an observation that aligns with previous research emphasizing the overlapping neural mechanisms activated during real and imagined movements [13]. Such findings substantiate the viability of motor imagery as a reliable means of command input in BCIs. However, contrasting correlation values were also observed when comparing the real and imaginary states of the same movement. The imaginary state showed higher correlation with the no-movement condition when comparing left and right movements. This might be indicative of the cognitive effort and control required in motor imagery, which could bear similarities to the resting or no-movement state, a concept that has been highlighted in earlier studies [14,15]. Interestingly, no significant difference in correlation values between visual and auditory ERPs were found. This could suggest that the type of stimulus, be it auditory or visual, may not significantly influence the brain’s response during either real or imagined movement [16]. However, a trend of marginally higher correlation values was observed for visual ERPs, encouraging further investigation.
The results present several implications regarding the nascent field of neuroscience and brain signal analysis, opening up several interesting avenues. From a BCI technology perspective, our findings may contribute to enhancing signal classification algorithms. The distinct patterns identified between real and imaginary movements can be used to improve the accuracy of BCIs that rely on differentiating these states. Furthermore, understanding the neural similarity between no-movement and certain imagined movements might inform user-training strategies, assisting users in generating more distinguishable neural patterns during different tasks [17,18]. Future research could be applied in different correlation or similarity measures, further identifying non-linear relationships or other complex interactions between evoked signals. In addition, the inclusion of other types of movements or stimuli could provide a more comprehensive understanding of brain-behavior relationships within a BCI framework. Moreover, the differentiations could be incorporated into machine learning fusion techniques [19], allowing for resources optimization for potential application in wearable devices [20]. Lastly, the impact of user experience or training on the differentiation between real and imaginary movements suggest a worthwhile investigation, considering the practical significance of BCIs in rehabilitation and assistive technology [21,22].
However, it should be taken into account that the limitations of the number of subjects (while not insignificant) might influence the generalizability of our findings. Future studies could benefit from a larger sample size, and from including diverse demographic groups to account for potential inter-individual differences [23,24]. Furthermore, while we applied rigorous preprocessing steps to our data, the potential for noise or bias inherent in EEG recordings cannot be completely ruled out [25].

5. Conclusions

Our study provided a thorough exploration of the correlations between real and imagined leg movements within a Brain-computer Interface (BCI) context, utilizing both visual and auditory ERPs. Our key findings highlighted distinct correlation patterns between similar types of movements, irrespective of their real or imagined state, and contrasting correlations when comparing real and imaginary states of the same movement. Interestingly, we observed a trend towards higher correlations for visual ERPs, although the difference was not statistically significant.
These findings have important implications for BCI technology. First, they underscore the viability of motor imagery as a control input, as the neural representations for similar physical and imagined movements share commonalities. Second, the distinct patterns identified between real and imaginary movements can be utilized to improve the accuracy of signal classification algorithms in BCIs. Lastly, the observed trends between the different states and types of ERPs offer insights that can potentially inform user-training strategies, thereby enhancing the overall performance of BCI systems.

Author Contributions

Conceptualization. G.K.M., I.K. (Ioannis Kalatzis) and C.P.; methodology, I.Z., S.T.M. and I.K. (Ioannis Kakkos); software, I.Z. and A.K.; validation, E.C.V., P.A. and C.P.; formal analysis, G.K.M., A.K. and I.K. (Ioannis Kakkos); investigation, I.Z. and S.T.M.; resources, I.K. (Ioannis Kakkos) and G.K.M.; data curation, I.K. (Ioannis Kalatzis) and E.C.V.; writing—original draft preparation, S.T.M. and I.Z.; writing—review and editing. P.A., C.P. and S.T.M.; visualization. I.K. (Ioannis Kakkos) and A.K.; supervision, G.K.M. and I.K. (Ioannis Kalatzis); project administration G.K.M. and P.A. All authors have read and agreed to the published version of the manuscript.

Funding

This research work was supported by the Hellenic Foundation for Research and Innovation (H.F.R.I.) under the “First Call for H.F.R.I. Research Projects to support Faculty members and Researchers and the procurement of high-cost research equipment grant” (Project Number: 1540, MIS 80747).

Institutional Review Board Statement

The study was conducted in accordance with the Declaration of Helsinki and approved by the Ethics Committee of the National Technical University of Athens institution’s (10 November 2021).

Informed Consent Statement

Informed consent was obtained from all subjects involved in the study.

Data Availability Statement

The data presented in this study are available upon request. The data are not publicly available due to privacy reasons.

Conflicts of Interest

The authors declare no conflict of interest.

References

  1. Kakkos, I.; Miloulis, S.-T.; Gkiatis, K.; Dimitrakopoulos, G.N.; Matsopoulos, G.K. Human–Machine Interfaces for Motor Rehabilitation. In Advanced Computational Intelligence in Healthcare-7: Biomedical Informatics; Maglogiannis, I., Brahnam, S., Jain, L.C., Eds.; Studies in Computational Intelligence; Springer: Berlin/Heidelberg, Germany, 2020; pp. 1–16. ISBN 978-3-662-61114-2. [Google Scholar]
  2. Saha, S.; Mamun, K.A.; Ahmed, K.; Mostafa, R.; Naik, G.R.; Darvishi, S.; Khandoker, A.H.; Baumert, M. Progress in Brain Computer Interface: Challenges and Opportunities. Front. Syst. Neurosci. 2021, 15, 578875. [Google Scholar] [CrossRef] [PubMed]
  3. Lee, Y.; Lee, H.J.; Tae, K.S. Classification of EEG Signals Related to Real and Imagery Knee Movements Using Deep Learning for Brain Computer Interfaces. Technol. Health Care Off. J. Eur. Soc. Eng. Med. 2023, 31, 933–942. [Google Scholar] [CrossRef] [PubMed]
  4. Sugata, H.; Hirata, M.; Yanagisawa, T.; Matsushita, K.; Yorifuji, S.; Yoshimine, T. Common Neural Correlates of Real and Imagined Movements Contributing to the Performance of Brain–Machine Interfaces. Sci. Rep. 2016, 6, 24663. [Google Scholar] [CrossRef] [PubMed]
  5. Gutierrez-Martinez, J.; Mercado-Gutierrez, J.A.; Carvajal-Gámez, B.E.; Rosas-Trigueros, J.L.; Contreras-Martinez, A.E. Artificial Intelligence Algorithms in Visual Evoked Potential-Based Brain-Computer Interfaces for Motor Rehabilitation Applications: Systematic Review and Future Directions. Front. Hum. Neurosci. 2021, 15, 772837. [Google Scholar] [CrossRef] [PubMed]
  6. Backer, K.C.; Kessler, A.S.; Lawyer, L.A.; Corina, D.P.; Miller, L.M. A Novel EEG Paradigm to Simultaneously and Rapidly Assess the Functioning of Auditory and Visual Pathways. J. Neurophysiol. 2019, 122, 1312–1329. [Google Scholar] [CrossRef] [PubMed]
  7. Kakkos, I.; Ventouras, E.M.; Asvestas, P.A.; Karanasiou, I.S.; Matsopoulos, G.K. A Condition-Independent Framework for the Classification of Error-Related Brain Activity. Med. Biol. Eng. Comput. 2020, 58, 573–587. [Google Scholar] [CrossRef]
  8. Dimitrakopoulos, G.N.; Kakkos, I.; Vrahatis, A.G.; Sgarbas, K.; Li, J.; Sun, Y.; Bezerianos, A. Driving Mental Fatigue Classification Based on Brain Functional Connectivity. In Proceedings of the Engineering Applications of Neural Networks; Boracchi, G., Iliadis, L., Jayne, C., Likas, A., Eds.; Springer International Publishing: Cham, Switzerland, 2017; pp. 465–474. [Google Scholar]
  9. Gramfort, A.; Luessi, M.; Larson, E.; Engemann, D.; Strohmeier, D.; Brodbeck, C.; Goj, R.; Jas, M.; Brooks, T.; Parkkonen, L.; et al. MEG and EEG Data Analysis with MNE-Python. Front. Neurosci. 2013, 7, 276. [Google Scholar] [CrossRef]
  10. Pawan; Dhiman, R. Electroencephalogram Channel Selection Based on Pearson Correlation Coefficient for Motor Imagery-Brain-Computer Interface. Meas. Sens. 2023, 25, 100616. [Google Scholar] [CrossRef]
  11. Vidaurre, C.; Haufe, S.; Jorajuría, T.; Müller, K.-R.; Nikulin, V.V. Sensorimotor Functional Connectivity: A Neurophysiological Factor Related to BCI Performance. Front. Neurosci. 2020, 14, 575081. [Google Scholar] [CrossRef]
  12. Oken, B.; Memmott, T.; Eddy, B.; Wiedrick, J.; Fried-Oken, M. Vigilance State Fluctuations and Performance Using Brain-Computer Interface for Communication. Brain Comput. Interfaces Abingdon Engl. 2018, 5, 146–156. [Google Scholar] [CrossRef]
  13. Roberts, J.W.; Wood, G.; Wakefield, C.J. Examining the Equivalence between Imagery and Execution within the Spatial Domain—Does Motor Imagery Account for Signal-Dependent Noise? Exp. Brain Res. 2020, 238, 2983–2992. [Google Scholar] [CrossRef]
  14. Jacquet, T.; Lepers, R.; Poulin-Charronnat, B.; Bard, P.; Pfister, P.; Pageaux, B. Mental Fatigue Induced by Prolonged Motor Imagery Increases Perception of Effort and the Activity of Motor Areas. Neuropsychologia 2021, 150, 107701. [Google Scholar] [CrossRef]
  15. Moran, A.; O’Shea, H. Motor Imagery Practice and Cognitive Processes. Front. Psychol. 2020, 11, 394. [Google Scholar] [CrossRef] [PubMed]
  16. Miloulis, S.T.; Kakkos, I.; Karampasi, A.; Zorzos, I.; Ventouras, E.-C.; Matsopoulos, G.K.; Asvestas, P.; Kalatzis, I. Stimulus Effects on Subject-Specific BCI Classification Training Using Motor Imagery. In Proceedings of the 2021 International Conference on e-Health and Bioengineering (EHB), Iasi, Romania, 18–19 November 2021; pp. 1–4. [Google Scholar]
  17. Rasheed, S. A Review of the Role of Machine Learning Techniques towards Brain–Computer Interface Applications. Mach. Learn. Knowl. Extr. 2021, 3, 835–862. [Google Scholar] [CrossRef]
  18. Roc, A.; Pillette, L.; Mladenović, J.; Benaroch, C.; N’Kaoua, B.; Jeunet, C.; Lotte, F. A Review of User Training Methods in Brain Computer Interfaces Based on Mental Tasks. J. Neural Eng. 2020, 18, 011002. [Google Scholar] [CrossRef] [PubMed]
  19. Kakkos, I.; Dimitrakopoulos, G.N.; Sun, Y.; Yuan, J.; Matsopoulos, G.K.; Bezerianos, A.; Sun, Y. EEG Fingerprints of Task-Independent Mental Workload Discrimination. IEEE J. Biomed. Health Inform. 2021, 25, 3824–3833. [Google Scholar] [CrossRef] [PubMed]
  20. Miloulis, S.-T.; Kakkos, I.; Dimitrakopoulos, G.Ν.; Sun, Y.; Karanasiou, I.; Asvestas, P.; Ventouras, E.-C.; Matsopoulos, G. Evaluating Memory and Cognition via a Wearable EEG System: A Preliminary Study. In Proceedings of the Wireless Mobile Communication and Healthcare; Ye, J., O’Grady, M.J., Civitarese, G., Yordanova, K., Eds.; Springer International Publishing: Cham, Switzerland, 2021; pp. 52–66. [Google Scholar]
  21. Ron-Angevin, R.; Medina-Juliá, M.T.; Fernández-Rodríguez, Á.; Velasco-Álvarez, F.; Andre, J.-M.; Lespinet-Najib, V.; Garcia, L. Performance Analysis with Different Types of Visual Stimuli in a BCI-Based Speller Under an RSVP Paradigm. Front. Comput. Neurosci. 2021, 14, 587702. [Google Scholar] [CrossRef] [PubMed]
  22. Škola, F.; Tinková, S.; Liarokapis, F. Progressive Training for Motor Imagery Brain-Computer Interfaces Using Gamification and Virtual Reality Embodiment. Front. Hum. Neurosci. 2019, 13, 329. [Google Scholar] [CrossRef] [PubMed]
  23. Kakkos, I.; Dimitrakopoulos, G.N.; Gao, L.; Zhang, Y.; Qi, P.; Matsopoulos, G.K.; Thakor, N.; Bezerianos, A.; Sun, Y. Mental Workload Drives Different Reorganizations of Functional Cortical Connectivity Between 2D and 3D Simulated Flight Experiments. IEEE Trans. Neural Syst. Rehabil. Eng. Publ. IEEE Eng. Med. Biol. Soc. 2019, 27, 1704–1713. [Google Scholar] [CrossRef] [PubMed]
  24. Stieger, J.R.; Engel, S.A.; He, B. Continuous Sensorimotor Rhythm Based Brain Computer Interface Learning in a Large Population. Sci. Data 2021, 8, 98. [Google Scholar] [CrossRef]
  25. Larson, E.; Taulu, S. Reducing Sensor Noise in MEG and EEG Recordings Using Oversampled Temporal Projection. IEEE Trans. Biomed. Eng. 2018, 65, 1002–1013. [Google Scholar] [CrossRef]
Figure 1. ERPs (averaged epochs) in a butterfly mode. Each color tone represents a different EEG channel, shown in the upper left.
Figure 1. ERPs (averaged epochs) in a butterfly mode. Each color tone represents a different EEG channel, shown in the upper left.
Engproc 50 00011 g001
Figure 2. The average correlation coefficient across all subjects for each pair of conditions for: (a) Pearson’s correlation; and (b) Spearman correlation. Each state is presented in the following format: (Real/Imaginary)/(Visual/Auditory)/(Left, Right, Both, None).
Figure 2. The average correlation coefficient across all subjects for each pair of conditions for: (a) Pearson’s correlation; and (b) Spearman correlation. Each state is presented in the following format: (Real/Imaginary)/(Visual/Auditory)/(Left, Right, Both, None).
Engproc 50 00011 g002
Figure 3. Correlation between real and imaginary motion of subject 7: (a) Pearson’s and (b) Spearman.
Figure 3. Correlation between real and imaginary motion of subject 7: (a) Pearson’s and (b) Spearman.
Engproc 50 00011 g003
Figure 4. Correlation between real and imaginary motion of subject 25: (a) Pearson’s and (b) Spearman.
Figure 4. Correlation between real and imaginary motion of subject 25: (a) Pearson’s and (b) Spearman.
Engproc 50 00011 g004
Figure 5. Pearson correlation for: (a) subject 18; (b) subject 17; (c) and subject 29 between movements from visual and auditory stimulus.
Figure 5. Pearson correlation for: (a) subject 18; (b) subject 17; (c) and subject 29 between movements from visual and auditory stimulus.
Engproc 50 00011 g005
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Miloulis, S.T.; Zorzos, I.; Kakkos, I.; Karampasi, A.; Ventouras, E.C.; Kalatzis, I.; Papageorgiou, C.; Asvestas, P.; Matsopoulos, G.K. Unraveling Imaginary and Real Motion: A Correlation Indices Study in BCI Data. Eng. Proc. 2023, 50, 11. https://doi.org/10.3390/engproc2023050011

AMA Style

Miloulis ST, Zorzos I, Kakkos I, Karampasi A, Ventouras EC, Kalatzis I, Papageorgiou C, Asvestas P, Matsopoulos GK. Unraveling Imaginary and Real Motion: A Correlation Indices Study in BCI Data. Engineering Proceedings. 2023; 50(1):11. https://doi.org/10.3390/engproc2023050011

Chicago/Turabian Style

Miloulis, Stavros T., Ioannis Zorzos, Ioannis Kakkos, Aikaterini Karampasi, Errikos C. Ventouras, Ioannis Kalatzis, Charalampos Papageorgiou, Panteleimon Asvestas, and George K. Matsopoulos. 2023. "Unraveling Imaginary and Real Motion: A Correlation Indices Study in BCI Data" Engineering Proceedings 50, no. 1: 11. https://doi.org/10.3390/engproc2023050011

APA Style

Miloulis, S. T., Zorzos, I., Kakkos, I., Karampasi, A., Ventouras, E. C., Kalatzis, I., Papageorgiou, C., Asvestas, P., & Matsopoulos, G. K. (2023). Unraveling Imaginary and Real Motion: A Correlation Indices Study in BCI Data. Engineering Proceedings, 50(1), 11. https://doi.org/10.3390/engproc2023050011

Article Metrics

Back to TopTop