Using Brain Activity Patterns to Differentiate Real and Virtual Attended Targets during Augmented Reality Scenarios
Abstract
:1. Introduction
1.1. Augmented Reality Technology
1.2. Related Work
1.3. Hypotheses
2. Experimental Design
3. Methods
3.1. Experiment Session
3.2. Data Recording
3.3. Analysis
3.3.1. Trial-Oblivious Approach
3.3.2. Trial-Sensitive Approach
3.3.3. Bci-Approach
3.3.4. Eye Tracking
3.3.5. Person-Independent Approach
3.3.6. Evaluation
4. Results
4.1. Person-Dependent Classification
4.2. Eye Tracking Classification
4.3. Person-Independent Classification
4.4. Feature Analysis
5. Discussion
6. Conclusions
Future Work
Author Contributions
Funding
Institutional Review Board Statement
Informed Consent Statement
Data Availability Statement
Conflicts of Interest
Abbreviations
AR | Augmented Reality |
BCI | Brain–Computer Interface |
CNN | Convolutional Neural Network |
EEG | Electroencephalography |
EOG | Electrooculography |
FBCSP | Filter-Bank Common Spatial Pattern |
HMD | Head-Mounted Display |
LSL | Lab Streaming Layer |
PSD | Power Spectral Density |
SSVEP | Steady-State Visually Evoked Potential |
References
- Kinsbourne, M. Neuropsychology of Attention. In Neuropsychology, 2nd ed.; Zaidel, D.W., Ed.; Academic Press Inc.: Cambridge, MA, USA, 1994; pp. 105–123. [Google Scholar] [CrossRef]
- Rensink, R. A Function-Centered Taxonomy of Visual Attention. In Phenomenal Qualities: Sense, Perception, and Consciousness; Coates, P., Coleman, S., Eds.; Oxford University Press: Oxford, UK, 2015; pp. 347–375. [Google Scholar]
- Hunt, A.R.; Kingstone, A. Covert and overt voluntary attention: Linked or independent? Cogn. Brain Res. 2003, 18, 102–105. [Google Scholar] [CrossRef]
- Chun, M.M.; Golomb, J.D.; Turk-Browne, N.B. A taxonomy of external and internal attention. Annu. Rev. Psychol. 2011, 62, 73–101. [Google Scholar] [CrossRef] [Green Version]
- Peelen, M.V.; Heslenfeld, D.J.; Theeuwes, J. Endogenous and exogenous attention shifts are mediated by the same large-scale neural network. Neuroimage 2004, 22, 822–830. [Google Scholar] [CrossRef]
- Wolpaw, J.; Wolpaw, E.W. Brain–Computer Interfaces: Principles and Practice; Oxford University Press: Oxford, UK, 2012. [Google Scholar]
- Birbaumer, N.; Ghanayim, N.; Hinterberger, T.; Iversen, I.; Kotchoubey, B.; Kübler, A.; Perelmouter, J.; Taub, E.; Flor, H. A spelling device for the paralysed. Nature 1999, 398, 297–298. [Google Scholar] [CrossRef]
- Wolpaw, J.R.; Birbaumer, N.; McFarland, D.J.; Pfurtscheller, G.; Vaughan, T.M. Brain–computer interfaces for communication and control. Clin. Neurophysiol. 2002, 113, 767–791. [Google Scholar] [CrossRef]
- Aricò, P.; Borghini, G.; Di Flumeri, G.; Sciaraffa, N.; Babiloni, F. Passive BCI beyond the lab: Current trends and future directions. Physiol. Meas. 2018, 39, 08TR02. [Google Scholar] [CrossRef]
- Dehais, F.; Dupres, A.; Di Flumeri, G.; Verdiere, K.; Borghini, G.; Babiloni, F.; Roy, R. Monitoring pilot’s cognitive fatigue with engagement features in simulated and actual flight conditions using an hybrid fNIRS-EEG passive BCI. In Proceedings of the 2018 IEEE International Conference on Systems, Man, and Cybernetics (SMC), Miyazaki, Japan, 7–8 October 2018; pp. 544–549. [Google Scholar]
- Guger, C.; Allison, B.; Grosswindhager, B.; Prückl, R.; Hintermüller, C.; Kapeller, C.; Bruckner, M.; Krausz, G.; Edlinger, G. How Many People Could Use an SSVEP BCI? Front. Neurosci. 2012, 6, 169. [Google Scholar] [CrossRef] [Green Version]
- Fraga-Lamas, P.; Fernandez-Caramés, T.M.; Blanco-Novoa, O.; Vilar-Montesinos, M.A. A Review on Industrial Augmented Reality Systems for the Industry 4.0 Shipyard. IEEE Access 2018, 6, 13358–13375. [Google Scholar] [CrossRef]
- Gattullo, M.; Scurati, G.W.; Fiorentino, M.; Uva, A.E.; Ferrise, F.; Bordegoni, M. Towards augmented reality manuals for industry 4.0: A methodology. Robot. Comput. Integr. Manuf. 2019, 56, 276–286. [Google Scholar] [CrossRef]
- Marino, E.; Barbieri, L.; Colacino, B.; Fleri, A.K.; Bruno, F. An Augmented Reality inspection tool to support workers in Industry 4.0 environments. Comput. Ind. 2021, 127, 103412. [Google Scholar] [CrossRef]
- Dalrymple, K.A.; Manner, M.D.; Harmelink, K.A.; Teska, E.P.; Elison, J.T. An Examination of Recording Accuracy and Precision From Eye Tracking Data From Toddlerhood to Adulthood. Front. Psychol. 2018, 9, 803. [Google Scholar] [CrossRef] [Green Version]
- Paszkiel, S.; Szpulak, P. Methods of acquisition, archiving and biomedical data analysis of brain functioning. In Proceedings of the 3rd International Scientific Conference on Brain–Computer Interfaces, BCI 2018, Oplole, Poland, 13–14 March 2018; pp. 158–171. [Google Scholar]
- Ang, K.K.; Chin, Z.Y.; Zhang, H.; Guan, C. Filter bank common spatial pattern (FBCSP) in brain–computer interface. In Proceedings of the 2008 IEEE International Joint Conference on Neural Networks (IEEE World Congress on Computational Intelligence), Hong Kong, China, 1–8 June 2008; pp. 2390–2397. [Google Scholar]
- Azuma, R.T. A Survey of Augmented Reality. Presence Teleoper. Virtual Environ. 1997, 6, 355–385. [Google Scholar] [CrossRef]
- Si-Mohammed, H.; Argelaguet, F.; Casiez, G.; Roussel, N.; Lécuyer, A. Brain–Computer Interfaces and Augmented Reality: A State of the Art. In Proceedings of the 7th Graz Brain–Computer Interface Conference, Graz, Austria, 17–22 September 2017. [Google Scholar] [CrossRef]
- Lin, Z.; Zhang, C.; Wu, W.; Gao, X. Frequency recognition based on canonical correlation analysis for SSVEP-based BCIs. IEEE Trans. Biomed. Eng. 2006, 53, 2610–2614. [Google Scholar] [CrossRef]
- Polich, J. Updating P300: An integrative theory of P3a and P3b. Clin. Neurophysiol. 2007, 118, 2128–2148. [Google Scholar] [CrossRef] [Green Version]
- Liang, C.; Lin, C.T.; Yao, S.N.; Chang, W.S.; Liu, Y.C.; Chen, S.A. Visual attention and association: An electroencephalography study in expert designers. Des. Stud. 2017, 48, 76–95. [Google Scholar] [CrossRef]
- Sauseng, P.; Klimesch, W.; Stadler, W.; Schabus, M.; Doppelmayr, M.; Hanslmayr, S.; Gruber, W.; Birbaumer, N. A shift of visual spatial attention is selectively associated with human EEG alpha activity. Eur. J. Neurosci. 2006, 22, 2917–2926. [Google Scholar] [CrossRef]
- Busch, N.; VanRullen, R. Spontaneous EEG oscillations reveal periodic sampling of visual attention. Proc. Natl. Acad. Sci. USA 2010, 107, 16048–16053. [Google Scholar] [CrossRef] [Green Version]
- Benedek, M.; Schickel, R.J.; Jauk, E.; Fink, A.; Neubauer, A.C. Alpha power increases in right parietal cortex reflects focused internal attention. Neuropsychologia 2014, 56, 393–400. [Google Scholar] [CrossRef] [Green Version]
- Krigolson, O.; Williams, C.; Colino, F. Using Portable EEG to Assess Human Visual Attention. In Proceedings of the 11th International Conference, AC 2017, Held as Part of HCI International 2017, Vancouver, BC, Canada, 9–14 July 2017; pp. 56–65. [Google Scholar] [CrossRef]
- Myrden, A.; Chau, T. Effects of user mental state on EEG-BCI performance. Front. Hum. Neurosci. 2015, 9, 308. [Google Scholar] [CrossRef] [Green Version]
- Zhang, C.; Yu, X. Estimating mental fatigue Based on electroencephalogram and heart rate variability. Pol. J. Med. Phys. Eng. PL ISSN 2010, 16, 67–84. [Google Scholar] [CrossRef] [Green Version]
- Liu, N.H.; Chiang, C.Y.; Chu, H.C. Recognizing the Degree of Human Attention Using EEG Signals from Mobile Sensors. Sensors 2013, 13, 10273–10286. [Google Scholar] [CrossRef]
- Li, Y.; Li, X.; Ratcliffe, M.; Liu, L.; Qi, Y.; Liu, Q. A Real-Time EEG-Based BCI System for Attention Recognition in Ubiquitous Environment. In Proceedings of the 2011 International Workshop on Ubiquitous Affective Awareness and Intelligent Interaction, UAAII ’11, Beijing, China, 18 September 2011; Association for Computing Machinery: New York, NY, USA, 2011; pp. 33–40. [Google Scholar] [CrossRef]
- Sethi, C.; Dabas, H.; Dua, C.; Dalawat, M.; Sethia, D. EEG-Based Attention Feedback to Improve Focus in E-Learning. In Proceedings of the 2018 2nd International Conference on Computer Science and Artificial Intelligence CSAI ’18, Shenzhen, China, 8–10 December 2018; Association for Computing Machinery: New York, NY, USA, 2018; pp. 321–326. [Google Scholar] [CrossRef]
- Putze, F.; Hild, J.; Kärgel, R.; Herff, C.; Redmann, A.; Beyerer, J.; Schultz, T. Locating user attention using eye tracking and EEG for spatio-temporal event selection. In Proceedings of the 2013 International Conference on Intelligent User Interfaces, IUI ’13, Santa Monica, CA, USA, 19–22 March 2013; pp. 129–136. [Google Scholar] [CrossRef]
- Sibert, L.; Jacob, R.; Templeman, J. Evaluation And Analysis Of Eye Gaze Interaction. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, CHI ‘00, The Hague, The Netherlands, 1–6 April 2000. [Google Scholar]
- Florea, L.; Florea, C.; Vranceanu, R.; Vertan, C. Can Your Eyes Tell Me How You Think? A Gaze Directed Estimation of the Mental Activity. In Proceedings of the British Machine Vision Conference, Bristol, UK, 9–13 September 2013; pp. 60.1–60.11. [Google Scholar] [CrossRef] [Green Version]
- Day, M.E. An Eye Movement Phenomenon Relating to Attention, Thought and Anxiety. Percept. Mot. Ski. 1964, 19, 443–446. [Google Scholar] [CrossRef]
- Hutton, S.B.; Ettinger, U. The antisaccade task as a research tool in psychopathology: A critical review. Psychophysiology 2006, 43, 302–313. [Google Scholar] [CrossRef] [Green Version]
- Huang, M.X.; Li, J.; Ngai, G.; Leong, H.V.; Bulling, A. Moment-to-Moment Detection of Internal Thought during Video Viewing from Eye Vergence Behavior. In Proceedings of the 27th ACM International Conference on Multimedia, MM ’19, Nice, France, 21–25 October 2019; Association for Computing Machinery: New York, NY, USA, 2019; pp. 2254–2262. [Google Scholar] [CrossRef] [Green Version]
- Annerer-Walcher, S.; Ceh, S.; Putze, F.; Kampen, M.; Körner, C.; Benedek, M. How reliably do eye parameters indicate internal vs. external attentional focus? Cognit. Sci. 2021, 45, e12977. [Google Scholar] [CrossRef]
- Putze, F.; Weiß, D.; Vortmann, L.M.; Schultz, T. Augmented Reality Interface for Smart Home Control using SSVEP-BCI and Eye Gaze. In Proceedings of the 2019 IEEE International Conference on Systems, Man and Cybernetics (SMC), Bari, Italy, 6–9 October 2019; pp. 2812–2817. [Google Scholar] [CrossRef]
- Paszkiel, S. Augmented reality of technological environment in correlation with brain computer interfaces for control processes. In Recent Advances in Automation, Robotics and Measuring Techniques; Springer: Cham, Switzerland, 2014; pp. 197–203. [Google Scholar]
- Jia, C.; Xu, H.; Hong, B.; Gao, X.; Zhang, Z.; Gao, S. A Human Computer Interface Using SSVEP-Based BCI Technology. In Foundations of Augmented Cognition; Schmorrow, D.D., Reeves, L.M., Eds.; Springer: Berlin/Heidelberg, Germany, 2007; pp. 113–119. [Google Scholar]
- Takano, K.; Hata, N.; Kansaku, K. Towards Intelligent Environments: An Augmented Reality–Brain–Machine Interface Operated with a See-Through Head-Mount Display. Front. Neurosci. 2011, 5, 60. [Google Scholar] [CrossRef] [Green Version]
- Si-Mohammed, H.; Petit, J.; Jeunet, C.; Argelaguet, F.; Spindler, F.; Évain, A.; Roussel, N.; Casiez, G.; Lecuyer, A. Towards BCI-Based Interfaces for Augmented Reality: Feasibility, Design and Evaluation. IEEE Trans. Vis. Comput. Graph. 2020, 26, 1608–1621. [Google Scholar] [CrossRef] [Green Version]
- Blum, T.; Stauder, R.; Euler, E.; Navab, N. Superman-like X-ray vision: Towards brain–computer interfaces for medical augmented reality. In Proceedings of the 2012 IEEE International Symposium on Mixed and Augmented Reality (ISMAR), Atlanta, GA, USA, 5–8 November 2012; pp. 271–272. [Google Scholar] [CrossRef] [Green Version]
- Zander, T.; Kothe, C. Towards passive Brain–Computer interfaces: Applying Brain–Computer interface technology to human-machine systems in general. J. Neural Eng. 2011, 8, 025005. [Google Scholar] [CrossRef]
- Navarro, K.F. Wearable, wireless brain computer interfaces in augmented reality environments. In Proceedings of the International Conference on Information Technology: Coding and Computing, ITCC 2004, Las Vegas, NV, USA, 5–7 April 2004; Volulme 2, pp. 643–647. [Google Scholar] [CrossRef]
- Zao, J.K.; Jung, T.P.; Chang, H.M.; Gan, T.T.; Wang, Y.T.; Lin, Y.P.; Liu, W.H.; Zheng, G.Y.; Lin, C.K.; Lin, C.H.; et al. Augmenting VR/AR Applications with EEG/EOG Monitoring and Oculo-Vestibular Recoupling. In Foundations of Augmented Cognition: Neuroergonomics and Operational Neuroscience; Schmorrow, D.D., Fidopiastis, C.M., Eds.; Springer International Publishing: Cham, Switzerland, 2016; pp. 121–131. [Google Scholar]
- Angrisani, L.; Arpaia, P.; Esposito, A.; Moccaldi, N. A Wearable Brain–Computer Interface Instrument for Augmented Reality-Based Inspection in Industry 4.0. IEEE Trans. Instrum. Meas. 2020, 69, 1530–1539. [Google Scholar] [CrossRef]
- Chin, Z.Y.; Ang, K.K.; Wang, C.; Guan, C. Online performance evaluation of motor imagery BCI with augmented-reality virtual hand feedback. In Proceedings of the 2010 Annual International Conference of the IEEE Engineering in Medicine and Biology, Buenos Aires, Argentina, 31 August–4 September 2010; pp. 3341–3344. [Google Scholar] [CrossRef]
- Barresi, G.; Olivieri, E.; Caldwell, D.G.; Mattos, L.S. Brain-Controlled AR Feedback Design for User’s Training in Surgical HRI. In Proceedings of the 2015 IEEE International Conference on Systems, Man, and Cybernetics, Hong Kong, China, 9–12 October 2015; pp. 1116–1121. [Google Scholar] [CrossRef]
- Rho, G.; Callara, A.L.; Condino, S.; Ghiasi, S.; Nardelli, M.; Carbone, M.; Ferrari, V.; Greco, A.; Scilingo, E.P. A preliminary quantitative EEG study on Augmented Reality Guidance of Manual Tasks. In Proceedings of the 2020 IEEE International Symposium on Medical Measurements and Applications (MeMeA), Bari, Italy, 1 June–1 July 2020; pp. 1–5. [Google Scholar] [CrossRef]
- Mercier-Ganady, J.; Lotte, F.; Loup-Escande, E.; Marchal, M.; Lécuyer, A. The Mind–Mirror: See your brain in action in your head using EEG and augmented reality. In Proceedings of the 2014 IEEE Virtual Reality (VR), Minneapolis, MN, USA, 29 March–2 April 2014; pp. 33–38. [Google Scholar] [CrossRef] [Green Version]
- Han, D.I.; Weber-Sabil, J.; Bastiaansen, M.; Mitas, O.; Lub, X. Blowing your mind: A conceptual framework of augmented reality and virtual reality enhanced cultural visitor experiences using EEG experience measures. Int. J. Technol. Mark. 2020, 14, 47. [Google Scholar] [CrossRef]
- Vortmann, L.M.; Kroll, F.; Putze, F. EEG-Based Classification of Internally- and Externally-Directed Attention in an Augmented Reality Paradigm. Front. Hum. Neurosci. 2019, 13, 348. [Google Scholar] [CrossRef]
- Vortmann, L.M.; Putze, F. Attention-Aware Brain Computer Interface to Avoid Distractions in Augmented Reality. In Extended Abstracts of the 2020 CHI Conference on Human Factors in Computing Systems, CHI EA ’20; Association for Computing Machinery: New York, NY, USA, 2020; pp. 1–8. [Google Scholar] [CrossRef]
- Wouters, P.; Spek, E.; Oostendorp, H. Current Practices in Serious Game Research: A Review from a Learning Outcomes Perspective. In Games-Based Learning Advancements for Multi-Sensory Human Computer Interfaces: Techniques and Effective Practices; Conolly, T., Stansfield, M., Boyle, L., Eds.; Information Science Reference: New York, NY, USA, 2009. [Google Scholar] [CrossRef]
- Bigdely-Shamlo, N.; Mullen, T.; Kothe, C.; Su, K.M.; Robbins, K.A. The PREP pipeline: Standardized preprocessing for large-scale EEG analysis. Front. Neuroinform. 2015, 9, 16. [Google Scholar] [CrossRef]
- Olejniczak, P. Neurophysiologic basis of EEG. J. Clin. Neurophysiol. 2006, 23, 186–189. [Google Scholar] [CrossRef] [Green Version]
- Luck, S.J. An Introduction to the Event-Related Potential Technique; MIT Press: Cambridge, MA, USA, 2014. [Google Scholar]
- Schirrmeister, R.T.; Springenberg, J.T.; Fiederer, L.D.J.; Glasstetter, M.; Eggensperger, K.; Tangermann, M.; Hutter, F.; Burgard, W.; Ball, T. Deep learning with convolutional neural networks for EEG decoding and visualization. Hum. Brain Mapp. 2017, 38, 5391–5420. [Google Scholar] [CrossRef] [Green Version]
- Vortmann, L.M.; Schult, M.; Benedek, M.; Annerer-Walcher, S.; Putze, F. Real-Time Multimodal Classification of Internal and External Attention. In Adjunct of the 2019 International Conference on Multimodal Interaction, ICMI ’19; Association for Computing Machinery: New York, NY, USA, 2019; pp. 1–7. [Google Scholar] [CrossRef]
- Pedregosa, F.; Varoquaux, G.; Gramfort, A.; Michel, V.; Thirion, B.; Grisel, O.; Blondel, M.; Müller, A.; Nothman, J.; Louppe, G.; et al. Scikit-learn: Machine Learning in Python. J. Mach. Learn. 2011, 12, 2825–2830. [Google Scholar]
- Müller-Putz, G.; Scherer, R.; Brunner, C.; Leeb, R.; Pfurtscheller, G. Better than Random? A closer look on BCI results. Int. J. Bioelektromagnetism 2008, 10, 52–55. [Google Scholar]
- Sacchet, M.D.; LaPlante, R.A.; Wan, Q.; Pritchett, D.L.; Lee, A.K.; Hämäläinen, M.; Moore, C.I.; Kerr, C.E.; Jones, S.R. Attention drives synchronization of alpha and beta rhythms between right inferior frontal and primary sensory neocortex. J. Neurosci. 2015, 35, 2074–2082. [Google Scholar] [CrossRef] [PubMed] [Green Version]
- Lotte, F.; Bougrain, L.; Clerc, M. Electroencephalography (EEG)-Based Brain–Computer Interfaces. In Wiley Encyclopedia of Electrical and Electronics Engineering; American Cancer Society: Atlanta, GA, USA, 2015; pp. 1–20. [Google Scholar] [CrossRef] [Green Version]
- Allison, B.; Neuper, C. Could anyone use a BCI? In Brain–Computer Interfaces; Tan, D.S., Nijholt, A., Eds.; Springer: London, UK, 2010; pp. 35–54. [Google Scholar] [CrossRef]
- Edlinger, G.; Allison, B.Z.; Guger, C. How Many People Can Use a BCI System? In Clinical Systems Neuroscience; Kansaku, K., Cohen, L.G., Birbaumer, N., Eds.; Springer: Tokyo, Japan, 2015; pp. 33–66. [Google Scholar] [CrossRef]
- Ungerleider, S.K.; Ungerleider, G.L. Mechanisms of Visual Attention in the Human Cortex. Annu. Rev. Neurosci. 2000, 23, 315–341. [Google Scholar] [CrossRef] [PubMed] [Green Version]
- Buracas, G.; Boynton, G. The Effect of Spatial Attention on Contrast Response Functions in Human Visual Cortex. J. Neurosci. 2007, 27, 93–97. [Google Scholar] [CrossRef] [Green Version]
- Herrmann, C. Human EEG responses to 1–100 Hz flicker: Resonance phenomena in visual cortex and their potential correlation to cognitive phenomena. Exp. Brain Res. 2001, 137, 346–353. [Google Scholar] [CrossRef]
- Min, B.K.; Jung, Y.C.; Kim, E.; Park, J.Y. Bright illumination reduces parietal EEG alpha activity during a sustained attention task. Brain Res. 2013, 1538, 83–92. [Google Scholar] [CrossRef] [Green Version]
- Lew, E.; Chavarriaga, R.; Silvoni, S.; Millán, J.d.R. Detection of self-paced reaching movement intention from EEG signals. Front. Neuroeng. 2012, 5, 13. [Google Scholar] [CrossRef] [PubMed]
- Jiang, L.; Wang, Y.; Pei, W.; Chen, H. A Four-Class Phase-Coded SSVEP BCI at 60 Hz Using Refresh Rate. In Proceedings of the 2019 41st Annual International Conference of the IEEE Engineering in Medicine and Biology Society (EMBC), Berlin, Germany, 23–27 July 2019; Volume 2019, pp. 6331–6334. [Google Scholar] [CrossRef]
Approach | Data | Train:Test | Split-Restriction | Classifier |
---|---|---|---|---|
Trial-Oblivious | EEG | 70:30 | Stratified | NN |
Trial-Sensitive | EEG | 70:30 | Windows from the same trial are either all in the training set or all in the test set, Stratified | NN |
BCI-Approach | EEG | 70:30 | First 70% of the trials of each label are in the training set, last 30% of the trials of each label are in the test set | NN |
Eye Tracking | ET | 70:30 | Windows from the same trial are either all in the training set or all in the test set, Stratified | LDA |
Late Fusion | ET EEG | 70:30 | Windows from the same trial are either all in the training set or all in the test set, Stratified | NN Threshold LDA |
Person-Independent | EEG | Leave-1-out | No data of the test subject is in the training set | NN |
Participant | Trial-Oblivious | Trial-Sensitive | BCI-Approach | Person-Independent |
---|---|---|---|---|
1 | 0.96 * | 0.69 * | 0.55 | 0.48 |
2 | 0.91 * | 0.70 * | 0.63 * | 0.53 |
3 | 0.97 * | 0.86 * | 0.83 * | 0.50 |
4 | 0.94 * | 0.79 * | 0.55 | 0.63 * |
5 | 0.92 * | 0.71 * | 0.65 * | 0.54 |
6 | 1.00 * | 0.60 | 0.59 | 0.54 |
7 | 1.00 * | 0.71 * | 0.58 | 0.56 |
8 | 1.00 * | 0.90 * | 0.89 * | 0.70 * |
9 | 0.87 * | 0.65 * | 0.49 | 0.49 |
10 | 0.64 * | 0.76 * | 0.89 * | 0.47 |
11 | 0.99 * | 0.57 | 0.40 | 0.64 * |
12 | 0.97 * | 0.65 * | 0.54 | 0.58 * |
13 | 0.96 * | 0.73 * | 0.80 * | 0.62 * |
14 | 0.98 * | 0.64 * | 0.54 | 0.48 |
15 | 0.86 * | 0.71 * | 0.76 * | 0.51 |
16 | 0.80 * | 0.77 * | 0.69 * | 0.51 |
17 | 0.97 * | 0.83 * | 0.83 * | 0.59 * |
18 | 0.97 * | 0.62 * | 0.58 | 0.53 |
19 | 0.92 * | 0.86 * | 0.90 * | 0.53 |
20 | 0.99 * | 0.71 * | 0.63 * | 0.52 |
Mean | 0.93 * | 0.72 * | 0.66 * | 0.54 |
Std | 0.08 | 0.09 | 0.15 | 0.06 |
n | 60 | 60 | 60 | 200 |
Real | Virtual | |||||
---|---|---|---|---|---|---|
Participant | Precision | Recall | F1 | Precision | Recall | F1 |
1 | 0.71 | 0.65 | 0.68 | 0.68 | 0.73 | 0.70 |
2 | 0.67 | 0.81 | 0.73 | 0.76 | 0.60 | 0.67 |
3 | 0.91 | 0.78 | 0.84 | 0.81 | 0.93 | 0.86 |
4 | 0.84 | 0.72 | 0.77 | 0.75 | 0.87 | 0.81 |
5 | 0.72 | 0.69 | 0.71 | 0.70 | 0.73 | 0.72 |
6 x | 0.61 | 0.57 | 0.59 | 0.59 | 0.63 | 0.61 |
7 | 0.72 | 0.69 | 0.71 | 0.70 | 0.73 | 0.72 |
8 | 0.85 | 0.94 | 0.89 | 0.94 | 0.86 | 0.90 |
9 | 0.64 | 0.67 | 0.65 | 0.65 | 0.62 | 0.64 |
10 | 0.74 | 0.75 | 0.75 | 0.78 | 0.77 | 0.78 |
11 x | 0.57 | 0.61 | 0.59 | 0.58 | 0.53 | 0.56 |
12 | 0.62 | 0.80 | 0.69 | 0.71 | 0.50 | 0.59 |
13 | 0.68 | 0.79 | 0.73 | 0.79 | 0.69 | 0.74 |
14 | 0.63 | 0.68 | 0.65 | 0.65 | 0.61 | 0.63 |
15 | 0.70 | 0.74 | 0.72 | 0.72 | 0.68 | 0.70 |
16 | 0.78 | 0.76 | 0.77 | 0.76 | 0.79 | 0.78 |
17 | 0.79 | 0.87 | 0.83 | 0.88 | 0.81 | 0.84 |
18 x | 0.62 | 0.64 | 0.63 | 0.63 | 0.60 | 0.61 |
19 | 0.79 | 0.94 | 0.86 | 0.94 | 0.78 | 0.85 |
20 | 0.72 | 0.69 | 0.70 | 0.70 | 0.73 | 0.72 |
Mean | 0.71 | 0.74 | 0.72 | 0.74 | 0.71 | 0.72 |
Real | Virtual | |
---|---|---|
Real | 0.36 ± 0.04 | 0.15 ± 0.05 |
Virtual | 0.13 ± 0.05 | 0.36 ± 0.06 |
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations. |
© 2021 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
Vortmann, L.-M.; Schwenke, L.; Putze, F. Using Brain Activity Patterns to Differentiate Real and Virtual Attended Targets during Augmented Reality Scenarios. Information 2021, 12, 226. https://doi.org/10.3390/info12060226
Vortmann L-M, Schwenke L, Putze F. Using Brain Activity Patterns to Differentiate Real and Virtual Attended Targets during Augmented Reality Scenarios. Information. 2021; 12(6):226. https://doi.org/10.3390/info12060226
Chicago/Turabian StyleVortmann, Lisa-Marie, Leonid Schwenke, and Felix Putze. 2021. "Using Brain Activity Patterns to Differentiate Real and Virtual Attended Targets during Augmented Reality Scenarios" Information 12, no. 6: 226. https://doi.org/10.3390/info12060226
APA StyleVortmann, L. -M., Schwenke, L., & Putze, F. (2021). Using Brain Activity Patterns to Differentiate Real and Virtual Attended Targets during Augmented Reality Scenarios. Information, 12(6), 226. https://doi.org/10.3390/info12060226