Human–Computer Interaction Multi-Task Modeling Based on Implicit Intent EEG Decoding
Abstract
:1. Introduction
2. Related Research
2.1. Research on Implicit Intent in HCI
2.2. Research on Algorithm Model Construction
3. Methods
3.1. Extract Typical HCI Implicit Intent
3.2. Feature Extraction
- Calculate the mixed space mean covariance matrix
- 2.
- Construct hybrid spatial filters with pairwise intention pairs
- 3.
- Generate intention features
- 4.
- Integrate intent features by column
- 5.
- After different spatial filters are applied to the original intention EEG data, the intention features are obtained and then spliced into columns. Finally, a total of 12 dimensions of CSP feature data are obtained for each type of intention (n value is 3 after repeated experiments), which is used for subsequent multi-intention and multi-classification modeling and recognition.
3.3. Machine Learning Classifier
4. Experiment
4.1. Subjects
4.2. Experiment Setup
4.3. Procedures
4.4. Preprocessing of the EEG Data
5. Results and Analysis
5.1. Classification Results
5.2. Effectiveness of the CSP Intention Feature
6. Discussion
7. Conclusions
Author Contributions
Funding
Institutional Review Board Statement
Informed Consent Statement
Data Availability Statement
Conflicts of Interest
References
- Alliance of Industrial Internet. Industrial Internet White Paper. 2021. Available online: http://en.aii-alliance.org/index.php (accessed on 27 December 2023).
- Moore, A.; O’Reilly, T.; Nielsen, P.D.; Fall, K. Four Thought Leaders on Where the Industry Is Headed. IEEE Softw. 2016, 33, 36–39. [Google Scholar] [CrossRef]
- World Economic Forum. Digital Transformation of Industries; World Economic Forum: Cologny, Switzerland, 2016. [Google Scholar]
- Gil, M.; Albert, M.; Fons, J.; Pelechano, V. Designing human-in-the-loop autonomous Cyber-Physical Systems. Int. J. Hum.-Comput. Stud. 2019, 130, 21–39. [Google Scholar] [CrossRef]
- De Lemos, R.; Giese, H.; Müller, H.A.; Shaw, M.; Andersson, J.; Litoiu, M.; Schmerl, B.; Tamura, G.; Villegas, N.M.; Wuttke, J.; et al. Software Engineering for Self-Adaptive Systems: A Second Research Roadmap. In Software Engineering for Self-Adaptive Systems II: International Seminar, Dagstuhl Castle, Germany, October 24–29, 2010 Revised Selected and Invited Papers; de Lemos, R., Giese, H., Müller, H.A., Shaw, M., Eds.; Springer: Berlin/Heidelberg, Germany, 2013; pp. 1–32. [Google Scholar]
- Miller, D.B.; Ju, W. Joint Cognition in Automated Driving: Combining Human and Machine Intelligence to Address Novel Problems. In Proceedings of the AAAI Spring Symposium Series, Stanford, CA, USA, 23–25 March 2015; Available online: https://www.aaai.org/ocs/index.php/SSS/SSS15/paper/view/10308 (accessed on 27 December 2023).
- Cutrell, E.; Guan, Z.W. What Are You Looking for? An Eye-Tracking Study of Information Usage in Web Search. In Proceedings of the Conference on Human Factors in Computing Systems, San Jose, CA, USA, 28 April–3 May 2007. [Google Scholar]
- Deng, X.; Xiao, L.F.; Yang, P.F.; Wang, J.; Zhang, J.H. Development of a robot arm control system using motor imagery electroencephalography and electrooculography. CAAI Trans. Intell. Syst. 2022, 17, 1163–1172. [Google Scholar]
- Wang, X.R. Research on Auxiliary Brain State Monitoring Methods in Brain-Computer Interfaces. Master’s Thesis, Yanshan University, Qinhuangdao, China, 2022. Available online: https://www.cnki.net (accessed on 27 December 2023).
- Yan, N.; Wang, J.; Wei, N.; Zong, L. Feature Exaction and Classification of Attention Related Electroencephalographic Signals Based on Sample Entropy. J. Xi’an Jiaotong Univ. 2007, 41, 1237–1241. [Google Scholar]
- Guo, F.; Yu, C.; Ding, Y. On Measuring Users’ Emotions in Interacting with Webs. J. Northeast. Univ. Nat. Sci. 2014, 35, 131. [Google Scholar]
- Hu, X.; Yu, J.; Song, M.; Yu, C.; Wang, F.; Sun, P.; Wang, D.; Zhang, D. EEG Correlates of Ten Positive Emotions. Front. Hum. Neurosci. 2017, 11, 26. [Google Scholar] [CrossRef]
- Xu, J.L.; Wang, P.; Mu, Z.D. Fatigue Driving Detection Based on Eye Movement and EEG Features. J. Chongqing Jiaotong Univ. Nat. Sci. 2021, 40, 7–11. [Google Scholar]
- Gevins, A.S.; Smith, M.E. Neurophysiological measures of cognitive workload during human-computer interaction. Theor. Issues Ergon. Sci. 2003, 4, 113–131. [Google Scholar] [CrossRef]
- Holm, A.; Lukander, K.; Korpela, J.; Sallinen, M.; Müller, K. Estimating Brain Load from the EEG. Sci. World J. 2009, 9, 639–651. [Google Scholar] [CrossRef]
- Xing, Y.; Lv, C.; Wang, H.; Wang, H.; Ai, Y.; Cao, D.; Velenis, E.; Wang, F.Y. Driver Lane Change Intention Inference for Intelligent Vehicles: Framework, Survey, and Challenges. IEEE Trans. Veh. Technol. 2019, 68, 4377–4390. [Google Scholar] [CrossRef]
- Park, H.; Lee, A.; Lee, M.; Chang, M.S.; Kwak, H.W. Using eye movement data to infer human behavioral intentions. Comput. Hum. Behav. 2016, 63, 796–804. [Google Scholar] [CrossRef]
- Jang, Y.M.; Mallipeddi, R.; Lee, M. Identification of human implicit visual search intention based on eye movement and pupillary analysis. User Model. User-Adapt. Interact. 2014, 24, 315–344. [Google Scholar] [CrossRef]
- Jang, Y.M.; Mallipeddi, R.; Lee, S.; Kwak, H.W.; Lee, M. Human intention recognition based on eyeball movement pattern and pupil size variation. Neurocomputing 2014, 128, 421–432. [Google Scholar] [CrossRef]
- Velasquez, J.; Weber, R.; Yasuda, H.; Aoki, T. Acquisition and Maintenance of Knowledge for Online Navigation Suggestions. IEICE Trans. Inf. Syst. 2005, E88D, 993–1003. [Google Scholar] [CrossRef]
- Pan, L.C.; Wang, K.; Xu, M.P.; Ni, G.J.; Ming, D. Review of Researches on Common Spacial Pattern and its Extended Algorithms for Movement Intention Decoding. Chin. J. Biomed. Endineering 2022, 41, 577–588. [Google Scholar]
- Lee, J.C.; Tan, D.S. Using a low-cost electroencephalograph for task classification in HCI research. In Proceedings of the 19th Annual ACM Symposium on User Interface Software and Technology, Montreux, Switzerland, 15–18 October 2006. [Google Scholar] [CrossRef]
- Palaniappan, R. Brain Computer Interface Design Using Band Powers Extracted During Mental Tasks. In Proceedings of the 2nd International IEEE EMBS Conference on Neural Engineering, Washington, DC, USA, 16–19 March 2005. [Google Scholar]
- Kang, J.S.; Park, U.; Gonuguntla, V.; Veluvolu, K.C.; Lee, M. Human implicit intent recognition based on the phase synchrony of EEG signals. Pattern Recognit. Lett. 2015, 66, 144–152. [Google Scholar] [CrossRef]
- Slanzi, G.; Balazs, J.A.; Velasquez, J.D. Combining eye tracking, pupil dilation and EEG analysis for predicting web users click intention. Inf. Fusion 2017, 35, 51–57. [Google Scholar] [CrossRef]
- Van-Horenbeke, F.A.; Peer, A. Activity, Plan, and Goal Recognition: A Review. Front. Robot. AI 2021, 8, 643010. [Google Scholar] [CrossRef]
- Jansen, B.; Booth, D.; Spink, A. Determining the informational, navigational and transactional intent of web queries. Inf. Process. Manag. 2008, 44, 1251–1266. [Google Scholar] [CrossRef]
- Zhou, S.; Yu, Q. Human-Computer Interaction Technology; Tsinghua University Press: Beijing, China, 2022. [Google Scholar]
- Park, U.; Mallipeddi, R.; Lee, M. Human Implicit Intent Discrimination Using EEG and Eye Movement. In Proceedings of the 21st International Conference on Neural Information Processing (ICONIP), Kuching, Malaysia, 3–6 November 2014. [Google Scholar]
- Khushaba, R.N.; Greenacre, L.; Kodagoda, S.; Louviere, J.J.; Burke, S.; Dissanayake, G. Choice modeling and the brain: A study on the Electroencephalogram (EEG) of preferences. Expert Syst. Appl. 2012, 39, 12378–12388. [Google Scholar] [CrossRef]
- Khushaba, R.N.; Wise, C.; Kodagoda, S.; Louviere, J.; Kahn, B.E.; Townsend, C. Consumer neuroscience: Assessing the brain response to marketing stimuli using electroencephalogram (EEG) and eye tracking. Expert Syst. Appl. 2013, 40, 3803–3812. [Google Scholar] [CrossRef]
- Wang, H.K.; Bi, L.Z.; Fei, W.J.; Wang, L. An EEG-Based Multi-Classification Method of Braking Intentions for Driver-Vehicle Interaction. In Proceedings of the 2019 IEEE International Conference on Real-Time Computing and Robotics (RCAR), Irkutsk, Russia, 4–9 August 2019; pp. 438–441. [Google Scholar]
- Wang, W.; Zhao, M.; Gao, H.; Zhu, S.; Qu, J. Human-computer interaction:Intention recognition based on EEG and eye tracking. Acta Aeronaut. Astronaut. Sin. 2021, 42, 324290. [Google Scholar]
- Liang, Y.Q.; Wang, W.; Qu, J.; Yang, J.; Liu, X.W. Human-Computer Interaction Behavior and Intention Prediction Model Based on Eye Movement Characteristics. Acta Electron. Sin. 2018, 46, 2993–3001. [Google Scholar]
- Hu, Y.; Liu, Y.; Cheng, C.C.; Geng, C.; Dai, B.; Peng, B.; Zhu, J.; Dai, Y.K. Multi-task motor imagery electroencephalogram classification based on adaptive time-frequency common spatial pattern combined with convolutional neural network. Chin. J. Biomed. Eng. 2022, 39, 1065–1073. [Google Scholar]
- Lu, Z.W.; Chen, Y.; Mo, Y.; Zhang, B.X. EEG channel selection method based on TRCSP and L2 norm. Electron. Meas. Technol. 2023, 46, 94–102. [Google Scholar] [CrossRef]
- NASA. Available online: https://matb.larc.nasa.gov/ (accessed on 27 December 2023).
- Chen, M.; Liang, N.M. The Path to Intelligent Manufacturing: Digitalized Factory; Liang, N., Ed.; China Machine Press: Beijing, China, 2016. [Google Scholar]
- Shi, J.; Tang, W.; Li, N.; Zhou, Y.; Zhou, T.; Chen, Z.; Yin, K. User Cognitive Abilities-Human Computer Interaction Tasks Model. In Intelligent Human Systems Integration 2021; Springer: Cham, Switzerland, 2021. [Google Scholar]
- Anderson, C.W. Classification of EEG Signals from Four Subjects During Five Mental Tasks. 2007. [Google Scholar]
- Zhang, X.; Yao, L.; Wang, X.; Zhang, W.; Zhang, S.; Liu, Y. Know Your Mind: Adaptive Cognitive Activity Recognition with Reinforced CNN. In Proceedings of the 2019 IEEE International Conference on Data Mining (ICDM), Beijing, China, 8–11 November 2019. [Google Scholar]
- Li, C.; He, F.; Qi, H.; Guo, X.; Chen, L.; Ming, D. A Study for ERP Classification of Food Preference Based on CSP and SVM. Chin. J. Biomed. Eng. 2022, 41, 266–272. [Google Scholar]
SVM (RBF Kernel) | KNN | GaussianNB | ||||
---|---|---|---|---|---|---|
%Acc | c | g | %Acc | k | %Acc | |
S1 | 99.10 | 0.25 | 8 | 86.75 | 5 | 84.52 |
S2 | 99.08 | 0.25 | 8 | 85.77 | 4 | 82.11 |
S3 | 99.54 | 0.25 | 8 | 88.22 | 4 | 85.21 |
S4 | 99.09 | 0.125 | 4 | 85.87 | 4 | 85.14 |
S5 | 98.94 | 0.25 | 8 | 87.01 | 3 | 84.71 |
S6 | 99.08 | 32 | 0.0625 | 87.62 | 4 | 85.00 |
S7 | 99.42 | 0.25 | 8 | 85.19 | 7 | 81.56 |
S8 | 99.04 | 0.25 | 8 | 84.15 | 3 | 84.43 |
S9 | 99.34 | 0.5 | 8 | 87.36 | 4 | 85.09 |
S10 | 99.26 | 0.125 | 4 | 86.90 | 4 | 75.96 |
S11 | 99.03 | 0.25 | 8 | 86.85 | 3 | 85.11 |
S12 | 99.08 | 0.25 | 8 | 91.03 | 4 | 85.49 |
S13 | 98.13 | 1 | 8 | 87.30 | 3 | 85.74 |
S14 | 99.00 | 0.5 | 8 | 88.32 | 3 | 83.63 |
S15 | 99.03 | 2 | 2 | 87.64 | 3 | 81.62 |
S16 | 99.05 | 0.25 | 8 | 89.64 | 3 | 83.76 |
S17 | 99.06 | 0.125 | 2 | 87.60 | 5 | 82.17 |
S18 | 99.89 | 0.125 | 2 | 87.18 | 3 | 83.32 |
S19 | 99.12 | 2 | 0.25 | 88.79 | 3 | 92.50 |
S20 | 99.42 | 0.25 | 8 | 87.28 | 3 | 87.95 |
Mean | 99.135 ± 0.33 | 87.32 ± 1.51 | 88.26 ± 1.11 |
SVM | KNN | GNB | |
---|---|---|---|
%Acc | %Acc | %Acc | |
S1 | 87.27 | 81 | 83.27 |
S2 | 100 | 79.19 | 78.95 |
S3 | 84.2 | 84.2 | 84.2 |
S4 | 79.66 | 80.05 | 84.03 |
S5 | 76.34 | 80.85 | 82.7 |
S6 | 70.86 | 79.75 | 79.51 |
S7 | 92.31 | 82.44 | 80.27 |
S8 | 100 | 79.8 | 79.6 |
S9 | 92.24 | 69.21 | 81.63 |
S10 | 96.68 | 84.19 | 57.4 |
S11 | 100 | 85.35 | 83.29 |
S12 | 85.96 | 62.52 | 74.73 |
S13 | 82.42 | 82.42 | 43.33 |
S14 | 100 | 87.6 | 76.5 |
S15 | 98.41 | 82.25 | 78.89 |
S16 | 98.62 | 87.38 | 78.15 |
S17 | 99.87 | 87.37 | 73.85 |
S18 | 100 | 80.25 | 82.79 |
S19 | 79 | 78.43 | 67.79 |
S20 | 92.31 | 83.24 | 82.36 |
SVM | KNN | GNB | |
---|---|---|---|
Recognition average | 90.81 | 80.87 | 76.66 |
Standard deviation | 9.42 | 5.94 | 10.12 |
Paired Difference | t | df | p (Two-Tailed) | |||||
---|---|---|---|---|---|---|---|---|
Mean | Std. | Standard Error of the Mean | 95% CI | |||||
Upper Limit | Lower Limit | |||||||
CSP-PSD | 59.75 | 11.20 | 2.51 | 54.50 | 69.99 | 23.85 | 19 | 0.00 |
Subjects Number | Task 1 Generalized Browsing | Task 2 Visual Search | Task 3 Click Control | Task 4 Table Inquire | Task 5 Complex Dispose |
---|---|---|---|---|---|
S1 | |||||
S2 | |||||
S3 | |||||
S4 | |||||
S5 | |||||
S6 | |||||
S7 | |||||
S8 | |||||
S9 | |||||
S10 | |||||
S11 | |||||
S12 | |||||
S13 | |||||
S14 | |||||
S15 | |||||
S16 | |||||
S17 | |||||
S18 | |||||
S19 | |||||
S20 |
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content. |
© 2023 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
Miao, X.; Hou, W. Human–Computer Interaction Multi-Task Modeling Based on Implicit Intent EEG Decoding. Appl. Sci. 2024, 14, 368. https://doi.org/10.3390/app14010368
Miao X, Hou W. Human–Computer Interaction Multi-Task Modeling Based on Implicit Intent EEG Decoding. Applied Sciences. 2024; 14(1):368. https://doi.org/10.3390/app14010368
Chicago/Turabian StyleMiao, Xiu, and Wenjun Hou. 2024. "Human–Computer Interaction Multi-Task Modeling Based on Implicit Intent EEG Decoding" Applied Sciences 14, no. 1: 368. https://doi.org/10.3390/app14010368
APA StyleMiao, X., & Hou, W. (2024). Human–Computer Interaction Multi-Task Modeling Based on Implicit Intent EEG Decoding. Applied Sciences, 14(1), 368. https://doi.org/10.3390/app14010368