Emotion Analysis AI Model for Sensing Architecture Using EEG
Abstract
:1. Introduction
1.1. Background and Goals of Study
1.2. Methods and Scope of Study
1.2.1. Dataset Overview and Preprocessing
1.2.2. Development of the AI Model
1.2.3. Model Training and Optimisation
1.2.4. Application of the AI Model in Architectural Environments
2. Literature Review
2.1. EEG-Based Emotion Recognition
2.2. AI-Driven Emotion Recognition Models and Fine-Tuning Approaches
2.3. Emotion Recognition in Architectural Spaces
2.4. Research Gaps and Contributions of This Study
3. Emotion Analysis AI Model Using EEG
3.1. Selection and Use of Dataset for Fine-Tuning Training
3.2. Fine-Tuned Model Construction
3.3. Fine-Tuned Model Development and Training
3.3.1. Data Pre-Processing
3.3.2. Fine-Tuned Model Training
4. Application and Usage of Brainwave-Based Fine-Tuned Model in Architectural Spaces
4.1. Fine-Tuned Model Validation and Performance Evaluation
4.1.1. Validation Dataset and JSONL File Loading
import json |
def load_jsonl(file_path): |
“““Loads and parses a JSONL file containing EEG validation data.””” |
data = [] |
with open(file_path, ‘r’, encoding = ‘utf-8’) as f: |
for line in f: |
data.append(json.loads(line)) |
return data |
4.1.2. Generating Emotion Predictions Using OpenAI API
import openai |
MODEL_NAME = “ft:gpt-4o-mini-2024-07-18:personal:eeg-emotion-ver7:AyoeRwUt” client = openai.OpenAI() |
def get_prediction(eeg_data): “““Queries the fine-tuned OpenAI model to classify EEG-based emotion.””” response = client.chat.completions.create( model = MODEL_NAME, messages = [ {“role”: “system”, “content”: “You are an EEG emotion classification model.”}, {“role”: “user”, “content”: f”EEG data: {eeg_data}\nClassify the emotion as
−
1 negative, 0 neutral, 1 positive.”} ], timeout = 30 ) return response.choices
[0].message.content.strip().lower() |
4.1.3. Comparing Predictions Against Ground-Truth Labels
4.1.4. Performance Metrics Computation and Results
4.1.5. Discussion and Implications
4.2. Application of Fine-Tuned Model to Architectural Space: 360° VR Experience
4.2.1. Experimental Setup and Equipment
4.2.2. Application of Fine-Tuned Model Version 8
4.2.3. Exploratory Analysis of Results
4.2.4. Implications and Future Directions
4.2.5. Conclusions
5. Conclusions and Future Research
Author Contributions
Funding
Institutional Review Board Statement
Informed Consent Statement
Data Availability Statement
Conflicts of Interest
References
- Picard, R.W. Affective Computing; MIT Press: Cambridge, MA, USA, 1997. [Google Scholar]
- Calvo, R.A.; D’Mello, S.; Gratch, J.; Kappas, A. The Oxford Handbook of Affective Computing; Oxford University Press: Oxford, UK, 2015. [Google Scholar]
- Barrett, L.F.; Adolphs, R.; Marsella, S.; Martinez, A.M.; Pollak, S.D. Emotional expressions reconsidered: Challenges to inferring emotion from human facial movements. Psychol. Sci. Public Interest 2019, 20, 1–68. [Google Scholar] [CrossRef]
- Koelstra, S.; Muhl, C.; Soleymani, M.; Yazdani, A.; Ebrahimi, T.; Pun, T.; Patras, I. DEAP: A database for emotion analysis using physiological signals. IEEE Trans. Affect. Comput. 2012, 3, 18–31. [Google Scholar] [CrossRef]
- Wang, Y.; Wang, Z.; Zhang, H.; Wang, F. Emotion recognition with EEG signals using a multi-scale CNN-LSTM model. Brain Sci. 2021, 11, 373. [Google Scholar]
- Schaefer, R.S.; Farquhar, J.; Blokland, Y.; Sadakata, M.; Desain, P. Name that tune: Decoding music from the brain responses to melodies using machine learning techniques. NeuroImage 2011, 56, 843–849. [Google Scholar] [CrossRef]
- Alavi, H.S.; Churchill, E.; Wiberg, M.; Lalanne, D.; Dalsgaard, P.; Schieck, A.; Rogers, Y. Introduction to Human-Building Interaction (HBI). ACM Trans. Comput.-Hum. Interact. 2019, 26, 1–10. [Google Scholar] [CrossRef]
- Liu, Y.; Sourina, O.; Nguyen, M.K. Real-time EEG-based human emotion recognition and visualization. In Proceedings of the International Conference on Cyberworlds, Singapore, 20–22 October 2010; pp. 262–269. [Google Scholar]
- Yang, C.Y.; Lin, Y.P.; Wu, T.L. EEG-based gaming systems to measure affective responses during gameplay. J. Neural Eng. 2015, 12, 066003. [Google Scholar]
- Chao, H.; Zeng, Y.; Zhang, Y.; Li, J.; Xie, L. An overview of affective computing based on traditional and emerging EEG techniques. Front. Neurosci. 2020, 14, 1–14. [Google Scholar]
- Spapé, M.M.; Hoggan, E.; Jacucci, G.; Ravaja, N. The meaning of the virtual self: A multimodal approach to affective user experience. Interact. Comput. 2015, 27, 349–362. [Google Scholar]
- Wang, Y.; He, Y.; Chen, H. Deep learning for EEG-based emotion recognition: A review of models and features. Signal Process. 2022, 195, 108474. [Google Scholar]
- Mehrabian, A.; Russell, J.A. An Approach to Environmental Psychology; MIT Press: Cambridge, MA, USA, 1974. [Google Scholar]
- Zhang, J.; Yin, Z.; Wang, Y.; Nichele, S. Emotion recognition using multi-modal data and machine learning techniques: A tutorial and review. Inf. Fusion 2020, 59, 103–126. [Google Scholar] [CrossRef]
- Mavridis, N. A review of social robotics developments: Towards a socio-cognitive architecture. Int. J. Hum.-Comput. Stud. 2015, 73, 76–93. [Google Scholar]
- Zheng, W.-L.; Lu, B.-L. Investigating Critical Frequency Bands and Channels for EEG-Based Emotion Recognition with Deep Neural Networks. IEEE Trans. Auton. Ment. Dev. 2015, 7, 162–175. [Google Scholar] [CrossRef]
- Zheng, W.L.; Zhu, J.Y.; Peng, Y.; Zhang, Y.; Lu, B.L. EEG-based emotion recognition using graph regularised sparse linear regression. IEEE Trans. Cogn. Dev. Syst. 2019, 10, 601–609. [Google Scholar]
- Becerik-Gerber, B.; Kensek, K. Building information modelling in architecture, engineering, and construction: Emerging research directions and trends. J. Prof. Issues Eng. Educ. Pract. 2010, 136, 139–147. [Google Scholar] [CrossRef]
- Wang, C.; Wu, X.; Wang, X. Architectural space and emotion: A study of the affective impacts of spatial configurations using immersive virtual environments. J. Environ. Psychol. 2020, 68, 101406. [Google Scholar]
- Lin, Y.P.; Wang, C.H.; Jung, T.P.; Wu, T.L.; Jeng, S.K.; Duann, J.R.; Chen, J.H. EEG-based emotion recognition in music listening: A comparison of schemes for multiclass classification. Neural Netw. 2010, 32, 108–120. [Google Scholar]
- Soleymani, M.; Lichtenauer, J.; Pun, T.; Pantic, M. A multimodal database for affect recognition and implicit tagging. IEEE Trans. Affect. Comput. 2012, 3, 42–55. [Google Scholar] [CrossRef]
- Li, X.; Zhang, P.; Song, D.; Yu, G.; Hou, Y.; Hu, B. EEG-based emotion recognition using unsupervised deep feature learning. Med. Biol. Eng. Comput. 2018, 56, 233–245. [Google Scholar]
- Zou, Y.; Liu, Y.; Song, X.; Wang, J.; Fu, J.; Wang, Y. A novel end-to-end model for EEG-based emotion recognition using multi-scale CNN and attention-based GRU. IEEE Trans. Affect. Comput. 2022, 13, 307–319. [Google Scholar]
- Wang, L.; Nie, D.; Lu, B.L. Emotional state classification from EEG data using machine learning approach. Neurocomputing 2014, 129, 94–106. [Google Scholar] [CrossRef]
- Vinhas, J.; Teixeira, C.; Postolache, O.; de Oliveira, R. A systematic review on EEG-based brain-machine interface applications for robotics and prosthetic devices. Electronics 2021, 10, 566. [Google Scholar]
- Huang, D.; Shang, C.; Wang, J.; Wang, W.; Zhang, X. Affective computing for human-robot collaboration: An overview. IEEE Trans. Cogn. Dev. Syst. 2021, 13, 541–555. [Google Scholar]
- Dzedzickis, A.; Kaklauskas, A.; Bucinskas, V. Human emotion recognition: Review of sensors and methods. Sensors 2020, 20, 592. [Google Scholar] [CrossRef]
- Chai, R.; Ling, S.H.; Hunter, G.P.; Tran, Y.; Nguyen, H.T. Brain-computer interface classifier for wheelchair commands using a hybrid computational intelligence approach. Comput. Intell. Neurosci. 2017, 2017, 1526525. [Google Scholar]
- Gao, X.; Wang, Y.; Chen, X.; Gao, S. BCI competition IV: A multimodal brain-computer interface dataset. Front. Neurosci. 2020, 14, 322. [Google Scholar]
- Lin, F.H.; Witzel, T.; Ahlfors, S.P.; Stufflebeam, S.M.; Belliveau, J.W.; Hamalainen, M.S. Assessing and improving the spatial accuracy in MEG source localization by depth-weighted minimum-norm estimates. NeuroImage 2006, 31, 160–171. [Google Scholar] [CrossRef] [PubMed]
- Pfurtscheller, G.; Neuper, C. Motor imagery and direct brain-computer communication. Proc. IEEE 2001, 89, 1123–1134. [Google Scholar] [CrossRef]
- Blankertz, B.; Lemm, S.; Treder, M.; Haufe, S.; Muller, K.R. Single-trial analysis and classification of ERP components—A tutorial. NeuroImage 2011, 56, 814–825. [Google Scholar] [CrossRef] [PubMed]
- Kotsiantis, S.B. An Overview of Machine Learning Classification Techniques. Informatica 2007, 31, 249–268. [Google Scholar]
- Garcia, E.K.; Feldman, A.; McKenzie, G.; Esteva, A. A Study on Classification Techniques in Data Mining. Expert. Syst. Appl. 2020, 123, 113–121. [Google Scholar]
- Asteris, P.G.; Gandomi, A.H.; Armaghani, D.J.; Tsoukalas, M.Z.; Gavriilaki, E.; Gerber, G.; Konstantakatos, G.; Skentou, A.D.; Triantafyllidis, L.; Kotsiou, N.; et al. Genetic justification of COVID-19 patient outcomes using DERGA, a novel data ensemble refinement greedy algorithm. J. Cell Mol. Med. 2024, 28, e18105. [Google Scholar] [CrossRef] [PubMed] [PubMed Central]
- Asteris, P.G.; Gavriilaki, E.; Kampaktsis, P.N.; Gandomi, A.H.; Armaghani, D.J.; Tsoukalas, M.Z.; Avgerinos, D.V.; Grigoriadis, S.; Kotsiou, N.; Yannaki, E.; et al. Revealing the nature of cardiovascular disease using DERGA, a novel data ensemble refinement greedy algorithm. Int. J. Cardiol. 2024, 412, 132339. [Google Scholar] [CrossRef] [PubMed]
- Russell, J.A. Core affect and the psychological construction of emotion. Psychol. Rev. 2003, 110, 145–172. [Google Scholar] [CrossRef] [PubMed]
- He, H.; Wu, D.; Yu, Y.; Babiloni, F.; Jin, J. Transfer learning for brain–computer interfaces: A Euclidean space data alignment approach. IEEE Trans. Biomed. Eng. 2019, 67, 399–410. [Google Scholar] [CrossRef] [PubMed]
- Makeig, S.; Bell, A.J.; Jung, T.P.; Sejnowski, T.J. Independent component analysis of electroencephalographic data. Adv. Neural Inf. Process. Syst. 1996, 8, 145–151. [Google Scholar]
- Brown, D.; Stevens, C.; Jedlicka, H.; O’Brien, S. Affective computing and architectural design: Enhancing user experience through emotion recognition. Des. Stud. 2020, 67, 34–50. [Google Scholar]
- Pan, X.; Zhang, Y.; Lu, B. The application of EEG-based emotion recognition in human-centered AI systems. Neural Netw. 2021, 143, 120–135. [Google Scholar]
- Young, R.M.; Simon, H.A. AI and architecture: A cognitive perspective on spatial reasoning. Artif. Intell. 2018, 275, 1–19. [Google Scholar]
- Breiman, L. Random forests for EEG emotion classification. Mach. Learn. 2001, 45, 5–32. [Google Scholar] [CrossRef]
- Kim, J.; André, E. Emotion recognition based on physiological changes in music listening. IEEE Trans. Pattern Anal. Mach. Intell. 2008, 30, 2067–2083. [Google Scholar] [CrossRef]
- Davidson, R.J. EEG-based neural signatures of affective states in human-computer interaction. Neuropsychologia 2003, 41, 89–98. [Google Scholar]
- Picard, R.W. Emotion sensing in intelligent environments. AI Mag. 2001, 22, 38–49. [Google Scholar]
Emotion | Video Clip | Video Length (s) | Number of Data Rows (200 Hz) |
---|---|---|---|
Sadness | 4 | 927 | 185,400 |
Happiness | 5 | 1186 | 237,200 |
Neutral | 5 | 1107 | 221,400 |
Total | 14 | 3220 | 644,00 |
Name of the Clip | Label | Start Time | End Time |
---|---|---|---|
Lost in Thailand | 2 (happy) | 0:06:13 | 0:10:11 |
World Heritage in China | 1 (neutral) | 0:00:50 | 0:04:36 |
Aftershock | 0 (sad) | 0:20:10 | 0:23:35 |
Back to 1942 | 0 (sad) | 0:49:58 | 0:54:00 |
World Heritage in China | 1 (neutral) | 0:10:40 | 0:13:44 |
Lost in Thailand | 2 (happy) | 1:05:10 | 1:08:29 |
Back to 1942 | 0 (sad) | 2:01:21 | 2:05:21 |
World Heritage in China | 1 (neutral) | 2:55 | 6:35 |
Flirting Scholar | 2 (happy) | 1:18:57 | 1:23:23 |
Just Another Pandora’s Box | 2 (happy) | 11:32 | 15:33 |
World Heritage in China | 1 (neutral) | 10:41 | 14:38 |
Back to 1942 | 0 (sad) | 2:16:37 | 2:20:37 |
World Heritage in China | 1 (neutral) | 5:36 | 9:36 |
Just Another Pandora’s Box | 2 (happy) | 35:00 | 39:02 |
Category | Count | Percentage |
---|---|---|
Sad (−1 negative) | 88,802 | 33.3% |
Neutral (0 neutral) | 88,802 | 33.3% |
Happy (1 positive) | 88,802 | 33.3% |
Version | Model | Emotion | Precision | Recall | F1 Score | Accuracy | Cohen’s Kappa |
---|---|---|---|---|---|---|---|
Ver. 1 | ft:gpt-4o-2024-08-06:personal:eeg-emotion:A5phbEb | Sad (−1 negative) | 0.5 | 0.04 | 0.07 | 0.34 | 0.1 |
Neutral (0 neutral) | 0.38 | 0.13 | 0.2 | ||||
Positive (1 positive) | 0.33 | 0.86 | 0.47 | ||||
Overall accuracy | - | - | 0.34 | ||||
Ver. 2 | ft:gpt-4o-2024-08-06:personal:eeg-emotion-ver2:A5uctXW | Sad (−1 negative) | 1.0 | 0.01 | 0.02 | 0.33 | 0.1 |
Neutral (0 neutral) | 0 | 0 | 0 | ||||
Positive (1 positive) | 0.32 | 0.99 | 0.49 | ||||
Overall accuracy | - | - | 0.33 | ||||
Ver. 3 | ft:gpt-4o-2024-08-06:personal:eeg-emotion-ver3:AArwL8j | Sad (−1 negative) | 0.34 | 0.37 | 0.36 | 0.34 | 0.1 |
Neutral (0 neutral) | 0.37 | 0.33 | 0.32 | ||||
Positive (1 positive) | 0.31 | 0.32 | 0.32 | ||||
Overall accuracy | - | - | 0.34 | ||||
Ver. 4 | ft:gpt-4o-mini-2024-07-18:personal:eeg-emotion-ver4:AB1mUNZ | Sad (−1 negative) | 0.32 | 0.63 | 0.43 | 0.32 | 0.1 |
Neutral (0 neutral) | 0.31 | 0.30 | 0.31 | ||||
Positive (1 positive) | 0.33 | 0.02 | 0.04 | ||||
Overall accuracy | - | - | 0.32 | ||||
Ver. 5 | ft:gpt-4o-mini-2024-07-18:personal:eeg-emotion-ver5:AB42P4s | Sad (−1 negative) | 0.39 | 0.17 | 0.23 | 0.38 | 0.07 |
Neutral (0 neutral) | 0.40 | 0.53 | 0.46 | ||||
Positive (1 positive) | 0.35 | 0.43 | 0.39 | ||||
Overall accuracy | - | - | 0.38 | ||||
Ver. 6 | ft:gpt-4o-mini-2024-07-18:personal:eeg-emotion-ver6:AB5S5vY | Sad (−1 negative) | 0.32 | 0.52 | 0.40 | 0.31 | 0.1 |
Neutral (0 neutral) | 0.30 | 0.36 | 0.32 | ||||
Positive (1 positive) | 0.31 | 0.05 | 0.08 | ||||
Overall accuracy | - | - | 0.31 | ||||
Ver. 7 | ft:gpt-4o-mini-2024-07-18:personal:eeg-emotion-ver7:AyoeRwU | Sad (−1 negative) | 0.35 | 0.44 | 0.39 | 0.32 | 0.1 |
Neutral (0 neutral) | 0.31 | 0.39 | 0.35 | ||||
Positive (1 positive) | 0.28 | 0.13 | 0.18 | ||||
Overall accuracy | - | - | 0.32 | ||||
Ver. 8 | ft:gpt-4o-mini-2024-07-18:personal:eeg-emotion-ver8:Az2SEgt | Sad (−1 negative) | 0.6 | 0.52 | 0.56 | 0.603 | 0.46 |
Neutral (0 neutral) | 0.55 | 0.47 | 0.51 | ||||
Positive (1 positive) | 0.62 | 0.81 | 0.70 | ||||
Overall accuracy | - | - | 0.603 |
User | DDP (A) | Long Island (B) | ||||
---|---|---|---|---|---|---|
Connectivity Diagram | Emotion Fine-Tuning (%) | Connectivity Diagram | Emotion Fine-Tuning (%) | |||
#1 | Frontal and parietal lobe activity | Happy | 57.42 | Frontal and temporal lobe activity | Happy | 40.23 |
Neutral | 32.18 | Neutral | 39.34 | |||
Sad | 10.40 | Sad | 20.43 | |||
#2 | Occipital and temporal lobe activity | Happy | 35.13 | Occipital lobe activity | Happy | 30.48 |
Neutral | 50.02 | Neutral | 55.23 | |||
Sad | 14.85 | Sad | 14.29 | |||
#3 | Parietal lobe activity | Happy | 48.96 | Frontal and parietal lobe activity | Happy | 44.60 |
Neutral | 33.14 | Neutral | 35.10 | |||
Sad | 17.90 | Sad | 20.30 | |||
#4 | Temporal and occipital lobe activity | Happy | 42.33 | Frontal, temporal, and occipital lobe activity | Happy | 34.78 |
Neutral | 41.55 | Neutral | 36.45 | |||
Sad | 16.12 | Sad | 28.77 |
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content. |
© 2025 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
Ji, S.-Y.; Kim, M.-K.; Jun, H.-J. Emotion Analysis AI Model for Sensing Architecture Using EEG. Appl. Sci. 2025, 15, 2742. https://doi.org/10.3390/app15052742
Ji S-Y, Kim M-K, Jun H-J. Emotion Analysis AI Model for Sensing Architecture Using EEG. Applied Sciences. 2025; 15(5):2742. https://doi.org/10.3390/app15052742
Chicago/Turabian StyleJi, Seung-Yeul, Mi-Kyoung Kim, and Han-Jong Jun. 2025. "Emotion Analysis AI Model for Sensing Architecture Using EEG" Applied Sciences 15, no. 5: 2742. https://doi.org/10.3390/app15052742
APA StyleJi, S.-Y., Kim, M.-K., & Jun, H.-J. (2025). Emotion Analysis AI Model for Sensing Architecture Using EEG. Applied Sciences, 15(5), 2742. https://doi.org/10.3390/app15052742