Multimodal and Multidomain Feature Fusion for Emotion Classification Based on Electrocardiogram and Galvanic Skin Response Signals
Abstract
:1. Introduction
- Developing an algorithm that utilizes suitable preprocessing, feature extraction, feature selection, and classification techniques to accurately classify emotions using ECG data.
- Developing an algorithm that utilizes suitable preprocessing, feature extraction, feature selection, and classification techniques to classify emotions using GSR data accurately.
- Emotion classification through the early fusion of ECG and GSR features.
2. Related Works
3. Methodology
- Scenario 1: Classifying emotions based on ECG data.
- Scenario 2: Classifying emotions based on GSR data.
- Scenario 3: Classifying emotions based on the fusion of ECG and GSR features.
3.1. Database
3.2. Preprocessing
3.2.1. Scenario 1: ECG Signal Preprocessing
3.2.2. Scenario 2: GSR Signal Preprocessing
3.3. Feature Extraction
3.3.1. Scenario 1: ECG Feature Extraction
3.3.2. Scenario 2: GSR Feature Extraction
3.4. Feature Selection
3.5. Feature Fusion
3.6. Classification
4. Results
4.1. Scenario 1: Emotion Classification Using ECG Data
4.2. Scenario 2: Emotion Classification Using GSR Data
4.3. Scenario 3: Emotion Classification via the Fusion of ECG and GSR Features
5. Discussion
6. Conclusions
Author Contributions
Funding
Institutional Review Board Statement
Informed Consent Statement
Data Availability Statement
Acknowledgments
Conflicts of Interest
References
- Egger, M.; Ley, M.; Hanke, S. Emotion recognition from physiological signal analysis: A review. Electron. Notes Theor. Comput. Sci. 2019, 343, 35–55. [Google Scholar] [CrossRef]
- Bulagang, A.F.; Weng, N.G.; Mountstephens, J.; Teo, J. A review of recent approaches for emotion classification using electrocardiography and electrodermography signals. Inform. Med. Unlocked 2020, 20, 100363. [Google Scholar] [CrossRef]
- Sepúlveda, A.; Castillo, F.; Palma, C.; Rodriguez-Fernandez, M. Emotion recognition from ECG signals using wavelet scattering and machine learning. Appl. Sci. 2021, 11, 4945. [Google Scholar] [CrossRef]
- Dessai, A.; Virani, H. Emotion Classification using Physiological Signals: A Recent Survey. In Proceedings of the 2022 IEEE International Conference on Signal Processing, Informatics, Communication and Energy Systems (SPICES), Trivandrum, India, 10–12 March 2022; IEEE: Piscataway, NJ, USA, 2022; Volume 1, pp. 333–338. [Google Scholar]
- Li, K.; Shen, X.; Chen, Z.; He, L.; Liu, Z. Effectiveness of Emotion Eliciting of Video Clips: A Self-report Study. In The International Conference on Natural Computation, Fuzzy Systems and Knowledge Discovery; Springer International Publishing: Cham, Switzerland, 2020; pp. 523–542. [Google Scholar]
- Bhangale, K.; Kothandaraman, M. Speech Emotion Recognition Based on Multiple Acoustic Features and Deep Convolutional Neural Network. Electronics 2023, 12, 839. [Google Scholar] [CrossRef]
- Velu, S.R.; Ravi, V.; Tabianan, K. Multi-Lexicon Classification and Valence-Based Sentiment Analysis as Features for Deep Neural Stock Price Prediction. Sci 2023, 5, 8. [Google Scholar] [CrossRef]
- Alonazi, M.; Alshahrani, H.J.; Alotaibi, F.A.; Maray, M.; Alghamdi, M.; Sayed, A. Automated Facial Emotion Recognition Using the Pelican Optimization Algorithm with a Deep Convolutional Neural Network. Electronics 2023, 12, 4608. [Google Scholar] [CrossRef]
- Hasnul, M.A.; Aziz NA, A.; Alelyani, S.; Mohana, M.; Aziz, A.A. Electrocardiogram-based emotion recognition systems and their applications in healthcare—A review. Sensors 2021, 21, 5015. [Google Scholar] [CrossRef]
- Tan, C.; Ceballos, G.; Kasabov, N.; Puthanmadam Subramaniyam, N. Fusionsense: Emotion classification using feature fusion of multimodal data and deep learning in a brain-inspired spiking neural network. Sensors 2020, 20, 5328. [Google Scholar] [CrossRef] [PubMed]
- Shahzad, H.F.; Saleem, A.A.; Ahmed, A.; Ur KS, H.; Siddiqui, R. A Review on Physiological Signal Based Emotion Detection. Ann. Emerg. Technol. Comput. 2021, 5. [Google Scholar] [CrossRef]
- Saganowski, S. Bringing emotion recognition out of the lab into real life: Recent advances in sensors and machine learning. Electronics 2022, 11, 496. [Google Scholar] [CrossRef]
- Dessai, A.U.; Virani, H.G. Emotion Detection and Classification Using Machine Learning Techniques. In Multidisciplinary Applications of Deep Learning-Based Artificial Emotional Intelligence; IGI Global: Hershey, PA, USA, 2023; pp. 11–31. [Google Scholar]
- DevTeam, Shimmer. Shimmer Solicits Clinical Research Community Input on Expanded Open Wearables Initiative (OWEAR). Shimmer Wearable Sensor Technology. Available online: https://shimmersensing.com/shimmer-solicits-clinical-research-community-input-on-expanded-open-wearables-initiative-owear/ (accessed on 24 August 2021).
- Koelstra, S.; Muhl, C.; Soleymani, M.; Lee, J.S.; Yazdani, A.; Ebrahimi, T.; Patras, I. Deap: A database for emotion analysis; using physiological signals. IEEE Trans. Affect. Comput. 2011, 3, 18–31. [Google Scholar] [CrossRef]
- Soleymani, M.; Lichtenauer, J.; Pun, T.; Pantic, M. A multimodal database for affect recognition and implicit tagging. IEEE Trans. Affect. Comput. 2011, 3, 42–55. [Google Scholar] [CrossRef]
- Subramanian, R.; Wache, J.; Abadi, M.K.; Vieriu, R.L.; Winkler, S.; Sebe, N. ASCERTAIN: Emotion and personality recognition using commercial sensors. IEEE Trans. Affect. Comput. 2016, 9, 147–160. [Google Scholar] [CrossRef]
- Miranda-Correa, J.A.; Abadi, M.K.; Sebe, N.; Patras, I. Amigos: A dataset for affect, personality and mood research on individuals and groups. IEEE Trans. Affect. Comput. 2018, 12, 479–493. [Google Scholar] [CrossRef]
- Dessai, A.; Virani, H. Emotion detection using physiological signals. In Proceedings of the 2021 International Conference on Electrical, Computer and Energy Technologies (ICECET), Cape Town, South Africa, 9–10 December 2021; IEEE: Piscataway, NJ, USA, 2021; pp. 1–4. [Google Scholar]
- Dar, M.N.; Akram, M.U.; Khawaja, S.G.; Pujari, A.N. CNN and LSTM-based emotion charting using physiological signals. Sensors 2020, 20, 4551. [Google Scholar] [CrossRef] [PubMed]
- Ismail SN, M.S.; Aziz NA, A.; Ibrahim, S.Z.; Nawawi, S.W.; Alelyani, S.; Mohana, M.; Chun, L.C. Evaluation of electrocardiogram: Numerical vs. image data for emotion recognition system. F1000Research 2021, 10, 1114. [Google Scholar] [CrossRef]
- Romeo, L.; Cavallo, A.; Pepa, L.; Bianchi-Berthouze, N.; Pontil, M. Multiple instance learning for emotion recognition using physiological signals. IEEE Trans. Affect. Comput. 2019, 13, 389–407. [Google Scholar] [CrossRef]
- Bulagang, A.F.; Mountstephens, J.; Teo, J. Multiclass emotion prediction using heart rate and virtual reality stimuli. J. Big Data 2021, 8, 12. [Google Scholar] [CrossRef]
- Katsigiannis, S.; Ramzan, N. DREAMER: A database for emotion recognition through EEG and ECG signals from wireless low-cost off-the-shelf devices. IEEE J. Biomed. Health Inform. 2017, 22, 98–107. [Google Scholar] [CrossRef]
- Shukla, J.; Barreda-Angeles, M.; Oliver, J.; Nandi, G.C.; Puig, D. Feature extraction and selection for emotion recognition from electrodermal activity. IEEE Trans. Affect. Comput. 2019, 12, 857–869. [Google Scholar] [CrossRef]
- Santamaria-Granados, L.; Munoz-Organero, M.; Ramirez-Gonzalez, G.; Abdulhay, E.; Arunkumar NJ, I.A. Using deep convolutional neural network for emotion detection on a physiological signals dataset (AMIGOS). IEEE Access 2018, 7, 57–67. [Google Scholar] [CrossRef]
- Hammad, D.S.; Monkaresi, H. Ecg-based emotion detection via parallel-extraction of temporal and spatial features using convolutional neural network. Trait. Du Signal 2022, 39, 43. [Google Scholar] [CrossRef]
- Lee, M.; Lee, Y.K.; Lim, M.T.; Kang, T.K. Emotion recognition using convolutional neural network with selected statistical photoplethysmogram features. Appl. Sci. 2020, 10, 3501. [Google Scholar] [CrossRef]
- Aslan, M. CNN based efficient approach for emotion recognition. J. King Saud Univ.-Comput. Inf. Sci. 2022, 34, 7335–7346. [Google Scholar] [CrossRef]
- Han, E.-G.; Kang, T.-K.; Lim, M.-T. Physiological Signal-Based Real-Time Emotion Recognition Based on Exploiting Mutual Information with Physiologically Common Features. Electronics 2023, 12, 2933. [Google Scholar] [CrossRef]
- Lee, M.S.; Lee, Y.K.; Pae, D.S.; Lim, M.T.; Kim, D.W.; Kang, T.K. Fast emotion recognition based on single pulse PPG signal with convolutional neural network. Appl. Sci. 2019, 9, 3355. [Google Scholar] [CrossRef]
- Filippini, C.; Di Crosta, A.; Palumbo, R.; Perpetuini, D.; Cardone, D.; Ceccato, I.; Di Domenico, A.; Merla, A. Automated Affective Computing Based on Bio-Signals Analysis and Deep Learning Approach. Sensors 2022, 22, 1789. [Google Scholar] [CrossRef] [PubMed]
- Dessai, A.; Virani, H. Emotion Classification Based on CWT of ECG and GSR Signals Using Various CNN Models. Electronics 2023, 12, 2795. [Google Scholar] [CrossRef]
- Al Machot, F.; Elmachot, A.; Ali, M.; Al Machot, E.; Kyamakya, K. A deep-learning model for subject-independent human emotion recognition using electrodermal activity sensors. Sensors 2019, 19, 1659. [Google Scholar] [CrossRef] [PubMed]
- Ahmad, Z.; Khan, N. A survey on physiological signal-based emotion recognition. Bioengineering 2022, 9, 688. [Google Scholar] [CrossRef] [PubMed]
- Khateeb, M.; Anwar, S.M.; Alnowami, M. Multi-domain feature fusion for emotion classification using DEAP dataset. IEEE Access 2021, 9, 12134–12142. [Google Scholar] [CrossRef]
- Wei, W.; Jia, Q.; Feng, Y.; Chen, G. Emotion recognition based on weighted fusion strategy of multichannel physiological signals. Comput. Intell. Neurosci. 2018, 2018, 5296523. [Google Scholar] [CrossRef] [PubMed]
- Bota, P.; Wang, C.; Fred, A.; Silva, H. Emotion assessment using feature fusion and decision fusion classification based on physiological data: Are we there yet? Sensors 2020, 20, 4723. [Google Scholar] [CrossRef]
- Kaur, M.; Singh, B.; Seema. Comparisons of Different Approaches for Removal of Baseline Wander from ECG Signal. In Proceedings of the International Conference and workshop on Emerging Trends in Technology (ICWET), Mumbai, India, 25–26 February 2011; Volume 5, pp. 30–34. [Google Scholar]
- Friesen, G.M.; Jannett, T.C.; Jadallah, M.A.; Yates, S.L.; Quint, S.R.; Nagle, H.T. A comparison of the noise sensitivity of nine QRS detection algorithms. IEEE Trans. Biomed. Eng. 1990, 37, 85–98. [Google Scholar] [CrossRef]
- Galvanic Skin Response (GSR): The Complete Pocket Guide—Imotions. 2020. Available online: https://imotions.com/blog/learning/research-fundamentals/galvanic-skin-response/ (accessed on 25 February 2020).
- Available online: https://guhanesvar.medium.com/feature-selection-based-on-mutual-information-gain-for-classification-and-regression (accessed on 20 November 2023).
- Gupta, Prashant. Cross-Validation in Machine Learning. Towards Data Science. 2017. Available online: https://towardsdatascience.com/cross-validation-in-machine-learning-72924a69872f (accessed on 20 November 2023).
- Available online: https://www.ibm.com/topics/knn (accessed on 20 November 2023).
- Available online: https://towardsdatascience.com/a-complete-view-of-decision-trees-and-svm-in-machine-learning-f9f3d19a337b (accessed on 20 November 2023).
- Available online: https://builtin.com/data-science/random-forest-algorithm (accessed on 20 November 2023).
Sr. No. | ECG Valence Classifier | 5-Fold Accuracy | ECG Valence Accuracy (%) | Precision | Recall | F1 Score |
---|---|---|---|---|---|---|
1 | SVM | [0.60, 0.60, 0.67, 0.46, 0.63] | 60 | 0.56 | 0.89 | 0.68 |
2 | KNN | [0.64, 0.71, 0.68, 0.75, 0.66] | 69 | 0.69 | 0.68 | 0.68 |
3 | RF | [0.57, 0.53, 0.78, 0.64,0.63] | 63 | 0.65 | 0.59 | 0.62 |
4 | DECISION TREE | [0.57, 0.53,0.86, 0.60,.63] | 64 | 0.64 | 0.62 | 0.63 |
Sr. No. | ECG Arousal Classifier | 5-Fold Accuracy | ECG Arousal Accuracy (%) | Precision | Recall | F1 Score |
---|---|---|---|---|---|---|
1 | SVM | [0.78, 0.53, 0.46, 0.64, 0.66] | 62 | 0.66 | 0.54 | 0.59 |
2 | KNN | [0.78, 0.64, 0.71, 0.71, 0.63] | 70 | 0.70 | 0.74 | 0.72 |
3 | RF | [0.78, 0.71, 0.68, 0.68, 0.74] | 72 | 0.71 | 0.77 | 0.74 |
4 | DECISION TREE | [0.71, 0.75, 0.71, 0.68, 0.70] | 71 | 0.68 | 0.80 | 0.73 |
Sr. No. | GSR Valence Classifier | 5-Fold Accuracy | GSR Valence Accuracy (%) | Precision | Recall | F1 Score |
---|---|---|---|---|---|---|
1 | SVM | [1.0, 0.96, 0.96, 0.89, 1.0] | 96 | 0.94 | 0.99 | 0.96 |
2 | KNN | [1.0, 0.96, 0.96, 0.89, 1.0] | 96 | 0.94 | 0.99 | 0.96 |
3 | RF | [0.98, 0.96, 0.96, 0.89, 0.98] | 95 | 0.93 | 0.98 | 0.95 |
4 | DECISION TREE | [0.98, 0.96, 0.96, 0.89, 0.98] | 95 | 0.93 | 0.98 | 0.95 |
Sr. No. | GSR Arousal Classifier | 5-Fold Accuracy | GSR Arousal Accuracy (%) | Precision | Recall | F1 Score |
---|---|---|---|---|---|---|
1 | SVM | [0.89, 0.93, 0.96, 0.928, 1.0] | 94 | 0.92 | 0.97 | 0.94 |
2 | KNN | [0.92, 0.93, 0.94, 0.96, 0.96] | 94 | 0.92 | 0.96 | 0.94 |
3 | RF | [0.89, 0.85, 0.96, 0.93, 0.96] | 92 | 0.92 | 0.93 | 0.92 |
4 | DECISION TREE | [0.89, 0.85, 0.96, 0.93, 0.96] | 92 | 0.92 | 0.93 | 0.92 |
Sr. No. | Classifier | 5-Fold Accuracy | GSR Valence Accuracy (%) | Precision | Recall | F1 Score |
---|---|---|---|---|---|---|
1 | SVM | [1.0, 0.96, 0.96, 0.89, 1.0] | 96 | 0.94 | 0.99 | 0.96 |
2 | KNN | [1.0, 0.96, 0.96, 0.89, 1.0] | 96 | 0.94 | 0.99 | 0.96 |
3 | RF | [0.98, 0.96, 0.96, 0.89, 0.98] | 95 | 0.93 | 0.98 | 0.95 |
4 | DECISION TREE | [0.98, 0.96, 0.96, 0.89, 0.98] | 95 | 0.93 | 0.98 | 0.95 |
Sr. No. | Classifier | 5-Fold Accuracy | Fusion Arousal Accuracy (%) | Precision | Recall | F1 Score |
---|---|---|---|---|---|---|
1 | SVM | [0.89, 0.93, 0.96, 0.93, 1.0] | 94 | 0.93 | 0.96 | 0.94 |
2 | KNN | [0.93, 0.93, 0.93, 0.93, 0.96] | 94 | 0.92 | 0.96 | 0.94 |
3 | RF | [0.88, 0.88, 0.96, 0.93, 1.0] | 94 | 0.94 | 0.93 | 0.93 |
4 | DECISION TREE | [0.89, 0.93, 1.0, 0.93, 1.0] | 95 | 0.96 | 0.93 | 0.94 |
Sr. No. | Classifier | ECG Valence (%) | ECG Arousal (%) | GSR Valence (%) | GSR Arousal (%) | Fusion Valence (%) | Fusion Arousal (%) |
---|---|---|---|---|---|---|---|
1 | SVM | 60 | 62 | 96 | 94 | 96 | 94 |
2 | KNN | 69 | 70 | 96 | 94 | 96 | 94 |
3 | RF | 63 | 72 | 95 | 92 | 95 | 94 |
4 | DECISION TREE | 64 | 71 | 95 | 92 | 95 | 95 |
Sr. No. | Reference No. | Database | Feature Selection | Cross Validation Technique | Classifier | Accuracy |
---|---|---|---|---|---|---|
1 | Present work | AMIGOS | Mutual information | K-fold | KNN | Valence: 69% |
Arousal: 70% | ||||||
2 | [3] | AMIGOS | _ | K-fold | Decision Tree | Valence: 59.2% |
Arousal: 60.6% | ||||||
3 | [18] | AMIGOS | Fisher’s linear discrimination | Leave one participant out | Linear SVM | Valence: 57.6% |
Arousal: 59.2% |
Sr. No. | Reference | Database | Feature Selection | Classifier | Accuracy |
---|---|---|---|---|---|
1 | Present work | AMIGOS | Mutual information | KNN | Valence: 96% Arousal: 94% |
2 | [18] | AMIGOS | Fisher’s linear discriminant | Linear SVM | Valence: 53.1% Arousal: 54.8% |
3 | [25] | AMIGOS | Mutual information | Non-linear SVM | Valence: 83.9% Arousal: 85.71% |
Sr. No. | Reference | Database | Feature Fusion Technique | Feature Selection | Classifier | Accuracy |
---|---|---|---|---|---|---|
1 | Present work | AMIGOS | Early fusion | Mutual information | KNN | Valence: 96% Arousal: 94% |
2 | [18] | AMIGOS | Decision-level fusion | Fisher’s linear discriminant | Linear SVM | Valence: 57% Arousal: 58.5% |
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content. |
© 2024 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
Dessai, A.; Virani, H. Multimodal and Multidomain Feature Fusion for Emotion Classification Based on Electrocardiogram and Galvanic Skin Response Signals. Sci 2024, 6, 10. https://doi.org/10.3390/sci6010010
Dessai A, Virani H. Multimodal and Multidomain Feature Fusion for Emotion Classification Based on Electrocardiogram and Galvanic Skin Response Signals. Sci. 2024; 6(1):10. https://doi.org/10.3390/sci6010010
Chicago/Turabian StyleDessai, Amita, and Hassanali Virani. 2024. "Multimodal and Multidomain Feature Fusion for Emotion Classification Based on Electrocardiogram and Galvanic Skin Response Signals" Sci 6, no. 1: 10. https://doi.org/10.3390/sci6010010
APA StyleDessai, A., & Virani, H. (2024). Multimodal and Multidomain Feature Fusion for Emotion Classification Based on Electrocardiogram and Galvanic Skin Response Signals. Sci, 6(1), 10. https://doi.org/10.3390/sci6010010