Emotion Assessment Using Feature Fusion and Decision Fusion Classification Based on Physiological Data: Are We There Yet?
Abstract
:1. Introduction
2. State of the Art
3. Methods
3.1. Feature Fusion
3.2. Decision Fusion
3.3. Classifier
4. Experimental Results
4.1. Datasets
- IT Multimodal Dataset for Emotion Recognition (ITMDER) [7]: contains the physiological signals of interest to our work (EDA, RESP, ECG, and BVP) of 18 individuals using two devices based on the BITalino system [50,51] (one placed on the arm and the other on the chest of the participants), collected while the subjects watched seven VR videos to elicit the emotions: Boredom, Joyfulness, Panic/Fear, Interest, Anger, Sadness and Relaxation. The ground-truth annotations were obtained by the subjects self-report per video using the Self-Assessment Manikin (SAM), in the Valence-Arousal space. For more information regarding the dataset, the authors refer the reader to [7].
- Multimodal Dataset for Wearable Stress and Affect Detection (WESAD) [6]: contains EDA, ECG, BVP, and RESP sensors data collected from 15 participants using a chest- and a wrist-worn device: a RespiBAN Professional (biosignalsplux.com/index.php/respiban-professional) and an Empatica E4 (empatica.com/en-eu/research/e4) under 4 main conditions: Baseline (reading neutral magazines); Amusement (funny video clips); Stress (Trier Social Stress Test (TSST) consisting of public speaking and a mental arithmetic task); and lastly, meditation. The annotations were obtained using 4 self-reports: PANAS; SAM in Valence-Arousal space; State-Trait Anxiety Inventory (STAI); and Short Stress State Questionnaire (SSSQ). For more information regarding the dataset, the authors refer the reader to [6].
- A dataset for Emotion Analysis using Physiological Signals (DEAP) [8]: contains EEG and peripheral (EDA, BVP, and RESP) physiological data from 32 participants, recorded as each watched 40 one-minute-long excerpts of music videos. The participants rated each video in terms of the levels of Arousal, Valence, like/dislike, dominance and familiarity. For more information regarding the dataset, the authors refer the reader to [8].
- Multimodal dataset for Affect Recognition and Implicit Tagging (MAHNOB-HCI) [52]: contains face videos, audio signals, eye gaze data, and peripheral physiological data (EDA, ECG, RESP) of 27 participants watching 20 emotional videos, self-reported in Arousal, Valence, dominance, predictability, and additional emotional keywords. For more information regarding the dataset, the authors refer the reader to [52].
- Eight-Emotion Sentics Data (EESD) [9]: contains physiological data (EMG, BVP, EDA, and RESP) from an actress during deliberate emotional expressions of Neutral, Anger, Hate, Grief, Platonic Love, Romantic Love, Joy, and Reverence. For more information regarding the dataset, the authors refer the reader to [9].
4.2. Signal Pre-Processing
- Electrocardiography (ECG): Finite impulse response (FIR) band-pass filter of order 300 and 3–45 Hz cut-off frequency.
- Electrodermal Activity (EDA): Butterworth low-pass pass filter of order 4 and 1 Hz cut-off frequency.
- Respiration (RESP): Butterworth band-pass filter of order 2 and 0.1–0.35 Hz cut-off frequency.
- Blood Volume Pulse (BVP): Butterworth band-pass filter of order 4 and 1–8 Hz cut-off frequency.
4.3. Supervised Learning Using Single Modality Classifiers
4.4. Decision Fusion vs. Feature Fusion
5. Conclusions and Future Work
Author Contributions
Funding
Conflicts of Interest
Appendix A
References
- Greenberg, L.S.; Safran, J. Emotion, Cognition, and Action. In Theoretical Foundations of Behavior Therapy; Springer: Boston, MA, USA, 1987; pp. 295–311. [Google Scholar] [CrossRef]
- Shu, L.; Xie, J.; Yang, M.; Li, Z.; Li, Z.; Liao, D.; Xu, X.; Yang, X. A Review of Emotion Recognition Using Physiological Signals. Sensors 2018, 18, 2074. [Google Scholar] [CrossRef] [PubMed] [Green Version]
- Paul, E. An argument for basic emotions. Cogn. Emot. 1992, 6, 169–200. [Google Scholar] [CrossRef]
- Damasio, A.R. Descartes’ Error: Emotion, Reason, and the Human Brain; G.P. Putnam: New York, NY, USA, 1994. [Google Scholar]
- Lang, P.J. The emotion probe: Studies of motivation and attention. Am. Psychol. 1995, 50, 372–385. [Google Scholar] [CrossRef]
- Schmidt, P.; Reiss, A.; Duerichen, R.; Marberger, C.; Van Laerhoven, K. Introducing WESAD, a Multimodal Dataset for Wearable Stress and Affect Detection. In Proceedings of the International Conference on Multimodal Interaction, Boulder, CO, USA, 16–20 October 2018; pp. 400–408. [Google Scholar] [CrossRef]
- Pinto, J. Exploring Physiological Multimodality for Emotional Assessment. Master’s Thesis, Instituto Superior Técnico, Rovisco Pais, Lisboa, Portugal, 2019. [Google Scholar]
- Koelstra, S.; Muhl, C.; Soleymani, M.; Lee, J.; Yazdani, A.; Ebrahimi, T.; Pun, T.; Nijholt, A.; Patras, I. DEAP: A Database for Emotion Analysis using Physiological Signals. IEEE Trans. Affect. Comput. 2012, 3, 18–31. [Google Scholar] [CrossRef] [Green Version]
- Picard, R.W.; Vyzas, E.; Healey, J. Toward machine emotional intelligence: Analysis of affective physiological state. IEEE Trans. Pattern Anal. Mach. Intell. 2001, 23, 1175–1191. [Google Scholar] [CrossRef] [Green Version]
- Schmidt, P.; Reiss, A.; Duerichen, R.; Laerhoven, K.V. Wearable affect and stress recognition: A review. arXiv 2018, arXiv:1811.08854. [Google Scholar]
- Bota, P.J.; Wang, C.; Fred, A.L.N.; Plácido da Silva, H. A Review, Current Challenges, and Future Possibilities on Emotion Recognition Using Machine Learning and Physiological Signals. IEEE Access 2019, 7, 140990–141020. [Google Scholar] [CrossRef]
- Liu, C.; Rani, P.; Sarkar, N. An empirical study of machine learning techniques for affect recognition in human-robot interaction. In Proceedings of the International Conference on Intelligent Robots and Systems, Edmonton, AB, Canada, 2–6 August 2005; pp. 2662–2667. [Google Scholar] [CrossRef]
- Kim, S.M.; Valitutti, A.; Calvo, R.A. Evaluation of Unsupervised Emotion Models to Textual Affect Recognition. In Proceedings of the NAAL HLT Workshop on Computational Approaches to Analysis and Generation of Emotion in Text, Los Angeles, CA, USA, 5 June 2010; pp. 62–70. [Google Scholar]
- Zhang, Z.; Han, J.; Deng, J.; Xu, X.; Ringeval, F.; Schuller, B. Leveraging Unlabeled Data for Emotion Recognition with Enhanced Collaborative Semi-Supervised Learning. IEEE Access 2018, 6, 22196–22209. [Google Scholar] [CrossRef]
- Alhagry, S.; Fahmy, A.A.; El-Khoribi, R.A. Emotion Recognition based on EEG using LSTM Recurrent Neural Network. Int. J. Adv. Comput. Sci. Appl. 2017, 8. [Google Scholar] [CrossRef] [Green Version]
- Zhang, J.; Chen, M.; Hu, S.; Cao, Y.; Kozma, R. PNN for EEG-based Emotion Recognition. In Proceedings of the International Conference on Systems, Man, and Cybernetics, Budapest, Hungary, 9–12 October 2016; pp. 2319–2323. [Google Scholar] [CrossRef]
- Salari, S.; Ansarian, A.; Atrianfar, H. Robust emotion classification using neural network models. In Proceedings of the Iranian Joint Congress on Fuzzy and Intelligent Systems, Kerman, Iran, 28 February–2 March 2018; pp. 190–194. [Google Scholar] [CrossRef]
- Vanny, M.; Park, S.M.; Ko, K.E.; Sim, K.B. Analysis of Physiological Signals for Emotion Recognition Based on Support Vector Machine. In Robot Intelligence Technology and Applications 2012; Kim, J.H., Matson, E.T., Myung, H., Xu, P., Eds.; Springer: Berlin/Heidelberg, Germany, 2013; pp. 115–125. [Google Scholar] [CrossRef]
- Cheng, B. Emotion Recognition from Physiological Signals Using Support Vector Machine; Springer: Berlin/Heidelberg, Germany, 2012; Volume 114, pp. 49–52. [Google Scholar] [CrossRef]
- He, C.; Yao, Y.J.; Ye, X.S. An Emotion Recognition System Based on Physiological Signals Obtained by Wearable Sensors; Springer: Singapore, 2017; pp. 15–25. [Google Scholar] [CrossRef]
- Meftah, I.T.; Le Thanh, N.; Ben Amar, C. Emotion Recognition Using KNN Classification for User Modeling and Sharing of Affect States. In Proceedings of the Neural Information Processing, Doha, Qatar, 12–15 November 2012; Huang, T., Zeng, Z., Li, C., Leung, C.S., Eds.; Springer: Berlin/Heidelberg, Germany, 2012; pp. 234–242. [Google Scholar]
- Li, M.; Xu, H.; Liu, X.; Lu, S. Emotion recognition from multichannel EEG signals using K-nearest neighbor classification. Technol. Health Care 2018, 26, 509–519. [Google Scholar] [CrossRef]
- Kolodyazhniy, V.; Kreibig, S.D.; Gross, J.J.; Roth, W.T.; Wilhelm, F.H. An affective computing approach to physiological emotion specificity: Toward subject-independent and stimulus-independent classification of film-induced emotions. Psychophysiology 2011, 48, 908–922. [Google Scholar] [CrossRef] [PubMed]
- Zhang, X.; Xu, C.; Xue, W.; Hu, J.; He, Y.; Gao, M. Emotion Recognition Based on Multichannel Physiological Signals with Comprehensive Nonlinear Processing. Sensors 2018, 18, 3886. [Google Scholar] [CrossRef] [PubMed] [Green Version]
- Gong, P.; Ma, H.T.; Wang, Y. Emotion recognition based on the multiple physiological signals. In Proceedings of the International Conference on Real-time Computing and Robotics, Angkor Wat, Cambodia, 6–9 June 2016; pp. 140–143. [Google Scholar]
- Ayata, D.; Yaslan, Y.; Kamasak, M.E. Emotion Recognition from Multimodal Physiological Signals for Emotion Aware Healthcare Systems. J. Med. Biol. Eng. 2020, 40, 149–157. [Google Scholar] [CrossRef] [Green Version]
- Chen, J.; Hu, B.; Wang, Y.; Moore, P.; Dai, Y.; Feng, L.; Ding, Z. Subject-independent emotion recognition based on physiological signals: A three-stage decision method. BMC Med. Informatics Decis. Mak. 2017, 17, 167. [Google Scholar] [CrossRef] [PubMed] [Green Version]
- Damaševičius, R.; Zhuang, N.; Zeng, Y.; Tong, L.; Zhang, C.; Zhang, H.; Yan, B. Emotion Recognition from EEG Signals Using Multidimensional Information in EMD Domain. BioMed Res. Int. 2017, 2017, 8317357. [Google Scholar] [CrossRef]
- Lahane, P.; Sangaiah, A.K. An Approach to EEG Based Emotion Recognition and Classification Using Kernel Density Estimation. Procedia Comput. Sci. 2015, 48, 574–581. [Google Scholar] [CrossRef] [Green Version]
- Qing, C.; Qiao, R.; Xu, X.; Cheng, Y. Interpretable Emotion Recognition Using EEG Signals. IEEE Access 2019, 7, 94160–94170. [Google Scholar] [CrossRef]
- Xianhai, G. Study of Emotion Recognition Based on Electrocardiogram and RBF neural network. Procedia Eng. 2011, 15, 2408–2412. [Google Scholar] [CrossRef] [Green Version]
- Xiefeng, C.; Wang, Y.; Dai, S.; Zhao, P.; Liu, Q. Heart sound signals can be used for emotion recognition. Sci. Rep. 2019, 9, 6486. [Google Scholar] [CrossRef]
- Dissanayake, T.; Rajapaksha, Y.; Ragel, R.; Nawinne, I. An Ensemble Learning Approach for Electrocardiogram Sensor Based Human Emotion Recognition. Sensors 2019, 19, 4495. [Google Scholar] [CrossRef] [Green Version]
- Shukla, J.; Barreda-Angeles, M.; Oliver, J.; Nandi, G.C.; Puig, D. Feature Extraction and Selection for Emotion Recognition from Electrodermal Activity. IEEE Trans. Affect. Comput. 2019. [Google Scholar] [CrossRef]
- Udovičić, G.; Ðerek, J.; Russo, M.; Sikora, M. Wearable Emotion Recognition System Based on GSR and PPG Signals. In Proceedings of the 2nd International Workshop on Multimedia for Personal Health and Health Care, Mountain View, CA, USA, 23–27 October 2017; pp. 53–59. [Google Scholar] [CrossRef]
- Liu, M.; Fan, D.; Zhang, X.; Gong, X. Human Emotion Recognition Based on Galvanic Skin Response Signal Feature Selection and SVM. In Proceedings of the 2016 International Conference on Smart City and Systems Engineering, Hunan, China, 25–26 November 2016; pp. 157–160. [Google Scholar] [CrossRef]
- Wei, W.; Jia, Q.; Yongli, F.; Chen, G. Emotion Recognition Based on Weighted Fusion Strategy of Multichannel Physiological Signals. Comput. Intell. Neurosci. 2018, 2018, 1–9. [Google Scholar] [CrossRef] [PubMed] [Green Version]
- Chen, J.; Hu, B.; Xu, L.; Moore, P.; Su, Y. Feature-level fusion of multimodal physiological signals for emotion recognition. In Proceedings of the International Conference on Bioinformatics and Biomedicine (BIBM), Washington, DC, USA, 9–12 November 2015; pp. 395–399. [Google Scholar] [CrossRef]
- Canento, F.; Fred, A.; Silva, H.; Gamboa, H.; Lourenço, A. Multimodal biosignal sensor data handling for emotion recognition. In Proceedings of the 2011 IEEE Sensors Conference, Limerick, Ireland, 28–31 October 2011; pp. 647–650. [Google Scholar] [CrossRef]
- Xie, J.; Xu, X.; Shu, L. WT Feature Based Emotion Recognition from Multi-channel Physiological Signals with Decision Fusion. In Proceedings of the Asian Conference on Affective Computing and Intelligent Interaction, Beijing, China, 20–22 May 2018; pp. 1–6. [Google Scholar]
- Subramanian, R.; Wache, J.; Abadi, M.K.; Vieriu, R.L.; Winkler, S.; Sebe, N. ASCERTAIN: Emotion and Personality Recognition Using Commercial Sensors. IEEE Trans. Affect. Comput. 2018, 9, 147–160. [Google Scholar] [CrossRef]
- Aguileta, A.A.; Brena, R.F.; Mayora, O.; Molino-Minero-Re, E.; Trejo, L.A. Multi-Sensor Fusion for Activity Recognition—A Survey. Sensors 2019, 19, 3808. [Google Scholar] [CrossRef] [PubMed] [Green Version]
- Egger, M.; Ley, M.; Hanke, S. Emotion Recognition from Physiological Signal Analysis: A Review. Electron. Notes Theor. Comput. Sci. 2019, 343, 35–55. [Google Scholar] [CrossRef]
- Doma, V.; Pirouz, M. A comparative analysis of machine learning methods for emotion recognition using EEG and peripheral physiological signals. J. Big Data 2020, 7, 18. [Google Scholar] [CrossRef] [Green Version]
- Dzedzickis, A.; Kaklauskas, A.; Bucinskas, V. Human Emotion Recognition: Review of Sensors and Methods. Sensors 2020, 20, 592. [Google Scholar] [CrossRef] [Green Version]
- Marechal, C.; Mikołajewski, D.; Tyburek, K.; Prokopowicz, P.; Bougueroua, L.; Ancourt, C.; Węgrzyn-Wolska, K. High-Performance Modelling and Simulation for Big Data Applications: Selected Results of the COST Action IC1406 cHiPSet; Springer International Publishing: Cham, Switzerland, 2019; pp. 307–324. [Google Scholar] [CrossRef] [Green Version]
- Zhang, J.; Yin, Z.; Chen, P.; Nichele, S. Emotion recognition using multi-modal data and machine learning techniques: A tutorial and review. Inf. Fusion 2020, 59, 103–126. [Google Scholar] [CrossRef]
- Duda, R.O.; Hart, P.E.; Stork, D.G. Pattern Classification, 2nd ed.; Wiley-Interscience: New York, NY, USA, 2000. [Google Scholar]
- Pedregosa, F.; Varoquaux, G.; Gramfort, A.; Michel, V.; Thirion, B.; Grisel, O.; Blondel, M.; Prettenhofer, P.; Weiss, R.; Dubourg, V.; et al. Scikit-learn: Machine Learning in Python. J. Mach. Learn. Res. 2011, 12, 2825–2830. [Google Scholar]
- Da Silva, H.P.; Fred, A.; Martins, R. Biosignals for Everyone. IEEE Pervasive Comput. 2014, 13, 64–71. [Google Scholar] [CrossRef]
- Alves, A.P.; Plácido da Silva, H.; Lourenco, A.; Fred, A. BITalino: A Biosignal Acquisition System based on Arduino. In Proceedings of the International Conference on Biomedical Electronics and Devices (BIODEVICES), Barcelona, Spain, 11–14 February 2013. [Google Scholar]
- Soleymani, M.; Lichtenauer, J.; Pun, T.; Pantic, M. A Multimodal Database for Affect Recognition and Implicit Tagging. IEEE Trans. Affect. Comput. 2012, 3, 42–55. [Google Scholar] [CrossRef] [Green Version]
- Wiem, M.; Lachiri, Z. Emotion Classification in Arousal Valence Model using MAHNOB-HCI Database. Int. J. Adv. Comput. Sci. Appl. 2017, 8. [Google Scholar] [CrossRef] [Green Version]
Dataset | Classes | N° of Samples per Class | DI | Device | Sampling Rate |
---|---|---|---|---|---|
ITMDER | Low-high Arousal/Valence | Arousal: 0.54 (0); 0.46 (1) Valence: 0.12 (0); 0.88 (1) | 18 23 ± 3.7 10 (F) – 13 (M) | Chest strap and armband based on BITalino a | 1000 |
WESAD | Neutral, Stress, Amusement + 4 Questionnaires | Arousal: 0.86 (0); 0.14 (1) Valence: 0.07 (0); 0.93 (1) | 15 27.5 ± 2.4 3 (F) – 12 (M) | RespiBAN Professional b, Empatica E4 | ECG and RESP: 700; EDA: 4; BVP: 64 |
DEAP | Arousal, Valence, Like/dislike, Dominance and Familiarity | Arousal: 0.41 (0); 0.59 (1) Valence: 0.43 (0); 0.57 (1) | 32 16 (F) – 16 (M) 19 – 37 | Biosemi Active II system c | 128 |
MAHNOB-HCI | Arousal, Valence, Dominance | Arousal: 0.48 (0); 0.52 (1) Valence: 0.47 (0); 0.53 (1) | 27 26.06 ± 4.39 17(F) – 13(M) | Biosemi Active II system | 256 |
EESD | Neutral, Anger, Hate, Grief, Platonic love, Romantic Love, Joy, and Reverence | Arousal: 0.5 (0); 0.5 (1) Valence: 0.5 (0); 0.5 (1) | 1 1 (F) | Thought Technologies ProComp prototype d | 256 |
ITMDER | WESAD | DEAP | MAHNOB-HCI | EESD | ||||||||
---|---|---|---|---|---|---|---|---|---|---|---|---|
Arousal | SOA | Valence | SOA | Arousal | Valence | Arousal | Valence | Arousal | Valence | Arousal | Valence | |
EDA | 59.65 ± 13.46 | 0.572 | 89.26 ± 17.3 | 0.721 | 85.78 ± 16.55 | 92.86 ± 11.96 | 58.91 ± 15.21 | 56.56 ± 9.07 | 50.61 ± 21.84 | 56.43 ± 34.84 | 59.38 ± 16.24 | 68.75 ± 18.75 |
H | 40.74 ± 26.0 | 93.2 ± 12.37 | 0.0 ± 0.0 | 95.86 ± 6.99 | 72.91 ± 12.92 | 71.83 ± 7.42 | 47.53 ± 31.47 | 64.63 ± 34.57 | 56.82 ± 20.8 | 66.71 ± 23.1 | ||
EDA | 56.03 ± 11.0 | 0.572 | 90.91 ± 11.29 | 0.721 | ||||||||
F | 45.67 ± 20.01 | 91.24 ± 18.75 | ||||||||||
ECG | 68.33 ± 5.58 | 0.656 | 89.26 ± 17.3 | 0.7 | 85.75 ± 16.61 | 92.86 ± 11.96 | 49.36 ± 37.5 | 59.15 ± 24.5 | ||||
58.79 ± 21.54 | 93.2 ± 12.37 | 0.0 ± 0.0 | 95.86 ± 6.99 | 53.0 ± 39.62 | 56.58 ± 32.61 | |||||||
BVP | 58.44 ± 12.69 | 0.660 | 89.35 ± 17.23 | 0.695 | 85.78 ± 16.55 | 94.39 ± 9.98 | 58.88 ± 15.19 | 56.56 ± 9.07 | 67.5 ± 13.35 | 66.25 ± 16.35 | ||
45.91 ± 25.24 | 93.25 ± 12.34 | 0.0 ± 0.0 | 96.68 ± 6.01 | 72.9 ± 12.91 | 71.83 ± 7.42 | 66.98 ± 15.95 | 64.49 ± 22.07 | |||||
RESP | 62.37 ± 16.83 | 0.585 | 89.26 ± 17.3 | 0.629 | 85.78 ± 16.55 | 92.86 ± 11.96 | 58.83 ± 14.78 | 56.56 ± 9.07 | 50.62 ± 21.25 | 46.57 ± 20.67 | 72.5 ± 12.87 | 67.5 ± 10.0 |
51.79 ± 23.16 | 93.2 ± 12.37 | 0.0 ± 0.0 | 95.86 ± 6.99 | 72.6 ± 12.74 | 71.83 ± 7.42 | 44.28 ± 31.66 | 48.27 ± 28.44 | 70.12 ± 15.72 | 57.92 ± 15.12 |
ITMDER | WESAD | DEAP | MAHNOB-HCI | EESD | ||||||
---|---|---|---|---|---|---|---|---|---|---|
Arousal | Valence | Arousal | Valence | Arousal | Valence | Arousal | Valence | Arousal | Valence | |
EDA Hand | DT | RF | RF | RF | SVM | SVM | AdaBoost | SVM | AdaBoost | AdaBoost |
EDA Finger | AdaBoost | QDA | ||||||||
ECG | AdaBoost | RF | QDA | RF | RF | AdaBoost | ||||
BVP | QDA | RF | AdaBoost | RF | RF | RF | AdaBoost | AdaBoost | ||
Resp | AdaBoost | RF | RF | RF | AdaBoost | RF | QDA | AdaBoost | AdaBoost | QDA |
ITMDER | WESAD | DEAP | MAHNOB-HCI | EESD | |
---|---|---|---|---|---|
EDA Hand | peaksOnVol_minpeaks EDRVolRatio_iqr onsets_temp_dev | EDA_onsets_spectrum_mean | onsets_spectrum_mean | half_rec_minAmp half_rec_rms amplitude_dist onsets_spectrum_statistic_hist43 rise_ts_temp_curve_distance phasic_rate_maxpeaks onsets_spectrum_meddiff EDRVolRatio_zero_cross | phasic_rate_abs_dev onsetspeaksVol_minpeaks |
EDA Finger | onsets_spectrum_statistic_hist81 peaksOnVol_iqr six_rise_autocorr | ||||
ECG | statistic_hist73, statistic_hist115 hr_sadiff statistic_hist7 statistic_hist137 | mean rpeaks_medadev hr_meandiff | hr_mindiff | ||
BVP | hr_max hr_meandiff | mean | mean | spectral_skewness temp_curve_distance statistic_hist18 statistic_hist13 statistic_hist15 meddiff | |
RESP | exhale_counter inhExhRatio_iqr | statistic_hist0 | mean | hr_total_energy meandiff statistic_hist95 inhale_dur_temp_curve_distance statistic_hist27 hr_meandiff | exhale_meanadiff max, zeros_mean |
ITMDER | WESAD | DEAP | MAHNOB-HCI | EESD | |
---|---|---|---|---|---|
EDA Hand | onsets_spectrum_mean rise_ts_temp_curve_distance rise_ts_medadev | onsets_spectrum_mean | onsets_spectrum_mean | onsets_spectrum_mean | amplitude_mean onsets_spectrum_meanadev half_rise_medadev onsets_spectrum_statistic_hist9 EDRVolRatio_medadiff half_rec_minpeaks |
EDA Finger | onset_peaks_Vol_max half_rise_mean, peaks_max onsets_spectrum_statistic_hist120 half_rec_meandiff onsets_spectrum_statistic_hist91 half_rise_var peaks_Onset_Vol_skewness | ||||
ECG | nni_minpeaks | nni_minpeaks statistic_hist95 | rpeaks_meandiff max mindiff | ||
BVP | statistic_hist44 meanadiff hr_meanadiff onsets_mean hr_meandiff | median minAmp | mean | mean statistic_hist16 statistic_hist5 statistic_hist31 meddiff | |
Resp | mean exhale_median statistic_hist196 | mean | mean | hr_maxpeaks statistic_hist55 zeros_skewness statistic_hist36 | iqr |
ITMDER | WESAD | DEAP | MAHNOB-HCI | EESD | ||||||||||||
---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
Arousal | SOA | Valence | SOA | Arousal | Valence | Arousal | SOA | Valence | SOA | Arousal | SOA | Valence | SOA | Arousal | Valence | |
DF | ||||||||||||||||
A | 66.7 ± 9.0 | 58.1 | 89.3 ± 17.3 | 57.12 | 85.8 ± 16.5 | 92.9 ± 12.0 | 58.9 ± 15.2 | 56.6 ± 9.1 | 54.7 ± 13.3 | 58.1 ± 6.1 | 75.0 ± 14.8 | 75.6 ± 17.9 | ||||
F1 | 50.9 ± 23.5 | 93.2 ± 12.4 | 0.0 ± 0.0 | 95.9 ± 7.0 | 72.9 ± 12.9 | 71.8 ± 7.4 | 63.8 ± 15.8 | 68.1 ± 8.9 | 73.4 ± 16.4 | 72.4 ± 22.5 | ||||||
T | 1.5 ± 0.0 | 1.35 ± 0.0 | 2.04 ± 0.0 | 2.0 ± 0.0 | 1.58 ± 0.0 | 1.73 ± 0.0 | 1.1 ± 0.0 | 1.35 ± 0.0 | 0.6 ± 0.0 | 0.7 ± 0.0 | ||||||
FF | ||||||||||||||||
A | 87.6 ± 16.7 | 89.26 ± 17.3 | 87.6 ± 16.7 | 92.9 ± 12.0 | 60.0 ± 13.9 | 57.0 | 56.9 ± 8.2 | 62.7 | 55.2 ± 15.4 | 64.2 57 55 ± 3.9 | 56.0 ± 10.2 | 68.7 62.7 ± 3.9 57.5 | 60.0 ± 18.4 | 68.7 ± 22.2 | ||
F1 | 19.4 ± 34.4 | 93.2 ± 12.4 | 19.4 ± 34.4 | 95.9 ± 7.0 | 67.3 ± 23.8 | 53.3 | 70.7 ± 7.6 | 60.8 | 67.5 ± 16.6 | 59.0 ± 15.1 | 56.7 ± 22.5 | 67.7 ± 24.7 | ||||
T | 0.02 ± 0.0 | 0.02 ± 0.0 | 0.02 ± 0.0 | 0.07 ± 0.01 | 0.02 ± 0.01 | 0.02 ± 0.0 | 0.01 ± 0.0 | 0.01 ± 0.0 | 0.0 ± 0.0 | 0.01 ± 0.0 |
ITMDER | WESAD | DEAP | MAHNOB-HCI | EESD | ||||||
---|---|---|---|---|---|---|---|---|---|---|
Arousal | Valence | Arousal | Valence | Arousal | Valence | Arousal | Valence | Arousal | Valence | |
Classifier | SVM | RF | QDA | SVM | QDA | GNB | GNB | QDA | DT | RF |
ITMDER | WESAD | DEAP | MAHNOB-HCI | EESD |
---|---|---|---|---|
Arousal | ||||
EDA_H_onsets_spectrum_mean | BVP_median ECG_min Resp_statistic_hist64 | Resp_zeros_sadiff BVP_statistic_hist29 EDA_phasic_rate_total_energy EDA_rise_ts_mindiff Resp_statistic_hist25 | Resp_inhExhRatio_maxpeaks EDA_phasic_rate_iqr Resp_inhExhRatio_zero_cross Resp_inhExhRatio_skewness ECG_rpeaks_meanadiff ECG_minpeaks Resp_meandiff EDA_onsets_spectrum_minAmp EDA_onsets_spectrum_statistic_hist22 ECG_hr_dist EDA_onsets_spectrum_statistic_hist62 | Resp_exhale_max EDA_amplitude_kurtosis |
Valence | ||||
EDA_H_peaksOnVol_minAmp BVP_mean EDA_F_EDRVolRatio_total_energy EDA_H_onsets_spectrum_statistic_hist112 | BVP_median ECG_dist ECG_zero_cross ECG_statistic_hist143 | Resp_statistic_hist60 EDA_half_rise_dist BVP_statistic_hist10 BVP_statistic_hist39 EDA_half_rise_temp_curve_distance BVP_hr_maxAmp | ECG_meanadiff EDA_rise_ts_meandiff Resp_inhale_dur_dist EDA_onsets_spectrum_statistic_hist5 | EDA_amplitude_mean BVP_statistic_hist35 Resp_rms Resp_zeros_meandiff EDA_onsets_spectrum_statistic_hist22 |
© 2020 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (http://creativecommons.org/licenses/by/4.0/).
Share and Cite
Bota, P.; Wang, C.; Fred, A.; Silva, H. Emotion Assessment Using Feature Fusion and Decision Fusion Classification Based on Physiological Data: Are We There Yet? Sensors 2020, 20, 4723. https://doi.org/10.3390/s20174723
Bota P, Wang C, Fred A, Silva H. Emotion Assessment Using Feature Fusion and Decision Fusion Classification Based on Physiological Data: Are We There Yet? Sensors. 2020; 20(17):4723. https://doi.org/10.3390/s20174723
Chicago/Turabian StyleBota, Patrícia, Chen Wang, Ana Fred, and Hugo Silva. 2020. "Emotion Assessment Using Feature Fusion and Decision Fusion Classification Based on Physiological Data: Are We There Yet?" Sensors 20, no. 17: 4723. https://doi.org/10.3390/s20174723
APA StyleBota, P., Wang, C., Fred, A., & Silva, H. (2020). Emotion Assessment Using Feature Fusion and Decision Fusion Classification Based on Physiological Data: Are We There Yet? Sensors, 20(17), 4723. https://doi.org/10.3390/s20174723