A Globally Generalized Emotion Recognition System Involving Different Physiological Signals
Abstract
:1. Introduction
2. Background
2.1. Definition of Emotion
- Subjective experience: several works have categorized emotions into different states, whereby all humans regardless of culture and race can experience them. However, the way of experiencing these emotions is highly subjective [9].
- Emotion expressions: most expressions are observable and nonverbal behaviors, which illustrate an affective or internal emotional state. For example, happiness and pleasure can be expressed by a smile, whereby sadness or displeasure by a frown. In general, emotion expressions include human audiovisual activities such as gesture, posture, voice intonation, breathing noise, etc.
2.2. Related Works
2.3. The Present Work
- The automatic features’ calibration for an adaptive adjustment of the extracted features by translating them toward the correlated subject in the training set. Here, we use the collaborative filtering concept of [27] to calculate the adjustment weight of the extracted features from a new subject by finding its most correlated subject from the training data.
- A novel machine learning model based on Cellular Neural Networks (CNN) that delivers promising results. Here, we improved the performance of the CNN processor by using a hyperbolic tangent sigmoid transfer function [28] as output nonlinear function of the CNN states and the echo-state network ESN [29] paradigm for an efficient training of the CNN processor model.
3. Physiological Signals Involved in This Study
- Electrodermal Activity (EDA): It refers to skin conductivity (SC) that basically measures the skin’s ability to conduct electricity, whereby the conductivity increases if the skin is sweaty. During the experience of physical arousal, the central nervous system is activated and the sweat is produced in the endocrine glands, which measurably changes the conductivity of the skin [30].EDA consists of a slowly changing part called Skin Conductance Level (SCL), which is overlaid by other short and fast conductance changes called phasic components. The phasic components can be separated into two different types. The first one is the Skin Conductance Response (SCR), where the peak occurs in reaction to a stimulus. The second one is the Non-Specific Skin Conductance Response (NS.SCR), which is empirically very similar to SCR, but, however, occurs spontaneously without any stimulus [6]. In our study, the EDA signals are measured with a sampling rate of 4 Hz using a wearable wireless device (Empatica—E4 [31]) placed on the human wrist.
- Electrocardiogram (ECG): It refers to a measurement setting that measures the electrical activity of the heart over a period of time. In general, ECG signals consist of three main waves. The first wave is the P wave, which indicates the depolarization of the atrium. The second wave is the QRS wave, which corresponds to the start of ventricular contractions. After the ventricles have stayed contracted for a few milliseconds, the third wave T appears. This wave occurs when the ventricular repolarizes [32]. The wearable wireless Bioradio™ device [33] (Great Lakes NeuroTechnologies, OH, USA) is used to measure the ECG signal with three electrodes (plus one ground electrode) placed on the body at a sampling rate of 500 Hz.
- Skin Temperature (ST): The skin temperature is recorded with an optical infrared thermometer. The ST signals are measured with a sampling rate of 4 Hz using a the wearable wireless device (Empatica—E4 [31]), which also incorporates the EDA measurement artefacts and is placed on the human wrist.
4. Research Methodology
4.1. Data Collection and Experiment Procedure
4.2. Data Synchronization and Target Classes
4.3. Feature Extraction
4.3.1. EDA Features
4.3.2. ECG Features
4.3.3. Skin Temperature Features
4.4. Automatic Calibration Model
4.4.1. Preparation (i.e., Offline) Phase
Algorithm 1 G-means algorithm. |
|
4.4.2. Online Phase
4.5. Classification
4.5.1. Learning Phase
- is generated as normally distributed sparse symmetric matrix with and a sparseness measure of . The resultant matrix is then divided by its own largest absolute eigenvalue. These generating constraints are important to respect the properties of the echo state (sparsity and spectral radius ) that give stability for the network as suggested by [46].
- and are generated randomly with a standard normal distribution and scaled by a factor equal to .
- Solver type Fixed-step,
- Solver ode1 (Euler’s method [47]),
- Step size 0.5 with holding final value,
- Initial condition of CNN cells: initial state is zero.
Algorithm 2 The learning algorithm of the CNN. |
|
4.5.2. Testing Phase
Algorithm 3 The testing algorithm of the CNN. |
|
5. Obtained Results
5.1. Overall System Performance While Using the Reference Database MAHNOB
5.2. Overall Performance Evaluation While Using Both Training and Testing Data from Our Experiment
5.3. The Overall Performance Using the MAHNOB Reference Database for Training and Data from Our Experiment for Testing
6. Discussion
7. Conclusions
Author Contributions
Funding
Acknowledgments
Conflicts of Interest
References
- Burns, P.; Lansdown, T. E-distraction: The challenges for safe and usable internet services in vehicles. In Proceedings of the Internet Forum on the Safety Impact of Driver Distraction When Using In-Vehicle Technologies, Washington, DC, USA, 5 July–11 August 2000. [Google Scholar]
- Mühlbacher-Karrer, S.; Mosa, A.H.; Faller, L.M.; Ali, M.; Hamid, R.; Zangl, H.; Kyamakya, K. A Driver State Detection System-Combining a Capacitive Hand Detection Sensor With Physiological Sensors. IEEE Trans. Instrum. Meas. 2017, 66, 624–636. [Google Scholar] [CrossRef]
- Ali, M.; Mosa, A.H.; Al Machot, F.; Kyamakya, K. EEG-based emotion recognition approach for e-healthcare applications. In Proceedings of the 2016 Eighth International Conference on IEEE Ubiquitous and Future Networks (ICUFN), Vienna, Austria, 5–8 July 2016; pp. 946–950. [Google Scholar]
- Verschuere, B.; Crombez, G.; Koster, E.; Uzieblo, K. Psychopathy and physiological detection of concealed information: A review. Psychol. Belg. 2006, 46, 1–2. [Google Scholar] [CrossRef]
- Mandryk, R.L.; Inkpen, K.M.; Calvert, T.W. Using psychophysiological techniques to measure user experience with entertainment technologies. Behav. Inf. Technol. 2006, 25, 141–158. [Google Scholar] [CrossRef]
- Cowie, R.; Douglas-Cowie, E.; Tsapatsoulis, N.; Votsis, G.; Kollias, S.; Fellenz, W.; Taylor, J.G. Emotion recognition in human-computer interaction. IEEE Signal Process. Mag. 2001, 18, 32–80. [Google Scholar] [CrossRef]
- Al Machot, F.; Mosa, A.H.; Fasih, A.; Schwarzlmüller, C.; Ali, M.; Kyamakya, K. A novel real-time emotion detection system for advanced driver assistance systems. In Autonomous Systems: Developments and Trends; Springer: Berlin, Germany, 2012; pp. 267–276. [Google Scholar]
- Katsis, C.D.; Katertsidis, N.; Ganiatsas, G.; Fotiadis, D.I. Toward emotion recognition in car-racing drivers: A biosignal processing approach. IEEE Trans. Syst. Man Cybern. Part A Syst. Hum. 2008, 38, 502–512. [Google Scholar] [CrossRef]
- Kim, J. Bimodal Emotion Recognition Using Speech and Physiological Changes; Robust Speech Recognition and Understanding; INTECH Open Access Publisher: London, UK, 2007. [Google Scholar]
- Essa, I.A.; Pentland, A. A vision system for observing and extracting facial action parameters. In Proceedings of the IEEE Computer Society Conference on Computer Vision and Pattern Recognition, CVPR’94. Seattle, WA, USA, 21–23 June 1994; pp. 76–83. [Google Scholar]
- Kim, J.; André, E. Emotion recognition based on physiological changes in music listening. IEEE Trans. Pattern Anal. Mach. Intell. 2008, 30, 2067–2083. [Google Scholar] [CrossRef] [PubMed]
- Wen, W.-H.; Qiu, Y.-H.; Liu, G.-Y. Electrocardiography recording, feature extraction and classification for emotion recognition. In Proceedings of the 2009 WRI World Congress on Computer Science and Information Engineering, Los Angeles, CA, USA, 31 March–2 April 2009; Volume 4, pp. 168–172. [Google Scholar]
- Ali, M.; Al Machot, F.; Mosa, A.H.; Kyamakya, K. CNN Based Subject-Independent Driver Emotion Recognition System Involving Physiological Signals for ADAS. In Advanced Microsystems for Automotive Applications 2016; Springer: Berlin, Germany, 2016; pp. 125–138. [Google Scholar]
- Lisetti, C.L.; Nasoz, F. Using noninvasive wearable computers to recognize human emotions from physiological signals. EURASIP J. Adv. Signal Process. 2004, 2004, 929414. [Google Scholar] [CrossRef]
- Thoits, P.A. The sociology of emotions. Annu. Rev. Sociol. 1989, 15, 317–342. [Google Scholar] [CrossRef]
- Krause, R. Universals and Cultural Differences in the Judgments of Facial Expressions of Emotion. J. Personal. Soc. Psychol. 1987, 5, 4–712. [Google Scholar]
- Lang, P.J. The emotion probe: Studies of motivation and attention. Am. Psychol. 1995, 50, 372. [Google Scholar] [CrossRef] [PubMed]
- Haag, A.; Goronzy, S.; Schaich, P.; Williams, J. Emotion Recognition Using Bio-Sensors: First Steps towards an Automatic System. Tutorial and Research Workshop on Affective Dialogue Systems; Springer: Berlin, Germany, 2004; pp. 36–48. [Google Scholar]
- Maaoui, C.; Pruski, A. Emotion Recognition through Physiological Signals for Human-Machine Communication; INTECH Open Access Publisher: London, UK, 2010. [Google Scholar]
- De Santos Sierra, A.; Ávila, C.S.; Casanova, J.G.; del Pozo, G.B. A stress-detection system based on physiological signals and fuzzy logic. IEEE Trans. Ind. Electron. 2011, 58, 4857–4865. [Google Scholar] [CrossRef] [Green Version]
- Kulic, D.; Croft, E.A. Affective state estimation for human–robot interaction. IEEE Trans. Robot. 2007, 23, 991–1000. [Google Scholar] [CrossRef]
- Wen, W.; Liu, G.; Cheng, N.; Wei, J.; Shangguan, P.; Huang, W. Emotion recognition based on multi-variant correlation of physiological signals. IEEE Trans. Affect. Comput. 2014, 5, 126–140. [Google Scholar] [CrossRef]
- Lin, Y.P.; Wang, C.H.; Jung, T.P.; Wu, T.L.; Jeng, S.K.; Duann, J.R.; Chen, J.H. EEG-based emotion recognition in music listening. IEEE Trans. Biomed. Eng. 2010, 57, 1798–1806. [Google Scholar] [PubMed]
- Coan, J.A.; Allen, J.J. Handbook of Emotion Elicitation and Assessment; Oxford University Press: Oxford, UK, 2007. [Google Scholar]
- Koelstra, S.; Muhl, C.; Soleymani, M.; Lee, J.S.; Yazdani, A.; Ebrahimi, T.; Pun, T.; Nijholt, A.; Patras, I. Deap: A database for emotion analysis; using physiological signals. IEEE Trans. Affect. Comput. 2012, 3, 18–31. [Google Scholar] [CrossRef]
- Soleymani, M.; Lichtenauer, J.; Pun, T.; Pantic, M. A multimodal database for affect recognition and implicit tagging. IEEE Trans. Affect. Comput. 2012, 3, 42–55. [Google Scholar] [CrossRef]
- Linden, G.; Smith, B.; York, J. Amazon. com recommendations: Item-to-item collaborative filtering. IEEE Internet Comput. 2003, 7, 76–80. [Google Scholar] [CrossRef]
- Vogl, T.P.; Mangis, J.; Rigler, A.; Zink, W.; Alkon, D. Accelerating the convergence of the back-propagation method. Biol. Cybern. 1988, 59, 257–263. [Google Scholar] [CrossRef]
- Jaeger, H. The “Echo State” Approach to Analysing and Training Recurrent Neural Networks-With an Erratum Note; GMD Technical Report; German National Research Center for Information Technology: Bonn, Germany, 2001; Volume 148, p. 13. [Google Scholar]
- Boucsein, W. Electrodermal Activity; Springer Science & Business Media: Berlin, Germany, 2012. [Google Scholar]
- Wristband, E.E. EDA Sensor. Available online: https://www.empatica.com/e4-wristband (accessed on 6 January 2017).
- Silverthorn, D.U.; Ober, W.C.; Garrison, C.W.; Silverthorn, A.C.; Johnson, B.R. Human Physiology: An Integrated Approach; Pearson/Benjamin Cummings: San Francisco, CA, USA, 2009. [Google Scholar]
- BioRadio. ECG Sensor. Available online: https://glneurotech.com/bioradio/ (accessed on 6 January 2017).
- Bradley, M.M.; Lang, P.J. Measuring emotion: The self-assessment manikin and the semantic differential. J. Behav. Ther. Exp. Psychiatry 1994, 25, 49–59. [Google Scholar] [CrossRef]
- Schaefer, A.; Nils, F.; Sanchez, X.; Philippot, P. Assessing the effectiveness of a large database of emotion-eliciting films: A new tool for emotion researchers. Cognit. Emot. 2010, 24, 1153–1172. [Google Scholar] [CrossRef]
- Rottenberg, J.; Ray, R.; Gross, J. Emotion elicitation using films. In Handbook of Emotion Elicitation and Assessment; Coan, J.A., Allen, J.J.B., Eds.; Oxford University Press: Oxford, UK, 2007. [Google Scholar]
- Benedek, M.; Kaernbach, C. Decomposition of skin conductance data by means of nonnegative deconvolution. Psychophysiology 2010, 47, 647–658. [Google Scholar] [CrossRef] [PubMed]
- Tarvainen, M.P.; Niskanen, J.P.; Lipponen, J.A.; Ranta-Aho, P.O.; Karjalainen, P.A. Kubios HRV–heart rate variability analysis software. Comput. Methods Programs Biomed. 2014, 113, 210–220. [Google Scholar] [CrossRef] [PubMed]
- Hartigan, J.A.; Wong, M.A. Algorithm AS 136: A k-means clustering algorithm. J. R. Stat. Soc. Ser. C 1979, 28, 100–108. [Google Scholar] [CrossRef]
- Arthur, D.; Vassilvitskii, S. k-means++: The advantages of careful seeding. In Proceedings of the Eighteenth Annual ACM-SIAM Symposium on Discrete Algorithms, New Orleans, Louisiana, 7–9 January 2007; Society for Industrial and Applied Mathematics: Philadelphia, PA, USA, 2007; pp. 1027–1035. [Google Scholar]
- Hamerly, G.; Elkan, C. Learning the k in k-means. NIPS 2003, 3, 281–288. [Google Scholar]
- Ahlgren, P.; Jarneving, B.; Rousseau, R. Requirements for a cocitation similarity measure, with special reference to Pearson’s correlation coefficient. J. Am. Soc. Inf. Sci. Technol. 2003, 54, 550–560. [Google Scholar] [CrossRef]
- Chua, L.O.; Yang, L. Cellular neural networks: Applications. IEEE Trans. Circuits Syst. 1988, 35, 1273–1290. [Google Scholar] [CrossRef]
- MATLAB, version (R2015b); The MathWorks Inc.: Natick, MA, USA, 2015.
- Kennedy, J. Particle swarm optimization. In Encyclopedia of Machine Learning; Springer: Berlin, Germany, 2011; pp. 760–766. [Google Scholar]
- Lukoševičius, M. A practical guide to applying echo state networks. In Neural Networks: Tricks of the Trade; Springer: Berlin, Germany, 2012; pp. 659–686. [Google Scholar]
- Cryer, C.W.; Tavernini, L. The numerical solution of Volterra functional differential equations by Euler’s method. SIAM J. Numeri. Anal. 1972, 9, 105–129. [Google Scholar] [CrossRef]
- Marconato, A.; Hu, M.; Boni, A.; Petri, D. Dynamic compensation of nonlinear sensors by a learning-from-examples approach. IEEE Trans. Instrum. Meas. 2008, 57, 1689–1694. [Google Scholar] [CrossRef]
- Srivastava, S.; Gupta, M.R.; Frigyik, B.A. Bayesian quadratic discriminant analysis. J. Mach. Learn. Res. 2007, 8, 1277–1305. [Google Scholar]
- Fukunaga, K.; Narendra, P.M. A branch and bound algorithm for computing k-nearest neighbors. IEEE Trans. Comput. 1975, 100, 750–753. [Google Scholar] [CrossRef]
- Yan, W. Toward automatic time-series forecasting using neural networks. IEEE Trans. Neural Netw. Learn. Syst. 2012, 23, 1028–1039. [Google Scholar] [PubMed]
- Köhler, R.; Metrology, F.C. The international vocabulary of metrology: Basic and general concepts and associated terms. Why? How? Transverse Discip. Metrol. 2009, 233–238. [Google Scholar] [CrossRef]
- Hand, D.J. Measuring classifier performance: A coherent alternative to the area under the ROC curve. Mach. Learn. 2009, 77, 103–123. [Google Scholar] [CrossRef]
- Al Machot, F.; Ali, M.; Mosa, A.H.; Schwarzlmüller, C.; Gutmann, M.; Kyamakya, K. Real-time raindrop detection based on cellular neural networks for ADAS. J. Real-Time Image Process. 2016. [Google Scholar] [CrossRef] [Green Version]
- Perfetti, R.; Ricci, E.; Casali, D.; Costantini, G. Cellular neural networks with virtual template expansion for retinal vessel segmentation. IEEE Trans. Circuits Syst. II Express Br. 2007, 54, 141–145. [Google Scholar] [CrossRef]
- Milanova, M.; Büker, U. Object recognition in image sequences with cellular neural networks. Neurocomputing 2000, 31, 125–141. [Google Scholar] [CrossRef] [Green Version]
- Yan, L.; Bae, J.; Lee, S.; Roh, T.; Song, K.; Yoo, H.J. A 3.9 mW 25-electrode reconfigured sensor for wearable cardiac monitoring system. IEEE J. Solid-State Circuits 2011, 46, 353–364. [Google Scholar] [CrossRef]
- Yapici, M.K.; Alkhidir, T.E. Intelligent medical garments with graphene-functionalized smart-cloth ECG sensors. Sensors 2017, 17, 875. [Google Scholar] [CrossRef] [PubMed]
Ref. No. | Signals | Features | Classifiers | Emotion Parameters | Stimuli | No. of Subjects | Accuracy in % |
---|---|---|---|---|---|---|---|
[11] | EMG ECG EDA RSP | Statistical, Energy, Sub band Spectrum, Entropy | Linear Discriminant Analysis | Joy, Anger, Sad, Pleasure | Music | 3 , MITdatabase | 95 (Subject-Dependent) 70 (Subject-Independent) |
[14] | EDA HR ST | No specific features stated | KNN, Discriminant Function Analysis, Marquardt backpropagation | Sadness, Anger, Fear, Surprise, Frustration, Amusement | Movies | 14 | 91.7 (Subject-Dependent) |
[18] | EMG EDA BVP ECG RSP | Running mean Running standard deviation Slope | NN | Arousal, Valance | IAPS (Visual Affective Picture System) | 1 | 96.58 Arousal 89.93 Valence (Subject-Dependent) |
[12] | ECG | Fast Fourier | Tabu Search | Joy, Sadness | Movies | 154 | 86 (Subject-Independent) |
[20] | EDA HR | No specific features stated | fuzzy logic | Stress | Hyperventilation Talk preparation | 80 | 99.5 (Subject-Independent) |
[19] | BVP EMG ST EDA RSP | Statistical Features | SVM, Fisher LDA | Amusement, Contentment, Disgust, Fear, Sad, Neutral | IAPS | 10 | 90 (Subject-Dependent) |
[9] | EMG EDA ECG BVP ST RSP SPEECH | Statistical Features, BRV, Zero-crossing, MFCCs | KNN | Arousal, Valance | Quiz dataset | 3 | 92 (Subject-Dependent) 55 (Sub Independent) |
[21] | EDA HR EMG | No specific features stated | HMM | Arousal, Valance | Robot Actions | 36 | 81 (Subject-Dependent) 66 (Subject-Independent) |
[13] | EDA ECG ST | Statistical Features average power SCL SCR | CNN | Arousal, Valance | Movies | 10 | 82.35 (Subject-Independent) |
Classifier | Type | Parameters |
---|---|---|
RBSVM [48] | C-SVC | KernelType= radial basis function, eps= 0.001, gamma= 0.0001 |
NB [49] | NaiveBayes -k | UseKernelEstimator= True |
KNN [50] | Default | |
ANN [51] | Multilayer Perceptron | |
CNN | Echo State |
Physiological Sensor | KNN | NB | ANN | SVM | CNN | |
---|---|---|---|---|---|---|
Single sensor | ECG | 61.2 | 53.58 | 53.92 | 62.91 | 56.41 |
EDA | 63.73 | 53.1 | 60.32 | 68.4 | 75.34 | |
ST | 33.12 | 35.7 | 42.64 | 41.8 | 42.6 | |
Multi sensors | EDA + ECG | 71.12 | 55.4 | 60.78 | 72.64 | 83.43 |
ST + ECG | 68.45 | 54.53 | 55.86 | 70 | 68.63 | |
ST + EDA | 69.13 | 55.34 | 58.43 | 69.64 | 78.5 | |
ST + EDA + ECG | 76.88 | 56.88 | 62.5 | 77.5 | 89.38 |
Measure | KNN | NB | ANN | SVM | CNN |
---|---|---|---|---|---|
Accuracy | 76.88% | 56.88% | 62.5% | 77.5% | 89.38% |
Specificity | 95% | 86.67% | 85.84% | 95% | 97.5% |
Precision | 81.82% | 57.9% | 57.5% | 82.86% | 92.11% |
Recall | 67.5% | 55% | 57.5% | 72.5% | 87.5% |
Measure | KNN | NB | ANN | SVM | CNN |
---|---|---|---|---|---|
Accuracy | 56.25% | 25.63% | 45.63% | 71.88% | 81.88% |
Specificity | 80% | 72.5% | 83.34% | 89.17% | 95% |
Precision | 50% | 25% | 42.86% | 67.5% | 82.86% |
Recall | 60% | 27% | 37.5% | 67.5% | 72.5% |
Subject | KNN | NB | ANN | SVM | CNN |
---|---|---|---|---|---|
Subject1 | 30.54% | 26.43% | 31.21% | 14.31% | 58.76% |
Subject2 | 44.15% | 22.23% | 30.76% | 20.87% | 60% |
Subject3 | 32.25% | 13.15% | 34.89% | 25.90% | 54.38% |
Subject4 | 29.64% | 19.22% | 28.33% | 20.33% | 57.5% |
Subject5 | 29.90% | 25.21% | 12.85% | 22.58% | 59.38% |
Subject6 | 27.45% | 10.89% | 25.07% | 17.32% | 53.13% |
All Subjects | 32.33% | 19.53% | 27.18% | 20.22% | 57.19% |
Subject | KNN | NB | ANN | SVM | CNN |
---|---|---|---|---|---|
Subject1 | 44.58% | 31.88% | 55.63% | 48.54% | 70.63% |
Subject2 | 53.93% | 29.12% | 58.38% | 50.82% | 81.26% |
Subject3 | 48.62% | 40.19% | 60.45% | 42.73% | 71.88% |
Subject4 | 35.55% | 27.45% | 33.77% | 44.21% | 72.5% |
Subject5 | 24.29% | 30.83% | 22.36% | 43.16% | 63.13% |
Subject6 | 25.23% | 23.28% | 30.83% | 26.87% | 66.88% |
All Subjects | 38.7% | 30.46% | 43.57% | 42.73% | 71.05% |
Measure | CNN without Calibration Model | CNN with Calibration Model |
---|---|---|
Accuracy | 57.19% | 71.05% |
Specificity | 84.31% | 89.87% |
Precision | 55.86% | 69.84% |
Recall | 59.59% | 70.42% |
© 2018 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (http://creativecommons.org/licenses/by/4.0/).
Share and Cite
Ali, M.; Al Machot, F.; Haj Mosa, A.; Jdeed, M.; Al Machot, E.; Kyamakya, K. A Globally Generalized Emotion Recognition System Involving Different Physiological Signals. Sensors 2018, 18, 1905. https://doi.org/10.3390/s18061905
Ali M, Al Machot F, Haj Mosa A, Jdeed M, Al Machot E, Kyamakya K. A Globally Generalized Emotion Recognition System Involving Different Physiological Signals. Sensors. 2018; 18(6):1905. https://doi.org/10.3390/s18061905
Chicago/Turabian StyleAli, Mouhannad, Fadi Al Machot, Ahmad Haj Mosa, Midhat Jdeed, Elyan Al Machot, and Kyandoghere Kyamakya. 2018. "A Globally Generalized Emotion Recognition System Involving Different Physiological Signals" Sensors 18, no. 6: 1905. https://doi.org/10.3390/s18061905
APA StyleAli, M., Al Machot, F., Haj Mosa, A., Jdeed, M., Al Machot, E., & Kyamakya, K. (2018). A Globally Generalized Emotion Recognition System Involving Different Physiological Signals. Sensors, 18(6), 1905. https://doi.org/10.3390/s18061905