Next Article in Journal
Constrained Multiple Planar Reconstruction for Automatic Camera Calibration of Intelligent Vehicles
Previous Article in Journal
Real-World Marine Radar Datasets for Evaluating Target Tracking Methods
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Non-Contact Measurement of Motion Sickness Using Pupillary Rhythms from an Infrared Camera

1
Center for Bionics, Korea Institute of Science and Technology, Seoul 02792, Korea
2
Department of Industrial Engineering, Jeonju University, Jeonju 55069, Korea
3
Department of Biomedical Engineering, Hanyang University, Seoul 04673, Korea
*
Author to whom correspondence should be addressed.
Sensors 2021, 21(14), 4642; https://doi.org/10.3390/s21144642
Submission received: 25 May 2021 / Revised: 25 June 2021 / Accepted: 1 July 2021 / Published: 6 July 2021
(This article belongs to the Section Sensing and Imaging)

Abstract

:
Both physiological and neurological mechanisms are reflected in pupillary rhythms via neural pathways between the brain and pupil nerves. This study aims to interpret the phenomenon of motion sickness such as fatigue, anxiety, nausea and disorientation using these mechanisms and to develop an advanced non-contact measurement method from an infrared webcam. Twenty-four volunteers (12 females) experienced virtual reality content through both two-dimensional and head-mounted device interpretations. An irregular pattern of the pupillary rhythms, demonstrated by an increasing mean and standard deviation of pupil diameter and decreasing pupillary rhythm coherence ratio, was revealed after the participants experienced motion sickness. The motion sickness was induced while watching the head-mounted device as compared to the two-dimensional virtual reality, with the motion sickness strongly related to the visual information processing load. In addition, the proposed method was verified using a new experimental dataset for 23 participants (11 females), with a classification performance of 89.6% (n = 48) and 80.4% (n = 46) for training and test sets using a support vector machine with a radial basis function kernel, respectively. The proposed method was proven to be capable of quantitatively measuring and monitoring motion sickness in real-time in a simple, economical and contactless manner using an infrared camera.

1. Introduction

The development and generalization of head-mounted devices (HMDs) has made virtual reality (VR) a real-life experience. VR technology has been extended to various applications, such as architecture, education, training, mobile devices, medical visualization, interactions, entertainment and manufacturing [1,2,3]. Positive effects have been reported regarding the increase in efficiency of work tasks and the ability to experience a real presence and coexistence [4,5,6,7]. However, the side effects of motion sickness have been widely reported by some users who have used flight and driving simulators and many other virtual environments [8,9,10,11,12], with common symptoms including visual fatigue, anxiety, nausea and disorientation apart from abdominal and oculomotor symptoms [13,14,15,16]. Visually induced motion sickness is caused by incongruities in the spatiotemporal relationships between actions (such as hand movements) and perceptions such as corresponding visual feedback, which leads to distortions and delays in the visual information system [17]. As these issues are a major obstacle for the further development of the VR industry, research to understand and resolve these issues is required to improve the VR experience for viewers [18,19,20].
The symptoms of motion sickness in VR are known to be caused by a variety of factors, such as gaze angle, fixation, retinal slip and the field of view of the HMD [21,22,23,24]. The relationship between these causal factors and motion sickness need to be verified, and there is a need to provide guidelines for content/device developers and users to minimize the symptoms so that users may be comfortable and enjoy the VR contents. Many previous studies have tried to measure motion sickness using various human responses with the following measurement tools: (1) subjective ratings such as a simulator sickness questionnaire (SSQ) [13,25,26,27], a motion sickness susceptibility questionnaire (MSSQ) [21,28,29,30,31,32], a Coriolis test [33,34] and a questionnaire developed by Graybiel and Hamilton [35]; (2) behavioral responses such as head motions [25], body movement [21,36] and eye blinking [28]; (3) autonomic nervous system (ANS) responses such as heart rate (HR) [28,30,33,34,37], autonomic balance [21,26,32,33,34,35], skin temperature (SKT) [28], galvanic skin response (GSR) [28], respiration (RR) [26,28] and blood pressure (BP) [26]; (4) central nervous system (CNS) responses such as an electroencephalogram (EEG) spectrum [31] and functional magnetic resonance imaging (fMRI) [32].
However, each of these measurements had limitations. The subjective measurement of experiences using questionnaires can be impacted by individual differences, depending on personal interpretations and experiences [38,39] meaning that other measures were required to solve these individual differences. Measurement using behavioral responses is tasked with determining a physiological mechanism for motion sickness that does not trigger a more general response. Physiological and neurological responses such as electrocardiogram (ECG), photoplethysmography (PPG), GSR, SKT and EEG have significant disadvantages both in the measurement burden of sensor contact with the skin and the need for an additional device for data acquisition. In our study, the non-contact measurement of motion sickness was developed by processing data from the pupillary responses obtained using an infrared (IR) camera. The camera-based pupillary measurement has practical applicability in HMDs to measure motion sickness, without requiring other additional devices. Our previous study confirmed that the motion sickness from HMDs causes significant changes in pupil rhythms. After experiencing the motion sickness, the pupillary rhythms revealed the irregular patterns with the increasing values in the mean and standard deviation of the pupil diameter. This phenomenon can be interpreted as the cognitive load caused by the increasing volume of visual information and the sensory conflict [40]. The purpose of this work is to develop a real-time system that can monitor the motion sickness based on new features.
The cause of the motion sickness can be interpreted by the “sensory conflict theory”. Following this theory, motion sickness is caused by a conflict, or inconsistency, between different sensory modalities, such as vestibular and visual information [41]. For example, the users experienced motion sickness when there was conflict or inconsistency between the visual information of VR content and the corresponding bodily feedback. This motion sickness has been strongly correlated with the decay of information processing in the brain, such as cognitive load or mental workload. Previous studies have reported that three-dimensional 3D visual fatigue is related to cognitive load rather than visual discomfort or eye strain. As 3D VR imaging involves more visual information, such as depth, than 2D images, it requires greater brain capacity, or resources, to process visual information [19,20,42,43]. In the case of motion sickness, the experience of VR content using an HMD may be required for neural resources because the VR content involves more visual information than 2D. In addition, owing to the motion sickness being related to the inconsistency between visual and vestibular information, this phenomenon can accelerate the load of visual information processing. Thus, the motion sickness is attributed to an increase in the amount of visual information to be processed and to the loss of neural resources caused by the inconsistency among different sensory information, which is interpreted as a high-level cognitive load.
Both physiological (i.e., the sympathetic and parasympathetic nervous systems) and neurological (i.e., brain functions such as memory, attention, cognition, perception and affective processing) are reflected in the pupillary rhythm via neural pathways (both afferent and efferent pathways) between the brain and the pupil nerves [44,45,46,47,48]. In particular, the pupillary rhythm has been observed to be involved in cognitive function among neurological mechanisms such as cognitive load or mental workload [49,50,51,52], attention [53,54] and working memory [49,50]. Thus, this study aimed to interpret the phenomenon of motion sickness using the cognitive load mechanisms reflected by the pupillary rhythm and to develop an advanced non-contact measurement method, via an infrared camera, for measuring motion sickness.

2. Materials and Methods

2.1. Experimental Design

Thirty-two volunteers applied for this experiment, participating in the pretask to measure their sensitivity to motion sickness. The participants were completely focused on the VR contents of “Ultimate Booster Experience” (GexagonVR, 2016) through HTC VIVE (HTC Inc., Taoyuan City, Taiwan) for 10 min, and then asked to report their motion sickness. Eight volunteers who did not experience motion sickness were excluded from the main experiment. Twenty-four healthy subjects participated in the experiments (12 females, all right-handed, average age of 24.34 ± 2.06 years). We recruited the volunteers with the specific conditions as follows: (1) normal vision or corrected-to-normal acuity (i.e., over 0.8) and (2) no medical and family history associated with their visual function or autonomic or central nervous system. They were required to get enough sleep the day before the experiment and to abstain from alcohol, cigarettes and caffeine for 12 h to minimize the negative effects of fatigue, autonomic and central nervous function. The study was approved by the Institutional Review Board of Sangmyung University, Seoul (BE2017-21). All participants signed informed consent forms before their participation.
This study was designed by a “within-subjects design” to compare the viewer’s experience for the VR contents from 2D (non-motion sickness) and HMD (motion sickness) devices. Participants experienced the VR content using either the 2D or the HMD version of the VR content from “NoLimits 2 Roller Coaster Simulation” (Ole Lange, Mad Data GmbH & Co. KG, Erkrath, NRW, Germany, 2014) for 15 min on the first day, and on the next day, they watched the VR content in the other version (i.e., first day HMD and the second day, 2D with the order randomized across subjects). For watching the VR contents for both the versions (2D and HMD), each participant used a 27-inch LED monitor (27MP68HM, LG) and HTC VIVE (HTC Inc., New Taipei City, Taiwan, and Valve Inc., Bellevue, WA, USA), respectively. Before and after viewing the VR content, subjective ratings based on an SSQ were evaluated, and the participants’ pupil images were recorded for 5 min. The subjective ratings and pupillary responses before and after the simulation were compared. The setup of the experimental procedure and environment is shown in Figure 1.
The SSQ was selected for 16-items for the motion sickness from the motion sickness questionnaire (MSQ), and categorized into three non-mutually exclusive factors using the factor analysis based on the relationship between SSQ items, namely: nausea (N), oculomotor responses (O) and disorientation (D). All three factors were each further divided into seven general items. Nausea consisted of general discomfort, increased salivation, sweating, nausea, difficulty concentrating, stomach awareness and burping. Oculomotor responses items consisted of general discomfort, fatigue, headache, eyestrain, difficulty focusing, difficulty concentrating and blurred vision, and disorientation consisted of difficulty focusing, nausea, fullness of head, blurred vision, dizziness (eyes open), dizziness (eyes closed) and vertigo [17]. The examples of SSQ are shown in Appendix B. Participants reported their experience with motion sickness using a 4-point scale (0–3) for 16 questionnaires, and the total SSQ score was calculated using Equation (1) [17], where the values of N, O and D were defined by summing the rating values of each questionnaire for nausea, oculomotor responses and disorientation, respectively. Example of SSQ are shown in Appendix B.
Total   SSQ   score = { ( N × 9.54 ) + ( O × 7.58 ) + ( D × 13.92 ) } × 3.74

2.2. Data Acquisition and Signal Processing

The pupil images were recorded at 30 fps with a resolution of 960 × 400 (pixels) using a GS3-U3-23S6M-C IR camera from Point Grey Research Inc., Richmond, BC, Canada. In addition, an infrared lamp (Genie Compact IR LED Illuminator, 30-degree, 850 nM wavelength and 20 m IR range) was used to detect the pupil area. Since changes in ambient light can affect the pupillary response, the ambient light of experimental room was controlled from 150 to 170 lx (163.42 ± 7.14 lx) measured by the Visible Light SD Card Logger (Sper Scientific Meters Ltd., Scottsdale, AZ, USA) at a 2 Hz sampling rate. The signal processing to extract the pupillary response was conducted based on methods from previous studies [55,56,57,58] as follows. First, the input eye images (gray scale) from the IR camera were processed by binarization based on a threshold value that was reported in previous studies established using a linear regression model between the mean and maximum brightness values from the entire image [56,57,58], as shown in Equation (2).
T h r e s h o l d   v a l u e = ( 0.418 × B m e a n ) + ( 1.051 × B m a x ) + 7.973
where Bmean and Bmax denote the brightness value of the mean and maximum of the entire image on a gray scale, respectively. Second, the pupil area was detected using a circular edge detection (CDE) algorithm [55], as shown in Equation (3). If the multiple pupil position was selected, the position closest to the reflected light caused by the infrared lamp was selected to accurately detect the pupil position.
Max ( r ,   x 0 ,   y 0 ) | G σ ( r ) r   r ,   x 0 ,   y 0 I ( x , y ) 2 π r ds |
where I(x,y) indicates the gray level at the (x, y) position, ( x 0 ,   y 0 ) and r represents the center position and radius of the pupil, respectively. Finally, the pupil diameter (pixel) was extracted by detecting the pupil area, as shown in Figure 2.
The procedure of signal processing and indicator definitions for the pupillary rhythm for detecting motion sickness are shown in Figure 2. (1) After detecting the pupil area, the pupil diameter was calculated using the number of pixels, see Figure 2A,B. (2) The pupil diameter was used to process the sliding movement average (i.e., a 1 s window size and a 1 s resolution) from 30 to 1 fps to minimize the effect of eye closure, and this signal defined the pupillary rhythm in this study, see Figure 2C. The pupil diameter was not calculated if the pupil area was not detected when the eyes were closed. This method was used to acquire the pupil diameter, based on this procedure for the sliding movement average, if a participant took less than a second to blink. (3) To determine whether the patterns of pupil rhythm were regular or irregular, the mean (mean of PD signals, mPD) and standard deviation (SD of PD signals, sPD) were calculated from the pupillary rhythm and were defined as indicators of motion sickness, see Figure 2D. (4) The pupillary rhythm was then processed by a fast Fourier transform (FFT) based on the Hanning window technique to extract the spectral information, see Figure 2E. (5) The ratio of the power of dominant peaks in the entire band was calculated and defined as the pupillary rhythm coherence (PRC) ratio based on metrics used in previous research [59], as shown in Equation (4). Increasing the PRC value was interpreted by the fact that the pupillary rhythm generally is stable at a certain frequency (dominant peak frequency) band, and vice versa. The dominant peak was identified in the range of 0–0.5 Hz, and its power was extracted. The total power was the sum of all power values in the range of 0–0.5 Hz, see Figure 2F.
PRC   ratio =   Power   of   Domonant   Peak   Band ( Power   of   Total   Band   Power   of   Domonant   Peak   Band )

2.3. Statistical Analysis

This study was designed using a “within subject design” and the motion sickness responses of individual subjects to 2D and HMD content were compared. Thus, in the statistical analysis, a paired t-test was selected based on the normality test to compare the pupillary response before and after viewing each condition. In addition, an analysis of covariance (ANCOVA) was also applied to compare the pupillary responses to the 2D and HMD conditions, because the independent t-test could not confirm the viewer’s state before watching the VR content. The ANCOVA compared dependent variables (post-viewing content) between groups, with the pre-viewing content baseline as a covariate [19,43,60]. A partial correlation was used to analyze the correlation between SSQ scores and pupillary responses (post-viewing contents) in all conditions (2D and HMD), considering the pre-viewing contents as covariates [61]. A Bonferroni correction was then applied to resolve the problem of type I errors caused by multiple comparisons, and the statistical significance was controlled based on the numbers from each individual hypothesis (i.e., α = 0.05/n) [62,63]. The statistical significance level of indicators for pupillary response was set to 0.0167 (mPD, sPD and PRC ratio, α = 0.05/3). In addition, this study applied the effect size to verify the practical significance based on Cohen’s d with a t-test and the partial eta-squared value (ƞp2) with an F-test. The standard values for practical significance of 0.10/0.01, 0.25/0.06 and 0.40/0.14 (Cohen’s d/partial eta-squared) were generally regarded as small, medium and large, respectively [64]. All statistical analyses (i.e., paired-samples t-test, ANCOVA and partial correlation) were conducted using IBM SPSS Statistics 21.0, for Windows (SPSS Inc., Chicago, IL, USA).

2.4. Classification

Four basic machine learning algorithms were used to classify motion sickness (HMD) and normal state (2D)—linear discriminant analysis (LDA) (data standardization), decision tree (DT) (split criteria: maximum deviance reduction; number of splits: 4), linear kernel support vector machine (linear SVM) (data standardization, box constraint: 7.7) and radial basis function kernel support vector machine (RBF-SVM) (data standardization, box constraint: 0.09) [65,66,67,68]. Three pupillary features, namely the mPD, the sPD and the PRC ratio, were extracted from the experimental data, and the three statistical features showing statistically significant results were trained by the four classification algorithms on a 24-subject dataset with ten-fold cross validation. The classification performances of the four algorithms were evaluated based on their area under the curve (AUC) for receiver operating characteristics (ROC), accuracy, sensitivity, and specificity [69,70]. A new dataset of 23 subjects (11 females), with ages ranging from 23 to 29 y (mean age 25.02 ± 3.14), was also applied to trained classification models to evaluate practical performance. The statistics and machine learning toolbox of Matlab (2019b, Mathworks Inc., Natick, MA, USA) was used for classification and cross-validation. The classification measures are defined as follows.
  • Accuracy is used to calculate the proportion of the total number of predictions that are correct.
Accuracy (%) = (TP + TN)/(TP + FN + TN + FP) × 100
  • Sensitivity is used to measure the proportion of actual positives that are correctly identified.
Sensitivity (%) = TP/(TP + FN) × 100
  • Specificity is used to measure the proportion of actual negatives that are correctly identified.
Specificity (%) = TN/(TN + FP) × 100
  • AUC: area under the receiver operating characteristic curve. The AUC value lies between 0.5 and 1, where 0.5 denotes a bad classifier and 1 denotes an excellent classifier.
Here, TP represents the correctly classified motion sickness (HMD), FN is the incorrectly classified motion sickness (HMD), TN is the number of true negative classifications and FP is the number of true positive classifications.

3. Result

3.1. SSQ Scores

The paired-samples t-test showed significant differences (increasing the post-viewing) for the total SSQ score in the HMD viewing conditions between the pre- and post-viewing (t(46) = −14.640, p = 0.0000, with a large effect size (Cohen’s d = 4.317)). In the 2D viewing condition, no significant differences were found between the pre- and post-viewing conditions (t(46) = −0.805, p = 0.4041). The ANCOVA analysis showed significant differences (increasing the HMD condition) in the post-viewing condition for the total SSQ score with the pre-viewing condition as a covariate (F(1,46) = 149.035, p = 0.0000, with a large effect size (ƞp2 = 0.768)), as shown in Figure 3.

3.2. Pupillary Response: Time Domain Index

As shown in Figure 4, the pupillary rhythms were fairly regular and stable in the pre- and post-viewing 2D conditions, with the pupil diameters almost identical. After being exposed to the 2D condition, the mean pupil diameters (mPDs) for participants 1, 10 and 24 were changed from 35.556, 34.853 and 37.219 to 34.004, 34.872 and 38.467 pixels, with the changes in standard deviations (sPDs) from 1.276, 0.968 and 1.549 to 1.264, 1.001 and 1.332 pixels, respectively. In contrast, participants’ pupillary rhythms were fairly regular and stable before the HMD viewing condition. However, they became irregular and unstable after the HMD viewing condition with the significantly increasing pupil diameter. After the HMD viewing condition, the mPD values for participants 1, 10 and 24 were changed from 35.168, 35.465 and 36.342 to 42.522, 44.885 and 43.620 pixels, with the following changes in their sPD values from 1.276, 1.046 and 1.564 to 2.889, 2.649 and 2.417 pixels. Similar results were observed for most of the participants, except for participant 6 showing no significant differences in the mPD and sPD values between the 2D and the HMD viewing conditions. The mPD and sPD values for participant 6 were changed from 39.902 and 0.854 to 38.200 and 0.798 pixels after experiencing the 2D condition, and changed from 39.543 and 0.798 to 38.899 and 0.902 pixels after experiencing the HMD condition.
A paired-sample t-test showed significant differences (increasing during the post-viewing measurement) for the mPD and sPD in the HMD viewing condition between the pre- and post-viewing measurements (mPD: t(46)) = −11.544, p = 0.0000, with a large effect size (Cohen’s d = 3.404); sPD: t(46)) = −8.265, p = 0.0000, with a large effect size (Cohen’s d = 2.437). In the 2D viewing condition, no significant differences were found between the pre- and post-viewing measurements (mPD: t(46) = −0.645, p = 0.5251; sPD: t(46) = −2.156, p = 0.0418]) based on the Bonferroni correction. The ANCOVA analysis showed a significant difference (increasing after the HMD condition) in the post-viewing condition for mPD and sPD with the pre-viewing condition as a covariate (mPD: F(1,46) = 90.793, p = 0.0000, with a large effect size (ƞp2 = 0.669); sPD: F(1,46) = 37.248, p = 0.0000, with a large effect size (ƞp2 = 0.453)), as shown in Figure 5 and Appendix A.

3.3. Pupillary Response: Frequency Domain Index

As seen in Figure 6, the spectral power of the pupillary rhythms was concentrated in a specific frequency band before and after viewing, for the 2D condition. For example, the PRC ratios for participants 1, 10 and 24 changed from 0.530, 0.467 and 0.579 to 0.556, 0.429 and 0.500, respectively, after experiencing the 2D condition. In contrast, the spectral power of pupillary rhythms was concentrated in a specific frequency band in the pre-viewing measurement for the HMD condition, but dispersed across the entire frequency band after viewing. The PRC ratios for participants 1, 10 and 24 changed from 0.595, 0.473 and 0.605 to 0.092, 0.043 and 0.072, respectively, after experiencing the HMD condition. These results were reported for most of the participants, however, participant 6 again showed no significant difference between the 2D and HMD viewing. The PRC ratio for participant 6 changed from 0.505 to 0.481 after experiencing the 2D condition and changed from 0.505 to 0.476 after experiencing the HMD condition.
A paired-samples t-test showed significant differences (decreasing the post-viewing) in the PRC ratio in the HMD viewing condition between the pre- and post-viewing measurements (t(46) = 10.483, p = 0.0000, with a large effect size (Cohen’s d = 3.091)). In the 2D viewing condition, no significant differences were found between the pre- and post-viewing measurements (t(46) = 2.341, p = 0.0283) based on the Bonferroni correction. The ANCOVA analysis showed a significant difference (decreasing after the HMD condition) in the post-viewing condition for the PRC ratio with the pre-viewing condition as a covariate (F(1,46) = 75.358, p = 0.0000, with a large effect size (ƞp2 = 0.629)), as shown in Figure 7 (see Table A1).

3.4. Correlation Analysis and Classification

A multiple regression analysis was conducted for partial correlation and for calculating the residuals with covariates (SSQ scores and pupillary responses in the pre-viewing conditions). As seen in Figure 8, the plot for residuals of SSQ scores and pupillary responses (mPD, sPD and PRC ratio) with linear regression lines. The correlation coefficients between SSQ scores and each pupillary response in the post-viewing condition were statistically significant (mPD: r = 0.751, p = 0.0000; sPD: r = 0.559, p = 0.0000; PRC ratio: r = −0.756, p = 0.0000).
As seen in Table 1, classification measures existed (accuracy, sensitivity, specificity and AUC) for the training dataset with 10-fold cross validation and for the new dataset according to four classifiers (LDA/decision tree/linear SVM/RBF SVM). ROC curves were applied for the training dataset in Figure 9A and for the test dataset in Figure 9B and Appendix A.

3.5. Non-Contact Measurement System of Motion Sickness in Real Time

The real-time system for the noncontact measurement of motion sickness in this study consisted of an HMD device, add-on IR camera (HTC Vive Eye Tracking Add-On from Pupil Labs, 120 fps with 640 × 480 resolution), add-on lamp and a personal computer for analysis, and can be classified as motion sickness or non-motion sickness state using non-contact measurement, as shown in Figure 10. This system was developed using Visual C++ 2010 and OpenCV 2.4.3, and signal processing was performed using LabVIEW 2010 (National Instruments Inc., Austin, TX, USA). The flowchart and non-contact real-time system of motion sickness are shown in Figure 10 and Figure 11, respectively.

4. Discussion

The aim of this study was to determine a method for measuring the motion sickness that appears as a side effect of experiencing VR content (HMD) using the pupillary rhythm and to propose a new indicator for evaluating motion sickness (high-level cognitive load). VR content from an HMD was presented to participants with the goal of causing motion sickness, and the pupillary responses of the participants were compared to the responses after a 2D experience. Participants’ responses to an SSQ confirmed their experience of motion sickness from the HMD; such confirmation verified that the changes in their pupillary response were related to motion sickness.
Overall, the study yielded two significant findings: firstly, the pupil diameters significantly increased during motion sickness. Many previous studies have reported that an increased pupil diameter is closely related to a decay in information processing by the brain [49,50,51,71,72]. The increase in pupil diameter in this study provides evidence that the experience of motion sickness is associated with physiological changes in cognitive load.
Second, the standard deviation of the pupillary rhythm significantly increased, and the PRC ratio significantly decreased after experiencing the HMD condition as compared to the 2D condition. An increase in the sPD and a decrease in the PRC ratio revealed irregular changes in pupil size due to the power of the pupillary rhythm spectrum being dispersed across various spectral bands and the deviation of the pupillary rhythms being increased. These results indicate that fluctuations in pupillary rhythms became irregular after experiencing motion sickness. In previous research, cognitive load has been related to heart rhythm patterns (HRPs) with one study reporting that increasing cognitive load leads to a pattern of irregular and unstable heart rhythms [19]. As the heart responds to external sensory inputs such as visual information transmitted to the brain through afferent pathways, cognitive processes occur not only in the brain, but also through brain–heart connectivity, which influences the cognitive function [59,73]. Pupillary rhythms (i.e., change in pupil size) are strongly affected by the regulation of the sympathetic and parasympathetic nervous system (autonomic balance) based on the contraction function of the sphincter and dilator muscles, and the autonomic balance is determined by the HRP [45,47,48,74]. If a pattern of irregular and unstable heart rhythms is related to cognitive load, the irregular rhythm of the pupil can also be interpreted as being related to cognitive load. Additionally, the pupils are known to be closely related to the central nervous system [44,45,46,47,48], and many studies have reported that they are indicators of cognitive load [49,50,51,71,72].
Changes in pupillary rhythms were correlated with functional brain processing, such as cognitive load or mental workload, attention and working memory based on neural pathways in the midbrain. Many previous studies have demonstrated that changes in pupillary rhythms are correlated with neural activity in the locus coeruleus–norepinephrine (LC–NE) system [75,76,77,78,79], dorsal attention network (DAN) (i.e., activity in the superior colliculus and the right thalamus) [79,80,81] and cingulate cortex [79,82]. These regions are known to be related to cognitive and attentional functions. Thus, the neural resources needed to process the visual information in the brain is reflected in the change of pupillary rhythms, and these results support the findings that increase the pupil diameter and show an irregular pattern of pupillary rhythms.
From these two significant findings, the main contributions of this work can be summarized as follows: firstly, an increase in pupil diameter and an irregular rhythm of the pupil are strongly related to motion sickness, which can be interpreted as a decay in the human vision system. Many previous studies reported that 3D visual fatigue is related to the degradation of the human vision system caused by information processing rather than to visual discomfort, because 3D content involves more visual information, such as image depth, than 2D content [14,19,42,83]. Experience of VR content using HMD should also be interpreted as consuming the neural resources to process the massive visual information. Other research has shown that motion sickness from HMD devices is caused by incongruities in the spatiotemporal relationships between actions and perceptions of visual information, which can lead to distortions and delays in the visual information system [17,83]. Thus, motion sickness is related to an increase in visual information to be processed and to the loss of neural resources caused by the inconsistency or conflict among different sensory information, that is, the high-level cognitive load caused by the massive and inefficient information processing. Results show that evaluating the pupils can be an appropriate way to measure motion sickness rather than interpreting symptoms such as dizziness, fatigue and nausea.
Among the algorithms for classifying motion sickness, the RBF–SVM in this study achieved the highest average recognition accuracy (89.6% for training and 80.4% for the test set). To better illustrate the study findings, this study compared the methods and results with those of the past studies, with the accuracy rate of recognizing motion sickness being 79.6–99.6% in the training set and 72.7% in the test set, as shown in Table 2. The majority of previous studies have reported measurement methods for motion sickness using neurophysiological responses such as the electroencephalogram (EEG), however, these methods have limitations such as complex and expensive equipment, inconveniences and a burden of sensor attachment [56,57,58]. In terms of accuracy, sample size, validation data set and usability, these methods outperformed existing state-of-the-art classification methods for motion sickness detection. In this way, motion sickness can be measured by an infrared webcam through a simple, low-cost and non-contact method based on pupillary rhythms.

5. Conclusions

The aim of this study was to develop an accurate non-contact measurement method for detecting motion sickness using pupillary rhythms measured with an infrared webcam. This study found that motion sickness was significantly related to the irregular pattern of pupillary rhythms, as demonstrated by increasing mPDs and sPDs and a decreasing PRC ratio. These phenomena can be interpreted as a decay in visual information processing (i.e., a high-level cognitive load). In addition, when it comes to VR using HMDs, monitoring pupillary responses in real time was proven to be more appropriate than examining other behavioral responses because a user’s face is covered by the device. The proposed method can be adopted to quantitatively measure motion sickness using various parameters such as gaze angle, fixation, retinal slip and field of view, and consequently improve the viewing environment of viewer-friendly VR. The list of abbreviation of the manuscript can be found in abbreviations.

Author Contributions

Conceptualization, S.P. and L.K.; methodology, S.P. and J.H.; software, S.M. and J.H.; validation, S.P. and J.H.; investigation, S.P. and J.H.; data curation, S.M. and J.H.; writing—original draft preparation, S.P.; writing—review and editing, S.P. and L.K.; visualization, S.M. and J.H.; supervision, L.K. All authors have read and agreed to the published version of the manuscript.

Funding

This work was partly supported by the Institute of Information & Communications Technology Planning & Evaluation (IITP) grant funded by the Korean government (MSIT) (No. 2017-0-00432, Development of non-invasive integrated BCI SW platform to control home appliances and external devices by user’s thoughts via AR/VR interface).

Institutional Review Board Statement

This experimental study was approved and reviewed by the Institutional Review Board (approval number: BE2017-21) of the Sangmyung University.

Informed Consent Statement

Informed consent was obtained from all subjects involved in the study.

Conflicts of Interest

The authors declare no conflict of interest.

Abbreviations

AbbreviationsDefinition
HMDshead-mounted devices
VRvirtual reality
SSQsimulator sickness questionnaire
MSSQmotion sickness susceptibility questionnaire
ANSautonomic nervous system
HRheart rate
SKTskin temperature
GSRgalvanic skin response
RRrespiration
BPblood pressure
CNScentral nervous system
EEGelectroencephalogram
fMRIfunction-al magnetic resonance imaging
ECGelectrocardiogram
PPGphotoplethysmography
IRinfrared
CEDcircular edge detection
FFTfast Fourier transform
PRCpupillary rhythm coherence
ANCOVAanalysis of covariance
LDAlinear discriminant analysis
DTdecision tree
SVMsupport vector machine
RBF-SVMradial basis function kernel-SVM
AUCarea under the curve
ROCreceiver operating characteristics
HRPsheart rhythm patterns
LE–NClocus coeruleus–norepinephrine
DANdorsal attention network
LCDliquid crystal display
EOGelectrooculography
COPcenter of pressure in force plate
PCAprincipal component analysis
SONFINself-organizing neural fuzzy inference network

Appendix A

Table A1. Results of the main experiments (training data set) for 24 participants.
Table A1. Results of the main experiments (training data set) for 24 participants.
ParticipantsMean of Pupil Diameter (mPD)Standard Deviation of Pupil Diameter (SPD)Pupillary Rhythm Coherence Ratio (PRC Ratio)
2DHMD2DHMD2DHMD
PrePostPrePostPrePostPrePostPrePostPrePost
P135.55634.00435.16842.5221.2761.2641.2762.8890.5300.5560.5950.092
P237.22238.20036.48345.1011.3371.4401.1722.3530.2880.3090.2950.177
P337.91035.54635.98446.9190.7660.8151.0112.7350.5040.5780.4830.220
P435.21934.68133.05143.5581.3851.1641.3462.4950.4710.5190.4720.275
P536.59336.25935.29546.8901.2281.3211.2664.6790.3080.2980.3810.203
P639.90238.20039.54338.8990.8541.0340.7981.0020.5050.4810.5050.476
P735.77934.19334.41341.2981.1331.2471.2352.0060.5260.5310.5610.207
P838.20939.07937.76245.2381.0690.9500.9092.5550.3960.3750.3480.061
P935.03436.22136.71645.8411.1571.2631.3082.3850.3910.3790.3610.151
P1034.85334.87235.46544.8850.9681.0011.0462.6490.4670.4290.4730.043
P1132.70638.70232.45040.2741.2652.8171.3262.3290.2940.1890.3050.142
P1236.48036.09137.13741.4811.3741.4691.3993.5030.5350.5100.5250.223
P1340.17638.14040.46943.9430.9031.0890.9063.2370.4070.3840.4140.315
P1434.96335.20234.05745.6151.6191.6751.6412.2290.4610.4210.4650.106
P1539.27538.85937.09245.0101.6571.7991.8552.1300.3990.3500.3870.128
P1636.85736.43735.12741.7160.7330.7890.5012.0150.4310.4130.4550.207
P1735.75837.40335.37440.3961.0511.3391.0621.9870.5040.4610.5050.247
P1840.50539.58038.09844.7261.5231.8441.2462.4930.5050.4850.4990.206
P1939.85841.73438.03844.3551.3681.4681.2111.3360.6130.5020.6140.156
P2032.39533.27034.29340.8581.1481.4371.0732.1620.5250.5960.5270.271
P2133.97338.31934.21436.0640.7431.9270.9412.1040.2970.1510.2960.115
P2239.45636.28139.07944.8760.9950.9790.7642.0550.5800.5310.5640.149
P2333.01635.83435.66440.3321.2801.1201.2151.7580.3840.3480.3810.196
P2437.21938.46736.34243.6201.5491.3321.5642.4170.5790.5000.6050.072
Avg.36.62136.89936.13843.1011.1831.3571.1702.3960.4540.4290.4590.185
SD2.3732.0201.9682.6620.2660.4270.2890.7040.0930.1130.0960.092
SE0.4840.4120.4020.5430.0540.0870.0590.1440.0190.0230.0200.019
Table A2. Results of the validation experiments (test data set) for 23 participants.
Table A2. Results of the validation experiments (test data set) for 23 participants.
ParticipantsMean of Pupil Diameter (mPD)Standard Deviation of Pupil Diameter (SPD)Pupillary Rhythm Coherence Ratio (PRC Ratio)
2DHMD2DHMD2DHMD
PrePostPrePostPrePostPrePOSTPrePostPrePost
P133.24134.06234.14239.6431.1441.1261.2121.8720.5120.5060.4920.212
P235.11235.46335.41238.4611.2411.3161.3011.4420.4420.4120.5160.336
P334.02438.12734.78137.0340.9721.1071.0421.1360.3720.3200.4160.310
P439.14240.03238.16240.3720.8921.1420.9821.0470.5180.2420.5020.464
P534.11735.01235.17242.1741.1121.1461.0471.5620.4820.4460.5720.312
P636.11736.04236.04741.1741.4121.4061.3621.7450.5020.4600.4820.141
P733.19235.17434.56235.1490.8721.1210.9461.0880.4320.1160.5020.420
P835.00235.17635.72239.8740.9721.0021.0421.3920.6160.5480.5820.333
P938.16238.47237.66342.1641.2121.2441.1981.5620.4410.4020.4820.206
P1033.20537.62434.18238.1141.1071.1920.9821.3990.4820.4410.5220.304
P1136.10438.14235.06642.1791.2061.2271.1171.8240.6420.6120.6630.318
P1234.11234.92235.01640.3271.4661.4321.3721.5900.3810.3880.4150.224
P1333.00435.13234.14236.8321.1121.4141.0321.2010.3820.2820.4440.389
P1437.10940.44236.49239.0321.1421.1661.0021.3120.4460.4120.4960.182
P1535.19835.64234.82440.0060.9821.0011.0041.4140.3820.3960.4120.264
P1636.12136.89735.00239.4460.8760.9240.8821.2210.7020.6640.6410.344
P1734.10235.11634.44238.8061.1161.2021.2411.4930.4420.4760.4920.218
P1838.16638.44238.26444.1620.8820.8960.9421.4110.3920.3640.3820.164
P1932.17236.44233.00836.1720.9241.2321.0151.1280.4420.2060.4460.312
P2036.04538.32335.41437.0321.1421.3621.6241.8320.3660.2270.5160.412
P2137.12237.46236.88242.1871.1121.1460.9861.3870.3920.4020.4220.232
P2234.56235.03235.00339.5560.9411.0021.0651.4750.4850.4440.3960.246
P2338.02238.14637.06842.3230.8770.9020.9111.3020.3960.4420.4720.231
Avg.35.35436.75335.49939.6621.0751.1611.1001.4280.4630.4000.4900.286
SD1.8921.7451.3312.2400.1630.1560.1760.2330.0870.1250.0710.084
SE0.3860.3560.2720.4570.0330.0320.0360.0480.0180.0260.0150.017

Appendix B. (Example of Simulator Sickness Questionnaire, Kennedy et al., 1993)

No__________________ Date_________________
          SIMULATOR SICKNESS QUESTIONNAIRE
  Robert S. Kennedy, Norman E. Lane, Kevin S. Berbaum and Michael G. Lilienthal (1993)
Instruction: Circle how much each symptom below is affecting you right now.
1. General discomfortNoneSlightModerateSevere
2. FatigueNoneSlightModerateSevere
3. HeadacheNoneSlightModerateSevere
4. EyestrainNoneSlightModerateSevere
5. Difficulty focusingNoneSlightModerateSevere
6. Increased salivationNoneSlightModerateSevere
7. SweatingNoneSlightModerateSevere
8. NauseaNoneSlightModerateSevere
9. Difficulty concentratingNoneSlightModerateSevere
10. Fullness of HeadNoneSlightModerateSevere
11. Blurred visionNoneSlightModerateSevere
12. Dizziness with eye openNoneSlightModerateSevere
13. Dizziness with eye closedNoneSlightModerateSevere
14. VertigoNoneSlightModerateSevere
15. Stomach awarenessNoneSlightModerateSevere
16. BurpingNoneSlightModerateSevere
Original version: Kennedy, R.S.; Lane, N.E.; Berbaum, K.S.; Lilienthal, M.G. Simulator sickness questionnaire: An enhanced method for quantifying simulator sickness. Int. J. Aviat. Psychol. 1993, 3, 203–220.

References

  1. Pan, Z.; Cheok, A.D.; Yang, H.; Zhu, J.; Shi, J. Virtual reality and mixed reality for virtual learning environments. Comput. Graph. 2006, 30, 20–28. [Google Scholar] [CrossRef]
  2. Nee, A.; Ong, S.K. Virtual and Augmented Reality Applications in Manufacturing; Springer Science & Business Media: Berlin/Heidelberg, Germany, 2013. [Google Scholar]
  3. Kesim, M.; Ozarslan, Y. Augmented Reality in Education: Current Technologies and the Potential for Education. Procedia Soc. Behav. Sci. 2012, 47, 297–302. [Google Scholar] [CrossRef] [Green Version]
  4. Riva, G. Virtual reality: An experiential tool for clinical psychology. Br. J. Guid. Couns. 2009, 37, 337–345. [Google Scholar] [CrossRef]
  5. Raajan, N.; Suganya, S.; Priya, M.; Ramanan, S.V.; Janani, S.; Nandini, N.S.; Hemanand, R.; Gayathri, S. Augmented Reality Based Virtual Reality. Procedia Eng. 2012, 38, 1559–1565. [Google Scholar] [CrossRef] [Green Version]
  6. Kim, C.J.; Park, S.; Won, M.J.; Whang, M.; Lee, E.C. Autonomic Nervous System Responses Can Reveal Visual Fatigue Induced by 3D Displays. Sensors 2013, 13, 13054–13062. [Google Scholar] [CrossRef] [PubMed]
  7. Kim, M.; Jeon, C.; Kim, J. A Study on Immersion and Presence of a Portable Hand Haptic System for Immersive Virtual Reality. Sensors 2017, 17, 1141. [Google Scholar] [CrossRef] [Green Version]
  8. Kennedy, R.S.; Drexler, J. Research in visually induced motion sickness. Appl. Ergon. 2010, 41, 494–503. [Google Scholar] [CrossRef] [PubMed]
  9. Van Krevelen, D.; Poelman, R. A Survey of Augmented Reality Technologies, Applications and Limitations. Int. J. Virtual Real. 2010, 9, 1–20. [Google Scholar] [CrossRef] [Green Version]
  10. Naqvi, S.A.A.; Badruddin, N.; Malik, A.; Hazabbah, W.; Abdullah, B. Does 3D produce more symptoms of visually induced motion sickness? In Proceedings of the 2013 35th Annual International Conference of the IEEE Engineering in Medicine and Biology Society (EMBC), Osaka, Japan, 3–7 July 2013; pp. 6405–6408. [Google Scholar]
  11. Rebenitsch, L.; Owen, C.B. Review on cybersickness in applications and visual displays. Virtual Real. 2016, 20, 101–125. [Google Scholar] [CrossRef]
  12. Clifton, J.; Palmisano, S. Effects of steering locomotion and teleporting on cybersickness and presence in HMD-based virtual reality. Virtual Real. 2020, 24, 453–468. [Google Scholar] [CrossRef]
  13. Sharples, S.; Cobb, S.; Moody, A.; Wilson, J.R. Virtual reality induced symptoms and effects (VRISE): Comparison of head mounted display (HMD), desktop and projection display systems. Displays 2008, 29, 58–69. [Google Scholar] [CrossRef]
  14. Lambooij, M.M.; Ijsselsteijn, W.W.; Fortuin, M.M.; Heynderickx, I.I. Visual Discomfort and Visual Fatigue of Stereoscopic Displays: A Review. J. Imaging Sci. Technol. 2009, 53. [Google Scholar] [CrossRef] [Green Version]
  15. Bouchard, S.; Robillard, G.; Renaud, P.; Bernier, F. Exploring new dimensions in the assessment of virtual reality induced side effects. J. Comput. Inf. Technol. 2011, 1, 20–32. [Google Scholar]
  16. Carnegie, K.; Rhee, T. Reducing Visual Discomfort with HMDs Using Dynamic Depth of Field. IEEE Eng. Med. Boil. Mag. 2015, 35, 34–41. [Google Scholar] [CrossRef] [PubMed]
  17. Kennedy, R.S.; Lane, N.E.; Berbaum, K.; Lilienthal, M.G. Simulator Sickness Questionnaire: An Enhanced Method for Quantifying Simulator Sickness. Int. J. Aviat. Psychol. 1993, 3, 203–220. [Google Scholar] [CrossRef]
  18. Lee, E.C.; Heo, H.; Park, K.R. The comparative measurements of eyestrain caused by 2D and 3D displays. IEEE Trans. Consum. Electron. 2010, 56, 1677–1683. [Google Scholar] [CrossRef]
  19. Park, S.; Won, M.; Mun, S.; Lee, E.; Whang, M. Does visual fatigue from 3D displays affect autonomic regulation and heart rhythm? Int. J. Psychophysiol. 2014, 92, 42–48. [Google Scholar] [CrossRef] [PubMed]
  20. Park, S.; Won, M.J.; Lee, E.C.; Mun, S.; Park, M.-C.; Whang, M. Evaluation of 3D cognitive fatigue using heart–brain synchronization. Int. J. Psychophysiol. 2015, 97, 120–130. [Google Scholar] [CrossRef] [PubMed]
  21. Yokota, Y.; Aoki, M.; Mizuta, K.; Ito, Y.; Isu, N. Motion sickness susceptibility associated with visually induced postural instability and cardiac autonomic responses in healthy subjects. Acta Otolaryngol. 2005, 125, 280–285. [Google Scholar] [CrossRef] [PubMed]
  22. Diels, C.; Ukai, K.; A Howarth, P. Visually induced motion sickness with radial displays: Effects of gaze angle and fixation. Aviat. Space Environ. Med. 2007, 78, 659–665. [Google Scholar]
  23. Bos, J.E.; de Vries, S.C.; van Emmerik, M.L.; Groen, E.L. The effect of internal and external fields of view on visually induced motion sickness. Appl. Ergon. 2010, 41, 516–521. [Google Scholar] [CrossRef] [PubMed]
  24. Moss, J.D.; Muth, E.R. Characteristics of Head-Mounted Displays and Their Effects on Simulator Sickness. Hum. Factors J. Hum. Factors Ergon. Soc. 2011, 53, 308–319. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  25. Merhi, O.; Faugloire, E.; Flanagan, M.; Stoffregen, T.A. Motion sickness, console video games, and head-mounted displays. Hum. Factors J. Hum. Factors Ergon. Soc. 2007, 49, 920–934. [Google Scholar] [CrossRef]
  26. Kiryu, T.; Tada, G.; Toyama, H.; Iijima, A. Integrated evaluation of visually induced motion sickness in terms of autonomic nervous regulation. In Proceedings of the 2008 30th Annual International Conference of the IEEE Engineering in Medicine and Biology Society, Vancouver, BC, Canada, 20–24 August 2008; pp. 4597–4600. [Google Scholar]
  27. Palmisano, S.; Mursic, R.; Kim, J. Vection and cybersickness generated by head-and-display motion in the Oculus Rift. Displays 2017, 46, 1–8. [Google Scholar] [CrossRef] [Green Version]
  28. Kim, Y.Y.; Kim, H.J.; Kim, E.N.; Ko, H.D. Characteristic changes in the physiological components of cybersickness. Psychophysiology 2005, 42, 616–625. [Google Scholar] [CrossRef] [PubMed]
  29. Lin, C.-T.; Tsai, S.-F.; Ko, L.-W. EEG-Based Learning System for Online Motion Sickness Level Estimation in a Dynamic Vehicle Environment. IEEE Trans. Neural Netw. Learn. Syst. 2013, 24, 1689–1700. [Google Scholar] [CrossRef] [PubMed]
  30. Nalivaiko, E.; Davis, S.L.; Blackmore, K.; Vakulin, A.; Nesbitt, K. Cybersickness provoked by head-mounted display affects cutaneous vascular tone, heart rate and reaction time. Physiol. Behav. 2015, 151, 583–590. [Google Scholar] [CrossRef]
  31. Chuang, S.-W.; Chuang, C.-H.; Yu, Y.-H.; King, J.-T.; Lin, C.-T. (Ct) EEG Alpha and Gamma Modulators Mediate Motion Sickness-Related Spectral Responses. Int. J. Neural Syst. 2016, 26, 1650007. [Google Scholar] [CrossRef] [PubMed]
  32. Toschi, N.; Kim, J.; Sclocco, R.; Duggento, A.; Barbieri, R.; Kuo, B.; Napadow, V. Motion sickness increases functional connectivity between visual motion and nausea-associated brain regions. Auton. Neurosci. 2017, 202, 108–113. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  33. Zużewicz, K.; Saulewicz, A.; Konarska, M.; Kaczorowski, Z. Heart Rate Variability and Motion Sickness During Forklift Simulator Driving. Int. J. Occup. Saf. Ergon. 2011, 17, 403–410. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  34. Malińska, M.; Zużewicz, K.; Bugajska, J.; Grabowski, A. Heart rate variability (HRV) during virtual reality immersion. Int. J. Occup. Saf. Ergon. 2015, 21, 47–54. [Google Scholar] [CrossRef]
  35. Ohyama, S.; Nishiike, S.; Watanabe, H.; Matsuoka, K.; Akizuki, H.; Takeda, N.; Harada, T. Autonomic responses during motion sickness induced by virtual reality. Auris Nasus Larynx 2007, 34, 303–306. [Google Scholar] [CrossRef] [PubMed]
  36. Chardonnet, J.-R.; Mirzaei, M.A.; Mérienne, F. Features of the Postural Sway Signal as Indicators to Estimate and Predict Visually Induced Motion Sickness in Virtual Reality. Int. J. Hum. Comput. Interact. 2017, 33, 771–785. [Google Scholar] [CrossRef] [Green Version]
  37. Gianaros, P.J.; Quigley, K.S.; Muth, E.R.; Levine, M.E.; Vasko, J.R.C.; Stern, R.M. Relationship between temporal changes in cardiac parasympathetic activity and motion sickness severity. Psychophysiology 2003, 40, 39–44. [Google Scholar] [CrossRef] [Green Version]
  38. Annett, J. Subjective rating scales: Science or art? Ergonomics 2002, 45, 966–987. [Google Scholar] [CrossRef]
  39. Cain, B. A Review of the Mental Workload Literature; Defence Research And Development Toronto (Canada): Toronto, ON, Canada, 2007. [Google Scholar]
  40. Park, S.; Lee, D.W.; Mun, S.; Kim, H.-I.; Whang, M. Effect of Simulator Sickness Caused by Head-mounted Display on the Stability of the Pupillary Rhythm. Korean Soc. Emot. Sensib. 2018, 21, 43–54. [Google Scholar] [CrossRef]
  41. Oman, C.M. Motion sickness: A synthesis and evaluation of the sensory conflict theory. Can. J. Physiol. Pharmacol. 1990, 68, 294–303. [Google Scholar] [CrossRef] [PubMed]
  42. Mun, S.; Park, M.-C.; Park, S.; Whang, M. SSVEP and ERP measurement of cognitive fatigue caused by stereoscopic 3D. Neurosci. Lett. 2012, 525, 89–94. [Google Scholar] [CrossRef] [PubMed]
  43. Mun, S.; Kim, E.-S.; Park, M.-C. Effect of mental fatigue caused by mobile 3D viewing on selective attention: An ERP study. Int. J. Psychophysiol. 2014, 94, 373–381. [Google Scholar] [CrossRef]
  44. Fotiou, F.; Fountoulakis, K.; Tsolaki, M.; Goulas, A.; Palikaras, A. Changes in pupil reaction to light in Alzheimer’s disease patients: A preliminary report. Int. J. Psychophysiol. 2000, 37, 111–120. [Google Scholar] [CrossRef]
  45. Partala, T.; Surakka, V. Pupil size variation as an indication of affective processing. Int. J. Hum. Comput. Stud. 2003, 59, 185–198. [Google Scholar] [CrossRef]
  46. Kojima, M.; Shioiri, T.; Hosoki, T.; Kitamura, H.; Bando, T.; Someya, T. Pupillary light reflex in panic disorder. Eur. Arch. Psychiatry Clin. Neurosci. 2004, 254, 242–244. [Google Scholar] [CrossRef] [PubMed]
  47. Kozicz, T.; Bittencourt, J.; May, P.J.; Reiner, A.; Gamlin, P.D.; Palkovits, M.; Horn, A.; Toledo, C.A.; Ryabinin, A.E. The Edinger-Westphal nucleus: A historical, structural, and functional perspective on a dichotomous terminology. J. Comp. Neurol. 2011, 519, 1413–1434. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  48. Júnior, E.D.D.S.; Da Silva, A.V.; Da Silva, K.R.; Haemmerle, C.; Batagello, D.S.; Da Silva, J.M.; Lima, L.B.; Da Silva, R.J.; Diniz, G.; Sita, L.V.; et al. The centrally projecting Edinger–Westphal nucleus—I: Efferents in the rat brain. J. Chem. Neuroanat. 2015, 68, 22–38. [Google Scholar] [CrossRef] [Green Version]
  49. Just, M.; Carpenter, P.A. The intensity dimension of thought: Pupillometric indices of sentence processing. Can. J. Exp. Psychol. 1993, 47, 310–339. [Google Scholar] [CrossRef] [PubMed]
  50. Just, M.A.; Carpenter, P.A.; Keller, T.A.; Eddy, W.F.; Thulborn, K.R. Brain activation modulated by sentence comprehension. Science 1996, 274, 114–116. [Google Scholar] [CrossRef]
  51. Klingner, J.; Kumar, R.; Hanrahan, P. Measuring the task-evoked pupillary response with a remote eye tracker. In Proceedings of the 2008 symposium on Eye tracking research & applications—ETRA ’08, Savannah, GA, USA, 26–28 March 2008; pp. 69–72. [Google Scholar]
  52. Jimenez-Molina, A.; Retamal, C.; Lira, H. Using Psychophysiological Sensors to Assess Mental Workload During Web Browsing. Sensors 2018, 18, 458. [Google Scholar] [CrossRef] [Green Version]
  53. Hink, R.; Van Voorhis, S.; Hillyard, S.; Smith, T. The division of attention and the human auditory evoked potential. Neuropsychologia 1977, 15, 597–605. [Google Scholar] [CrossRef]
  54. Beatty, J. Pupillometric signs of selective attention in man. In Neurophysiology and Psychophysiology: Experimental and Clinical Applications; Academic Press: New York, NY, USA, 1988; pp. 138–143. [Google Scholar]
  55. Lee, E.C.; Park, K.R.; Whang, M.; Min, K. Measuring the degree of eyestrain caused by watching LCD and PDP devices. Int. J. Ind. Ergon. 2009, 39, 798–806. [Google Scholar] [CrossRef]
  56. Park, S.; Whang, M. Infrared Camera-Based Non-contact Measurement of Brain Activity From Pupillary Rhythms. Front. Physiol. 2018, 9, 1400. [Google Scholar] [CrossRef]
  57. Park, S.; Won, M.J.; Lee, D.W.; Whang, M. Non-contact measurement of heart response reflected in human eye. Int. J. Psychophysiol. 2018, 123, 179–198. [Google Scholar] [CrossRef]
  58. Park, S.; Mun, S.; Lee, D.W.; Whang, M. IR-camera-based measurements of 2D/3D cognitive fatigue in 2D/3D display system using task-evoked pupillary response. Appl. Opt. 2019, 58, 3467–3480. [Google Scholar] [CrossRef] [PubMed]
  59. McCraty, R.; Atkinson, M.; Tomasino, D.; Bradley, R.T. The coherent heart heart-brain interactions, psychophysiological coherence, and the emergence of system-wide order. Integral Rev. 2009, 5, 10–115. [Google Scholar]
  60. Keselman, H.J.; Huberty, C.J.; Lix, L.M.; Olejnik, S.; Cribbie, R.A.; Donahue, B.; Kowalchuk, R.K.; Lowman, L.L.; Petoskey, M.D.; Keselman, J.C.; et al. Statistical Practices of Educational Researchers: An Analysis of their ANOVA, MANOVA, and ANCOVA Analyses. Rev. Educ. Res. 1998, 68, 350–386. [Google Scholar] [CrossRef]
  61. Jung, H.; Kim, H.S.; Kim, J.Y.; Sun, J.-M.; Ahn, J.S.; Ahn, M.-J.; Park, K.; Esteller, M.; Lee, S.-H.; Choi, J.K. DNA methylation loss promotes immune evasion of tumours with high mutation and copy number load. Nat. Commun. 2019, 10, 4278. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  62. Dunnett, C.W. A multiple comparison procedure for comparing several treatments with a control. J. Am. Stat. Assoc. 1955, 50, 1096–1121. [Google Scholar] [CrossRef]
  63. Armstrong, R.A. When to use the Bonferroni correction. Ophthalmic Physiol. Opt. 2014, 34, 502–508. [Google Scholar] [CrossRef]
  64. Huck, S.W.; Cormier, W.H.; Bounds, W.G. Reading Statistics and Research; Harper & Row: New York, NY, USA, 1974. [Google Scholar]
  65. Narsky, I.; Porter, F.C. Statistical Analysis Techniques in Particle Physics; Wiley Online Library: Hoboken, NJ, USA, 2013. [Google Scholar]
  66. Myszczynska, M.A.; Ojamies, P.N.; Lacoste, A.M.B.; Neil, D.; Saffari, A.; Mead, R.; Hautbergue, G.M.; Holbrook, J.D.; Ferraiuolo, L. Applications of machine learning to diagnosis and treatment of neurodegenerative diseases. Nat. Rev. Neurol. 2020, 16, 440–456. [Google Scholar] [CrossRef]
  67. Alloghani, M.; Al-Jumeily, D.; Hussain, A.; Liatsis, P.; Aljaaf, A. Performance-Based Prediction of Chronic Kidney Disease Using Machine Learning for High-Risk Cardiovascular Disease Patients. In Nature-Inspired Computation in Data Mining and Machine Learning; Studies in Computational Intelligence; Springer: Berlin/Heidelberg, Germany, 2020; pp. 187–206. [Google Scholar]
  68. Lindholm, A.; Wahlström, N.; Lindsten, F.; Schön, T.B. Supervised machine learning. 2020, unpublished. [Google Scholar]
  69. James, G.; Witten, D.; Hastie, T.; Tibshirani, R. An Introduction to Statistical Learning; Springer: Cham, Switzerland, 2013; Volume 112. [Google Scholar]
  70. Saito, T.; Rehmsmeier, M. Precrec: Fast and accurate precision–recall and ROC curve calculations in R. Bioinformatics 2017, 33, 145–147. [Google Scholar] [CrossRef] [Green Version]
  71. Kahneman, D.; Beatty, J. Pupil Diameter and Load on Memory. Science 1966, 154, 1583–1585. [Google Scholar] [CrossRef]
  72. Ahern, S.; Beatty, J. Pupillary responses during information processing vary with Scholastic Aptitude Test scores. Science 1979, 205, 1289–1292. [Google Scholar] [CrossRef] [PubMed]
  73. Hansen, A.L.; Johnsen, B.H.; Thayer, J.F. Vagal influence on working memory and attention. Int. J. Psychophysiol. 2003, 48, 263–274. [Google Scholar] [CrossRef]
  74. McGuigan, F.J.; Andreassi, J.L. Psychophysiology—Human Behavior and Physiological Response. Am. J. Psychol. 1981, 94, 359. [Google Scholar] [CrossRef]
  75. Gabay, S.; Pertzov, Y.; Henik, A. Orienting of attention, pupil size, and the norepinephrine system. Atten. Percept. Psychophys. 2010, 73, 123–129. [Google Scholar] [CrossRef] [PubMed]
  76. Geva, R.; Zivan, M.; Warsha, A.; Olchik, D. Alerting, orienting or executive attention networks: Differential patters of pupil dilations. Front. Behav. Neurosci. 2013, 7, 145. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  77. Murphy, P.R.; O’Connell, R.G.; O’Sullivan, M.; Robertson, I.H.; Balsters, J.H. Pupil diameter covaries with BOLD activity in human locus coeruleus. Hum. Brain Mapp. 2014, 35, 4140–4154. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  78. Hong, L.; Walz, J.; Sajda, P. Your Eyes Give You Away: Prestimulus Changes in Pupil Diameter Correlate with Poststimulus Task-Related EEG Dynamics. PLoS ONE 2014, 9, e91321. [Google Scholar] [CrossRef]
  79. Joshi, S.; Li, Y.; Kalwani, R.M.; Gold, J.I. Relationships between Pupil Diameter and Neuronal Activity in the Locus Coeruleus, Colliculi, and Cingulate Cortex. Neuron 2016, 89, 221–234. [Google Scholar] [CrossRef] [Green Version]
  80. Alnaes, D.; Sneve, M.H.; Espeseth, T.; Endestad, T.; Van De Pavert, S.H.P.; Laeng, B. Pupil size signals mental effort deployed during multiple object tracking and predicts brain activity in the dorsal attention network and the locus coeruleus. J. Vis. 2014, 14, 1. [Google Scholar] [CrossRef]
  81. Wang, C.-A.J.; Munoz, D.P. A circuit for pupil orienting responses: Implications for cognitive modulation of pupil size. Curr. Opin. Neurobiol. 2015, 33, 134–140. [Google Scholar] [CrossRef]
  82. Ebitz, R.B.; Platt, M.L. Neuronal Activity in Primate Dorsal Anterior Cingulate Cortex Signals Task Conflict and Predicts Adjustments in Pupil-Linked Arousal. Neuron 2015, 85, 628–640. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  83. Mittelstaedt, J.M.; Wacker, J.; Stelling, D. VR aftereffect and the relation of cybersickness and cognitive performance. Virtual Real. 2019, 23, 143–154. [Google Scholar] [CrossRef]
Figure 1. The experimental procedure and environment.
Figure 1. The experimental procedure and environment.
Sensors 21 04642 g001
Figure 2. Signal processing for detecting motion sickness from the pupillary rhythm. (A) Procedure to detect the pupil area: (a) a raw image (gray scale) from the IR camera; (b) the binarization image based on the auto threshold; (c) detection of the reflected light caused by the infrared lamp; (d) detection of the pupil area using the CDE algorithm. (B) Pupil diameter signals at 30 fps. (C) Resampled pupil diameter (pupillary rhythm) at 1 Hz based on the sliding movement average (window size: 30 fps and resolution: 30 fps). (D) Definition for mean and standard deviation (SD) of pupil diameter. (E) Spectral signals of pupillary rhythm using the fast Fourier transform (FFT) analysis. (F) Definition for pupillary rhythm coherence (PRC).
Figure 2. Signal processing for detecting motion sickness from the pupillary rhythm. (A) Procedure to detect the pupil area: (a) a raw image (gray scale) from the IR camera; (b) the binarization image based on the auto threshold; (c) detection of the reflected light caused by the infrared lamp; (d) detection of the pupil area using the CDE algorithm. (B) Pupil diameter signals at 30 fps. (C) Resampled pupil diameter (pupillary rhythm) at 1 Hz based on the sliding movement average (window size: 30 fps and resolution: 30 fps). (D) Definition for mean and standard deviation (SD) of pupil diameter. (E) Spectral signals of pupillary rhythm using the fast Fourier transform (FFT) analysis. (F) Definition for pupillary rhythm coherence (PRC).
Sensors 21 04642 g002
Figure 3. Representation of the total SSQ scores for motion sickness between the 2D and HMD conditions. There was a significant difference based on a paired t-test and ANCOVA (*** p < 0.001). (a) The ANCOVA test between the 2D and HMD viewing condition. (b) A paired t-test between pre- and post-viewing in the HMD condition.
Figure 3. Representation of the total SSQ scores for motion sickness between the 2D and HMD conditions. There was a significant difference based on a paired t-test and ANCOVA (*** p < 0.001). (a) The ANCOVA test between the 2D and HMD viewing condition. (b) A paired t-test between pre- and post-viewing in the HMD condition.
Sensors 21 04642 g003
Figure 4. (A) Clear examples (for participants 1, 10 and 24) and (B) unclear example (for participant 6) of changes in pupillary rhythms (mPD and sPD) pre- and post-viewing 2D and HMD.
Figure 4. (A) Clear examples (for participants 1, 10 and 24) and (B) unclear example (for participant 6) of changes in pupillary rhythms (mPD and sPD) pre- and post-viewing 2D and HMD.
Sensors 21 04642 g004
Figure 5. Representation of the mPD and sPD for motion sickness between the 2D and HMD conditions. There was a significant difference based on a paired t-test and ANCOVA (* p < 0.05; *** p < 0.001). (a) The ANCOVA test between the 2D and HMD viewing condition. (b) A paired t-test between pre- and post-viewing in the 2D condition. (c) A paired t-test between pre- and post-viewing in the HMD condition.
Figure 5. Representation of the mPD and sPD for motion sickness between the 2D and HMD conditions. There was a significant difference based on a paired t-test and ANCOVA (* p < 0.05; *** p < 0.001). (a) The ANCOVA test between the 2D and HMD viewing condition. (b) A paired t-test between pre- and post-viewing in the 2D condition. (c) A paired t-test between pre- and post-viewing in the HMD condition.
Sensors 21 04642 g005
Figure 6. (A) Clear (for participants 1, 10 and 24) and (B) unclear examples (for participant 6) of changes in the spectrum of pupillary rhythms (PRC ratio) pre- and post-viewing 2D and HMD.
Figure 6. (A) Clear (for participants 1, 10 and 24) and (B) unclear examples (for participant 6) of changes in the spectrum of pupillary rhythms (PRC ratio) pre- and post-viewing 2D and HMD.
Sensors 21 04642 g006
Figure 7. Representation of the PRC ratio for motion sickness between the 2D and HMD conditions. There was a significant difference based on a paired t-test and ANCOVA (* p < 0.05; *** p < 0.001). (a) The ANCOVA test between the 2D and HMD viewing condition. (b) A paired t-test between pre- and post-viewing in the 2D condition. (c) A paired t-test between pre- and post-viewing in the HMD condition.
Figure 7. Representation of the PRC ratio for motion sickness between the 2D and HMD conditions. There was a significant difference based on a paired t-test and ANCOVA (* p < 0.05; *** p < 0.001). (a) The ANCOVA test between the 2D and HMD viewing condition. (b) A paired t-test between pre- and post-viewing in the 2D condition. (c) A paired t-test between pre- and post-viewing in the HMD condition.
Sensors 21 04642 g007
Figure 8. Results of the correlation analysis between SSQ scores and significant features of pupillary rhythms.
Figure 8. Results of the correlation analysis between SSQ scores and significant features of pupillary rhythms.
Sensors 21 04642 g008
Figure 9. Receiver operating characteristics curves for the (A) training dataset and (B) test dataset according to the four classifiers.
Figure 9. Receiver operating characteristics curves for the (A) training dataset and (B) test dataset according to the four classifiers.
Sensors 21 04642 g009
Figure 10. Flowchart for the offline training and online testing stage in the non-contact measurement system of motion sickness.
Figure 10. Flowchart for the offline training and online testing stage in the non-contact measurement system of motion sickness.
Sensors 21 04642 g010
Figure 11. Non-contact measurement system of motion sickness using an infrared (IR) webcam. (A) Introduction to the measuring software: (a) protocol for detecting the pupil area; (b) raw signals of pupillary diameter; (c) filtered pupil diameter signals in time domain and detecting MPD and SPD; (d) power spectral density of pupillary rhythms in the frequency domain and detecting the PRC ratio; (e) binary decision for the motion sickness state. (B) Configuration of the measuring device including the HMD device, add-on IR webcam and lamp. (C) Overview of the real-time system.
Figure 11. Non-contact measurement system of motion sickness using an infrared (IR) webcam. (A) Introduction to the measuring software: (a) protocol for detecting the pupil area; (b) raw signals of pupillary diameter; (c) filtered pupil diameter signals in time domain and detecting MPD and SPD; (d) power spectral density of pupillary rhythms in the frequency domain and detecting the PRC ratio; (e) binary decision for the motion sickness state. (B) Configuration of the measuring device including the HMD device, add-on IR webcam and lamp. (C) Overview of the real-time system.
Sensors 21 04642 g011
Table 1. The performance of different types of classifiers according to the training and test dataset.
Table 1. The performance of different types of classifiers according to the training and test dataset.
Classifier10-Fold Cross ValidationTest Set Validation
AccuracySensitivitySpecificityAUCAccuracySensitivitySpecificityAUC
LDA0.880.830.920.930.800.740.870.90
Decision Tree0.830.750.920.950.760.650.870.76
Linear SVM0.900.880.920.920.800.740.870.90
RBF SVM0.900.880.920.920.800.740.870.89
Note: LDA: linear discriminant analysis; Linear SVM: linear kernel support vector machine; RBF SVM: radial basis function kernel support vector machine.
Table 2. Performance comparison of the proposed method and previous methods of motion-sickness.
Table 2. Performance comparison of the proposed method and previous methods of motion-sickness.
StudyDeviceFeatureClassifierAccuracy
Train SetnTest Setn
1Lin et al., 2013HMDEEGPCA + SONFIN0.8217--
2Pane et al., 2018LCDCN2 Rules0.899--
3Mawalid et al., 2018LCDNaïve Bayes0.849--
4Li et al., 2020HMDSVM0.7918--
5Dennison Jr et al., 2019HMDEEG, EOG, RSP, etc.Tree Bagger0.9518--
6Li et al., 2019HMDEEG and COPVoting Classifier0.7620--
7Present studyHMDPupillary ResponseSVM0.90480.8046
Note: HMD: head-mounted display; LCD: crystal display liquid; EEG: electroencephalography; EOG: electrooculography; RSP: respiration; COP: center of pressure in force plate; PCA: principal component analysis; SONFIN: self-organizing neural fuzzy inference network; SVM: support vector machine.
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Share and Cite

MDPI and ACS Style

Park, S.; Mun, S.; Ha, J.; Kim, L. Non-Contact Measurement of Motion Sickness Using Pupillary Rhythms from an Infrared Camera. Sensors 2021, 21, 4642. https://doi.org/10.3390/s21144642

AMA Style

Park S, Mun S, Ha J, Kim L. Non-Contact Measurement of Motion Sickness Using Pupillary Rhythms from an Infrared Camera. Sensors. 2021; 21(14):4642. https://doi.org/10.3390/s21144642

Chicago/Turabian Style

Park, Sangin, Sungchul Mun, Jihyeon Ha, and Laehyun Kim. 2021. "Non-Contact Measurement of Motion Sickness Using Pupillary Rhythms from an Infrared Camera" Sensors 21, no. 14: 4642. https://doi.org/10.3390/s21144642

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop