Next Article in Journal
Creating a Diagnostic Assistance System for Diseases in Kampo Medicine
Previous Article in Journal
A Fast and Robust Rotation Search and Point Cloud Registration Method for 2D Stitching and 3D Object Localization
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

The Effect of Task Complexity on Time Estimation in the Virtual Reality Environment: An EEG Study

Department of Industrial and Systems Engineering, University of Washington, Seattle, WA 98195, USA
*
Author to whom correspondence should be addressed.
Appl. Sci. 2021, 11(20), 9779; https://doi.org/10.3390/app11209779
Submission received: 10 September 2021 / Revised: 2 October 2021 / Accepted: 11 October 2021 / Published: 19 October 2021

Abstract

:
This paper investigated the effect of task complexity on time estimation in the virtual reality environment (VRE) using behavioral, subjective, and physiological measurements. Virtual reality (VR) is not a perfect copy of the real world, and individuals perceive time duration differently in the VRE than they do in reality. Though many researchers have found a connection between task complexity and time estimation under non-VR conditions, the influence of task complexity on time estimation in the VRE is yet unknown. In this study, twenty-nine participants performed a VR jigsaw puzzle task at two levels of task complexity. We observed that as task complexity increased, participants showed larger time estimation errors, reduced relative beta-band power at Fz and Pz, and higher NASA-Task Load Index scores. Our findings indicate the importance of controlling task complexity in the VRE and demonstrate the potential of using electroencephalography (EEG) as real-time indicators of complexity level.

1. Introduction

Virtual reality (VR) technology provides an immersive computer-generalized environment in which individuals are able to interact with multisensory input [1,2]. As VR technology has become increasingly portable and inexpensive in recent decades, VR applications have expanded into diverse education and training areas, where they are used to reproduce scenarios that may be expensive or hard to mimic in real life [3]. However, because the head-mounted display (HMD) that is part of VR technology blocks real-world external stimuli, it is challenging for VR users to estimate the amount of time they have spent inside the virtual reality environment (VRE) [1].
Time estimation refers to the process by which an individual subjectively judges duration. Two categories of time estimation have been defined. In the prospective time estimation paradigm, participants are informed that they are required to estimate duration prior to starting a task; in contrast, in the retrospective time estimation paradigm, participants are not informed that they are required to estimate duration prior to starting a task. Both paradigms ask participants about duration after the task [4,5]. Due to the limitations of the experimental settings in which participants surmise that they may be asked to judge duration, prospective time estimation has been used more widely in studies than retrospective time estimation [5]. Prospective time estimation is explained by the attention allocation model that fewer resources are available for processing time estimation when other cognitive tasks require more attentional resources [6,7,8].
Multiple studies have underlined the importance of time estimation in indicating task performance and users’ satisfaction in diverse task settings [9,10]. Accurate time estimation is known to be an indicator of enhanced user experience, high decision-making quality, and high academic performance. For example, users were likely to overestimate the time they spent on webpage loading and downloading tasks, and their user experience ratings decreased correspondingly [11]. In addition, inaccurate time estimation has been shown to result in customers’ dissatisfaction with waiting in line [12,13]. Previous studies have also paid attention to how time estimation influences decision-making. Klapproth’s study [14] indicated that when individuals made larger time estimation errors, they tended to make impulsive decisions without evaluating the value of the options. Another study, too, found that individuals made impulsive decisions, choosing instant rewards over delayed rewards with greater value [15], when they overestimated the time. In the academic arena, Aitken [16] indicated that students who underestimated the amount of time they had spent reading textbooks and answering exercise questions were more likely to delay starting coursework and to have lower academic performance. Josephs and Hahn [17] observed that students who made accurate time estimations when reading a psychology manuscript tended to have higher grade-point averages (GPAs).
Though researchers have realized the importance of time estimation, few studies have investigated time estimation in the VRE; of those that have, there have been mixed results. One stream of research showed that during chemotherapy, most individuals estimated the time duration in the VRE to be shorter than the actual time elapsed [1,18]. In other studies, participants estimated that the time elapsed in the VRE was longer than the actual time they had spent playing a music game and a shooting game [19], and walking according to the direction of a virtual marker [20]. One of the main reasons for such inconsistency is investigating without taking task complexity into account to time estimation in the VRE.
In Section 1.1 and Section 1.2, we provide a literature review (the first and second paragraphs). Then, we discuss the limits of the existing studies in terms of their experimental designs (the third paragraph of Section 1.1 and Section 1.2) and assessment tools (Section 1.3). Such limitations led us to develop the three research questions that appear at the end of each section.

1.1. Task Complexity and Time Estimation

One approach to defining task complexity is to consider the number of elements comprising a task [21,22]. A high-complexity task includes many elements; a low-complexity task is comprised of few elements. As a task’s complexity level increases, an individual needs to process more information, leading to a higher demand for attentional resources [22,23,24].
Several studies have revealed that in non-VREs, individuals estimate time more inaccurately, as tasks become more complex. For example, Zakay and colleagues [25] designed verbal tasks with three levels of complexity—reading words (low complexity), naming objects (mid complexity), and giving synonyms for the words on the cards (high complexity). Of the three task conditions, the participants made the largest time estimation errors during the synonym-providing task, the highest task complexity. More recently, Chan and Hoffmann [26] found that participants estimated the elapsed time to be longer than the actual time as the complexity of two tasks—a pin-to-hole assembly task and a task that required moving an object to a target area—increased. Brown [27] used a figure-tracing task in which participants were instructed to trace a six-point star within a double boundary line. At the low complexity level, the participants performed the task with a pencil; at the high complexity level, the participants used a mirror-drawing apparatus. The study revealed that the subjects tended to make larger errors in both prospective and retrospective time estimations during high-complexity tasks compared to those of low complexity.
Although many researchers have discussed the effect of task complexity on time estimation under non-VRE conditions, few studies have considered VREs. A recent study by Schatzschneider et al. [28] used the VRE to investigate the impact of cognitive task types—verbal vs. spatial memory tasks—on time estimation. They observed that cognitive tasks significantly impacted participants’ time estimation in the VRE. However, we noticed that the researchers did not consider the complexity of each task type. As the effect of task complexity on time estimation in the VRE remains unknown, our first research question (RQ) was as follows:
RQ1: 
Does task complexity influence time estimation in the VRE?

1.2. Task Complexity and Electroencephalography

Electroencephalography (EEG) is a recording of electrical changes in a population of neurons of the brain [29]. Neuronal oscillations in different frequency bands of EEG are related to specific cognitive activities [30,31,32]. Among five major frequency bands—delta (1–4 Hz), theta (4–8 Hz), alpha (8–12 Hz), beta (12–30 Hz), and gamma (30–100 Hz), the beta-band is associated with attention-related mental activities [33,34,35].
Although the beta-band is thought to be sensitive to task complexity [36,37,38], there is no consensus as to the direction of the relationship. On the one hand, beta-band power is known to increase with task complexity. For example, Wilson et al. [39] observed higher beta-band power when the number of characters to be memorized increased from one to five. Murata [35] reported that the average beta-band power at Cz, Fz, and Pz increased with the complexity of the N-back task where participants were asked to respond when the letter in the current trial was the same as the one shown N trials before. In a word discrimination experiment, researchers found enhanced beta-band power when the semantic complexity increased [40]. On the other hand, other studies have reported opposite results. Fernández et al. [41] indicated that beta-band power trended higher on a simple number-reading task than on arithmetic tasks. Bočková et al. [42] showed a higher beta-band power on simple letter-copying tasks than on a complex task in which participants were instructed to write a letter other than the one shown.
Such inconsistency in the relationship between task complexity and beta-band power has not even been discussed in the VRE. Recently, researchers have successfully applied EEG systems together with consumer-level VR HMDs [43,44]. For example, Dey and colleagues employed a 32-channel portable EEG system with the HTC VIVE VR headset to generate an adaptive VR training system by collecting alpha-band activities in real-time [44]. Taking advantage of recent technological advancements in EEG devices that are portable and provide data in real-time [45,46], our second research question was to determine whether brain signals indicate task complexity in the VRE.
RQ2: 
Do changes in brain signals indicate task complexity in the VRE?

1.3. NASA-Task Load Index

The NASA-Task Load Index (NASA-TLX), a widely applied workload assessment tool [47], has proven its usefulness in measuring task complexity from a subjective perspective [48,49], but it has some disadvantages. NASA-TLX is normally applied after an individual finishes a task, meaning that the result depends on the individual’s memory and willingness to answer the questions without any bias [50]. Given the limitations of NASA-TLX, researchers have considered estimating perceived workload using responses such as time estimation [50,51] and EEG signals [37,45]. To further verify in the VRE that time estimation and EEG signals can work as workload assessment tools, our third research goal was to use correlation analysis to discover the relationship between time estimation error, EEG band power, and the NASA-TLX score.
RQ3: 
Are there relationships among time estimation, brain signals, and perceived workload in the VRE?
To answer the three research questions, we designed controlled experiments with independent and dependent variables and analyzed collected experimental data, which are described in Section 2. Section 3 presents the descriptive statistics of dependent variables and the results from the analysis of variance (ANOVA) that show the effects of the independent variables on the dependent variables (RQs 1 and 2). The results from the correlation analysis are also provided to show the relationship between various dependent variables (RQ3).

2. Methods

2.1. Participants

Twenty-nine subjects (13 males and 16 females) aged 21–29 years (mean = 23.72, SD = 2.09) at the University of Washington participated in the study. Participants had either normal or corrected vision without visual or auditory impairments. Each participant signed a consent form prior to the experiment and received USD 20 in compensation for their participation. The study was reviewed and approved by the University of Washington’s Institutional Review Boards (IRBs) before recruiting participants.

2.2. Apparatus

We used an HTC VIVE head-mounted VR device with a refresh rate of 90 Hz. A 90 Hz refresh rate is sufficiently capable of eliminating flicker, a contributing factor to motion sickness [52,53]. A single-player commercial VR software program, Jigsaw360 (Head Start Design), was used to deliver the main task in the study. To minimize the potential effect of visual complexity on task performance, apart from using a different number of jigsaw puzzles, we kept all experimental settings between the high- and low-task-complexity conditions identical. We used the same background color, the same jigsaw puzzle design (i.e., solid gray color), and the same contrast between the jigsaw pieces. Likewise, the virtual positions of the jigsaw pieces were consistent across task complexity conditions, as was the virtual position of the participants. All participants were required to sit in a chair to minimize their body movements. Thus, we assume that there were minimal differences in the spatial components of the high- and low-task-complexity conditions in the VRE.
Auditory and visual effects in the background of the game were eliminated to minimize any nuisance effect on the EEG signals. Participants’ brain signals were collected using a wireless EEG (Epoc Flex by Emotiv) with a sampling rate of 128 Hz. All raw data were recorded using EmotivPro software (Emotiv). The experiment was performed on a Lenovo X1 laptop (Lenovo) equipped with an Intel i7 processor and Intel UHD 620 Graphics. We used R version 3.6.1. to perform statistical analysis [54].

2.3. Experimental Design

We employed a 2 × 3 within-subjects design. Table 1 summarizes the independent and dependent variables. The independent variables were task complexity and block sequence. The jigsaw game had two levels of complexity—8 pieces (low task complexity) and 18 pieces (high task complexity). Figure 1 shows the interface of low and high complexity levels in the VR jigsaw game. There were three blocks in the experiment, and each block consisted of two levels of task complexity; each participant was thus required to complete six trials in total. We used a randomized block design where the order of two trials (i.e., low or high complexity task) within a block was randomized to avoid any order effect. The dependent variables were absolute time estimation error; the relative beta-band power at the Cz, Fz, and Pz electrodes; and the NASA-TLX score. The Cz, Fz, and Pz electrodes are located on the midline of the central, frontal, and parietal lobe according to the International 10-20 system [55]. A number of studies have reported that the beta-band power at electrodes Cz, Fz, and Pz is sensitive to task complexity [35,36,39].

2.4. Procedure

Figure 2 presents the procedure of the experiment. Before the experiment, the participants had completed an online survey with screening questions about their demographic information and previous VR experiences. The experiment lasted about 105 min per participant. At the beginning, each participant was asked to sign a consent form and read a brief set of instructions. Participants were informed that if they experienced any discomfort such as motion sickness, they were free to withdraw from the study immediately. Each participant then performed a practice trial with the VR headset to become familiar with the VR jigsaw game. Participants were informed that in the main study, they would be asked a time estimation question verbally each time they finished a jigsaw puzzle. After a participant completed their practice trial, we set-up the Epoc Flex EEG device. The Cz, Fz, and Pz electrodes were attached to the participant’s head with electric gel.
Each trial during the main experiment required the participants to put every jigsaw piece in the right place on a blank tray with a white border, as shown in Figure 1. To pick up a jigsaw piece, the participants needed to target the jigsaw piece using the virtual laser pointer and then press the trigger button on the controller. Then, they kept holding the button and moved the controller to drag the jigsaw piece to the location they wanted to. The participants only needed to release the button to place the jigsaw piece. The participants were instructed to gently move their necks and wrists and avoid moving other body parts to minimize the artifacts caused by bodily movement on the EEG signal recording. After completing each trial, the participants were instructed to estimate the time, in seconds, that the trial had taken them by answering a question: “How long do you think you spent on the previous jigsaw game?” Before moving on to the next trial, the participants also assessed their subjective workloads using the NASA-TLX questionnaire [47]. Each participant gave their answers verbally while wearing the VR headset. There was a one-minute break between trials. The participants were instructed to play the game at a comfortable pace. During the experiment, participants’ brain signals, time estimations, actual completion times, and NASA-TLX scores were recorded.

2.5. Data Analysis

To investigate the effect of task complexity on time estimation, the study explored behavioral, physiological, and subjective responses in the VRE, as described in Section 2.4. As shown in Table 1, three dependent variables—absolute time estimation error, relative beta-band power, and NASA-TLX—represent behavioral, physiological, and subjective responses, respectively. The following sections provide detailed information about the data analyses related to participants’ behavioral (Section 2.5.1), physiological (Section 2.5.2), and subjective responses (Section 2.5.3).

2.5.1. Absolute Time Estimation Error

In this study, we utilized the absolute value of time estimation error [27,56] to describe the accuracy of the time estimation results. To avoid the artificially high estimation accuracy that results from overestimation and underestimation results canceling each other out, we used the absolute value of the time estimation error instead of the true value [27,57]. As shown in Equation (1), the absolute time estimation error (AE) was specified as the absolute difference between the participants’ estimated time (ET) and the actual task completion time (CT):
AE = |ET − CT|

2.5.2. Relative Beta-Band Power

To analyze EEG band power, we first applied the fast Fourier transform (FFT) based on Welch’s [58] method to decompose the voltage signals into the power spectrum. FFT is a method that is widely used to calculate power spectrum density with high computational efficiency [59]. Compared to the original FFT, Welch’s FFT separates raw data into overlapping sample segments and derives the power spectral density by averaging the periodograms of all segments [58]. This reduces the variance in estimated band power [60]. We conducted Welch’s FFT and data filtering by using the Scipy package [61] in Python 3.7.3. We filtered the data based on the frequency ranges of the delta (1–4 Hz), theta (4–8 Hz), alpha (8–12 Hz), beta (12–30 Hz), and gamma (30–100 Hz) bands [62,63]. After performing the FFT, we transformed frequency band power into a relative form: the ratio of a specific band power to the total power of the signal. For example, when calculating the relative beta-band power, we considered beta-band power to be the numerator and the total power of the signal to be the denominator. We preferred relative band power to frequency band power because the former has a smaller variance among subjects and is less influenced by individuals’ electrical properties [30,64,65].

2.5.3. NASA-TLX

NASA-TLX records individuals’ mental, physical, and temporal demands, frustrations, efforts, and performance to gauge their subjective workload. In practical usage, many researchers have modified the NASA-TLX to suit different purposes and situations. The most common modification is the elimination of the weighting process in the NASA-TLX [47]. The same modification was made in this study. To make it easier for participants to answer the questions verbally, we simplified the scale from 0 to 100 with 5-point increments to 21 gradations.

3. Results

3.1. Descriptive Statistics

Table 2 summarizes the descriptive statistics of the behavioral response (i.e., time estimation error), subjective response (i.e., NASA-TLX score), and physiological responses (i.e., the relative beta-band power at Cz, Fz, and Pz) under the low- and high-task-complexity conditions. Participants had a trend of making larger estimation errors under the high-task-complexity conditions than the low-task-complexity conditions. In addition, the mean NASA-TLX score was greater under the high-task-complexity conditions. However, the average relative beta-band power at Cz, Fz, and Pz was lower under the high complexity conditions compared to the low-complexity conditions. We also present the average task completion time and estimated time under both conditions in Table 2. In general, participants spent more time on the jigsaw game and made longer time estimates under the high conditions compared to the low conditions.

3.2. The Effect of Task Complexity and Block Sequence on Dependent Variables

3.2.1. Absolute Time Estimation Error

To investigate the impact of the independent variables on the absolute time estimation error, we conducted a repeated-measures ANOVA. Figure 3 presents the interaction plot with the absolute time estimation error. We found a significant effect of task complexity on the absolute time estimation error (F (1, 140) = 82.02, p < 0.001), meaning that the more complex the task was, the larger the absolute error in time estimation was. The effect of the sequence of blocks on the absolute time estimation error was also significant (F (2, 140) = 4.60, p < 0.05). Fisher’s least significant difference (LSD) post hoc analysis indicated that all three blocks differed significantly. Participants made larger time estimation errors in the first block than in the second block, larger errors in the second block than in the third block, and larger errors in the first block than in the third block. There was no significant effect of interaction between task complexity and block sequence on the absolute time estimation error.

3.2.2. Relative Beta-Band Power of EEG

The repeated-measures ANOVA results revealed a significant effect of task complexity on the relative beta-band power at Fz (F (1, 98) = 5.26, p < 0.05) and Pz (F (1, 98) = 5.85, p < 0.05) and a marginal effect on the relative beta-band power at Cz (F (1, 98) = 2.92, p = 0.09). Figure 4 shows the interaction plots of the two independent variables with the relatively high beta-band power at Fz (a), Pz (b), and Cz (c). The relative beta-band power at the three electrodes decreased with task complexity. The block sequence presented no significant results, indicating that task repetition did not affect the fraction of beta-band power. There was no significant interaction effect between task complexity and block sequence on each signal.

3.2.3. Subjective Workload

We conducted a similar repeated-measures ANOVA on the subjective workload scores. We found a significant main effect of task complexity on the NASA-TLX score (F (1, 140) = 123.89, p < 0.001). As illustrated in Figure 5, high task complexity generated a higher NASA-TLX score: participants perceived a higher workload in the high-complexity task than the low-complexity task. The sequence of blocks also significantly impacted the NASA-TLX score (F (2, 140) = 37.36, p < 0.001). Fisher’s LSD post hoc analysis showed that all three blocks differed significantly. In our study, participants’ perceived workload gradually decreased from the first block to the third block.

3.3. Correlations among Responses

Table 3 summarizes the relationships between the absolute time estimation error, relative beta-band power at the three electrodes, and the NASA-TLX score by using the repeated-measures correlation analysis [66]. We found a significant negative correlation between the absolute time estimation error and the relative beta-band power at Cz. This correlation indicated that the proportion of beta-band power at Cz showed an overall decreasing trend when individuals made larger time estimation errors. In addition, we observed marginal correlations between the absolute time estimation error and the relative beta-band power at Fz and Pz. We also observed a significant correlation between the NASA-TLX score and the absolute time estimation error, indicating that larger time estimation errors were associated with increasing perceived workload.

4. Discussion and Conclusions

To the best of our knowledge, this study is the first to investigate how time estimation is influenced by task complexity in the VRE. The results of this study revealed three major findings that answer the three questions raised. First, we found that the participants made greater time estimation errors as the task complexity in the VRE increased. Second, in the VRE, the relative beta-band power of EEG was greater for the low task-complexity condition than it was for the high task-complexity condition. Third, we found a negative relationship between time estimation error and relative beta-band power and a positive relationship between time estimation error and mental workload.
We explored the impact of task complexity on time estimation in a new task setting, i.e., the VRE. In addition to considering the participants’ behavioral and subjective responses, we employed EEG to capture the participants’ physiological responses in real-time. The composite of behavioral, physiological, and subjective responses enabled us to overcome the limitations of the self-reported questionnaire of NASA-TLX that is available only after the completion of tasks and may produce self-reported biases. Our findings in the VRE provided new evidence of the direction of the relationship between the beta-band power of EEG and task complexity.
Our finding that the absolute time estimation error was negatively influenced by task complexity in the VRE substantiates the attentional resource allocation theory [67,68]. Based on this theory, estimating time and conducting cognitive tasks both require attentional resources, of which each person has only a limited capacity. In line with the structuralist point of view [21,22], this study provided a smaller number of elements to be completed during the low-complexity task and a greater number of elements to be completed during the high-complexity task. As a result, the participants needed to allocate greater attention to the high-complexity task and paid less attention to the process of time estimation, leading to less accurate time estimations than seen during the low-complexity task [15,67,69].
We also observed the effect of block sequences on the absolute time estimation error, demonstrating that task repetition significantly improved the accuracy of time estimation results. We note that the practice trial was designed to familiarize participants with the VR environment, task procedure, and device usage with the HMD and VR controllers. The screening survey showed that 38% of participants did not have VR experience. If we had not included the practice trial, the participants likely would have made many task errors due to task misunderstanding and their lack of familiarity with the VR system. These errors would have been reflected in the time estimation. As we designed the experiment to investigate the effect of task complexity only on time estimation, the practice trial was necessary. While the practice trial minimized the impact of previous VR experience, we found an effect of task repetition on the absolute time estimation error. Without the practice trial, we would have expected the effects of block sequences to increase dramatically. Throughout the experiment, we purposefully avoided providing any feedback to the participants on their estimation results. The participants were thus unable to adjust their estimations based on their previous results. Task repetition decreases the attentional resources assigned to the given task [70,71] and leaves more attentional resources available for time estimation, thereby contributing to more accurate time estimation results [70].
In addition to considering relative beta-band power, we examined changes in relative theta-band power in the frontal and parietal regions. We did not find significant results when we tested the relative theta-band power as the dependent variable. Given that the previous studies have reported increased theta-band power in enhanced task complexity [35,72,73], beta-band power is more sensitive in the VRE to detect changes in task complexity compared to the theta-band.
The significant correlation between the relative beta-band power at Cz and time estimation error indicates the connection between beta-band activities in the midline central region and time estimation, whether inside or outside of the VRE. In a previous study conducted in a non-VRE setting, Kulashekhar and colleagues [74] observed a suppression in beta-band activities in the midline central region during time estimation. Ghaderi et al. [75] also reported a significant change in beta-band power at Cz when participants estimated elapsed time. As previous researchers have suggested, the beta-band plays an essential role in time estimation mechanisms [74,75,76], and our findings further support the theory in the VRE.
Additionally, our correlation results between the absolute time estimation error and the NASA-TLX score corroborate previous findings indicating that time estimation is a measure of mental workload in surgery training [49] and in 2D video games [77]. Our results also validate time estimation usage in the VRE. When the environment limits the use of the NASA-TLX questionnaire, time estimation can serve as a valid and quick mental workload evaluation tool.
Given that the age distribution of the participants was not diverse in that most of the participants were college-aged young adults, we did not originally consider age to be the main factor in this study. However, we acknowledge that age plays a significant role in time estimation tasks [56,78]. Previous studies found noticeable differences between age groups such that the elderly group made larger time estimation errors than the younger groups did [56,78]. To test the effect of age on our dependent variables, we divided age into two levels (i.e., older than or equal to 24 and younger than 24) using a median split. However, the ANOVA result showed that age did not have a significant effect on absolute estimation error (F (1, 27) = 0.16, p = 0.70), relative beta-band power at Cz (F (1,20) = 0.08, p = 0.78), Fz (F (1,20) = 0.04, p = 0.85), Pz (F (1,20) = 2.05, p = 0.17), or NASA-TLX score (F (1, 27) = 2.94, p = 0.10). To increase the generalizability of the results, future studies might aim to recruit participants from different age groups.
Our study has several limitations that merit future research. First, this study included factors that may have imperceptibly influenced the quality of EEG signals, such as the vibration of the VR controller and the electrical interference between the VR HMD and the EEG headset. To eliminate noise, future research might perform further data filtering by comparing the EEG signals at the baseline and during the experiment. Second, the statistical power of the analysis was 0.6, indicating a 40% probability of having a Type II error in the results, which could be attributed to the small sample size. Third, only three EEG electrodes were used in the experiment. In the future, brain signals from multiple electrodes might be analyzed synchronously to further investigate brain connectivity and its relationship with time estimation in the VRE. Fourth, our study did not focus on the event-related potential (ERP). The error-related negativity (ERN) is an ERP component that is found about 100 ms after individuals make and observe incorrect responses [79]. For example, Pezzetta et al. [80] reported an ERN on the frontal lobe when participants failed to grasp a virtual glass in the VRE. By recording the ERN when participants make errors in the VR puzzle game, future studies might consider variations in ERN under different complexity levels and the correlation between ERN amplitudes and time estimation error. In addition, future research may consider the inclusion of conditions in the real world. A comparative study between the VRE and the real world will provide evidence to understand how the VRE affects human time estimation.
Our study presents an implication of applying EEG systems to assist task design in VR applications. The finding that higher task complexity resulted in larger time estimation errors indicates the importance of controlling complexity levels in VR applications. Current EEG systems are able to perform FFT automatically and export the power of frequency bands in real-time. Taking advantage of this capability, designers will be able to real-time-monitor changes in time estimation errors during the use of VR applications inferred from changes in EEG responses. Compared to a laboratory environment, practical VR design settings pose more challenges for manipulating task complexity. This is due to the diversity of functional requirements associated with VR design [81]. The relationship between task complexity and EEG signals suggests the possibility of using relative beta-band power as an index to help developers adjust the complexity levels in VR applications.

Author Contributions

Conceptualization, J.L. and J.-E.K.; experimental design, J.L. and J.-E.K.; data analysis, J.L.; writing—original draft preparation, J.L.; writing—review and editing, J.-E.K.; supervision, J.-E.K. All authors have read and agreed to the published version of the manuscript.

Funding

This research received no external funding.

Institutional Review Board Statement

This research complied with the American Psychological Association Code of Ethics and was approved by the Institutional Review Board at the University of Washington (protocol code: STUDY00008244, date of approval: 4 September 2019).

Informed Consent Statement

Informed consent was obtained from all subjects involved in the study.

Conflicts of Interest

The authors declare no conflict of interest.

References

  1. Schneider, S.; Kisby, M.; Flint, C. Effect of virtual reality on time perception in patients receiving chemotherapy. Supportive Care Cancer 2011, 19, 555–564. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  2. Wiederhold, M.D.; Wiederhold, B.K. Virtual reality and interactive simulation for pain distraction. Pain Med. 2007, 8, 182–188. [Google Scholar] [CrossRef]
  3. Greengard, S. Virtual Reality; MIT Press: Cambridge, MA, USA, 2019. [Google Scholar]
  4. Bisson, N.; Tobin, S.; Grondin, S. Prospective and retrospective time estimates of children: A comparison based on ecological tasks. PLoS ONE 2012, 7, e33049. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  5. Block, R.A.; Zakay, D. Prospective and retrospective duration judgments: A meta-analytic review. Psychon. Bull. Rev. 1997, 4, 184–197. [Google Scholar] [CrossRef]
  6. Brown, S.W. Attentional resources in timing: Interference effects in concurrent temporal and nontemporal working memory tasks. Percept. Psychophys. 1997, 59, 1118–1140. [Google Scholar] [CrossRef] [Green Version]
  7. Grondin, S. Timing and time perception: A review of recent behavioral and neuroscience findings and theoretical directions. Atten. Percept. Psychophys. 2010, 72, 561–582. [Google Scholar] [CrossRef]
  8. Zakay, D.; Block, R.A. Temporal cognition. Curr. Dir. Psychol. Sci. 1997, 6, 12–16. [Google Scholar] [CrossRef]
  9. Eagleman, D.M.; Pariyadath, V. Is subjective duration a signature of coding efficiency? Philos. Trans. R. Soc. B Biol. Sci. 2009, 364, 1841–1851. [Google Scholar] [CrossRef]
  10. Zakay, D. Psychological time as information: The case of boredom. Front. Psychol. 2014, 5, 917. [Google Scholar] [CrossRef] [Green Version]
  11. Egger, S.; Reichl, P.; Hoßfeld, T.; Schatz, R. “Time is bandwidth”? Narrowing the gap between subjective time perception and Quality of Experience. In Proceedings of the 2012 IEEE International Conference on Communications (ICC), Ottawa, ON, Canada, 10–15 June 2012; pp. 1325–1330. [Google Scholar] [CrossRef]
  12. Davis, M.; Heineke, J. How disconfirmation, perception and actual waiting times impact customer satisfaction. Int. J. Serv. Ind. Manag. 1998, 9, 64–73. [Google Scholar] [CrossRef]
  13. East, R.; Singh, J.; Wright, M.; Vanhuele, M. Consumer Behaviour: Applications in Marketing; Sage: Thousand Oaks, CA, USA, 2016. [Google Scholar]
  14. Klapproth, F. Time and decision making in humans. Cogn. Affect. Behav. Neurosci. 2008, 8, 509–524. [Google Scholar] [CrossRef] [Green Version]
  15. Wittmann, M.; Paulus, M.P. Decision making, impulsivity and time perception. Trends Cogn. Sci. 2008, 12, 7–12. [Google Scholar] [CrossRef]
  16. Aitken, M.E. Personality Profile of the College Student Procrastinator. Dissertation Abstracts International: Section A. Humanities and Social Sciences. Ph.D. Thesis, University of Pittsburgh, Pittsburgh, PA, USA, 1982; pp. 722–723. Available online: https://psycnet.apa.org/record/1983-52736-001 (accessed on 9 September 2021).
  17. Josephs, R.A.; Hahn, E.D. Bias and accuracy in estimates of task duration. Organ. Behav. Hum. Decis. Process. 1995, 61, 202–213. [Google Scholar] [CrossRef]
  18. Chirico, A.; D′Aiuto, M.; Pinto, M.; Milanese, C.; Napoli, A.; Avino, F.; Iodice, G.; Russo, G.; De Laurentiis, M.; Ciliberto, G.; et al. The elapsed time during a virtual reality treatment for stressful procedures: A pool analysis on breast cancer patients during chemotherapy. In Intelligent Interactive Multimedia Systems and Services; De Pietro, G., Gallo, L., Howlett, R.J., Jain, L.C., Eds.; Springer: Cham, Switzerland, 2016; pp. 731–738. [Google Scholar] [CrossRef]
  19. Volante, W.G.; Cruit, J.; Tice, J.; Shugars, W.; Hancock, P.A. Time flies: Investigating duration judgments in virtual reality. In Proceedings of the Human Factors and Ergonomics Society Annual Meeting, Thousand Oaks, CA, USA, 27 September 2018; Volume 62, pp. 1777–1781. [Google Scholar] [CrossRef]
  20. Bruder, G.; Steinicke, F. Time perception during walking in virtual environments. In Proceedings of the 2014 IEEE Virtual Reality, Minneapolis, MN, USA, 29 March–2 April 2014; pp. 67–68. [Google Scholar] [CrossRef]
  21. Liu, P.; Li, Z. Task complexity: A review and conceptualization framework. Int. J. Ind. Ergon. 2012, 42, 553–568. [Google Scholar] [CrossRef]
  22. Wood, R.E. Task complexity: Definition of the construct. Organ. Behav. Hum. Decis. Process. 1986, 37, 60–82. [Google Scholar] [CrossRef]
  23. Bedny, G.Z.; Karwowski, W.; Bedny, I.S. Complexity evaluation of computer-based tasks. Int. J. Hum. Comput. Interact. 2012, 28, 236–257. [Google Scholar] [CrossRef]
  24. Robinson, P. Task complexity, task difficulty, and task production: Exploring interactions in a componential framework. Appl. Linguist. 2001, 22, 27–57. [Google Scholar] [CrossRef]
  25. Zakay, D.; Nitzan, D.; Glicksohn, J. The influence of task difficulty and external tempo on subjective time estimation. Percept. Psychophys. 1983, 34, 451–456. [Google Scholar] [CrossRef] [Green Version]
  26. Chan, A.; Hoffmann, E. Subjective estimation of task time and task difficulty of simple movement tasks. J. Mot. Behav. 2016, 49, 185–199. [Google Scholar] [CrossRef]
  27. Brown, S.W. Time perception and attention: The effects of prospective versus retrospective paradigms and task demands on perceived duration. Percept. Psychophys. 1985, 38, 115–124. [Google Scholar] [CrossRef]
  28. Schatzschneider, C.; Bruder, G.; Steinicke, F. Who turned the clock? Effects of manipulated zeitgebers, cognitive load and immersion on time estimation. IEEE Trans. Vis. Comput.Graph. 2016, 22, 1387–1395. [Google Scholar] [CrossRef] [PubMed]
  29. Binnie, C.D.; Prior, P.F. Electroencephalography. J. Neurol. Neurosurg. Psychiatry 1994, 57, 1308–1319. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  30. Choi, M.K.; Lee, S.M.; Ha, J.S.; Seong, P.H. Development of an EEG-based workload measurement method in nuclear power plants. Ann. Nucl. Energy 2018, 111, 595–607. [Google Scholar] [CrossRef]
  31. Kamiński, J.; Brzezicka, A.; Gola, M.; Wróbel, A. Beta band oscillations engagement in human alertness process. Int. J. Psychophysiol. 2012, 85, 125–128. [Google Scholar] [CrossRef]
  32. Klimesch, W. EEG alpha and theta oscillations reflect cognitive and memory performance: A review and analysis. Brain Res. Rev. 1999, 29, 169–195. [Google Scholar] [CrossRef]
  33. Engel, A.K.; Fries, P. Beta-band oscillations—signalling the status quo? Curr. Opin. Neurobiol. 2010, 20, 156–165. [Google Scholar] [CrossRef]
  34. Howells, F.M.; Stein, D.J.; Russell, V.A. Perceived Mental Effort Correlates with Changes in Tonic Arousal During Attentional Tasks. Behav. Brain Funct. 2010, 6, 39. Available online: http://www.behavioralandbrainfunctions.com/content/6/1/39 (accessed on 30 September 2021). [CrossRef] [Green Version]
  35. Murata, A. An attempt to evaluate mental workload using wavelet transform of EEG. Hum. Factors 2005, 47, 498–508. [Google Scholar] [CrossRef]
  36. Chen, Y.; Huang, X. Modulation of alpha and beta oscillations during an n-back task with varying temporal memory load. Front. Psychol. 2016, 6, 2031. [Google Scholar] [CrossRef] [Green Version]
  37. Brookings, J.B.; Wilson, G.F.; Swain, C.R. Psychophysiological responses to changes in workload during simulated air traffic control. Biol. Psychol. 1996, 42, 361–377. [Google Scholar] [CrossRef]
  38. Fairclough, S.H.; Venables, L.; Tattersall, A. The influence of task demand and learning on the psychophysiological response. Int. J. Psychophysiol. 2005, 56, 171–184. [Google Scholar] [CrossRef] [PubMed]
  39. Wilson, G.F.; Swain, C.R.; Ullsperger, P. EEG power changes during a multiple level memory retention task. Int. J. Psychophysiol. 1999, 32, 107–118. [Google Scholar] [CrossRef]
  40. Micheloyannis, S.; Vourkas, M.; Bizas, M.; Simos, P.; Stam, C.J. Changes in linear and nonlinear EEG measures as a function of task complexity: Evidence for local and distant signal synchronization. Brain Topogr. 2003, 15, 239–247. [Google Scholar] [CrossRef] [PubMed]
  41. Fernández, T.; Harmony, T.; Rodriguez, M.; Bernal, J.; Silva, J.; Reyes, A.; Marosi, E. EEG activation patterns during the performance of tasks involving different components of mental calculation. Electroencephalogr. Clin. Neurophysiol. 1995, 94, 175–182. [Google Scholar] [CrossRef]
  42. Bočková, M.; Chládek, J.; Jurák, P.; Halámek, J.; Rektor, I. Executive functions processed in the frontal and lateral temporal cortices: Intracerebral study. Clin. Neurophysiol. 2007, 118, 2625–2636. [Google Scholar] [CrossRef]
  43. Dey, A.; Chatburn, A.; Billinghurst, M. Exploration of an EEG-based cognitively adaptive training system in virtual reality. In Proceedings of the 2019 IEEE Conference on Virtual Reality and 3D User Interfaces (VR), Osaka, Japan, 23–27 March 2019; pp. 220–226. [Google Scholar] [CrossRef]
  44. Fadeev, K.A.; Smirnov, A.S.; Zhigalova, O.P.; Bazhina, P.S.; Tumialis, A.V.; Golokhvast, K.S. Too real to be virtual: Autonomic and EEG responses to extreme stress scenarios in virtual reality. Behav. Neurol. 2020, 2020. [Google Scholar] [CrossRef]
  45. Berka, C.; Levendowski, D.J.; Cvetinovic, M.M.; Petrovic, M.M.; Davis, G.; Lumicao, M.N.; Olmstead, R. Real-time analysis of EEG indexes of alertness, cognition, and memory acquired with a wireless EEG headset. Int. J. Human-Computer Interact. 2004, 17, 151–170. [Google Scholar] [CrossRef]
  46. Borghini, G.; Astolfi, L.; Vecchiato, G.; Mattia, D.; Babiloni, F. Measuring neurophysiological signals in aircraft pilots and car drivers for the assessment of mental workload, fatigue and drowsiness. Neurosci. Biobehav. Rev. 2014, 44, 58–75. [Google Scholar] [CrossRef]
  47. Hart, S.G. NASA-task load index (NASA-TLX); 20 years later. Proc. Hum. Factors Ergon. Soc. Annu.Meet. 2006, 50, 904–908. [Google Scholar] [CrossRef] [Green Version]
  48. Fournier, L.R.; Wilson, G.F.; Swain, C.R. Electrophysiological, behavioral, and subjective indexes of workload when performing multiple tasks: Manipulations of task difficulty and training. Int. J. Psychophysiol. 1999, 31, 129–145. [Google Scholar] [CrossRef]
  49. Jaquess, K.J.; Lo, L.C.; Oh, H.; Lu, C.; Ginsberg, A.; Tan, Y.Y.; Gentili, R.J. Changes in mental workload and motor performance throughout multiple practice sessions under various levels of task difficulty. Neuroscience 2018, 393, 305–318. [Google Scholar] [CrossRef]
  50. Carswell, C.M.; Clarke, D.; Seales, W.B. Assessing mental workload during laparoscopic surgery. Surg. Innov. 2005, 12, 80–90. [Google Scholar] [CrossRef]
  51. Liu, Y.; Wickens, C.D. Mental workload and cognitive task automaticity: An evaluation of subjective and time estimation metrics. Ergonomics 1994, 37, 1843–1854. [Google Scholar] [CrossRef]
  52. LaViola, J.J., Jr. A discussion of cybersickness in virtual environments. ACM Sigchi Bull. 2000, 32, 47–56. [Google Scholar] [CrossRef]
  53. Davis, S.; Nesbitt, K.; Nalivaiko, E. A systematic review of cybersickness. In Proceedings of the 2014 Conference on Interactive Entertainment, Newcastle, Australia, 2–3 December 2014; pp. 1–9. [Google Scholar] [CrossRef]
  54. R Core Team. R: A Language and Environment for Statistical Computing; R Foundation for Statistical Computing, 2013. Available online: https://www.scirp.org/(S(i43dyn45teexjx455qlt3d2q))/reference/ReferencesPapers.aspx?ReferenceID=1787696 (accessed on 9 September 2021).
  55. Koessler, L.; Maillard, L.; Benhadid, A.; Vignal, J.P.; Felblinger, J.; Vespignani, H.; Braun, M. Automated cortical projection of EEG sensors: Anatomical correlation via the international 10–10 system. Neuroimage 2009, 46, 64–72. [Google Scholar] [CrossRef]
  56. Carrasco, M.C.; Bernal, M.C.; Redolat, R. Time estimation and aging: A comparison between young and elderly adults. Int. J. Aging Hum. Dev. 2001, 52, 91–101. [Google Scholar] [CrossRef]
  57. Meaux, J.B.; Chelonis, J.J. Time perception differences in children with and without ADHD. J. Pediatr. Heal. Care 2003, 17, 64–71. [Google Scholar] [CrossRef]
  58. Welch, P. The use of fast Fourier transform for the estimation of power spectra: A method based on time averaging over short, modified periodograms. IEEE Trans. Audio Electroacoust. 1967, 15, 70–73. [Google Scholar] [CrossRef] [Green Version]
  59. Nussbaumer, H.J. The fast Fourier transform. In Fast Fourier Transform and Convolution Algorithms; Springer: Berlin/Heidelberg, Germany, 1981; pp. 80–111. [Google Scholar] [CrossRef]
  60. Alkan, A.; Kiymik, M.K. Comparison of AR and Welch methods in epileptic seizure detection. J. Med. Syst. 2006, 30, 413–419. [Google Scholar] [CrossRef]
  61. Virtanen, P.; Gommers, R.; Oliphant, T.E.; Haberland, M.; Reddy, T.; Cournapeau, D.; van Mulbregt, P. SciPy 1.0: Fundamental algorithms for scientific computing in Python. Nat. Methods 2020, 17, 261–272. [Google Scholar] [CrossRef] [Green Version]
  62. Lee, J.C.; Tan, D.S. Using a low-cost electroencephalograph for task classification in HCI research. In Proceedings of the 19th Annual ACM Symposium on User Interface Software and Technology, Montreux, Switzerland, 15–18 October 2006; pp. 81–90. [Google Scholar] [CrossRef]
  63. Grimes, D.; Tan, D.S.; Hudson, S.E.; Shenoy, P.; Rao, R.P. Feasibility and pragmatics of classifying working memory load with an electroencephalograph. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, Florence, Italy, 5–10 April 2008; pp. 835–844. [Google Scholar] [CrossRef]
  64. Fernández, T.; Harmony, T.; Rodríguez, M.; Reyes, A.; Marosi, E.; Bernal, J. Test-retest reliability of EEG spectral parameters during cognitive tasks: I absolute and relative power. Int. J. Neurosci. 1993, 68, 255–261. [Google Scholar] [CrossRef] [PubMed]
  65. Moretti, D.V.; Babiloni, C.; Binetti, G.; Cassetta, E.; Dal Forno, G.; Ferreric, F.; Rossini, P.M. Individual analysis of EEG frequency and band power in mild Alzheimer′s disease. Clin. Neurophysiol. 2004, 115, 299–308. [Google Scholar] [CrossRef] [Green Version]
  66. Bakdash, J.Z.; Marusich, L.R. Repeated measures correlation. Front. Psychol. 2017, 8, 456. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  67. Thomas, E.A.C.; Weaver, W.B. Cognitive processing and time perception. Percept. Psychophys. 1975, 17, 363–367. [Google Scholar] [CrossRef]
  68. Zakay, D. Subjective time and attentional resource allocation: An integrated model of time estimation. Adv.Psychol. 1989, 59, 365–397. [Google Scholar] [CrossRef]
  69. Polti, I.; Martin, B.; van Wassenhove, V. The effect of attention and working memory on the estimation of elapsed time. Sci. Rep. 2018, 8, 6690. [Google Scholar] [CrossRef] [Green Version]
  70. Brown, S.W. Timing, Resources, and Interference: Attentional Modulation of Time Perception. In Attention and Time; Oxford University Press: Oxford, UK, 2010; pp. 107–121. [Google Scholar]
  71. Lelis-Torres, N.; Ugrinowitsch, H.; Apolinário-Souza, T.; Benda, R.N.; Lage, G.M. Task engagement and mental workload involved in variation and repetition of a motor skill. Sci. Rep. 2017, 7, 14764. [Google Scholar] [CrossRef] [Green Version]
  72. Gevins, A.; Smith, M.E.; Leong, H.; McEvoy, L.; Whitfield, S.; Du, R.; Rush, G. Monitoring working memory load during computer-based tasks with EEG pattern recognition methods. Hum. Factors 1998, 40, 79–91. [Google Scholar] [CrossRef]
  73. Puma, S.; Matton, N.; Paubel, P.V.; Raufaste, É.; El-Yagoubi, R. Using theta and alpha band power to assess cognitive workload in multitasking environments. Int. J. Psychophysiol. 2018, 123, 111–120. [Google Scholar] [CrossRef]
  74. Kulashekhar, S.; Pekkola, J.; Palva, J.M.; Palva, S. The role of cortical beta oscillations in time estimation. Hum. Brain Mapp. 2016, 37, 3262–3281. [Google Scholar] [CrossRef] [Green Version]
  75. Ghaderi, A.H.; Moradkhani, S.; Haghighatfard, A.; Akrami, F.; Khayyer, Z.; Balcı, F. Time estimation and beta segregation: An EEG study and graph theoretical approach. PLoS ONE 2018, 13, e0195380. [Google Scholar] [CrossRef]
  76. Wiener, M.; Parikh, A.; Krakow, A.; Coslett, H.B. An intrinsic role of beta oscillations in memory for time estimation. Sci. Rep. 2018, 8, 7992. [Google Scholar] [CrossRef] [Green Version]
  77. Lecoutre, L.A.; Lini, S.; Lebour, Q.; Bey, C.; Favier, P. Evaluating EEG measures as a workload assessment in an operational video game setup. In Proceedings of the PhyCS 2015—2nd International Conference on Physiological Computing Systems, Loire Valley, France, 11–13 February 2015; pp. 112–117. Available online: https://www.scitepress.org/papers/2015/53189/53189.pdf (accessed on 30 September 2021).
  78. Espinosa-Fernández, L.; Miró, E.; Cano, M.; Buela-Casal, G. Age-related changes and gender differences in time estimation. Acta Psychol. 2003, 112, 221–232. [Google Scholar] [CrossRef]
  79. Wessel, J.R. Error awareness and the error-related negativity: Evaluating the first decade of evidence. Front. Hum. Neurosci. 2012, 6, 88. [Google Scholar] [CrossRef] [Green Version]
  80. Pezzetta, R.; Nicolardi, V.; Tidoni, E.; Aglioti, S.M. Error, rather than its probability, elicits specific electrocortical signatures: A combined EEG-immersive virtual reality study of action observation. J. Neurophysiol. 2018, 120, 1107–1118. [Google Scholar] [CrossRef]
  81. Sutcliffe, A.G.; Poullis, C.; Gregoriades, A.; Katsouri, I.; Tzanavari, A.; Herakleous, K. Reflecting on the design process for virtual reality applications. Int. J. Hum. Comput. Interact. 2019, 35, 168–179. [Google Scholar] [CrossRef]
Figure 1. (a) The low-complexity task with 8 pieces of the VR Jigsaw game; (b) the high-complexity task with 18 pieces of the VR Jigsaw game.
Figure 1. (a) The low-complexity task with 8 pieces of the VR Jigsaw game; (b) the high-complexity task with 18 pieces of the VR Jigsaw game.
Applsci 11 09779 g001
Figure 2. The diagram of the experiment (duration of each step is approximate).
Figure 2. The diagram of the experiment (duration of each step is approximate).
Applsci 11 09779 g002
Figure 3. Interaction plot with the absolute time estimation error (task complexity x block sequence). Error bars denote 95% confidence intervals.
Figure 3. Interaction plot with the absolute time estimation error (task complexity x block sequence). Error bars denote 95% confidence intervals.
Applsci 11 09779 g003
Figure 4. Interaction plots with the relative beta-band power at (a) Fz, (b) Pz, and (c) Cz (task complexity x block sequence). Error bars denote 95% confidence intervals.
Figure 4. Interaction plots with the relative beta-band power at (a) Fz, (b) Pz, and (c) Cz (task complexity x block sequence). Error bars denote 95% confidence intervals.
Applsci 11 09779 g004
Figure 5. Interaction plot with NASA-TLX scores (task complexity x block sequence). Error bars denote 95% confidence intervals.
Figure 5. Interaction plot with NASA-TLX scores (task complexity x block sequence). Error bars denote 95% confidence intervals.
Applsci 11 09779 g005
Table 1. Independent and dependent Variables.
Table 1. Independent and dependent Variables.
VariablesLevels
Independent variablesTask complexity (low, high)
Block sequence (first, second, third)
Dependent variablesAbsolute time estimation error
Relative beta-band power
The NASA-TLX score
Table 2. Descriptive statistics of the measurements.
Table 2. Descriptive statistics of the measurements.
ResponsesLow ComplexityHigh Complexity
Mean95% CIMean95% CI
Task completion time47.74[45.00, 50.47]140.69[130.98, 150.40]
Estimated time56.78[50.32, 63.24]168.07[151.70, 184.44]
Absolute time estimation error18.95[14.92, 22.99]56.87[46.86, 66.89]
NASA-TLX score3.35[2.91, 3.79]5.56[5.04, 6.08]
Relative beta-band power at Cz0.2[0.19, 0.20]0.19[0.18, 0.20]
Relative beta-band power at Fz0.18[0.17, 0.19]0.17[0.16, 0.18]
Relative beta-band power at Pz0.2[0.19, 0.21]0.19[0.19, 0.20]
Table 3. Correlation coefficients between dependent variables.
Table 3. Correlation coefficients between dependent variables.
VariablesCoefficientsp-Value
Absolute time estimation error and relative beta-band power at Cz−0.250.01 *
Absolute time estimation error and relative beta-band power at Fz−0.160.11
Absolute time estimation error and relative beta-band power at Pz−0.170.08
Absolute time estimation error and NASA-TLX score0.55<0.001 ***
* p < 0.05, *** p < 0.001.
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Share and Cite

MDPI and ACS Style

Li, J.; Kim, J.-E. The Effect of Task Complexity on Time Estimation in the Virtual Reality Environment: An EEG Study. Appl. Sci. 2021, 11, 9779. https://doi.org/10.3390/app11209779

AMA Style

Li J, Kim J-E. The Effect of Task Complexity on Time Estimation in the Virtual Reality Environment: An EEG Study. Applied Sciences. 2021; 11(20):9779. https://doi.org/10.3390/app11209779

Chicago/Turabian Style

Li, Jiaxin, and Ji-Eun Kim. 2021. "The Effect of Task Complexity on Time Estimation in the Virtual Reality Environment: An EEG Study" Applied Sciences 11, no. 20: 9779. https://doi.org/10.3390/app11209779

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop