Next Article in Journal
Applying Bounding Techniques on Grammatical Evolution
Previous Article in Journal
A Novel Hybrid Vision Transformer CNN for COVID-19 Detection from ECG Images
Previous Article in Special Issue
Reviving Antiquity in the Digital Era: Digitization, Semantic Curation, and VR Exhibition of Contemporary Dresses
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

User Experience in Neurofeedback Applications Using AR as Feedback Modality

by
Lisa Maria Berger
1,
Guilherme Wood
1,2 and
Silvia Erika Kober
1,2,*
1
Department of Psychology, University of Graz, Universitaetsplatz 2/III, 8010 Graz, Austria
2
BioTechMed-Graz, 8010 Graz, Austria
*
Author to whom correspondence should be addressed.
Computers 2024, 13(5), 110; https://doi.org/10.3390/computers13050110
Submission received: 30 December 2023 / Revised: 19 March 2024 / Accepted: 16 April 2024 / Published: 23 April 2024
(This article belongs to the Special Issue Extended or Mixed Reality (AR + VR): Technology and Applications)

Abstract

:
Neurofeedback (NF) is a brain–computer interface in which users can learn to modulate their own brain activation while receiving real-time feedback thereof. To increase motivation and adherence to training, virtual reality has recently been used as a feedback modality. In the presented study, we focused on the effects of augmented reality (AR) based visual feedback on subjective user experience, including positive/negative affect, cybersickness, flow experience, and experience with the use of this technology, and compared it with a traditional 2D feedback modality. Also, half of the participants got real feedback and the other half got sham feedback. All participants performed one NF training session, in which they tried to increase their sensorimotor rhythm (SMR, 12–15 Hz) over central brain areas. Forty-four participants received conventional 2D visual feedback (moving bars on a conventional computer screen) about real-time changes in SMR activity, while 45 participants received AR feedback (3D virtual flowers grew out of a real pot). The subjective user experience differed in several points between the groups. Participants from the AR group received a tendentially higher flow score, and the AR sham group perceived a tendentially higher feeling of flow than the 2D sham group. Further, participants from the AR group reported a higher technology usability, experienced a higher feeling of control, and perceived themselves as more successful than those from the 2D group. Psychological factors like this are crucial for NF training motivation and success. In the 2D group, participants reported more concern related to their performance, a tendentially higher technology anxiety, and also more physical discomfort. These results show the potential advantage of the use of AR-based feedback in NF applications over traditional feedback modalities.

1. Introduction

Neurofeedback (NF) is a technique to teach users how to modulate neuronal signals, i.e., their own brain activation. Most commonly, the electroencephalogram (EEG) is used to record brain activation, which is then pre-processed in real time to extract relevant features of the brain signal that should be modulated in a desired direction during NF training. For instance, the sensorimotor rhythm (SMR) between 12 and 15 Hz, which is strongest over central brain areas, is often used as a feedback frequency in NF applications since this EEG rhythm is associated with optimal cognitive processing [1,2]. So, if the aim of the NF training is to improve cognitive functions (e.g., memory functions), the SMR can be extracted in real time from the ongoing EEG of the NF user, and finally, changes in SMR activity are fed back visually, auditorily or in a tactile manner to the users. When obtaining, for instance, visual feedback of changes in one’s own SMR activity, NF users can try different mental strategies to voluntarily increase this brain signal [3]. If the users are successful in up-regulating their SMR rhythm, cognitive improvements may follow [1].
Most NF applications use visual feedback. Conventional visual feedback designs are rather simple in their design and restricted to the presentation on a computer screen. Although such simple feedback designs clearly present the training task without any distraction (e.g., the circle in the middle of the screen should become larger), participants might become bored over the course of several training sessions. It has been shown in previous NF studies that psychological factors such as motivation and attention are paramount for NF training success [4,5,6]. Thus, more and more approaches to using more entertaining or visually appealing feedback, such as game-like feedback scenarios or virtual reality (VR)-based 3D feedback, can be found in the literature [7,8]. Instead of regulating the height of objects on a 2D screen, participants need to alter features in a whole virtual environment, move through a 3D forest environment [8], modify adaptive VR games with NF [9], or be placed in an inviting virtual beach environment [10]. Results of VR-based feedback are promising, showing higher training motivation in a variety of populations, such as stroke patients or healthy young participants, who show better NF training results in VR groups compared to conventional 2D groups. However, there are still unanswered questions, such as who is the right target group for VR-based feedback [7,11], can VR-based NF be used for home-based training [12], or what is the optimal VR technology (e.g., head-mounted display vs. 3D presentation on a screen). Also, VR proposes problems of transferability of training results to real-world settings on the one hand, and on the other hand, it entails the problem of cybersickness (CS) for the users. Using visually rich VR scenarios as feedback screens, such as a forest environment, might also distract the attention of the NF users from the NF task or lead to cognitive overload, which might be counterproductive for the modulation of the neuronal signals during NF training [13]. However, the majority of VR-based NF training studies report positive effects [8,14]. Also, in BCI training settings, fewer classification errors are shown when VR paradigms are used [15]. Generally, there are lots of interesting and creative ways to realize BCI and neurofeedback paradigms in a fun and game-like manner. Suggestions are proposed in previous research [16,17].
While the number of VR-based NF training studies is steadily increasing, augmented reality-based NF is still very new in this field. Augmented reality (AR) describes the extension of the real world by adding virtual objects [18]. It is closer to reality than VR approaches [19,20], which is why there are fewer visual distractions compared to VR. This can lead to better focus on the NF task at hand, when working memory is not overloaded by irrelevant virtual stimuli, which has been found multiple times in the literature on game-based learning [13]. AR-based feedback can also be easily implemented in home-based training settings, as many smartphones and tablets can already implement AR, see list: https://developers.google.com/ar/devices (accessed on 15 April 2024). Some research groups focused on the first development and implementation of AR features. A group focusing on autism spectrum disorder built an AR smartphone app in a dynamic participant-driven process, called Eggly. It works with consumer-grade EEG systems for neurofeedback purposes and has been tested on five children with autism [21]. It was efficiently used to increase impaired social attention functionalities in children with autism. Other research groups tested AR feedback effects on meditation with an iPhone AR application. Participants would hatch butterflies in the room by meditating [22], either with or without EEG-NF.
However, there are only a few studies using AR in NF research, comparing it with traditional feedback. One of the previous studies used a webcam to overlay virtual brains on the participants as an AR paradigm, and as a control group, they used a temporal gauge as 2D feedback [23]. It was reported to be more engaging than traditional feedback but, at the same time, more complex by the participants. Further, a steady-state visual evoked potential (SSVEP) AR-NF set-up has been used to influence affect-biased attention in five adolescent girls. Here, a Gabor patch was projected onto a wall that participants would see through AR-glasses [24].
Several researchers propose the need for more research in this field, especially when it comes to user experience [25]. What is still lacking are studies including sham feedback and the inclusion of a broader set of questionnaires to evaluate the subjective user experience more thoroughly.
Prior VR studies showed that a VR-based feedback condition can lead to negative emotional reactions such as fear or lower values in mastery confidence compared to 2D simple feedback screens [11]. The appearance of such negative emotional responses when using VR as a feedback modality in NF training might be related to the prior experience with this technology, age, or medical conditions. For instance, in the study by Kober et al. stroke patients showed such negative emotional reactions when using VR-based NF training [11]. Further, CS can also be a problem in AR environments. Previous research could show an association with oculomotor problems with higher experiences of eyestrain, blurred vision, etc. using the HoloLens [26]. It remains open whether this is transferable to other AR systems as well as other, more static paradigms. Hence, when using a new technology, such as AR as feedback technology in NF applications, it is mandatory to evaluate the subjective user experience in close detail.
Therefore, we conducted a study comparing the user experience of participants undergoing one session of SMR-based NF training, either receiving conventional 2D feedback or AR feedback, as well as real or sham feedback. Here, our main focus was to see whether AR feedback in general would lead to a different user experience than 2D feedback, independently from more possible adapted or complex designs. Therefore, we chose a paradigm related to the traditional 2D paradigms, showing virtual plants growing and shrinking out of actual plant pots.
We expected to observe group differences in the user experience of how the technology and individual performance after training are perceived, subjective feelings of flow or presence, and CS. In line with previous VR-based NF training studies [27], we assessed mood, CS, flow, participants’ attitude (e.g., technology acceptance, use, fear, curiosity, interest, skepticism) towards VR technology, subjective VR experience (immersion), subjectively experienced level of control and concentration during NF training, perceived success, fun, as well as physical conditions such as pressure on the head that can occur when wearing the VR headset over the EEG electrodes [27]. In the present AR-based NF study, we also compared the user experience of a group receiving one session of AR-based NF training with that of a group receiving one session of 2D conventional NF training.
Finally, in order to investigate whether the results of subjective questionnaire data are associated with NF performance, we calculated correlations between the subjective experience and the SMR power over the six feedback runs.

2. Methods

2.1. Participants

Ninety-nine healthy young adults participated in this study. Due to poor EEG data quality and problems with the paradigm, we had to exclude some of the datasets. Eighty-nine participants remained for analysis. N = 44 participants performed the 2D NF task, and N = 45 participants performed the AR NF task (see Table 1). Participants from both groups had comparable previous VR experience. All volunteers had normal or corrected-to-normal vision, no neurological, psychological, or other severe diseases, as well as no reflex epilepsy. Other inclusion criteria were right-handedness and an age range between 18 and 34 years. Participants were recruited through e-mail distribution lists at our university as well as student forums. They gave written informed consent and were either paid for their participation (EUR 16) or received research credit hours for their Psychology Bachelor program.

2.2. Neurofeedback Training: 2D vs. AR Feedback

All participants performed SMR-based EEG-NF training, where half of the participants received their actual brain activation fed back during training and the other half received sham feedback. Here, brain activation from participants of another study undergoing similar VR-NF training was fed back [8]. Also, half of the participants received 2D feedback on a computer screen, and the other half received AR feedback. In the 2D feedback, participants would see three green bars on a dark blue background moving up and down according to changes in brain activation (see Figure 1A). The two outer bars (representing Theta, 4–7 Hz, and Beta, 16–30 Hz frequency bands) needed to be held as low as possible, and the middle bar (representing SMR, 12–15 Hz) should be held as high as possible. In the AR-group, participants would have three real plant pots placed on a table in front of them. Here, participants should let virtual flowers grow out of those real plant pots with their brain activation (see Figure 1B). The instructions were the same as in the 2D group. The two outer plants were represented as simple ranks, representing Theta and Beta, and the plant in the middle would show a rose bush and represent SMR, and the instruction was the same as in the 2D group. Participants were randomly assigned to one of the four groups.
The NF training consisted of one initial baseline run of 3 min and six subsequent training runs of 3 min each. During the baseline run, participants were instructed to relax and watch the feedback object move automatically without trying to control it. After the baseline run, individual thresholds for the subsequent NF runs were calculated. The SMR threshold was calculated as the median SMR power value during the three minutes of the baseline run. The Beta and Theta thresholds were calculated to be the median plus the median absolute deviation and were also fed back to the users to control artifacts such as blinking and eye movements (Theta) or facial movements (Beta) [11,28]. The electrode Cz was used as the feedback electrode. During training, participants were instructed to increase the bar/flower in the middle as much as possible while at the same time keeping the two outer bars/flowers as low as possible. This state can be achieved by being physically relaxed and mentally focused, which is the aim of the training program. The sham group got feedback on changes in brain activation from another person from another study [8] with the same instructions, but only for SMR. Changes in Beta and Theta were linked to their own EEG activity to make sure artifacts were controlled and participants got the feeling the feedback was real.

2.3. AR-Technology

The AR paradigm was presented via the HTC Vive Pro VR System and the ZED mini stereoscopical camera from Stereolabs, SDK Unity Plugin version 3.8.0. VR environments were created in Unity, Version 2020.3.30f1 (see Figure 2). For superimposing the virtual objects in the AR setting via QR codes, the free Unity trial version of OpenCV (version 2.4.8) was used. For real-time EEG data streaming, the LSL4Unity plugin, freely available at https://github.com/labstreaminglayer/LSL4Unity (accessed on 15 April 2023) was used in combination with OpenViBE, Version 3.3.0. OpenViBE is a software program to stream and preprocess EEG data in real-time. The framerate of both the camera and computer screen were set to 60 FPS. Even though the 2D group got their paradigm presented on a computer screen, they also had to wear the VR-AR system to rule out any group differences related to wearing the system, such as headaches or pressure sensations due to wearing the whole system. Here, the camera was simply switched on, so participants would also see their surroundings through the camera. So, the only difference to the AR group was that the AR group would see virtual objects appearing in the lab environment, whereas in the 2D group, participants would actually see the paradigm on the screen.

2.4. EEG Data Recording and Analysis

EEG data was recorded with the g.tec USBamp RESEARCH EEG-amplifier from g.tec medical engineering GmbH and the open-source software OpenViBE Version 3.3.0. Sixteen sintered Ag/AgCl passive ring electrodes were placed according to the standardized 10–20 EEG system, and the Cz electrode was used to give feedback. The ground was placed at FPz, and electrodes were directly referenced against the left mastoid. Offline data processing was performed in the Brain Vision Analyzer (version 2.2, Brain Products GmbH, Munich, Germany). A 50 Hz notch filter was applied, as well as a low cut-off filter of 0.01 and a high cutoff filter of 100 Hz. During the raw data inspection, big muscle artifacts were excluded, as were heavy drifts. A linked mastoid reference was calculated to rule out hemisphere artifacts, as data was referenced to the left mastoid during data recording. In the next step, blinks and eye movements were eliminated via a semi-automatic independent component analysis (ICA). Remaining artifacts were excluded in a second semi-automatic data inspection (criteria for rejection: Maximum allowed voltage step of 50 µV/ms, maximum allowed difference between values in a segment was 200 µV, amplitudes ± 120 µV, lowest allowed activity in 100 ms intervals was 0.5 µV, artifacts were marked 200 ms before and after emergence). Lastly, the frequency power band in the range of 12–15 Hz (SMR) was extracted using complex demodulation.

2.5. Questionnaires

2.5.1. PANAS

To assess positive and negative affect before and after the NF task, we used the German version of the Positive and Negative Affect Schedule (PANAS) [29]. Ten items assess positive affect (e.g., excited, inspired), and ten items assess negative affect (e.g., upset, afraid) on a five-point Likert scale, ranging from 1 = “Not at all” to 5 = “Extremely”, of which the average values were calculated per scale, respectively. For statistical analyses, we calculated the difference values between the PANAS post (assessed after the NF task) minus the PANAS pre (assessed before the NF task) values. Positive difference values indicate an increase in affective responses during the NF task. Several previous studies used the PANAS to evaluate positive and negative mood as well as mood changes in their participants during a BCI-VR task [30,31].

2.5.2. SSQ

CS symptoms were assessed with the Simulator Sickness Questionnaire (SSQ), a self-report questionnaire [32], before and after the NF training. Possible sickness symptoms such as nausea, sweating, eyestrain, fatigue, etc. have to be rated on a four-point Likert scale (0 = “not at all” to 3 = “very strong”). Items were merged into three different sub-scales: Nausea (e.g., sweating, nausea), oculomotor problems (e.g., eyestrain), and disorientation (e.g., dizziness). The total score includes all 16 items. The average of the raw scores for the items per sub-scale was then multiplied by a constant, which differs for each sub-factor [32]. For statistical analyses, we calculated the difference value between the SSQ post (assessed after the NF task) minus the SSQ pre (assessed before the NF task) values. Positive difference values indicate an increase in CS symptoms during the NF task. The SSQ is a standard inventory in VR and AR studies [33].

2.5.3. FSK

The Flow Short Scale (original German title: “Flow-Kurzskala”, FKS) was used to assess participants’ flow experience during the NF training in accordance with Csikszentmihalyi’s flow theory [34]. It measures the perceived fluency of a task (subscale “fluency”, 6 items, α = 0.92), immersion and absorption (subscale “absorption”, 4 items, α = 0.80), and concern about the task (subscale “concern”, 3 items, α = 0.80 to 0.90) on a 7-point Likert scale ranging from 1 = “strong disagreement” to 7 = “strong agreement” with the statement (averages of the items per sub-scale were calculated for statistical analyses). Additionally, the average of the sub-scales “fluency” and “absorption” was calculated to get a general flow factor [35]. Participants were asked, among others, whether or not they had a hard time focusing and whether they forgot their surroundings during the training. The questionnaire was chosen because flow was found to be positively associated with NF performance [14].

2.5.4. TUI

To assess the participants’ attitude (e.g., technology acceptance, use) towards VR technology, we used a shortened version of the Technical Usage Inventory (TUI) [36], which consists of 23 items. The items are merged (average of all items per sub-scale) into six sub-scales: Curiosity, anxiety, interest, immersion, skepticism, and user-friendliness. Curiosity and anxiety were assessed before the NF training, the other sub-scales were assessed after the NF training. Participants answered each item on a 7-point Likert scale ranging from 1 = “not at all” to 7 = “totally”. Example questions would be, whether participants would worry that newer technological devices could be overwhelming, if they are curious to use an AR system, and if they would have a hard time trusting technical devices. This questionnaire was chosen because factors such as technology anxiety can lead to worse training performance [6], and immersion [37] is positively associated.

2.5.5. VAS

To evaluate subjective well-being during the VR-EEG measurements, we used a self-constructed visual analog scale to evaluate whether any pain appeared due to the VR goggles and electrodes during the experiment. Participants had to set a point on a continuous line between 0 and 100 to evaluate discomfort levels, where higher values mean a higher experience of pain/discomfort. It was, for example, asked whether pressure on the head or headache was experienced throughout the experiment, with the opportunity to explain it in more detail. This questionnaire was already used in several previous studies [27].

2.5.6. Statistical Analysis

Statistical analyses were performed using R version 4.2.1. To calculate the group differences of the feedback group (AR vs. 2D) and condition (real vs. sham) concerning the respective questionnaire outcomes, ANOVAs4 with two between-subject factors were calculated. For significant interaction effects, TukeyHSD post-tests were calculated.
To investigate whether results of subjective questionnaire data were associated with NF performance, we calculated correlations for all four groups between questionnaire data and NF performance (individual regression slopes with feedback run number as a predictor and SMR power as a criterion) for questionnaire data where significant group differences emerged to keep the number of single correlation analyses to a minimum. For the individual regression slopes, positive values indicate a linear increase in SMR power, and negative values indicate a linear decrease in SMR power across the feedback runs.

3. Results

Table 2 summarizes the means and SD of questionnaire data and the results of the statistical comparisons.
PANAS. Participants from all four groups had comparable values of positive affect. There was a significant group difference between real and sham feedback concerning negative affect, with a lower decrease in values pre- and post-training in the real feedback group (F(1,81) = 5.87, p = 0.018, η2p = 0.04) compared to the sham feedback group.
SSQ. There was a tendential difference between real and sham feedback in nausea values (F(1,82) = 3.22, p = 0.077, η2p = 0.05), with higher values in the real group compared to the sham group. Also, there was a tendential interaction effect between task and condition (F(1,82) = 3.65, p = 0.060, η2p = 0.04), with higher nausea values in the AR real feedback group compared to the AR sham feedback group (see Table 2).
Flow. Participants in the AR group reported a tendentially higher flow than the 2D group (F(1,85) = 3.37, p = 0.070, η2p = 0.04) and a tendential interaction effect with feedback group and condition, with a higher subjective experience of fluency during the NF training in the AR sham group compared to the 2D sham group (F(1,85) = 3.94, p = 0.050, η2p = 0.04). Participants from the 2D group reported higher concerns during the training compared to the AR group (F(1,82) = 4.24, p = 0.043, η2p = 0.05). Also, there was a tendentially higher total flow score in the AR group (F(1,84) = 3.95, p = 0.050, η2p = 0.04). There were no differences between sham and real feedback.
TUI. Results show that participants in the 2D group experienced a tendency of higher technology anxiety compared to the AR group (F(1,84) = 3.78, p = 0.055, η2p = 0.04). Further, the AR group evaluates a higher usability of the technology than the 2D group (F(1,83) = 9.73, p = 0.003, η2p = 0.10). However, they rated the technology significantly as less accessible than the 2D group (F(1,83) = 5.20, p = 0.025, η2p = 0.06). There were no differences between sham and real feedback.
Questionnaire on Experience. Participants of the AR group reported a significantly higher subjective feeling of control during the NF task (F(1,85) = 10.17, p = 0.002, η2p = 0.11), as well as a higher perceived success compared to the 2D group (F(1,85) = 8.26, p = 0.005, η2p = 0.09). There were no differences between sham and real feedback.
Pain/Discomfort. Participants in the 2D group report higher feelings of discomfort. They experience more headaches compared to the participants of the AR group (F(1,74) = 4.83, p = 0.031, η2p = 0.06). There were no differences between sham and real feedback.
For questionnaire data, where the ANOVAs revealed significant group differences, correlation analyses were performed between the NF performance (SMR regression slopes) and the questionnaire data for all four groups. A positive association in the AR real feedback group was found for NF performance and accessibility (r = 0.57, p = 0.003).

4. Discussion

In the present study, we evaluated the differences in user experience during a 2D vs. AR NF-training session. The user experience of the participants differed in several points. Participants from the AR group reported a higher subjective feeling of flow, perceived a higher technology usability, experienced a higher feeling of control, and perceived themselves as more successful than those from the 2D group. Psychological factors like this are crucial for NF training motivation and success. In the 2D group, participants reported more concern related to their performance, a higher level of technology anxiety, and also more physical discomfort.
Participants from the real feedback group experienced tendentially more nausea symptoms than those from the sham feedback group. However, it is not the case that the participants from the real feedback group showed a strong increase in nausea symptoms but had comparable levels pre- and post-training, whereas the sham feedback group showed a slight decrease in symptoms. Participants receiving real feedback also had a more comparable negative effect pre- and post-training, while those receiving sham feedback showed a slight decrease after the training. This could also be a result of the slightly higher nausea values of participants in the real feedback group. Also, there is a tendential interaction effect with the feedback group (AR vs. 2D), with higher values in the AR real group compared to the AR sham group. This is contrary to previous studies that show task control can reduce sickness symptoms [38]. It could be that participants from the sham group became more passive during the training because they could not control the paradigm, and participants from the AR paradigm were more engaged and hence more affected by nausea symptoms. However, this interaction is only a tendency and should not be interpreted too much. The groups (AR vs. 2D) did not differ in their experience of CS. CS is a major problem in VR, where up to 80% of the users are affected [33,39]. AR has the advantage that CS is less prevalent among users. Here, the symptom profiles of VR and AR differ. For VR Disorientation > Nausea > Oculomotor problems are most apparent; during AR studies, a pattern of Oculomotor problems > Disorientation > Nausea is described by users [26]. A pattern, which we could also find in our groups, and as both groups wore the AR set-up, the experiences between groups were comparable. This means that symptoms such as eye strain and fatigue are more prevalent than disorientation and sickness in AR systems. AR cameras can vary in their frame rates so that participants experience a small time lag when moving the head, which can be especially straining for the eyes and results in a feeling of sickness [40]. However, in the present NF task, participants did not have to move their heads. They only had to watch objects moving up and down. This might explain why CS symptoms were relatively low in the present study and also between both groups. However, as it could be the case that the AR plants would wiggle when participants moved their heads, which in turn could lead to eye strain or dizziness, we nevertheless compared both groups. The exposure time was also only moderate, with about 21 min in total (7 × 3 min). Previous studies on AR and CS suggest an exposure time of about 20 min for a good user experience [26]. In summary, it can be said that only minor CS symptoms have occurred with the current AR set-up.
Participants in the 2D group reported more physical discomfort during the NF training compared to the AR group. The 2D group reported a stronger headache than the AR group. In our study, both groups wore the VR headset above the EEG electrodes during NF training to reach comparable technical conditions. Prior VR-based NF studies using the same VR headset mounted over EEG electrodes also reported some sort of physical discomfort during the NF training. Berger et al. found that mainly female participants report such negative physical conditions while wearing the VR headset during NF training [27]. In the present study, a lower subjectively experienced physical discomfort in the AR group during NF training compared to the 2D group might be related to a stronger involvement or engagement in the task [41,42], which is also reflected in a tendentially higher flow experience in the AR group and a higher feeling of control and perceived success in the AR group compared to the 2D group. Control beliefs have been shown to be important for good NF performance [43], and flow has also been associated with faster learning progress in NF tasks [14]. Such increased task engagement may have reduced participants’ attention to the physical conditions [41,42]. Increased attention to the NF task has already been shown to have a positive effect on NF task performance [4], training motivation [5], and training adherence, suggesting that AR-based NF training could have positive effects on future NF applications.
Although not statistically significant, flow experience during NF training was tendentially higher in the AR group than in the 2D group. The AR feedback was experienced as more fluent than the 2D condition, showing that the movement of the virtual flower could be pursued more smoothly than the movement of the 2D bars. A smooth pursuit of the visual feedback might be beneficial for NF control, which again might be related to the higher level of control subjectively perceived during NF training in the AR group. Previous research could show that this also corresponds to an improved NF training performance. Gruzelier et al. compared a VR to a 2D setting in SMR NF training in order to improve acting performance [14]. They reported higher subjective feelings of flow in the VR training group and also a quicker SMR increase during the NF training, which in turn had beneficial effects on the acting performance of the participants.
The 2D condition led to more negative reactions, such as tendentially higher technology fear, a reduced usability rating, and a higher concern about the technology. Prior VR-based NF studies report such increased negative reactions, especially in patient populations of higher age with limited prior experience with VR technology. In the present study, we tested healthy young adults with comparable prior VR experience. We might only speculate about the reasons for the differences in fear and concern between groups. The 2D group saw the moving bars on a computer screen through the VR headset and wore only the VR headset to achieve comparable technical conditions between the groups during the NF training. Hence, the VR headset had no function in the 2D condition. Some participants also asked why they were wearing the VR headset, even though it had no function. This could have led to irritation in the 2D group and could be the reason for the increased reporting of fear and anxiety and the reduced usability rating. This may have contributed to inflating the difference observed between groups. Future AR-based NF training studies should compare the AR condition to a traditional 2D feedback set-up without a VR headset. Also, as participants from the 2D group felt less control over the paradigm and tendentially less flow, this might also explain why they rated the technology as less usable than the AR group, who felt more in control. This might also add to the irritation they reported, which could have led to the higher fear and anxiety levels.
Although the 2D group seemed to be irritated by wearing the VR headset, they rated the accessibility of the 2D NF task higher than the AR group. The participants probably found the development and implementation of AR-based feedback more complex and time-consuming than 2D feedback.
In contrast, the AR condition led to more positive reactions, such as a higher degree of subjective control during NF training and a greater perception of success. Our results regarding more positive reactions in the AR group are in line with previous research on this topic. A meditation study with participants having anxiety and/or depression could show an increased positive mood after undergoing an AR meditation intervention [22]. However, the AR and 2D groups did not differ in positive or negative affect as assessed with the PANAS questionnaire.
For questionnaire data, where the ANOVAs revealed significant group differences, correlation analyses were performed between the NF performance (SMR regression slopes) and the questionnaire data. We found a significant positive correlation between NF performance and accessibility in the AR group with real feedback. It has been shown previously that a positive attitude towards the paradigms used, such as interest and motivation, can be beneficial for NF performance [11]. The more accessible a technology appears to be, the more relevant it may feel to its users. Hence, it is important to make participants comfortable with the set-up before the training to enable a good NF performance.

5. Limitations

The study design poses some limitations that need to be considered in future work. First, both groups wore the VR headset with the attached AR camera. While this has the advantage of ruling out group differences related to wearing a system that is rather heavy to wear over time, it also poses some problems. Some participants from the 2D non-AR group reported being irritated by wearing a system that was not needed for the training. This could certainly also influence the user experience in an unnatural way. In future studies, one should use the system solely for the groups that actually need it for the task.
Further, we did not ask for previous AR experience. Since previous usage experience can be associated with subjective user experience, such as sickness [44], it would be interesting to have additional information for the interpretation of the results.

6. Conclusions and Implications

In the present study, we showed differences in the user experience of participants undergoing either 2D or AR NF training. Participants from the AR group perceived themselves as more successful and under control, whereas those from the 2D group felt more technology anxiety, rated the technology as less useful, and verbalized more irritation, as for the sake of comparability, the 2D group also wore the AR headset. This aspect of the experimental design has to be taken into consideration in the interpretation of the results. For future studies, it would be interesting to conduct a study where the 2D group would not wear the whole set-up to see whether the negative experiences are bound to this irritation. It seems, however, that AR technology is a promising tool for NF training. Participants only experience little CS and experience more positive psychological factors that are in turn associated with higher NF training success. The AR-supported training actually leads to better NF performance, and the training outcome needs to be investigated in further studies.

Supplementary Materials

The following supporting information can be downloaded at: https://www.mdpi.com/article/10.3390/computers13050110/s1, Supplementary Material A: R Code for statistical analysis of questionnaire data.

Author Contributions

S.E.K. and L.M.B. were involved in designing the research, conceptualization and methodology; S.E.K. was responsible for project administration, S.E.K., G.W. and L.M.B. contributed resources, software, and/or analytic tools; S.E.K. was involved in formal analysis and validation; S.E.K. and L.M.B. were involved in data interpretation; S.E.K. performed visualization of the data; S.E.K. and L.M.B. wrote the original draft. All authors have read and agreed to the published version of the manuscript.

Funding

This research received no external funding. Open Access Funding by the University of Graz.

Informed Consent Statement

All participants gave written informed consent before the start of the measurement. The ethics committee of the University of Graz, Austria, approved all aspects of the present study in accordance with the Declaration of Helsinki (GZ. 39/119/63 ex 2021/22).

Data Availability Statement

Data that support the findings of this study are uploaded as Supplementary Materials.

Acknowledgments

The authors acknowledge the financial support by the University of Graz. Supported by the Field of Excellence COLIBRI (Complexity of Life in Basic Research and Innovation, University of Graz). Also, the authors want to thank Johanna Krauss and Christa Hoefler for data curation.

Conflicts of Interest

The authors declare that they have no competing interests.

References

  1. Gruzelier, J.H. EEG-neurofeedback for optimising performance. I: A review of cognitive and affective outcome in healthy participants. Neurosci. Biobehav. Rev. 2014, 44, 124–141. [Google Scholar] [CrossRef] [PubMed]
  2. Sterman, M.B. Physiological origins and functional correlates of EEG rhythmic activities: Implications for self-regulation. Appl. Psychophysiol. Biofeedback 1996, 21, 3–33. [Google Scholar] [CrossRef]
  3. Kober, S.E.; Witte, M.; Ninaus, M.; Neuper, C.; Wood, G. Learning to modulate one’s own brain activity: The effect of spontaneous mental strategies. Front. Hum. Neurosci. 2013, 7, 695. [Google Scholar] [CrossRef] [PubMed]
  4. Hammer, E.M.; Halder, S.; Blankertz, B.; Sannelli, C.; Dickhaus, T.; Kleih, S.; Müller, K.-R.; Kübler, A. Psychological predictors of SMR-BCI performance. Biol. Psychol. 2012, 89, 80–86. [Google Scholar] [CrossRef]
  5. Kleih, S.C.; Nijboer, F.; Halder, S.; Kübler, A. Motivation modulates the P300 amplitude during brain-computer interface use. Clin. Neurophysiol. 2010, 121, 1023–1031. [Google Scholar] [CrossRef] [PubMed]
  6. Kadosh, K.C.; Staunton, G. A systematic review of the psychological factors that influence neurofeedback learning outcomes. Neuroimage 2019, 185, 545–555. [Google Scholar] [CrossRef]
  7. Ninaus, M.; Kober, S.E.; Friedrich, E.V.; Dunwell, I.; De Freitas, S.; Arnab, S.; Ott, M.; Kravcik, M.; Lim, T.; Louchart, S.; et al. Neurophysiological methods for monitoring brain activity in serious games and virtual environments: A review. IJTEL 2014, 6, 78. [Google Scholar] [CrossRef]
  8. Berger, L.M.; Wood, G.; Kober, S.E. Effects of virtual reality-based feedback on neurofeedback training performance-A sham-controlled study. Front. Hum. Neurosci. 2022, 16, 952261. [Google Scholar] [CrossRef] [PubMed]
  9. Abdessalem, H.B.; Frasson, C. Real-time Brain Assessment for Adaptive Virtual Reality Game: A Neurofeedback Approach. In Brain Function Assessment in Learning; Frasson, C., Kostopoulos, G., Eds.; Springer International Publishing: Cham, Switzerland, 2017; pp. 133–143. [Google Scholar] [CrossRef]
  10. Vourvopoulos, A.; Pardo, O.M.; Lefebvre, S.; Neureither, M.; Saldana, D.; Jahng, E.; Liew, S.-L. Effects of a Brain-Computer Interface with Virtual Reality (VR) Neurofeedback: A Pilot Study in Chronic Stroke Patients. Front. Hum. Neurosci. 2019, 13, 210. [Google Scholar] [CrossRef]
  11. Kober, S.E.; Reichert, J.L.; Schweiger, D.; Neuper, C.; Wood, G. Effects of a 3D Virtual Reality Neurofeedback Scenario on User Experience and Performance in Stroke Patients. In Games and Learning Alliance; Bottino, R., Jeuring, J., Veltkamp, R.C., Eds.; Springer International Publishing: Cham, Switzerland, 2016; pp. 83–94. [Google Scholar] [CrossRef]
  12. Arpaia, P.; Coyle, D.; Esposito, A.; Natalizio, A.; Parvis, M.; Pesola, M.; Vallefuoco, E. Paving the Way for Motor Imagery-Based Tele-Rehabilitation through a Fully Wearable BCI System. Sensors 2023, 23, 5836. [Google Scholar] [CrossRef]
  13. Rey, G.D. A review of research and a meta-analysis of the seductive detail effect. Educ. Res. Rev. 2012, 7, 216–237. [Google Scholar] [CrossRef]
  14. Gruzelier, J.; Inoue, A.; Smart, R.; Steed, A.; Steffert, T. Acting performance and flow state enhanced with sensory-motor rhythm neurofeedback comparing ecologically valid immersive VR and training screen scenarios. Neurosci. Lett. 2010, 480, 112–116. [Google Scholar] [CrossRef] [PubMed]
  15. Ron-Angevin, R.; Díaz-Estrella, A. Brain–computer interface: Changes in performance using virtual reality techniques. Neurosci. Lett. 2009, 449, 123–127. [Google Scholar] [CrossRef] [PubMed]
  16. Lécuyer, A. BCIs and Video Games: State of the Art with the OpenViBE2 Project. In Brain–Computer Interfaces 2; Clerc, M., Bougrain, L., Lotte, F., Eds.; Wiley: Hoboken, NJ, USA, 2016; pp. 85–99. [Google Scholar] [CrossRef]
  17. Marshall, D.; Coyle, D.; Wilson, S.; Callaghan, M. Games, Gameplay, and BCI: The State of the Art. IEEE Trans. Comput. Intell. AI Games 2013, 5, 82–99. [Google Scholar] [CrossRef]
  18. Carmigniani, J.; Furht, B.; Anisetti, M.; Ceravolo, P.; Damiani, E.; Ivkovic, M. Augmented reality technologies, systems and applications. Multimed. Tools Appl. 2011, 51, 341–377. [Google Scholar] [CrossRef]
  19. Milgram, P.; Kishino, F. A Taxonomy of Mixed Reality Visual Displays. IEICE Trans. Inf. Syst. 1994, 12, 1–15. [Google Scholar]
  20. Skarbez, R.; Smith, M.; Whitton, M.C. Revisiting Milgram and Kishino ‘s Reality-Virtuality Continuum. Front. Virtual Real. 2021, 2, 647997. [Google Scholar] [CrossRef]
  21. Lyu, Y.; An, P.; Xiao, Y.; Zhang, Z.; Zhang, H.; Katsuragawa, K.; Zhao, J. Eggly: Designing Mobile Augmented Reality Neurofeedback Training Games for Children with Autism Spectrum Disorder. Proc. ACM Interact. Mob. Wearable Ubiquitous Technol. 2023, 7, 1–29. [Google Scholar] [CrossRef]
  22. Viczko, J.; Tarrant, J.; Jackson, R. Effects on Mood and EEG States After Meditation in Augmented Reality with and without Adjunctive Neurofeedback. Front. Virtual Real. 2021, 2, 618381. [Google Scholar] [CrossRef]
  23. Mercier-Ganady, J.; Lotte, F.; Loup-Escande, E.; Marchal, M.; Lecuyer, A. The Mind-Mirror: See your brain in action in your head using EEG and augmented reality. In Proceedings of the 2014 IEEE Virtual Reality (VR), Minneapolis, MN, USA, 29 March–2 April 2014; IEEE: Minneapolis, MN, USA, 2014; pp. 33–38. [Google Scholar] [CrossRef]
  24. Huang, X.; Mak, J.; Wears, A.; Price, R.B.; Akcakaya, M.; Ostadabbas, S.; Woody, M.L. Using Neurofeedback from Steady-State Visual Evoked Potentials to Target Affect-Biased Attention in Augmented Reality. Annu. Int. Conf. IEEE Eng. Med. Biol. Soc. 2022, 2022, 2314–2318. [Google Scholar] [CrossRef]
  25. Gorman, C.; Gustafsson, L. The use of augmented reality for rehabilitation after stroke: A narrative review. Disabil. Rehabil. Assist. Technol. 2022, 17, 409–417. [Google Scholar] [CrossRef] [PubMed]
  26. Hughes, C.L.; Fidopiastis, C.; Stanney, K.M.; Bailey, P.S.; Ruiz, E. The Psychometrics of Cybersickness in Augmented Reality. Front. Virtual Real. 2020, 1, 602954. [Google Scholar] [CrossRef]
  27. Berger, L.M.; Wood, G.; Neuper, C.; Kober, S.E. Sex Differences in User Experience in a VR EEG Neurofeedback Paradigm. In Games and Learning Alliance; Rosa, F.d., Marfisi Schottman, I., Baalsrud Hauge, J., Bellotti, F., Dondio, P., Romero, M., Eds.; Springer International Publishing: Cham, Switzerland, 2021; pp. 111–120. [Google Scholar] [CrossRef]
  28. Tatum, W.O.; Dworetzky, B.A.; Schomer, D.L. Artifact and recording concepts in EEG. J. Clin. Neurophysiol. 2011, 28, 252–263. [Google Scholar] [CrossRef] [PubMed]
  29. Breyer, B.; Bluemke, M. Deutsche Version der Positive and Negative Affect Schedule (PANAS (GESIS Panel); ZIS-GESIS Leibniz Institute for the Social Sciences: Mannheim, Germany, 2016. [Google Scholar] [CrossRef]
  30. Tarrant, J.; Cope, H. Combining frontal gamma asymmetry neurofeedback with virtual reality: A proof of concept case study. NeuroRegulation 2018, 5, 57–66. [Google Scholar] [CrossRef]
  31. Libriandy, E.; Arlini Puspasari, M. Immersive Virtual Reality and Gamification Evaluation on Treadmill Exercise by Using Electrophysiological Monitoring Device. In Proceedings of the ICIBE 2020: 2020 The 6th International Conference on Industrial and Business Engineering, Macau, China, 27–29 September 2020; ACM: New York, NY, USA, 2020; pp. 191–195. [Google Scholar] [CrossRef]
  32. Kennedy, R.S.; Lane, N.E.; Berbaum, K.S.; Lilienthal, M.G. Simulator Sickness Questionnaire: An Enhanced Method for Quantifying Simulator Sickness. Int. J. Aviat. Psychol. 1993, 3, 203–220. [Google Scholar] [CrossRef]
  33. Rebenitsch, L.; Owen, C. Review on cybersickness in applications and visual displays. Virtual Real. 2016, 20, 101–125. [Google Scholar] [CrossRef]
  34. Nakamura, J.; Csikszentmihalyi, M. The concept of flow. In Handbook of Positive Psychology; Snyder, C.R., Lopez, S.J., Eds.; Oxford University Press: Oxford, UK, 2002; pp. 89–105. [Google Scholar]
  35. Rheinberg, F.; Vollmeyer, R.; Engeser, S. FKS-Flow-Kurzskala: ZPID (Leibniz Institute for Psychology)–Open Test Archive; ZPID: Trier, Germany, 2019. [Google Scholar]
  36. Kothgassner, O.D.; Felnhofer, A.; Hauk, N.; Kastenhofer, E.; Gomm, J.; Kryspin-Exner, I. Technology Usage Inventory (TUI): Manual; FFG: Wien, Austria, 2012. [Google Scholar]
  37. Magosso, E.; De Crescenzio, F.; Piastra, S.; Ursino, M. EEG Alpha Power Is Modulated by Attentional Changes during Cognitive Tasks and Virtual Reality Immersion. Comput. Intell. Neurosci. 2019, 2019, 7051079. [Google Scholar] [CrossRef] [PubMed]
  38. Lee, J.; Kim, M.; Kim, J. A Study on Immersion and VR Sickness in Walking Interaction for Immersive Virtual Reality Applications. Symmetry 2017, 9, 78. [Google Scholar] [CrossRef]
  39. Sharples, S.; Cobb, S.; Moody, A.; Wilson, J.R. Virtual reality induced symptoms and effects (VRISE): Comparison of head mounted display (HMD), desktop and projection display systems. Displays 2008, 29, 58–69. [Google Scholar] [CrossRef]
  40. Fang, W.; Zheng, L.; Deng, H.; Zhang, H. Real-Time Motion Tracking for Mobile Augmented/Virtual Reality Using Adaptive Visual-Inertial Fusion. Sensors 2017, 17, 1037. [Google Scholar] [CrossRef]
  41. McCaul, K.D.; Malott, J.M. Distraction and coping with pain. Psychol. Bull. 1984, 95, 516–533. [Google Scholar] [CrossRef] [PubMed]
  42. Verhoeven, K.; van Damme, S.; Eccleston, C.; van Ryckeghem, D.M.L.; Legrain, V.; Crombez, G. Distraction from pain and executive functioning: An experimental investigation of the role of inhibition, task switching and working memory. Eur. J. Pain 2011, 15, 866–873. [Google Scholar] [CrossRef] [PubMed]
  43. Witte, M.; Kober, S.E.; Ninaus, M.; Neuper, C.; Wood, G. Control beliefs can predict the ability to up-regulate sensorimotor rhythm during neurofeedback training. Front. Hum. Neurosci. 2013, 7, 478. [Google Scholar] [CrossRef] [PubMed]
  44. Weech, S.; Kenny, S.; Lenizky, M.; Barnett-Cowan, M. Narrative and Gaming Experience Interact to Affect Presence and Cybersickness in Virtual Reality. Int. J. Hum.-Comput. Stud. 2019, 138, 102398. [Google Scholar] [CrossRef]
Figure 1. The two different paradigms participants were pseudo-randomly assigned to. (A) 2D paradigm that was presented on a computer screen. (B) AR paradigm with virtual plants growing out of real plant pots.
Figure 1. The two different paradigms participants were pseudo-randomly assigned to. (A) 2D paradigm that was presented on a computer screen. (B) AR paradigm with virtual plants growing out of real plant pots.
Computers 13 00110 g001
Figure 2. Participant wearing the VR-AR set-up used in this study.
Figure 2. Participant wearing the VR-AR set-up used in this study.
Computers 13 00110 g002
Table 1. Description of the sample per group.
Table 1. Description of the sample per group.
2DAR
ShamRealShamReal
N24 (17 female)20 (9 female)20 (13 female)25 (14 female)
Mean age (SD)21.79 (1.74)24.65 (4.06)25.20 (3.39)23.68 (3.42)
Table 2. User experience data (means and SD) and results of statistical comparison (2D vs. AR).
Table 2. User experience data (means and SD) and results of statistical comparison (2D vs. AR).
2DAR
ShamRealShamReal
MeanSDMeanSDMeanSDMeanSD
Positive affect−0.030.650.070.59−0.020.57−0.100.60
Negative affect−0.100.280.020.20−0.170.16−0.070.13
Nausea−0.251.41−0.301.47−0.991.850.161.03
Oculomotor disturbance1.312.261.202.650.802.931.582.63
Disorientation1.211.771.573.150.771.821.361.88
Total Score0.330.500.300.680.380.350.280.45
Fluency3.651.183.931.064.661.213.921.35
Absorption4.250.954.131.374.611.064.480.98
Concern2.821.412.351.352.221.201.940.87
General factor3.890.924.011.024.641.014.151.12
Curiosity4.481.384.241.204.781.224.721.37
Technology fear1.800.632.030.741.730.801.530.59
Interest3.321.423.491.203.881.683.691.49
Usability4.100.544.380.644.550.494.620.49
Immersion3.801.504.061.564.191.383.911.43
Usefulness2.571.543.081.253.041.062.931.33
Skepticism2.811.133.051.102.901.342.511.13
Accessibility2.570.963.181.472.371.222.181.15
Control3.381.793.852.285.052.525.282.70
Concentration5.962.275.502.656.851.666.282.75
Perceived success4.422.414.752.756.552.015.883.15
Fun7.302.537.612.067.852.367.483.15
Pain3.143.402.223.001.211.962.292.94
Pressure on head20.8813.1920.5017.5414.6115.3016.4815.40
Headache3.454.653.793.722.374.590.891.97
Eye burning8.6111.345.888.203.696.444.617.37
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Berger, L.M.; Wood, G.; Kober, S.E. User Experience in Neurofeedback Applications Using AR as Feedback Modality. Computers 2024, 13, 110. https://doi.org/10.3390/computers13050110

AMA Style

Berger LM, Wood G, Kober SE. User Experience in Neurofeedback Applications Using AR as Feedback Modality. Computers. 2024; 13(5):110. https://doi.org/10.3390/computers13050110

Chicago/Turabian Style

Berger, Lisa Maria, Guilherme Wood, and Silvia Erika Kober. 2024. "User Experience in Neurofeedback Applications Using AR as Feedback Modality" Computers 13, no. 5: 110. https://doi.org/10.3390/computers13050110

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop