Next Article in Journal
Approaching Four Years of Encyclopedia
Previous Article in Journal
Sustainable Water Management Practices in Agriculture: The Case of East Africa
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Review

Cognitive Assessment and Training in Extended Reality: Multimodal Systems, Clinical Utility, and Current Challenges

by
Palmira Victoria González-Erena
1,2,
Sara Fernández-Guinea
2 and
Panagiotis Kourtesis
1,2,3,4,5,*
1
Department of Psychology, The American College of Greece, 15342 Athens, Greece
2
Department of Experimental Psychology, Cognitive Processes and Speech Therapy, Complutense University of Madrid, 28223 Madrid, Spain
3
Department of Informatics & Telecommunications, National and Kapodistrian University of Athens, 16122 Athens, Greece
4
Department of Psychology, National and Kapodistrian University of Athens, 15784 Athens, Greece
5
Department of Psychology, The University of Edinburgh, Edinburgh EH8 9Y, UK
*
Author to whom correspondence should be addressed.
Encyclopedia 2025, 5(1), 8; https://doi.org/10.3390/encyclopedia5010008
Submission received: 10 November 2024 / Revised: 3 January 2025 / Accepted: 10 January 2025 / Published: 13 January 2025
(This article belongs to the Section Behavioral Sciences)

Abstract

:
Extended reality (XR) technologies—encompassing virtual reality (VR), augmented reality (AR), and mixed reality (MR)—are transforming cognitive assessment and training by offering immersive, interactive environments that simulate real-world tasks. XR enhances ecological validity while enabling real-time, multimodal data collection through tools such as galvanic skin response (GSR), electroencephalography (EEG), eye tracking (ET), hand tracking, and body tracking. This allows for a more comprehensive understanding of cognitive and emotional processes, as well as adaptive, personalized interventions for users. Despite these advancements, current XR applications often underutilize the full potential of multimodal integration, relying primarily on visual and auditory inputs. Challenges such as cybersickness, usability concerns, and accessibility barriers further limit the widespread adoption of XR tools in cognitive science and clinical practice. This review examines XR-based cognitive assessment and training, focusing on its advantages over traditional methods, including ecological validity, engagement, and adaptability. It also explores unresolved challenges such as system usability, cost, and the need for multimodal feedback integration. The review concludes by identifying opportunities for optimizing XR tools to improve cognitive evaluation and rehabilitation outcomes, particularly for diverse populations, including older adults and individuals with cognitive impairments.

1. Introduction

Extended reality (XR), encompassing virtual reality (VR), augmented reality (AR), and mixed reality (MR), has transformed cognitive assessment and training by offering immersive, dynamic environments that simulate real-world tasks [1]. Traditional neuropsychological tests—such as paper-and-pencil tasks or static computerized exercises—often isolate cognitive functions under artificial conditions [2]. XR, in contrast, integrates real-world complexity into cognitive assessments and training [3]. For example, while a traditional memory test might involve recalling a list of words, XR can simulate a virtual shopping mall where participants must locate items on a list, recall their positions, and manage realistic distractions [4]. This not only evaluates memory but also incorporates attention, spatial navigation, and decision-making, offering a more ecologically valid reflection of real-world cognitive performance [4,5].
A key innovation of XR is its ability to combine immersive, interactive experiences with multimodal feedback systems such as eye tracking (ET), galvanic skin response (GSR), electroencephalography (EEG), and body tracking [6,7]. These technologies enable the real-time collection of behavioral, physiological, and neural data, providing deeper insights into cognitive and emotional states during task performance [7,8]. For instance, in an XR-based attention task, eye-tracking data can reveal visual attention patterns, while EEG signals can indicate changes in cognitive load or mental fatigue [6,9]. This continuous, multimodal data collection represents a significant advancement over traditional methods, which often capture only static performance snapshots [1].
XR also addresses challenges such as disengagement and the learning effect observed in repetitive cognitive tasks [3]. Immersive XR environments enhance user engagement and motivation, particularly for older adults and individuals with cognitive impairments, where adherence to training programs is often a concern [10,11]. Additionally, XR systems can adapt task difficulty dynamically in real time based on user performance and cognitive load, ensuring personalized assessments and training programs aligned with individual abilities [9,12].
In clinical contexts, XR demonstrates utility in assessing and rehabilitating cognitive impairments associated with neurodegenerative diseases, brain injuries, and neurodevelopmental disorders [13,14]. For example, XR-based neuropsychological batteries can simulate daily activities, such as navigating virtual cities or managing household responsibilities, providing clinicians with ecologically valid insights into cognitive performance [15,16]. XR’s ability to collect longitudinal, personalized data further enhances its role in monitoring progress and tailoring interventions for improved cognitive outcomes [17].
Despite its potential, XR technologies face several challenges, including the underutilization of sensory modalities beyond visual and auditory feedback, issues such as cybersickness, and barriers to accessibility and usability [1,18]. Addressing these limitations is essential for XR to fulfill its promise as a transformative tool in cognitive science.

Aim and Approach of This Narrative Review

This narrative review aims to explore the applications, benefits, and challenges of XR technologies in cognitive assessment and training. Specifically, it examines XR’s contributions to enhancing ecological validity, integrating ET, EEG, GSR, and body tracking, and supporting clinical interventions. The review also identifies unresolved challenges, including usability concerns, cybersickness, and regulatory barriers, while proposing future directions for research and implementation.
To ensure a comprehensive and focused analysis, the literature was identified and selected based on the following approach:
  • Databases searched: PubMed, IEEE Xplore, and Scopus.
  • Keywords: “Extended Reality”, “Cognitive Assessment”, “Adaptive Systems”, “Neuropsychological Testing”, “XR-based Cognitive Training”, “Eye Tracking”, “EEG”, “GSR”, and “Body Tracking”.
  • Inclusion criteria: Experimental studies, theoretical frameworks, and applications of XR technologies in cognitive science published within the last 10 years.
  • Focus: The review prioritizes studies on XR-based tools for cognitive assessment and rehabilitation, emphasizing the integration of ET, EEG, GSR, and body tracking data streams across diverse populations, including healthy individuals, older adults, and clinical groups with neurological or neurodevelopmental conditions.
By synthesizing findings from these sources, this review highlights XR’s comparative advantages over traditional cognitive tools, its role in creating ecologically valid and adaptive assessments, and its potential for advancing cognitive training and rehabilitation.

2. Ecological Validity in XR-Based Cognitive Assessment and Training

Ecological validity refers to the degree to which cognitive assessments reflect real-world scenarios, ensuring that findings gathered in controlled environments generalize to everyday functioning [19]. Traditional assessments, such as paper-and-pencil or static computerized tasks, often isolate cognitive functions like memory, attention, and problem-solving but fail to replicate the complexity and unpredictability of real-world tasks [2]. Cognitive performance is context-dependent and influenced by environmental, social, and situational factors [5]. This is particularly important in neuropsychology and cognitive rehabilitation, where assessments guide interventions aimed at improving real-world outcomes [20].
XR, including VR and MR, enhances ecological validity by simulating dynamic, real-world-like conditions that mirror daily challenges [3]. Unlike traditional assessments, XR allows participants to engage in tasks that require the simultaneous use of multiple cognitive functions—such as attention, memory, executive function, and visuospatial reasoning—within realistic scenarios [4,21]. For example, a participant navigating a virtual city must make decisions, solve problems, and interact with the environment [4,22]. This holistic approach replicates cognitive demands encountered in everyday life, making XR a more relevant and functional tool for assessment [3].

Examples of XR-Based Cognitive Tasks with Real-World Relevance

Several XR-based tasks have been developed to assess cognitive abilities in immersive, contextually relevant settings [23,24]. These tasks include:
  • Memory: XR environments evoke autobiographical memories and enhance episodic memory by embedding tasks within realistic contexts, such as navigating a virtual home or completing a shopping list [25,26].
  • Prospective Memory: XR tasks replicate real-world challenges, such as remembering to perform time- or event-based actions (e.g., managing a virtual household or navigating a virtual shopping trip), offering a more naturalistic approach than lab-based button-press tasks [27,28,29,30].
  • Executive Function: Tasks like planning routes through virtual environments or multitasking in a simulated workspace mirror real-world problem-solving and adaptability demands [4,22].
  • Language: XR assessments integrate virtual avatars and multimodal feedback (e.g., eye gaze, speech, and facial expressions) within interactive social scenarios, providing insights into natural language comprehension and production [31,32].
  • Attention and Spatial Cognition: XR simulates complex tasks such as driving, cooking, or office work, requiring users to sustain attention, shift focus, and navigate spatially challenging environments [33,34].
By replicating real-world cognitive demands in a controlled environment, XR enhances both the ecological validity and accuracy of cognitive assessments, making them more reflective of daily life challenges [4].

3. Usability, Acceptability, and User Experience in XR

3.1. Usability of XR Devices and Systems for Cognitive Assessment

Usability is a critical factor for the successful implementation of XR systems in cognitive assessment, as tasks require sustained engagement and active participation [3]. Usability encompasses the ease with which individuals interact with XR hardware, such as headsets and sensors, as well as software interfaces that enable immersive experiences [35]. Poor usability—manifesting as unintuitive interfaces or uncomfortable hardware—can increase cognitive load and negatively impact task performance, compromising the accuracy and validity of assessment [36,37,38].
Significant advancements in XR usability have been achieved in recent years. Modern headsets are lighter, more ergonomic, and feature improved interactions through hand tracking, voice commands, and eye-tracking technologies [1,39]. These advancements reduce the need for users to learn complex controls, allowing them to focus on tasks rather than managing system intricacies [40,41]. Nevertheless, challenges remain, particularly for older adults and inexperienced users, who may find XR systems intimidating or difficult to navigate [42]. This intimidation can reduce their willingness to engage with XR-based assessments or training [35]. Additionally, individuals with cognitive impairments face unique barriers, as complex interfaces can pose significant challenges to their effective participation [43]. Addressing these usability concerns requires designing XR systems that prioritize simplicity, inclusivity, and accessibility, ensuring broader acceptance across diverse populations [1].

3.2. User Acceptability and Experience Across Various Populations

User acceptability refers to the extent to which XR systems are adopted and embraced across different populations, which is strongly influenced by user experience (UX)—the overall comfort, satisfaction, and engagement individuals derive from interacting with XR devices and applications [38]. A positive UX is crucial for the success of XR-based cognitive assessments and training [3,44,45]. If users find the system disorienting, uncomfortable, or overly complex, it can diminish their willingness to participate and decrease the technology’s overall acceptability [42,43].
Different user groups have unique needs and challenges when engaging with XR technologies. For children, XR provides an engaging and interactive learning environment but raises concerns regarding safety, content appropriateness, and the effects of prolonged exposure to immersive experiences [46]. Middle-aged adults often benefit from XR-based cognitive training and rehabilitation but may face challenges integrating time-intensive XR programs into their daily responsibilities, such as work and family obligations [47]. Flexible and time-efficient XR applications can help address these barriers, improving adherence and overall acceptability [47].
Older adults present another distinct challenge [48]. While XR has shown significant promise for cognitive enhancement in this population, factors such as limited technological familiarity, physical discomfort caused by headsets, and cognitive overload may reduce usability and engagement [35,42]. Additional support, such as training sessions and user-friendly designs, is often necessary to improve acceptance and ensure meaningful participation [42]. Similarly, individuals with cognitive impairments, including those with dementia or acquired brain injuries, stand to benefit from XR-based cognitive interventions [20,49]. However, these individuals are often more susceptible to disorientation, cognitive strain, and accessibility challenges, underscoring the importance of carefully designed, adaptive XR systems that cater to their specific needs [48,50]. Ensuring that XR systems are designed with the specific needs of these populations in mind is critical for promoting user acceptability [38,45].

3.3. Case Studies on UX in Cognitive Training

Case studies have demonstrated the potential of XR technologies to enhance cognitive training outcomes across various populations [46,51]. In studies involving older adults, immersive VR-based cognitive training programs have shown improvements in memory, attention, and executive function [10,49]. Participants frequently report higher levels of engagement and enjoyment compared to traditional methods, although challenges related to physical discomfort and usability persist [52,53].
In the case of neurodevelopmental disorders, the use of transdiagnostic approaches based on functional outcomes to identify the specific needs of each individual is crucial, regardless of diagnostic labels and discrete categories [54]. Case studies implementing this approach focus on personalized and adaptive treatment adjust to the specific needs of each individual [55,56]. For example, studies involving children with Autism Spectrum Disorder (ASD), XR environments have been used to simulate real-world scenarios, such as visiting a store or attending a movie, where participants can practice social and cognitive skills in a safe, controlled setting [56,57]. Similarly, XR-based cognitive training has demonstrated promising results for children with Attention Deficit Hyperactivity Disorder (ADHD), improving both cognitive and social functioning through repeated immersive practice [46,58].
In clinical settings, XR has been employed to develop personalized cognitive training programs for individuals with neurodegenerative conditions or traumatic brain injuries [20,53]. Participants often report that the immersive nature of XR helps them remain focused and motivated, leading to notable cognitive improvements [59]. However, these studies also highlight the need for simplified systems and additional support to address accessibility and usability concerns [51,60].

4. Multimodal Systems in XR Cognitive Applications

4.1. Overview of Multimodalities: GSR, EEG, ET, Hand Tracking, Body Tracking

Multimodal systems in XR environments integrate various sensory and physiological inputs—such as GSR, EEG, ET, hand tracking, and body tracking—to create a comprehensive understanding of users’ cognitive, emotional, and motor states [61]. These systems enable XR applications to capture real-time data across multiple dimensions of human behavior, enhancing the precision, adaptability, and effectiveness of cognitive assessment and training [62].
By leveraging multimodal inputs, XR systems provide deeper insights into user engagement, stress, and cognitive load, facilitating personalized and dynamic experiences [9,63]. For instance, EEG-based systems can detect mental fatigue and adjust task difficulty accordingly, while GSR data reveal emotional responses that influence performance [41,64]. Similarly, ET and hand tracking allow for intuitive interactions with virtual environments, improving immersion and usability [65,66]. A detailed breakdown of key modalities, their descriptions, and applications in XR is presented in Table 1 below.

4.2. Integration of These Modalities in XR Environments

The integration of multimodal systems into XR environments revolutionizes cognitive assessments and training by enabling continuous, real-time data collection across multiple dimensions of behavior [61]. Unlike traditional methods, which gather isolated data points, XR systems equipped with EEG, GSR, ET, and motion tracking provide a synchronized, dynamic understanding of cognitive and emotional processes as they unfold during task performance [67,68].
For instance, during a cognitive task, EEG monitors cognitive load, GSR tracks emotional arousal, and ET analyzes attentional focus, while hand and body tracking capture motor performance [66,69]. This integrated approach offers a holistic view of participants’ states, particularly when multiple cognitive functions, such as decision-making and problem-solving, must operate simultaneously [70].
A key innovation enabled by multimodal integration is adaptive feedback. XR environments can dynamically respond to user states by adjusting task parameters in real time [71]. For example, if EEG data indicate cognitive overload or GSR detects elevated stress levels, the system can reduce task complexity to optimize performance and maintain engagement [9,72]. These adaptive capabilities make XR environments highly responsive and personalized, which enhances their utility in both research and clinical settings [71].

4.3. Applications of Multimodal Systems in Cognitive Assessment

Multimodal systems in XR environments enable novel applications for cognitive assessment by combining data from various physiological and behavioral modalities [61]. For instance, emotion recognition benefits from the integration of GSR, EEG, and facial expression analysis, allowing researchers to identify emotional states like stress, frustration, or excitement [63]. Such insights are particularly valuable for mental health assessments, where understanding emotional responses informs personalized therapeutic interventions [1].
Attention monitoring represents another key application. By integrating EEG and ET, XR systems can evaluate attentional control and detect lapses in focus more accurately than traditional assessments [71]. For example, in virtual driving simulations, ET monitors visual focus on key environmental cues, while EEG measures the cognitive effort required to sustain attention [64,73].
Cognitive load assessment is significantly enhanced through multimodal integration. EEG data, combined with behavioral metrics from hand and body tracking, reveal how individuals manage cognitive demands during complex tasks [74]. This is particularly beneficial in educational and training contexts, where real-time adjustments to task difficulty help optimize learning outcomes and prevent cognitive overload [75].
Finally, multimodal systems are valuable for assessing visuomotor coordination and spatial cognition [76]. By tracking hand and body movements during virtual tasks, XR can evaluate motor–cognitive integration in realistic settings, such as virtual rehabilitation exercises or training simulations [41]. These applications demonstrate how multimodal systems expand the capabilities of XR technologies, enabling assessments that are both more immersive and ecologically valid [77].

4.4. Advantages and Limitations of Multimodal Systems

The primary advantage of multimodal systems in XR environments lies in their ability to provide a rich, multidimensional view of cognitive, emotional, and motor processes [61]. By integrating data from EEG, GSR, ET, and motion sensors, these systems offer more accurate assessments and enable personalized interventions that adapt dynamically to users’ needs [78]. This holistic approach is particularly beneficial in cognitive training and rehabilitation, where understanding the interplay between cognitive load, emotional states, and motor responses is critical for achieving meaningful outcomes [79].
However, the adoption of multimodal systems is limited by technical complexity and resource demands [3,80]. Integrating and synchronizing data streams requires sophisticated hardware, software, and computational techniques, which can be costly and inaccessible for resource-limited institutions [81]. Moreover, the sheer volume of data generated can lead to data overload, necessitating advanced analytical methods and expertise for effective interpretation [80].
Physical discomfort caused by wearable sensors and prolonged headset use remains another challenge, particularly for populations with physical or cognitive impairments [82]. Ensuring that systems are lightweight, ergonomic, and inclusive is essential for enhancing UX and expanding the applicability of XR-based multimodal systems [83]. Despite these limitations, the integration of multimodal systems represents a major step forward in cognitive science, offering unparalleled opportunities for real-time, ecologically valid assessments and interventions [69].

5. XR Applications in Cognitive Assessment

5.1. Review of Current XR-Based Cognitive Assessment Tools

XR technologies have introduced a new dimension to cognitive assessment by combining immersive realism and dynamic interactivity, allowing researchers to evaluate multiple cognitive domains within real-world-like environments [3]. These tools assess critical cognitive functions, including memory, executive functions, attention, and spatial cognition, through tailored tasks that replicate the complexity of everyday scenarios [23,24] (see Table 2).
For memory assessment, XR-based tools immerse participants in tasks that require them to remember objects, locations, or sequences within virtual spaces [24]. For instance, the VR Everyday Assessment Lab (VR-EAL) evaluates episodic and prospective memory through tasks like recalling items from a shopping list or remembering to complete actions triggered by time cues [4]. By embedding these tasks in realistic contexts, VR-EAL captures naturalistic memory use, far exceeding the ecological validity of traditional list-based recall tests [4].
In assessing executive functions, XR tools simulate complex multi-step activities that mirror real-world demands [23]. In one example, participants manage tasks within a Virtual Office Simulation by prioritizing assignments, responding to interruptions, and making strategic decisions [22]. Another study used a virtual classroom to evaluate inhibitory control in adults with ASD, demonstrating sensitivity to everyday challenges not captured by conventional tasks [84]. These XR tools provide insights into cognitive flexibility, planning, and adaptability under realistic, high-pressure conditions [23].
Attention assessment in XR environments introduces a higher level of realism by placing participants in immersive, distracting settings [4]. For instance, participants might focus on instructions in a virtual classroom while filtering out noise from peers or other environmental distractions [85,86]. Such tasks capture attentional control in ways that static, low-stimulation environments cannot [3].
For spatial cognition, XR tasks require participants to navigate virtual cities, plan routes, and interact with 3D objects to evaluate spatial memory and visuospatial reasoning [34,87]. These tools are particularly useful for assessing populations with spatial deficits, such as individuals with neurodegenerative conditions or brain injuries, offering dynamic, real-world-like scenarios that traditional paper-and-pencil tests fail to replicate [16].
By offering highly realistic, adaptive, and interactive tasks, XR tools provide a deeper, ecologically valid understanding of cognitive abilities across multiple domains, bridging the gap between laboratory settings and real-world functioning [3] (see Table 2).
Table 2. XR cognitive assessment tools and studies.
Table 2. XR cognitive assessment tools and studies.
Cognitive DomainXR-Based Assessment Tool and StudyDescription of MethodKey Findings and Implications
MemoryVR-EAL (Kourtesis et al., 2021) [4]Participants engage in tasks like remembering a shopping list or recalling sequences in a realistic virtual environment.Enhanced ecological validity compared to traditional tests, accurately reflecting real-world memory usage.
Spatial Recall Task (Sauzéon et al., 2016) [26]Participants memorize and recall spatial information in a virtual environment.Increased realism leads to better memory performance measurements, compared to static tests.
Context-rich Memory Tasks (Pflueger et al., 2023) [88]Memory tasks incorporate environmental and situational cues in VR settings.Contextual elements enhance memory assessment and provide a more realistic understanding of memory function.
VR-EAL (Kourtesis and MacPherson, 2023) [28]VR-EAL simulates everyday situations that demand prospective memory, where users must remember tasks triggered by specific times or events, like taking virtual medication after breakfast or at scheduled intervals.XR methods outperform traditional approaches in capturing prospective memory in real-life situations.
Executive FunctionsVirtual Office Simulation (Jansari et al., 2014) [22]Participants manage tasks, handle unexpected events, and make strategic decisions in a virtual office setting.Effectively assesses planning, adaptability, and decision-making, mirroring real-world complexities.
Inhibitory Control in ASD (Parsons and Carlew, 2016) [84]VR classroom simulation to measure inhibitory control in adults with ASD.Captures real-world executive dysfunction in a way that traditional tests cannot.
VR-EAL (Kourtesis and MacPherson, 2021) [4]Tasks simulate planning and adaptability challenges, like running errands in a virtual city. Also, there is a cooking task that requires multitasking skills. Provides insights into strategic planning and adaptability under realistic conditions.
AttentionHigh-Stimulation Attention Task (Coleman et al., 2019) [85]Participants focus on instructions amid distractions in a virtual classroom.More accurate assessment of attention control compared to lab-based tests.
Naturalistic Attention (Iriarte et al., 2016) [86]Participants filter out distractions in an immersive VR class.XR tasks simulate real-world attentional demands, offering more applicable results.
VR-EAL (Kourtesis et al., 2021) [4]Detecting visual/auditory cues amid distractions while on the road. Comprehensive assessment of attentional processes, enhancing real-world applicability.
Visuospatial SkillsVirtual City Navigation (Grübel et al., 2017) [87]Participants plan routes and remember landmarks in a virtual city.Detailed data on spatial memory and reasoning that traditional 2D tests cannot offer.
Spatial Deficit Assessment (Howett et al., 2019) [16]XR tasks assess navigation skills in individuals with mild cognitive impairment or brain injuries.Valuable for clinical applications, as XR provides a realistic measure of spatial impairments.
VR-EAL (Kourtesis and MacPherson, 2021) [4]Route planning and landmark recall in immersive VR settings. Offers an ecologically valid measure of spatial reasoning, closely reflecting real-world challenges.
3D Interaction Tasks (Cogné et al., 2018) [89]Participants interact with 3D objects in virtual environments to test coordination and movement patterns.XR captures coordination skills in a dynamic setting, revealing nuances not measurable by traditional tests.
Object Manipulation (Wen et al., 2023b) [79]Tasks involving manipulation of objects and solving spatial puzzles.Provides a comprehensive understanding of visuomotor skills in realistic, engaging scenarios.

5.2. XR Assessments Compared to Traditional Methods

The key distinction between XR-based cognitive assessments and traditional methods lies in XR’s ability to simulate real-world complexity and provide an engaging, immersive testing environment [3]. Traditional cognitive assessments, such as computerized tasks, structured interviews, or paper-and-pencil tests, typically isolate specific cognitive functions—such as memory, attention, or problem-solving—in controlled, low-stimulation settings [2]. While these standardized methods provide valuable, quantifiable data, they fail to replicate the dynamic and contextual demands of real-world cognition, where tasks are rarely performed in isolation or under ideal conditions [3,4].

5.2.1. Ecological Validity and Realism

XR-based assessments excel in ecological validity by immersing participants in realistic, interactive scenarios that simulate everyday tasks [5]. For example, while traditional memory tasks might involve recalling a list of words or numbers, XR environments require participants to remember object locations or execute sequential instructions within immersive, dynamic spaces [26,88]. Tasks such as navigating through a virtual house, remembering the placement of items, or recalling event-based triggers provide a naturalistic evaluation of memory processes, making the findings more transferable to real-world settings [24,79].

5.2.2. Visuomotor Coordination and Spatial Cognition

Another significant advantage of XR assessments is their ability to measure visuomotor coordination and spatial cognition more effectively than traditional methods [76]. Standard tests might involve drawing figures or solving two-dimensional mazes [21]. In contrast, XR tasks allow participants to interact with 3D objects, solve spatial puzzles, and navigate immersive environments that replicate real-world spatial challenges [89]. This dynamic approach captures nuances of visuospatial reasoning and coordination that are difficult to assess using static, two-dimensional tools [79]. Additionally, XR tasks often provide real-time feedback, enabling researchers to observe how participants adjust their strategies in response to task demands [1].

5.2.3. Continuous and Dynamic Data Collection

A critical advantage of XR-based assessments is their capacity for continuous data collection, which offers a more detailed and dynamic understanding of cognitive performance [71,79]. Traditional assessments typically measure outcomes at discrete time points—such as reaction times, accuracy, or task completion scores—providing limited insight into the processes underlying performance [2]. In contrast, XR systems collect continuous streams of behavioral, physiological, and neural data throughout the task [41].
For instance, an XR-based attention task can simultaneously track participants’ gaze (using ET), their emotional responses (via GSR), and their cognitive load (via EEG) [9]. This approach reveals not only whether participants complete the task successfully but also how their focus shifts, how cognitive demands fluctuate, and how emotional states impact performance [69]. Such continuous, multimodal data enable researchers to derive nuanced insights into the dynamic interplay of cognitive and emotional processes [71].

5.2.4. Complementarity with Traditional Methods

Despite their advantages, XR-based assessments do not aim to replace traditional cognitive testing methods but rather to complement them [3]. Traditional tools remain valuable for standardization and benchmarking, particularly in clinical and research settings where consistency and simplicity are critical [2]. The integration of XR with traditional assessments can provide a comprehensive evaluation of cognitive abilities, combining the strengths of standardized methods with the ecological validity and richness of XR technologies [90]. By offering ecologically valid, dynamic, and multimodal assessments, XR tools address key limitations of traditional methods, providing a deeper and more contextually relevant understanding of cognitive functioning in everyday life [4].

5.3. XR’s Potential for Real-Time Data Collection and Analysis

XR-based cognitive assessments offer a significant advantage through real-time, multimodal data collection [1]. Unlike traditional assessments, which rely on isolated performance metrics, XR systems continuously collect synchronized data streams, including:
  • Behavioral data: Eye movements, reaction times, and body posture [66,69].
  • Physiological data: Stress or arousal measured via GSR [72].
  • Neural data: Cognitive states assessed through EEG signals [17].
This multimodal approach enables a deeper understanding of cognitive and emotional processes as they unfold during task performance [63]. For instance, an XR-based decision-making task can simultaneously track attention (ET), emotional responses (GSR), and cognitive load (EEG), revealing how these processes interact under complex conditions [69,91].
Additionally, XR systems enable adaptive assessments by dynamically adjusting task difficulty based on real-time feedback [92]. For example, if EEG or GSR data detect cognitive overload, the system can lower task complexity to prevent frustration [72]. Conversely, if the task is too easy, complexity can increase to maintain engagement [92]. This responsive, personalized feedback ensures that XR tools remain both engaging and effective [80].

5.4. Challenges in Implementing XR in Large-Scale Cognitive Assessments

Despite their transformative potential, XR-based cognitive assessments face several challenges that hinder their widespread implementation, particularly at scale [3]. One of the most significant barriers is the cost and resource requirements associated with XR systems [3]. High-quality hardware, such as VR and AR headsets, sensors, and the powerful computers needed to run immersive environments, remains relatively expensive compared to traditional cognitive assessment tools [1]. Moreover, developing scientifically valid and engaging XR-based applications requires specialized expertise in cognitive science, software development, and XR design, which can be both costly and resource-intensive [81]. For institutions with limited financial and technical resources, these requirements can present substantial obstacles to adoption and maintenance [93].
Another major challenge is usability and accessibility, particularly for specific populations, including older adults, individuals with physical impairments, or those with cognitive disabilities [11,37]. While XR systems provide unparalleled immersion, prolonged use can lead to cybersickness, eye strain, and physical discomfort, which can negatively impact user engagement and data reliability [94,95,96]. These limitations are particularly concerning for long-duration tasks or assessments targeting vulnerable populations [94,97]. Designing XR tools that minimize these side effects and are accessible to users with diverse needs is essential for broader adoption [43,48,97].
Scalability also poses a significant hurdle for XR technologies, particularly when transitioning from controlled research settings to larger, real-world applications [3]. Administering XR-based cognitive assessments to large populations requires not only an adequate number of XR systems but also the technical infrastructure and support necessary to manage and troubleshoot these devices during testing sessions [81]. Furthermore, XR assessments generate large and complex data sets that require advanced analytical tools and expertise to process effectively [98]. Institutions with limited infrastructure may struggle to manage these demands, further complicating efforts to scale XR tools for widespread use [81,93].
Nevertheless, advancements in XR hardware and software are expected to alleviate some of these challenges [1]. As the cost of XR devices continues to decline and technologies become lighter, more ergonomic, and user-friendly, broader accessibility will likely follow [3]. Additionally, emerging cloud-based and AI solutions for XR applications can help reduce on-site computational requirements, making these systems more feasible for resource-limited settings [80]. Addressing these barriers—through affordable hardware, inclusive design, and scalable infrastructure—will be crucial for realizing XR’s full potential as an innovative tool for large-scale cognitive assessment [3,99].

6. XR Applications in Cognitive Training

6.1. Review of Current XR-Based Cognitive Training Interventions

XR technologies have emerged as powerful tools for cognitive training, offering immersive and interactive environments that enhance engagement, personalization, and ecological validity (see Table 3) [1]. Unlike traditional training methods, which often rely on repetitive and static tasks, XR enables the creation of dynamic, real-world scenarios tailored to train specific cognitive functions such as memory, attention, and executive function (see Table 3) [51].
Memory enhancement is a major focus in XR-based cognitive training [100]. For example, XR tasks immerse participants in context-rich scenarios that simulate real-world challenges, such as remembering object locations within a virtual home or recalling action sequences in a virtual kitchen [39,100]. These tasks engage memory processes in a way that reflects daily-life challenges, making them more applicable than traditional route learning or list-recall exercises [101]. Studies have shown that such immersive tasks lead to significant improvements in memory performance, particularly among older adults and individuals with mild cognitive impairments (MCI) [102].
Attention training has also benefitted from XR’s ability to simulate complex, high-stimulation environments [103]. XR interventions challenge users to sustain attention, filter distractions, and respond to multiple sources of stimuli—skills critical for real-world functioning [104]. For instance, tasks set in virtual marketplaces or classrooms require participants to focus on key details while ignoring irrelevant inputs, providing a more nuanced training experience than simple reaction-time exercises [60,103]. Additionally, XR systems can incorporate adaptive difficulty and real-time feedback, ensuring that users are continuously engaged at an optimal level [81].
XR has also proven effective in training executive functions, such as problem-solving, planning, and task-switching [49]. Participants may engage in tasks that require navigating a virtual city, managing a simulated workplace, or solving dynamic problems in changing scenarios [49,105]. These immersive tasks challenge participants to prioritize actions, adapt to unexpected events, and execute strategic decisions—skills that are often difficult to isolate and train in static environments [15]. By providing engaging, interactive, and ecologically valid scenarios, XR-based cognitive training offers a more holistic approach to improving cognitive functions across diverse populations [51].
Table 3. XR cognitive training tools and studies.
Table 3. XR cognitive training tools and studies.
Training FocusPopulation (Study)Method DescriptionKey Findings and Implications
Memory
Training
Older Adults
(Varela-Aldás et al., 2022) [39]
Real-life simulated memory tasks, like recalling sequences of actions in a virtual kitchen.Enhanced user engagement and better real-world applicability compared to static recall exercises.
Individuals with Cognitive Decline (Mondellini et al., 2018) [102]Context-rich scenarios replicating everyday memory challenges.Memory performance showed marked improvements, especially in older adults and those with MCI.
Attention
Training
General Population and Stroke Patients (Huygelier et al., 2022) [104]Dynamic tasks in XR requiring sustained attention in realistic, immersive settings.Improved attentional control, better reflecting real-world demands compared to simple reaction-time exercises.
General Population, Children
(Wang et al., 2020) [103]
Tasks designed with adaptive difficulty and real-time feedback to sustain attention.Participants maintained engagement and showed greater attentional improvements that generalized to daily activities.
Older Adults (Lorentz et al., 2023) [60]Immersive attention training tasks set in complex environments, like virtual markets.Enhanced focus and attentional resource management in high-stimulation scenarios.
General Population and Adults with ADHD (Selaskowski et al., 2023) [81]XR-based interventions with personalized difficulty adjustments.Greater effectiveness in training attention skills compared to non-adaptive methods.
Executive
Functions
Training
General Population and MCI Liao et al., 2019) [49]Participants navigate a virtual city or manage tasks in a simulated workplace, engaging executive functions.XR tasks provided a more realistic training experience, leading to better problem-solving and adaptability.
Social Cognition
Training
Adults with ASD (Kourtesis et al., 2023) [57]Simulations of daily life tasks, like job interviews and shopping, for real-world social skill practice.XR provided a safe space to learn and adapt, enhancing social interactions and everyday functioning.
Children with ASD (Bekele et al., 2016) [55]XR scenarios focusing on social interactions, like making eye contact and understanding social cues.Effective at reducing social anxiety and improving communication skills in a safe, controlled environment.
Children with ASD (Ip et al., 2018) [56]Virtual practice of social tasks, tailored to individual needs, with repeated exposure.Personalized training showed significant improvements in social cognition and adaptive behavior.
Multiple
Cognitive
Domains
Training
TBI Patients (Masoumzadeh and Moussavi, 2020) [33]Gradually increasing task complexity in XR settings to support cognitive skill recovery.Effective in enhancing spatial memory and task-switching, critical for neurological recovery.
Children with Attention or Learning Challenges (Coleman et al., 2019) [85]Game-like XR scenarios for working memory, problem-solving, and attention training.High engagement and sustained interest, resulting in cognitive gains and improved academic skills.
Children (Araiza-Alba et al., 2021) [106]Interactive missions and virtual puzzles that require strategic thinking and memory use.Enhanced cognitive skill development and positive behavioral outcomes in young learners.
Children with ADHD (Ou et al., 2020) [107]XR-based training that focuses on attention and strategic thinking through playful scenarios.XR tasks promoted adaptability, patience, and academic success.
Children with ADHD (Wong et al., 2023) [46]Engaging XR tasks for attention, social cognition, and executive function, with adaptable challenges.Increased focus, better task management, and improved social skills, proving XR to be a highly effective therapeutic tool.

6.2. Population-Specific XR Training Programs

One of XR’s greatest strengths lies in its ability to adapt to the unique needs of specific populations [77]. This flexibility enables the design of tailored interventions that address the cognitive challenges faced by children, older adults, and individuals with neurodevelopmental or neurological conditions [49,56].
XR-based cognitive training focuses on enhancing attention, working memory, and problem-solving skills through game-like, interactive scenarios [85,106]. Tasks such as navigating virtual mazes, solving puzzles, or completing missions not only improve cognitive performance but also foster persistence, patience, and adaptability—skills essential for academic success [107]. The engaging and playful nature of XR makes it particularly effective for sustaining children’s motivation and participation [108].
For aging adults, XR-based training addresses cognitive decline by targeting memory, attention, and executive functions through tasks that replicate daily-life scenarios [109]. Examples include remembering shopping lists in virtual stores, preparing meals, or navigating public transport systems [110]. These practical tasks not only improve cognitive performance but also enhance confidence and quality of life in older adults [111]. XR environments can also accommodate physical limitations, ensuring safety and comfort for participants with reduced mobility [112].
For individuals with brain injuries or neurodevelopmental disorders, XR provides a safe and controlled space to practice cognitive skills critical for recovery and daily functioning [77]. In traumatic brain injury (TBI) rehabilitation, XR environments can simulate real-world scenarios—like driving or workplace tasks—that challenge cognitive abilities while allowing gradual increases in task complexity [33]. Similarly, for individuals with ASD, XR-based interventions focus on improving social cognition, executive function, and adaptive skills [55,56]. Virtual environments simulate real-world social interactions, such as job interviews or shopping, offering a safe space for practice without fear of judgment or consequences [57].
XR’s adaptability also benefits children with ADHD. Engaging XR tasks that train attention, social cognition, and executive function have been shown to enhance focus, task management, and behavioral outcomes [46]. The structured yet flexible nature of XR makes it an effective, user-centered therapeutic tool that can be tailored to individual progress and needs [46]. Overall, XR technologies allow for the development of personalized and contextually relevant interventions that address the unique challenges faced by diverse populations, improving cognitive and adaptive skills in ways that are highly transferable to real-world settings [99].

6.3. Long-Term Effects and Retention of Cognitive Skills in XR Training

The long-term effects and retention of cognitive improvements following XR-based cognitive training are key areas of ongoing research [113]. XR technologies have demonstrated strong potential to produce lasting gains by simulating real-world challenges, enhancing engagement, and fostering deeper learning processes [15,39]. However, the degree to which these skills transfer to daily tasks and remain stable over time without reinforcement remains a central question [77].
A primary contributor to retention is XR’s ecological validity—its ability to replicate realistic tasks [109]. Unlike traditional training, XR tasks simulate daily activities, such as navigating environments, recalling object locations, or solving multi-step problems [114]. These context-rich experiences make cognitive improvements more likely to generalize to real-world situations, as shown in studies where older adults sustained memory gains weeks after completing XR-based training [77,102].
User engagement also plays a pivotal role in reinforcing long-term cognitive benefits [44]. XR environments provide immersive, interactive experiences that sustain motivation and emotional arousal, key factors for durable learning [111,115]. Features such as adaptive task difficulty and real-time feedback ensure participants remain appropriately challenged, enhancing both skill acquisition and retention [116].
Nevertheless, challenges remain. While XR training shows promise, some skills may not fully transfer to non-virtual settings, particularly without follow-up reinforcement [109]. Additionally, the optimal duration, frequency, and reinforcement strategies for maintaining long-term gains are still under investigation [77]. Research into XR’s role in neuroplasticity—the brain’s ability to reorganize and adapt—suggests that engaging in adaptive, cognitively demanding XR tasks can strengthen neural connections, particularly in aging populations or individuals recovering from brain injuries [117]. Future studies should explore these mechanisms to better understand the long-term biological and behavioral impacts of XR-based training [77].

6.4. Future Directions in XR-Based Cognitive Training

The future of XR-based cognitive training will be shaped by advancements in technology, personalization, and accessibility, offering new possibilities for enhancing cognitive health across diverse populations [77].
One promising direction is the integration of XR with neuroimaging techniques such as EEG [79]. Real-time monitoring of brain activity can provide insights into users’ cognitive states, allowing systems to dynamically adapt task difficulty based on cognitive load or fatigue [80,118]. This combination of XR and neuroimaging represents a significant step toward creating tailored interventions that optimize learning outcomes [17].
Advances in socially interactive XR environments will further expand the scope of cognitive training [57]. Virtual environments featuring avatars and collaborative scenarios can simulate real-world challenges that require users to develop problem-solving, emotional regulation, and social cognition skills [119]. These interactive platforms are particularly beneficial for individuals with ASD or social anxiety, offering safe, repeatable spaces to practice and refine social behaviors [55,57].
Artificial intelligence (AI) will also revolutionize XR training. AI-driven algorithms can analyze user performance data to predict cognitive states, personalize training trajectories, and adjust difficulty in real time, ensuring participants remain challenged and engaged [80]. AI can further enhance XR environments by enabling dynamic, naturalistic responses to user actions, creating immersive experiences that closely replicate real-world cognitive demands [118].
Finally, as XR technologies become more affordable and accessible, their adoption will expand into resource-limited settings such as schools, clinics, and homes [1]. The development of lightweight, ergonomic hardware and cloud-based platforms will enable remote delivery of XR training programs, democratizing access to high-quality cognitive interventions for underserved populations [1,50].
These advancements will transform XR-based cognitive training into a scalable, personalized, and effective solution for addressing cognitive challenges. Future research should prioritize validating these approaches through longitudinal studies and exploring their potential for lifelong cognitive health, rehabilitation, and development [44,113].

7. Clinical Utility of XR in Cognitive Assessment and Training

7.1. Benefits of XR-Based Cognitive Tools in Clinical Settings

XR technologies—VR, AR and MR—offer unique advantages in clinical settings for cognitive assessment and training [1]. XR provides immersive, interactive, and flexible platforms that enable the evaluation and improvement of cognitive functions in ways that traditional methods cannot [1,51]. By simulating real-world scenarios, XR enhances ecological validity, bridging the gap between laboratory-based testing and real-life performance [3,62].
In clinical practice, XR tools allow clinicians to observe cognitive processes, such as memory, attention, and executive functions, in dynamic, controlled environments [23,100]. For instance, memory assessments can involve navigating virtual spaces and recalling item locations, providing richer insights compared to static tasks [36,88]. Similarly, XR-based tasks can simulate multitasking challenges, enabling clinicians to evaluate decision-making and problem-solving under realistic conditions [4,23]. This comprehensive approach is particularly valuable for individuals with cognitive impairments, such as dementia or TBI, as it reveals how they manage complex tasks, distractions, and spatial challenges in real time [14].
Another major advantage of XR tools is their customizability and adaptability [115]. Task difficulty can be adjusted in real time, allowing clinicians to personalize assessments and training to meet individual needs [71,115]. This adaptability enhances patient engagement and adherence, which are critical for achieving meaningful improvements, particularly among older adults and children [43,108].

7.2. Comparative Analysis of Traditional vs. XR-Based Cognitive Assessments and Training

Traditional cognitive assessments—such as paper-and-pencil tests or computerized tasks—are valuable for standardized evaluation but often lack ecological validity and fail to reflect real-world cognitive demands [2,5]. XR-based assessments, in contrast, immerse participants in realistic, interactive environments that engage multiple cognitive processes simultaneously [3].
For example, traditional memory tasks may involve recalling word lists, whereas XR tools place participants in virtual environments—like grocery stores—where they must remember and retrieve items, integrating attention, memory, and spatial navigation [4,5]. Similarly, XR enables real-time collection of multimodal data—ET, motion tracking, and physiological signals—providing deeper insights into participants’ cognitive and emotional states during task performance [120].
In cognitive training, XR tools outperform traditional repetitive exercises by offering engaging, context-rich tasks [77]. For instance, XR-based interventions may simulate driving, workplace activities, or social interactions, enhancing motivation and facilitating skill transfer to real-world situations [33,56]. Unlike static methods, XR dynamically adapts difficulty based on user performance, ensuring continuous engagement and personalized progression [105].

7.3. Case Examples of Clinical Applications

XR has demonstrated significant clinical utility across various populations and conditions [13,51]. For individuals with MCI or dementia, XR tools simulate daily activities—such as navigating virtual cities or managing household tasks—to assess spatial memory, attention, and executive function [16,121]. These immersive assessments provide accurate, ecologically valid insights that aid in early diagnosis and personalized care plans [121].
In rehabilitation settings, XR has been employed to improve cognitive and motor recovery for stroke or TBI patients [14]. Tasks such as reaching for virtual objects, navigating spaces, or multitasking allow clinicians to gradually increase task complexity while tracking progress in real time [49,122].
For individuals with ASD, XR enables safe and structured practice of social cognition and communication skills [56]. Virtual avatars and interactive environments help individuals engage in realistic social tasks—such as making eye contact, participating in conversations, or attending job interviews—without the pressures of real-world interactions [57].
XR tools have also shown promise for older adults, offering cognitive training scenarios like remembering appointments or managing finances within virtual environments [123]. These tasks improve cognitive function, build confidence, and provide a non-invasive, engaging approach to delaying cognitive decline [36,123].

7.4. Barriers to XR Adoption: Hardware, Software, and Accessibility Challenges

While XR technologies hold immense potential for cognitive assessment and training, several barriers continue to hinder their widespread adoption [113]. A major challenge is the cost of XR hardware, including high-quality headsets, sensors, and computing systems, which remain prohibitively expensive for many educational, clinical, and research institutions, particularly in low-resource settings [1,3]. Although hardware prices have decreased in recent years, the financial burden associated with acquiring and maintaining XR systems remains significant [81].
Developing XR software is another resource-intensive barrier [18]. Creating engaging, user-friendly applications requires substantial expertise, financial investment, and continuous updates to ensure compatibility with rapidly evolving hardware technologies [1,99]. Institutions lacking the necessary infrastructure or funding may struggle to integrate XR effectively into cognitive programs [81].
Accessibility also poses significant challenges. Operating XR systems requires a certain level of technological literacy, which may not be present in older adults or individuals with limited experience using digital technologies [42]. Physical barriers, such as discomfort caused by prolonged headset use or difficulties interacting with virtual environments due to motor impairments, further complicate adoption [18,124]. Additionally, individuals with visual or hearing impairments may face challenges if XR systems are not designed with adaptive accessibility features [43,83].
Additionally, the technical complexity of XR implementation—developing, maintaining, and integrating these systems into existing clinical workflows—requires specialized expertise in cognitive science and software engineering [80]. Clinician training and system compatibility with other diagnostic tools are further obstacles to adoption [81].
Regulatory issues surrounding data privacy and security also pose challenges. XR systems collect sensitive biometric and behavioral data, necessitating clear frameworks to ensure compliance with clinical standards for patient safety and data protection [11,90].
Overcoming these barriers requires a concerted effort to make XR systems more affordable, user-friendly, and inclusive [1,3]. Prioritizing streamlined software, lightweight hardware, and adaptive design will be key to ensuring that XR technologies are accessible and acceptable across diverse populations [124].

7.5. Current Issues with Using XR for Cognitive Assessment and Training

7.5.1. Cybersickness and Immersion Fatigue

One of the primary challenges of XR technologies is cybersickness, a phenomenon similar to motion sickness caused by mismatches between visual input and vestibular signals [113,125]. Symptoms such as nausea, dizziness, and eye strain can limit how long participants can engage with XR tasks and negatively impact cognitive performance during assessments and training [94,95]. Cybersickness is particularly problematic in applications requiring prolonged immersion or rapid movements, such as navigation tasks or VR-based simulations [94,126].
In addition to cybersickness, immersion fatigue can occur when participants are exposed to highly immersive environments for extended periods [94]. While XR’s immersive nature enhances engagement, excessive cognitive and sensory stimulation may cause mental fatigue, visual discomfort, and declining task performance over time [9,98]. This challenge is especially relevant for populations with cognitive impairments or older adults, where sustained focus may already be limited [13]. Optimizing session durations, implementing periodic breaks, and managing the intensity of tasks are essential strategies for mitigating these effects [94,122].

7.5.2. Underutilization of XR Technologies

Despite technological advancements, many XR applications for cognitive assessment and training fail to fully exploit XR’s capabilities [1]. A significant area of underutilization lies in the integration of multimodal data streams, such as combining visual, auditory, and haptic feedback with physiological measures like EEG, GSR, and heart rate variability [8,80]. Multimodal integration allows for more adaptive, personalized experiences and provides deeper insights into users’ cognitive and emotional states [69].
For example, incorporating haptic feedback can make interactions with virtual objects more natural [127,128], while real-time monitoring of physiological responses can dynamically adjust task difficulty to maintain an optimal cognitive load [72]. However, many current XR systems focus primarily on visual and auditory inputs, missing opportunities to enhance immersion, interactivity, and the granularity of cognitive assessments [129]. Expanding the use of these modalities could significantly improve both the UX and the accuracy of XR-based cognitive and training tools [80,129].

7.5.3. Population-Specific Challenges

The effectiveness and usability of XR technologies can vary significantly across different populations, creating unique challenges for their widespread adoption.
  • Children: While XR holds great promise for cognitive training in younger populations, children may be more susceptible to cybersickness due to their developing vestibular systems [108]. Additionally, children require highly interactive, engaging content to sustain their attention, making careful design of XR tasks essential [46,108].
  • Older Adults: Older populations often face barriers related to technology adoption. Physical discomfort caused by heavy or poorly balanced headsets, visual fatigue, and unfamiliarity with immersive interfaces can limit their participation and effectiveness [43,48]. Tailoring XR experiences to accommodate physical and cognitive limitations is necessary to make these tools accessible to aging adults [11].
  • Cognitive Impairment: Individuals with dementia, TBI, or other neurological conditions may find XR environments overwhelming or disorienting due to their immersive nature and the cognitive load required to navigate these systems [38,50]. Designing XR tools with simplified interfaces, adjustable immersion levels, and clear guidance can help address these challenges [13].
Addressing population-specific barriers requires thoughtful, user-centered design that prioritizes accessibility, comfort, and usability across diverse user groups [1,113].

7.5.4. Hardware Limitations and Their Impact on Immersive Experience

The quality of the XR experience is heavily dependent on the underlying hardware, and current limitations remain a significant hurdle for clinical applications [93]. High-quality XR systems rely on high-resolution displays, wide fields of view, and responsive motion tracking, but limitations in these areas can reduce immersion and user comfort [1]. For example, low-resolution visuals can cause pixelation, while latency in motion tracking can disrupt natural interactions with virtual objects, reducing the accuracy and realism of cognitive tasks [74].
The weight and ergonomics of VR headsets are also critical concerns [48]. Heavy or poorly balanced headsets can lead to physical discomfort, particularly during prolonged use, which limits session duration and may reduce user engagement [43]. These challenges are exacerbated for populations such as older adults and individuals with physical impairments [42]. Future advancements in hardware design—such as lighter, more ergonomic devices with improved motion tracking and display performance—are essential for enhancing usability and expanding XR’s clinical utility [1,83].

7.5.5. Strategies to Mitigate Challenges and Improve XR Implementation

Several strategies can address these challenges to maximize the effectiveness of XR technologies in cognitive assessment and training [37]. To reduce cybersickness, techniques such as dynamic field-of-view adjustment, optimizing frame rates, and improving motion tracking accuracy can minimize sensory mismatches [94,130]. For managing immersion fatigue, task designs that incorporate breaks, natural transitions, and varied cognitive loads can help sustain engagement without overwhelming users [122].
Addressing underutilization of XR technologies requires expanding the integration of multimodal data streams and sensory feedback, such as EEG, GSR, and haptics, to provide more adaptive, personalized cognitive tools [72,80]. Thoughtful design that accounts for population-specific needs—including intuitive interfaces, adjustable immersion levels, and physical accommodations—will enhance accessibility and usability for diverse groups [48,108].
Finally, continued advancements in hardware—including lighter, more ergonomic headsets with improved resolution and tracking capabilities—will be critical for delivering immersive, comfortable experiences that support effective cognitive assessment and training [1,83].

7.5.6. Guidelines to Maximize the Effectiveness of XR Tools for Assessment and Training

XR experts and workgroups are developing guidelines and checklists to ensure that XR assessment and training applications meet optimal criteria and research rigor [37,44,131]. XR tools for cognitive assessment employ a multidimensional checklist to ensure successful development, with a focus on ecological relevance, task adaptability, and anticipating predictable pitfalls [37,131]. Recent guidelines address the necessity for multidisciplinary workgroups in XR development applications and emphasize the integration of multimodal techniques to ensure VR applications conduct adequate cognitive assessments [18,122,131].
Additionally, growing efforts are focused on developing guidelines to analyze XR rehabilitation applications in the early stages of development, ensuring optimal design, rigorous protocol testing, and comprehensive evaluation of human factors such as acceptability, usability, cybersickness, and safety [44]. For example, a recently proposed framework introduces AI techniques to adapt task difficulty and personalize the VR training process through an adaptive VR application [122]. Constantly updating guidelines and frameworks is crucial to keep pace with continuous technological advancements and software development [37,44].

8. Conclusions

This review underscores the transformative potential of XR technologies in cognitive assessment and training, particularly in their ability to provide ecologically valid evaluations of real-world cognitive skills. Unlike traditional methods, XR immerses participants in dynamic, interactive environments, enabling a comprehensive analysis of memory, attention, decision-making, and problem-solving as they occur in everyday contexts.
A defining strength of XR is its capacity for multimodal integration. Systems combining GSR, EEG, ET, hand tracking, and body tracking allow for a deeper, holistic understanding of users’ cognitive and emotional states. These innovations have the potential to drive adaptive and personalized interventions, yet many current XR applications fail to fully exploit this capability. Future advancements in multimodal systems will be critical for enhancing both engagement and therapeutic outcomes.
XR’s engaging and interactive environments are particularly advantageous for cognitive training programs, fostering motivation and adherence that are often lacking in traditional approaches. However, challenges such as cybersickness, immersion fatigue, and hardware constraints remain significant barriers to widespread adoption. Addressing these issues through ergonomic hardware design, intuitive interfaces, and enhanced software optimization will be essential for ensuring accessibility and comfort for diverse users.
The validation of XR-based rehabilitation programs through rigorous randomized controlled trials (RCTs) is another pressing requirement. Such trials must be supported by multicenter collaborations, standardized protocols, and iterative testing methodologies to establish both scientific reliability and practical applicability. Early-phase studies and continuous feedback from diverse user groups can further refine these technologies for specific clinical and educational settings.
Despite these advances, XR technologies face substantial limitations. These include the underutilization of multimodal features, usability challenges for older adults and cognitively impaired populations, and regulatory hurdles concerning data privacy and ethical standards. Resolving these issues is essential for achieving the broader acceptance of XR in healthcare and education.
Looking ahead, the future of XR lies in its ability to seamlessly integrate cutting-edge technologies, enhance user-centric designs, and expand its applicability to both general and specialized populations. By overcoming current limitations, XR can solidify its role as a transformative tool in neuropsychology and cognitive science, offering innovative and impactful solutions for assessment, training, and rehabilitation.

Future Directions

To realize the full potential of XR in cognitive science, future advancements must address critical areas to overcome existing limitations and expand its transformative capabilities. One essential focus is the enhancement of multimodal integration. By incorporating diverse biometric data streams—including EEG, GSR, and ET—XR systems can enable real-time task adaptations and provide a more immersive and responsive user experience. This integration is pivotal for improving the accuracy and effectiveness of cognitive training and assessments.
Equally significant is the development of personalized and adaptive solutions tailored to the unique cognitive and emotional needs of individual users. Such customization is particularly valuable for vulnerable or clinical populations, where generic approaches often fall short. By leveraging real-time feedback and data analysis, XR systems can dynamically adjust their content to align with users’ specific requirements, enhancing both engagement and outcomes.
Advances in hardware design are also paramount. The development of lighter, more ergonomic headsets and more precise motion tracking systems will not only improve usability but will also expand accessibility to broader audiences. These innovations are crucial for mitigating issues such as physical discomfort and usability barriers, making XR technologies more viable for prolonged use in educational, clinical, and domestic settings.
Additionally, XR’s ecological validity opens new opportunities for its application in diverse, real-world scenarios. From navigation and workplace simulations to household management, XR tools can provide invaluable insights into users’ cognitive abilities in practical contexts. These applications extend XR’s utility beyond traditional domains, offering solutions that are both innovative and pragmatic.
Rehabilitation is another area where XR’s flexibility and adaptability shine. By creating safe, controlled environments, XR enables individuals to practice real-world tasks critical to their recovery. This capability is particularly beneficial for patients managing neurological injuries or cognitive impairments, as it fosters consistent engagement and improves therapeutic outcomes. The ability to adapt these environments to individual needs ensures that rehabilitation programs are both effective and accessible.
Ultimately, addressing these priorities will drive the evolution of XR technologies into inclusive, robust, and impactful tools. By overcoming current limitations, XR can solidify its role as a cornerstone in neuropsychology and cognitive science, facilitating groundbreaking advancements in research, education, and clinical practice.

Author Contributions

Conceptualization, P.V.G.-E. and P.K.; validation, P.V.G.-E., S.F.-G. and P.K.; writing—original draft preparation, P.V.G.-E. and P.K.; writing—review and editing, P.V.G.-E., S.F.-G. and P.K.; supervision, S.F.-G. and P.K.; project administration, S.F.-G. and P.K. All authors have read and agreed to the published version of the manuscript.

Funding

This research received no external funding.

Institutional Review Board Statement

Not applicable.

Informed Consent Statement

Not applicable.

Data Availability Statement

Data sharing is not applicable.

Conflicts of Interest

The authors declare no conflicts of interest.

References

  1. Pons, P.; Navas-Medrano, S.; Soler-Dominguez, J.L. Extended Reality for Mental Health: Current Trends and Future Challenges. Front. Comput. Sci. 2022, 4, 1034307. [Google Scholar] [CrossRef]
  2. Howieson, D. Current Limitations of Neuropsychological Tests and Assessment Procedures. Clin. Neuropsychol. 2019, 33, 200–208. [Google Scholar] [CrossRef]
  3. Pieri, L.; Tosi, G.; Romano, D. Virtual Reality Technology in Neuropsychological Testing: A Systematic Review. J. Neuropsychol. 2023, 17, 382–399. [Google Scholar] [CrossRef] [PubMed]
  4. Kourtesis, P.; Collina, S.; Doumas, L.A.A.; MacPherson, S.E. Validation of the Virtual Reality Everyday Assessment Lab (VR-EAL): An Immersive Virtual Reality Neuropsychological Battery with Enhanced Ecological Validity. J. Int. Neuropsychol. Soc. 2021, 27, 181–196. [Google Scholar] [CrossRef] [PubMed]
  5. Parsons, T.D.; Carlew, A.R.; Magtoto, J.; Stonecipher, K. The Potential of Function-Led Virtual Environments for Ecologically Valid Measures of Executive Function in Experimental and Clinical Neuropsychology. Neuropsychol. Rehabil. 2017, 27, 777–807. [Google Scholar] [CrossRef]
  6. Baceviciute, S.; Lucas, G.; Terkildsen, T.; Makransky, G. Investigating the Redundancy Principle in Immersive Virtual Reality Environments: An Eye-Tracking and EEG Study. J. Comput. Assist. Learn. 2022, 38, 120–136. [Google Scholar] [CrossRef]
  7. Kalantari, S.; Rounds, J.D.; Kan, J.; Tripathi, V.; Cruz-Garza, J.G. Comparing Physiological Responses during Cognitive Tests in Virtual Environments vs. in Identical Real-World Environments. Sci. Rep. 2021, 11, 10227. [Google Scholar] [CrossRef] [PubMed]
  8. Mishra, S.; Kumar, A.; Padmanabhan, P.; Gulyás, B. Neurophysiological Correlates of Cognition as Revealed by Virtual Reality: Delving the Brain with a Synergistic Approach. Brain Sci. 2021, 11, 51. [Google Scholar] [CrossRef]
  9. Souchet, A.D.; Philippe, S.; Lourdeaux, D.; Leroy, L. Measuring Visual Fatigue and Cognitive Load via Eye Tracking While Learning with Virtual Reality Head-Mounted Displays: A Review. Int. J. Hum.–Comput. Interact. 2022, 38, 801–824. [Google Scholar] [CrossRef]
  10. Riva, G.; Mancuso, V.; Cavedoni, S.; Stramba-Badiale, C. Virtual Reality in Neurorehabilitation: A Review of Its Effects on Multiple Cognitive Domains. Expert Rev. Med. Devices 2020, 17, 1035–1061. [Google Scholar] [CrossRef]
  11. Skurla, M.D.; Rahman, A.T.; Salcone, S.; Mathias, L.; Shah, B.; Forester, B.P.; Vahia, I.V. Virtual Reality and Mental Health in Older Adults: A Systematic Review. Int. Psychogeriatr. 2021, 34, 143–145. [Google Scholar] [CrossRef] [PubMed]
  12. Dey, A.; Chatburn, A.; Billinghurst, M. Exploration of an EEG-Based Cognitively Adaptive Training System in Virtual Reality. In Proceedings of the 2019 IEEE Conference on Virtual Reality and 3D User Interfaces (VR), Osaka, Japan, 23–27 March 2019; pp. 220–226. [Google Scholar]
  13. Clay, F.; Howett, D.; Fitzgerald, J.; Fletcher, P.; Chan, D.; House, D. Use of Immersive Virtual Reality in the Assessment and Treatment of Alzheimer’s Disease: A Systematic Review. J. Alzheimer’s Dis. 2020, 75, 23–43. [Google Scholar] [CrossRef] [PubMed]
  14. Shen, J.; Lundine, J.P.; Koterba, C.; Udaipuria, S.; Busch, T.; Rausch, J.; Yeates, K.O.; Crawfis, R.; Xiang, H.; Taylor, H.G. VR-Based Cognitive Rehabilitation for Children with Traumatic Brain Injuries: Feasibility and Safety. Rehabil. Psychol. 2022, 67, 474–483. [Google Scholar] [CrossRef] [PubMed]
  15. Howard, M.C. A Meta-Analysis and Systematic Literature Review of Virtual Reality Rehabilitation Programs. Comput. Hum. Behav. 2017, 70, 317–327. [Google Scholar] [CrossRef]
  16. Howett, D.; Castegnaro, A.; Krzywicka, K.; Hagman, J.; Marchment, D.; Henson, R.; Rio, M.; King, J.A.; Burgess, N.; Chan, D. Differentiation of Mild Cognitive Impairment Using an Entorhinal Cortex-Based Test of Virtual Reality Navigation. Brain 2019, 142, 1751–1766. [Google Scholar] [CrossRef] [PubMed]
  17. Wen, D.; Li, R.; Jiang, M.; Li, J.; Liu, Y.; Dong, X.; Saripan, M.I.; Song, H.; Han, W.; Zhou, Y. Multi-Dimensional Conditional Mutual Information with Application on the EEG Signal Analysis for Spatial Cognitive Ability Evaluation. Neural Netw. 2022, 148, 23–36. [Google Scholar] [CrossRef] [PubMed]
  18. Kourtesis, P.; Korre, D.; Collina, S.; Doumas, L.A.A.; MacPherson, S.E. Guidelines for the Development of Immersive Virtual Reality Software for Cognitive Neuroscience and Neuropsychology: The Development of Virtual Reality Everyday Assessment Lab (VR-EAL), a Neuropsychological Test Battery in Immersive Virtual Reality. Front. Comput. Sci. 2020, 1, e00012. [Google Scholar] [CrossRef]
  19. Dawson, D.R.; Marcotte, T.D. Special Issue on Ecological Validity and Cognitive Assessment. Neuropsychol. Rehabil. 2017, 27, 599–602. [Google Scholar] [CrossRef] [PubMed]
  20. Faria, A.L.; Andrade, A.; Soares, L.; I Badia, S.B. Benefits of Virtual Reality Based Cognitive Rehabilitation through Simulated Activities of Daily Living: A Randomized Controlled Trial with Stroke Patients. J. NeuroEng. Rehabil. 2016, 13, 96. [Google Scholar] [CrossRef] [PubMed]
  21. Diaz-Orueta, U.; Rogers, B.M.; Blanco-Campal, A.; Burke, T. The Challenge of Neuropsychological Assessment of Visual/Visuo-Spatial Memory: A Critical, Historical Review, and Lessons for the Present and Future. Front. Psychol. 2022, 13, 962025. [Google Scholar] [CrossRef] [PubMed]
  22. Jansari, A.S.; Devlin, A.; Agnew, R.; Akesson, K.; Murphy, L.; Leadbetter, T. Ecological Assessment of Executive Functions: A New Virtual Reality Paradigm. Brain Impair. 2014, 15, 71–87. [Google Scholar] [CrossRef]
  23. Borgnis, F.; Baglio, F.; Pedroli, E.; Rossetto, F.; Uccellatore, L.; Oliveira, J.A.G.; Riva, G.; Cipresso, P. Available Virtual Reality-Based Tools for Executive Functions: A Systematic Review. Front. Psychol. 2022, 13, 833136. [Google Scholar] [CrossRef]
  24. Mancuso, V.; Sarcinella, E.D.; Bruni, F.; Arlati, S.; Di Santo, S.G.; Cavallo, M.; Cipresso, P.; Pedroli, E. Systematic Review of Memory Assessment in Virtual Reality: Evaluating Convergent and Divergent Validity with Traditional Neuropsychological Measures. Front. Hum. Neurosci. 2024, 18, 1380575. [Google Scholar] [CrossRef] [PubMed]
  25. Kisker, J.; Gruber, T.; Schöne, B. Virtual Reality Experiences Promote Autobiographical Retrieval Mechanisms: Electrophysiological Correlates of Laboratory and Virtual Experiences. Psychol. Res. 2021, 85, 2485–2501. [Google Scholar] [CrossRef]
  26. Sauzéon, H.; N’Kaoua, B.; Arvind Pala, P.; Taillade, M.; Guitton, P. Age and Active Navigation Effects on Episodic Memory: A Virtual Reality Study. Br. J. Psychol. 2016, 107, 72–94. [Google Scholar] [CrossRef] [PubMed]
  27. Canty, A.L.; Fleming, J.; Patterson, F.; Green, H.J.; Man, D.; Shum, D.H.K. Evaluation of a Virtual Reality Prospective Memory Task for Use with Individuals with Severe Traumatic Brain Injury. Neuropsychol. Rehabil. 2014, 24, 238–265. [Google Scholar] [CrossRef] [PubMed]
  28. Kourtesis, P.; MacPherson, S.E. An Ecologically Valid Examination of Event-Based and Time-Based Prospective Memory Using Immersive Virtual Reality: The Influence of Attention, Memory, and Executive Function Processes on Real-World Prospective Memory. Neuropsychol. Rehabil. 2023, 33, 255–280. [Google Scholar] [CrossRef]
  29. Lecouvey, G.; Morand, A.; Gonneaud, J.; Piolino, P.; Orriols, E.; Pélerin, A.; Ferreira Da Silva, L.; De La Sayette, V.; Eustache, F.; Desgranges, B. An Impairment of Prospective Memory in Mild Alzheimer’s Disease: A Ride in a Virtual Town. Front. Psychol. 2019, 10, 241. [Google Scholar] [CrossRef]
  30. Kourtesis, P.; Collina, S.; Doumas, L.A.A.; MacPherson, S.E. An Ecologically Valid Examination of Event-Based and Time-Based Prospective Memory Using Immersive Virtual Reality: The Effects of Delay and Task Type on Everyday Prospective Memory. Memory 2021, 29, 486–506. [Google Scholar] [CrossRef] [PubMed]
  31. Huizeling, E.; Alday, P.M.; Peeters, D.; Hagoort, P. Combining EEG and 3D-Eye-Tracking to Study the Prediction of Upcoming Speech in Naturalistic Virtual Environments: A Proof of Principle. Neuropsychologia 2023, 191, 108730. [Google Scholar] [CrossRef] [PubMed]
  32. Peeters, D. Virtual Reality: A Game-Changing Method for the Language Sciences. Psychon. Bull. Rev. 2019, 26, 894–900. [Google Scholar] [CrossRef] [PubMed]
  33. Masoumzadeh, S.; Moussavi, Z. Does Practicing with a Virtual Reality Driving Simulator Improve Spatial Cognition in Older Adults? A Pilot Study. Neurosci. Insights 2020, 15, 2633105520967930. [Google Scholar] [CrossRef]
  34. Tuena, C.; Mancuso, V.; Stramba-badiale, C.; Pedroli, E. Egocentric and Allocentric Spatial Memory in Mild Cognitive Impairment with Real-World and Virtual Navigation Tasks: A Systematic Review. J. Alzheimer’s Dis. 2021, 79, 95–116. [Google Scholar] [CrossRef]
  35. Coldham, G.; Cook, D.M. VR Usability from Elderly Cohorts: Preparatory Challenges in Overcoming Technology Rejection. In Proceedings of the 2017 National Information Technology Conference (NITC), Colombo, Srilanka, 13–15 September 2017; pp. 131–135. [Google Scholar]
  36. Ijaz, K.; Ahmadpour, N.; Naismith, S.L.; Calvo, R.A. An Immersive Virtual Reality Platform for Assessing Spatial Navigation Memory in Predementia Screening: Feasibility and Usability Study. JMIR Ment. Health 2019, 6, e13887. [Google Scholar] [CrossRef] [PubMed]
  37. Krohn, S.; Tromp, J.; Quinque, E.M.; Belger, J.; Klotzsche, F.; Rekers, S.; Chojecki, P.; de Mooij, J.; Akbal, M.; McCall, C.; et al. Multidimensional Evaluation of Virtual Reality Paradigms in Clinical Neuropsychology: Application of the VR-Check Framework. J. Med. Internet Res. 2020, 22, 16724. [Google Scholar] [CrossRef] [PubMed]
  38. Arlati, S.; Di Santo, S.G.; Franchini, F.; Mondellini, M.; Filiputti, B.; Luchi, M.; Ratto, F.; Ferrigno, G.; Sacco, M.; Greci, L. Acceptance and Usability of Immersive Virtual Reality in Older Adults with Objective and Subjective Cognitive Decline. J. Alzheimer’s Dis. 2021, 80, 1025–1038. [Google Scholar] [CrossRef] [PubMed]
  39. Varela-Aldás, J.; Buele, J.; Amariglio, R.; García-Magariño, I.; Palacios-Navarro, G. The Cupboard Task: An Immersive Virtual Reality-Based System for Everyday Memory Assessment. Int. J. Hum.-Comput. Stud. 2022, 167, 102885. [Google Scholar] [CrossRef]
  40. Li, F.; Lee, C.-H.; Feng, S.; Trappey, A.; Gilani, F. Prospective on Eye-Tracking-Based Studies in Immersive Virtual Reality. In Proceedings of the 2021 IEEE 24th International Conference on Computer Supported Cooperative Work in Design (CSCWD), Dalian, China, 5 May 2021; pp. 861–866. [Google Scholar]
  41. Tada, K.; Sorimachi, Y.; Kutsuzawa, K.; Owaki, D.; Hayashibe, M. Integrated Quantitative Evaluation of Spatial Cognition and Motor Function with HoloLens Mixed Reality. Sensors 2024, 24, 528. [Google Scholar] [CrossRef]
  42. Ijaz, K.; Tran, T.T.M.; Kocaballi, A.B.; Calvo, R.A.; Berkovsky, S.; Ahmadpour, N. Design Considerations for Immersive Virtual Reality Applications for Older Adults: A Scoping Review. Multimodal Technol. Interact. 2022, 6, 60. [Google Scholar] [CrossRef]
  43. Pardini, S.; Gabrielli, S.; Gios, L.; Dianti, M.; Mayora-Ibarra, O.; Appel, L.; Olivetto, S.; Torres, A.; Rigatti, P.; Trentini, E.; et al. Customized Virtual Reality Naturalistic Scenarios Promoting Engagement and Relaxation in Patients with Cognitive Impairment: A Proof-of-Concept Mixed-Methods Study. Sci. Rep. 2023, 13, 20516. [Google Scholar] [CrossRef]
  44. Vlake, J.H.; Drop, D.L.Q.; Van Bommel, J.; Riva, G.; Wiederhold, B.K.; Cipresso, P.; Rizzo, A.S.; Rothbaum, B.O.; Botella, C.; Hooft, L.; et al. Reporting Guidelines for the Early-Phase Clinical Evaluation of Applications Using Extended Reality: RATE-XR Qualitative Study Guideline. J. Med. Internet Res. 2024, 26, e56790. [Google Scholar] [CrossRef] [PubMed]
  45. Giatzoglou, E.; Vorias, P.; Kemm, R.; Karayianni, I.; Nega, C.; Kourtesis, P. The Trail Making Test in Virtual Reality (TMT-VR): The Effects of Interaction Modes and Gaming Skills on Cognitive Performance of Young Adults. Appl. Sci. 2024, 14, 10010. [Google Scholar] [CrossRef]
  46. Wong, K.-P.; Zhang, B.; Qin, J. Unlocking Potential: The Development and User-Friendly Evaluation of a Virtual Reality Intervention for Attention-Deficit/Hyperactivity Disorder. Appl. Syst. Innov. 2023, 6, 110. [Google Scholar] [CrossRef]
  47. Doniger, G.M.; Beeri, M.S.; Bahar-Fuchs, A.; Gottlieb, A.; Tkachov, A.; Kenan, H.; Livny, A.; Bahat, Y.; Sharon, H.; Ben-Gal, O.; et al. Virtual Reality-based Cognitive-motor Training for Middle-aged Adults at High Alzheimer’s Disease Risk: A Randomized Controlled Trial. Alzheimer’s Dement. Transl. Res. Clin. Interv. 2018, 4, 118–129. [Google Scholar] [CrossRef] [PubMed]
  48. Healy, D.; Flynn, A.; Conlan, O.; McSharry, J.; Walsh, J. Older Adults’ Experiences and Perceptions of Immersive Virtual Reality: Systematic Review and Thematic Synthesis. JMIR Serious Games 2022, 10, e35802. [Google Scholar] [CrossRef] [PubMed]
  49. Liao, Y.-Y.; Chen, I.-H.; Lin, Y.-J.; Chen, Y.; Hsu, W.-C. Effects of Virtual Reality-Based Physical and Cognitive Training on Executive Function and Dual-Task Gait Performance in Older Adults with Mild Cognitive Impairment: A Randomized Control Trial. Front. Aging Neurosci. 2019, 11, 162. [Google Scholar] [CrossRef] [PubMed]
  50. Brassel, S.; Power, E.; Campbell, A.; Brunner, M.; Togher, L. Recommendations for the Design and Implementation of Virtual Reality for Acquired Brain Injury Rehabilitation: Systematic Review. J. Med. Internet Res. 2021, 23, e26344. [Google Scholar] [CrossRef]
  51. Faria, A.L.; Latorre, J.; Silva Cameirão, M.; Bermúdez I Badia, S.; Llorens, R. Ecologically Valid Virtual Reality-Based Technologies for Assessment and Rehabilitation of Acquired Brain Injury: A Systematic Review. Front. Psychol. 2023, 14, 1233346. [Google Scholar] [CrossRef]
  52. Gamito, P.; Oliveira, J.; Alves, C.; Santos, N.; Coelho, C.; Brito, R. Virtual Reality-Based Cognitive Stimulation to Improve Cognitive Functioning in Community Elderly: A Controlled Study. Cyberpsychology Behav. Soc. Netw. 2020, 23, 150–156. [Google Scholar] [CrossRef] [PubMed]
  53. Maeng, S.; Hong, J.P.; Kim, W.-H.; Kim, H.; Cho, S.-E.; Kang, J.M.; Na, K.-S.; Oh, S.-H.; Park, J.W.; Bae, J.N.; et al. Effects of Virtual Reality-Based Cognitive Training in the Elderly with and without Mild Cognitive Impairment. Psychiatry Investig. 2021, 18, 619–627. [Google Scholar] [CrossRef] [PubMed]
  54. Astle, D.E.; Holmes, J.; Kievit, R.; Gathercole, S.E. Annual Research Review: The Transdiagnostic Revolution in Neurodevelopmental Disorders. Child Psychol. Psychiatry 2022, 63, 397–417. [Google Scholar] [CrossRef] [PubMed]
  55. Bekele, E.; Wade, J.; Bian, D.; Fan, J.; Swanson, A.; Warren, Z.; Sarkar, N. Multimodal Adaptive Social Interaction in Virtual Environment (MASI-VR) for Children with Autism Spectrum Disorders (ASD). In Proceedings of the 2016 IEEE Virtual Reality (VR), Greenville, SC, USA, 19–23 March 2016; pp. 121–130. [Google Scholar]
  56. Ip, H.H.S.; Wong, S.W.L.; Chan, D.F.Y.; Byrne, J.; Li, C.; Yuan, V.S.N.; Lau, K.S.Y.; Wong, J.Y.W. Enhance Emotional and Social Adaptation Skills for Children with Autism Spectrum Disorder: A Virtual Reality Enabled Approach. Comput. Educ. 2018, 117, 1–15. [Google Scholar] [CrossRef]
  57. Kourtesis, P.; Kouklari, E.-C.; Roussos, P.; Mantas, V.; Papanikolaou, K.; Skaloumbakas, C.; Pehlivanidis, A. Virtual Reality Training of Social Skills in Adults with Autism Spectrum Disorder: An Examination of Acceptability, Usability, User Experience, Social Skills, and Executive Functions. Behav. Sci. 2023, 13, 336. [Google Scholar] [CrossRef] [PubMed]
  58. Romero-Ayuso, D.; Toledano-González, A.; Rodríguez-Martínez, M.D.C.; Arroyo-Castillo, P.; Triviño-Juárez, J.M.; González, P.; Ariza-Vega, P.; Del Pino González, A.; Segura-Fragoso, A. Effectiveness of Virtual Reality-Based Interventions for Children and Adolescents with ADHD: A Systematic Review and Meta-Analysis. Children 2021, 8, 70. [Google Scholar] [CrossRef] [PubMed]
  59. Bauer, A.C.M.; Andringa, G. The Potential of Immersive Virtual Reality for Cognitive Training in Elderly. Gerontology 2020, 66, 614–623. [Google Scholar] [CrossRef]
  60. Lorentz, L.; Simone, M.; Zimmermann, M.; Studer, B.; Suchan, B.; Althausen, A.; Estocinova, J.; Müller, K.; Lendt, M. Evaluation of a VR Prototype for Neuropsychological Rehabilitation of Attentional Functions. Virtual Real. 2023, 27, 187–199. [Google Scholar] [CrossRef]
  61. Halbig, A.; Latoschik, M.E. A Systematic Review of Physiological Measurements, Factors, Methods, and Applications in Virtual Reality. Front. Virtual Real. 2021, 2, 694567. [Google Scholar] [CrossRef]
  62. Wilf, M.; Korakin, A.; Bahat, Y.; Koren, O.; Galor, N.; Dagan, O.; Wright, W.G.; Friedman, J.; Plotnik, M. Using Virtual Reality-Based Neurocognitive Testing and Eye Tracking to Study Naturalistic Cognitive-Motor Performance. Neuropsychologia 2024, 194, 108744. [Google Scholar] [CrossRef] [PubMed]
  63. Saffaryazdi, N.; Wasim, S.T.; Dileep, K.; Nia, A.F.; Nanayakkara, S.; Broadbent, E.; Billinghurst, M. Using Facial Micro-Expressions in Combination With EEG and Physiological Signals for Emotion Recognition. Front. Psychol. 2022, 13, 864047. [Google Scholar] [CrossRef] [PubMed]
  64. Wascher, E.; Alyan, E.; Karthaus, M.; Getzmann, S.; Arnau, S.; Reiser, J.E. Tracking Drivers’ Minds: Continuous Evaluation of Mental Load and Cognitive Processing in a Realistic Driving Simulator Scenario by Means of the EEG. Heliyon 2023, 9, e17904. [Google Scholar] [CrossRef] [PubMed]
  65. Chiossi, F.; Gruenefeld, U.; Hou, B.J.; Newn, J.; Ou, C.; Liao, R.; Welsch, R.; Mayer, S. Understanding the Impact of the Reality-Virtuality Continuum on Visual Search Using Fixation-Related Potentials and Eye Tracking Features. Proc. ACM Hum.-Comput. Interact. 2024, 8, 1–33. [Google Scholar] [CrossRef]
  66. Seo, K.; Kim, J.K.; Oh, D.H.; Ryu, H.; Choi, H. Virtual Daily Living Test to Screen for Mild Cognitive Impairment Using Kinematic Movement Analysis. PLoS ONE 2017, 12, 1–11. [Google Scholar] [CrossRef] [PubMed]
  67. Cavedoni, S.; Cipresso, P.; Mancuso, V.; Bruni, F.; Pedroli, E. Virtual Reality for the Assessment and Rehabilitation of Neglect: Where Are We Now? A 6-Year Review Update. Virtual Real. 2022, 26, 1663–1704. [Google Scholar] [CrossRef] [PubMed]
  68. Larsen, O.F.P.; Tresselt, W.G.; Lorenz, E.A.; Holt, T.; Sandstrak, G.; Hansen, T.I.; Su, X.; Holt, A. A Method for Synchronized Use of EEG and Eye Tracking in Fully Immersive VR. Front. Hum. Neurosci. 2024, 18, 1347974. [Google Scholar] [CrossRef]
  69. Long, X.; Mayer, S.; Chiossi, F. Multimodal Detection of External and Internal Attention in Virtual Reality Using EEG and Eye Tracking Features. In Proceedings of the Mensch und Computer 2024, Karlsruhe, Germany, 1 September 2024; pp. 29–43. [Google Scholar]
  70. Chicchi Giglioli, I.A.; Bermejo Vidal, C.; Alcañiz Raya, M. A Virtual Versus an Augmented Reality Cooking Task Based-Tools: A Behavioral and Physiological Study on the Assessment of Executive Functions. Front. Psychol. 2019, 10, 2529. [Google Scholar] [CrossRef] [PubMed]
  71. Berger, A.M.; Davelaar, E.J. Frontal Alpha Oscillations and Attentional Control: A Virtual Reality Neurofeedback Study. Neuroscience 2018, 378, 189–197. [Google Scholar] [CrossRef] [PubMed]
  72. Chiossi, F.; Turgut, Y.; Welsch, R.; Mayer, S. Adapting Visual Complexity Based on Electrodermal Activity Improves Working Memory Performance in Virtual Reality. Proc. ACM Hum.-Comput. Interact. 2023, 7, 1–26. [Google Scholar] [CrossRef]
  73. Lapborisuth, P.; Koorathota, S.; Sajda, P. Pupil-Linked Arousal Modulates Network-Level EEG Signatures of Attention Reorienting during Immersive Multitasking. J. Neural Eng. 2023, 20, 046043. [Google Scholar] [CrossRef] [PubMed]
  74. Lustig, A.; Wilf, M.; Dudkiewicz, I.; Plotnik, M. Higher Cognitive Load Interferes with Head-Hand Coordination: Virtual Reality-Based Study. Sci. Rep. 2023, 13, 17632. [Google Scholar] [CrossRef]
  75. Poupard, M.; Larrue, F.; Sauzéon, H.; Tricot, A. A Systematic Review of Immersive Technologies for Education: Learning Performance, Cognitive Load and Intrinsic Motivation. Br. J. Educ. Tech. 2024. [Google Scholar] [CrossRef]
  76. Smith, A.D. Virtual Reality and Spatial Cognition: Bridging the Epistemic Gap Between Laboratory and Real-World Insights. Sci. Educ. 2024. [Google Scholar] [CrossRef]
  77. Voinescu, A.; Sui, J.; Stanton Fraser, D. Virtual Reality in Neurorehabilitation: An Umbrella Review of Meta-Analyses. J. Clin. Med. 2021, 10, 1478. [Google Scholar] [CrossRef]
  78. Salminen, M.; Jarvela, S.; Ruonala, A.; Harjunen, V.J.; Hamari, J.; Jacucci, G.; Ravaja, N. Evoking Physiological Synchrony and Empathy Using Social VR With Biofeedback. IEEE Trans. Affect. Comput. 2022, 13, 746–755. [Google Scholar] [CrossRef]
  79. Wen, D.; Yuan, J.; Li, J.; Sun, Y.; Wang, X.; Shi, R.; Wan, X.; Zhou, Y.; Song, H.; Dong, X.; et al. Design and Test of Spatial Cognitive Training and Evaluation System Based on Virtual Reality Head-Mounted Display With EEG Recording. IEEE Trans. Neural Syst. Rehabil. Eng. 2023, 31, 2705–2714. [Google Scholar] [CrossRef]
  80. Blackmore, K.L.; Smith, S.P.; Bailey, J.D.; Krynski, B. Integrating Biofeedback and Artificial Intelligence into eXtended Reality Training Scenarios: A Systematic Literature Review. Simul. Gaming 2024, 55, 445–478. [Google Scholar] [CrossRef]
  81. Selaskowski, B.; Wiebe, A.; Kannen, K.; Asché, L.; Pakos, J.; Philipsen, A.; Braun, N. Clinical Adoption of Virtual Reality in Mental Health Is Challenged by Lack of High-Quality Research. npj Ment. Health Res. 2024, 3, 24. [Google Scholar] [CrossRef] [PubMed]
  82. Lopez-Nava, I.H.; Munoz-Melendez, A. Wearable Inertial Sensors for Human Motion Analysis: A Review. IEEE Sens. J. 2016, 16, 7821–7834. [Google Scholar] [CrossRef]
  83. Mirzaei, M.; Kan, P.; Kaufmann, H. Multi-Modal Spatial Object Localization in Virtual Reality for Deaf and Hard-of-Hearing People. In Proceedings of the 2021 IEEE Virtual Reality and 3D User Interfaces (VR), Lisboa, Portugal, 27 March–1 April 2021; pp. 588–596. [Google Scholar]
  84. Parsons, T.D.; Carlew, A.R. Bimodal Virtual Reality Stroop for Assessing Distractor Inhibition in Autism Spectrum Disorders. J. Autism. Dev. Disord. 2016, 46, 1255–1267. [Google Scholar] [CrossRef] [PubMed]
  85. Coleman, B.; Marion, S.; Rizzo, A.; Turnbull, J.; Nolty, A. Virtual Reality Assessment of Classroom—Related Attention: An Ecologically Relevant Approach to Evaluating the Effectiveness of Working Memory Training. Front. Psychol. 2019, 10, 1851. [Google Scholar] [CrossRef]
  86. Iriarte, Y.; Diaz-Orueta, U.; Cueto, E.; Irazustabarrena, P.; Banterla, F.; Climent, G. AULA—Advanced Virtual Reality Tool for the Assessment of Attention: Normative Study in Spain. J. Atten. Disord. 2016, 20, 542–568. [Google Scholar] [CrossRef]
  87. Grübel, J.; Thrash, T.; Hölscher, C.; Schinazi, V.R. Evaluation of a Conceptual Framework for Predicting Navigation Performance in Virtual Reality. PLoS ONE 2017, 12, e0184682. [Google Scholar] [CrossRef]
  88. Pflueger, M.O.; Mager, R.; Graf, M.; Stieglitz, R.-D. Encoding of Everyday Objects in Older Adults: Episodic Memory Assessment in Virtual Reality. Front. Aging Neurosci. 2023, 15, 1100057. [Google Scholar] [CrossRef] [PubMed]
  89. Cogné, M.; Auriacombe, S.; Vasa, L.; Tison, F.; Klinger, E.; Sauzéon, H.; Joseph, P.-A.; N’Kaoua, B. Are Visual Cues Helpful for Virtual Spatial Navigation and Spatial Memory in Patients with Mild Cognitive Impairment or Alzheimer’s Disease? Neuropsychology 2018, 32, 385–400. [Google Scholar] [CrossRef] [PubMed]
  90. Cerasuolo, M.; De Marco, S.; Nappo, R.; Simeoli, R.; Rega, A. The Potential of Virtual Reality to Improve Diagnostic Assessment by Boosting Autism Spectrum Disorder Traits: A Systematic Review. Adv. Neurodev. Disord. 2024. [Google Scholar] [CrossRef]
  91. Gramouseni, F.; Tzimourta, K.D.; Angelidis, P.; Giannakeas, N.; Tsipouras, M.G. Cognitive Assessment Based on Electroencephalography Analysis in Virtual and Augmented Reality Environments, Using Head Mounted Displays: A Systematic Review. Big Data Cogn. Comput. 2023, 7, 163. [Google Scholar] [CrossRef]
  92. Wimmer, M.; Weidinger, N.; ElSayed, N.; Müller-Putz, G.R.; Veas, E. EEG-Based Error Detection Can Challenge Human Reaction Time in a VR Navigation Task. In Proceedings of the 2023 IEEE International Symposium on Mixed and Augmented Reality (ISMAR), Sydney, Australia, 16 October 2023; pp. 970–979. [Google Scholar]
  93. Kourtesis, P.; Collina, S.; Doumas, L.A.A.; MacPherson, S.E. Technological Competence Is a Pre-Condition for Effective Implementation of Virtual Reality Head Mounted Displays in Human Neuroscience: A Technological Review and Meta-Analysis. Front. Hum. Neurosci. 2019, 13, e00342. [Google Scholar] [CrossRef] [PubMed]
  94. Kourtesis, P.; Collina, S.; Doumas, L.A.A.; MacPherson, S.E. Validation of the Virtual Reality Neuroscience Questionnaire: Maximum Duration of Immersive Virtual Reality Sessions Without the Presence of Pertinent Adverse Symptomatology. Front. Hum. Neurosci. 2019, 13, e00417. [Google Scholar] [CrossRef] [PubMed]
  95. Mittelstaedt, J.M.; Wacker, J.; Stelling, D. VR Aftereffect and the Relation of Cybersickness and Cognitive Performance. Virtual Real. 2019, 23, 143–154. [Google Scholar] [CrossRef]
  96. Kourtesis, P.; Linnell, J.; Amir, R.; Argelaguet, F.; MacPherson, S.E. Cybersickness in Virtual Reality Questionnaire (CSQ-VR): A Validation and Comparison against SSQ and VRSQ. Virtual Worlds 2023, 2, 16–35. [Google Scholar] [CrossRef]
  97. Papaefthymiou, S.; Giannakopoulos, A.; Roussos, P.; Kourtesis, P. Mitigating Cybersickness in Virtual Reality: Impact of Eye–Hand Coordination Tasks, Immersion, and Gaming Skills. Virtual Worlds 2024, 3, 506–535. [Google Scholar] [CrossRef]
  98. Correia, P.H.B. Adaptive Virtual Reality Solutions: A Literature Review. In Proceedings of the Symposium on Virtual and Augmented Reality, Manaus, Brazil, 30 September 2024; pp. 1–10. [Google Scholar]
  99. Parsons, T.D.; Gaggioli, A.; Riva, G. Extended Reality for the Clinical, Affective, and Social Neurosciences. Brain Sci. 2020, 10, 922. [Google Scholar] [CrossRef]
  100. Jonson, M.; Avramescu, S.; Chen, D.; Alam, F. The Role of Virtual Reality in Screening, Diagnosing, and Rehabilitating Spatial Memory Deficits. Front. Hum. Neurosci. 2021, 15, 628818. [Google Scholar] [CrossRef] [PubMed]
  101. Hampstead, B.M.; Gillis, M.M.; Stringer, A.Y. Cognitive Rehabilitation of Memory for Mild Cognitive Impairment: A Methodological Review and Model for Future Research. J. Int. Neuropsychol. Soc. 2014, 20, 135–151. [Google Scholar] [CrossRef]
  102. Mondellini, M.; Arlati, S.; Pizzagalli, S.; Greci, L.; Sacco, M.; Arlati, S.; Ferrigno, G. Assessment of the Usability of an Immersive Virtual Supermarket for the Cognitive Rehabilitation of Elderly Patients: A Pilot Study on Young Adults. In Proceedings of the 2018 IEEE 6th International Conference on Serious Games and Applications for Health (SeGAH), Vienna, Italy, 16–18 May 2018; pp. 1–8. [Google Scholar]
  103. Wang, J.; Wang, W.; Hou, Z.-G. Toward Improving Engagement in Neural Rehabilitation: Attention Enhancement Based on Brain–Computer Interface and Audiovisual Feedback. IEEE Trans. Cogn. Dev. Syst. 2020, 12, 787–796. [Google Scholar] [CrossRef]
  104. Huygelier, H.; Schraepen, B.; Lafosse, C.; Vaes, N.; Schillebeeckx, F.; Michiels, K.; Note, E.; Vanden Abeele, V.; Van Ee, R.; Gillebert, C.R. An Immersive Virtual Reality Game to Train Spatial Attention Orientation after Stroke: A Feasibility Study. Appl. Neuropsychol. Adult 2022, 29, 915–935. [Google Scholar] [CrossRef] [PubMed]
  105. Moulaei, K.; Sharifi, H.; Bahaadinbeigy, K.; Dinari, F. Efficacy of Virtual Reality-Based Training Programs and Games on the Improvement of Cognitive Disorders in Patients: A Systematic Review and Meta-Analysis. BMC Psychiatry 2024, 24, 116. [Google Scholar] [CrossRef]
  106. Araiza-Alba, P.; Keane, T.; Chen, W.S.; Kaufman, J. Immersive Virtual Reality as a Tool to Learn Problem-Solving Skills. Comput. Educ. 2021, 164, 104121. [Google Scholar] [CrossRef]
  107. Ou, Y.-K.; Wang, Y.-L.; Chang, H.-C.; Yen, S.-Y.; Zheng, Y.-H.; Lee, B.-O. Development of Virtual Reality Rehabilitation Games for Children with Attention-Deficit Hyperactivity Disorder. J. Ambient Intell. Hum. Comput. 2020, 11, 5713–5720. [Google Scholar] [CrossRef]
  108. Ren, X.; Wu, Q.; Cui, N.; Zhao, J.; Bi, H.-Y. Effectiveness of Digital Game-Based Trainings in Children with Neurodevelopmental Disorders: A Meta-Analysis. Res. Dev. Disabil. 2023, 133, 104418. [Google Scholar] [CrossRef]
  109. Buele, J.; Varela-Aldás, J.L.; Palacios-Navarro, G. Virtual Reality Applications Based on Instrumental Activities of Daily Living (iADLs) for Cognitive Intervention in Older Adults: A Systematic Review. J. NeuroEng. Rehabil. 2023, 20, 168. [Google Scholar] [CrossRef] [PubMed]
  110. Liao, Y.; Tseng, H.; Lin, Y.; Wang, C.; Hsu, W. Using Virtual Reality-Based Training to Improve Cognitive Function, Instrumental Activities of Daily Living and Neural Efficiency in Older Adults with Mild Cognitive Impairment. Eur. J. Phys. Rehabil. Med. 2020, 56, 47–57. [Google Scholar] [CrossRef] [PubMed]
  111. To’mah, V.; Du Toit, S.H.J. Potential of Virtual Reality to Meaningfully Engage Adults Living with Dementia in Care Settings: A Scoping Review. Aus. Occup. Ther. J. 2024, 71, 313–339. [Google Scholar] [CrossRef] [PubMed]
  112. Pei, S.; Chen, A.; Chen, C.; Li, F.M.; Fozzard, M.; Chi, H.-Y.; Weibel, N.; Carrington, P.; Zhang, Y. Embodied Exploration: Facilitating Remote Accessibility Assessment for Wheelchair Users with Virtual Reality. In Proceedings of the 25th International ACM SIGACCESS Conference on Computers and Accessibility, New York, NY, USA, 22 October 2023; pp. 1–17. [Google Scholar]
  113. Birckhead, B.; Khalil, C.; Liu, X.; Conovitz, S.; Rizzo, A.; Danovitch, I.; Bullock, K.; Spiegel, B. Recommendations for Methodology of Virtual Reality Clinical Trials in Health Care by an International Working Group: Iterative Study. JMIR Ment. Health 2019, 6, e11973. [Google Scholar] [CrossRef] [PubMed]
  114. Dehn, L.B.; Piefke, M.; Toepper, M.; Kohsik, A.; Rogalewski, A.; Dyck, E.; Botsch, M.; Schäbitz, W.-R. Cognitive Training in an Everyday-like Virtual Reality Enhances Visual-Spatial Memory Capacities in Stroke Survivors with Visual Field Defects. Top. Stroke Rehabil. 2020, 27, 442–452. [Google Scholar] [CrossRef] [PubMed]
  115. Zhao, Y.; Li, L.; He, X.; Yin, S.; Zhou, Y.; Marquez-Chin, C.; Yang, W.; Rao, J.; Xiang, W.; Liu, B.; et al. Psychodynamic-Based Virtual Reality Cognitive Training System with Personalized Emotional Arousal Elements for Mild Cognitive Impairment Patients. Comput. Methods Programs Biomed. 2023, 241, 107779. [Google Scholar] [CrossRef] [PubMed]
  116. Teo, W.-P.; Muthalib, M.; Yamin, S.; Hendy, A.M.; Bramstedt, K.; Kotsopoulos, E.; Perrey, S.; Ayaz, H. Does a Combination of Virtual Reality, Neuromodulation and Neuroimaging Provide a Comprehensive Platform for Neurorehabilitation?—A Narrative Review of the Literature. Front. Hum. Neurosci. 2016, 10, e00284. [Google Scholar] [CrossRef] [PubMed]
  117. Jeffay, E.; Ponsford, J.; Harnett, A.; Janzen, S.; Patsakos, E.; Douglas, J.; Kennedy, M.; Kua, A.; Teasell, R.; Welch-West, P.; et al. INCOG 2.0 Guidelines for Cognitive Rehabilitation Following Traumatic Brain Injury, Part III: Executive Functions. J. Head Trauma. Rehabil. 2023, 38, 52–64. [Google Scholar] [CrossRef]
  118. Valenza, G.; Alcaniz, M.; Alfeo, A.L.; Bianchi, M.; Carli, V.; Catrambone, V.; Cimino, M.C.; Dudnik, G.; Duggento, A.; Ferrante, M.; et al. The EXPERIENCE Project: Unveiling Extended-Personal Reality Through Automated VR Environments and Explainable Artificial Intelligence. In Proceedings of the 2023 IEEE International Conference on Metrology for eXtended Reality, Artificial Intelligence and Neural Engineering (MetroXRAINE), Milano, Italy, 25 October 2023; pp. 757–762. [Google Scholar]
  119. Hadley, W.; Houck, C.; Brown, L.K.; Spitalnick, J.S.; Ferrer, M.; Barker, D. Moving Beyond Role-Play: Evaluating the Use of Virtual Reality to Teach Emotion Regulation for the Prevention of Adolescent Risk Behavior Within a Randomized Pilot Trial. J. Pediatr. Psychol. 2019, 44, 425–435. [Google Scholar] [CrossRef] [PubMed]
  120. Chiossi, F.; Ou, C.; Mayer, S. Optimizing Visual Complexity for Physiologically-Adaptive VR Systems: Evaluating a Multimodal Dataset Using EDA, ECG and EEG Features. In Proceedings of the 2024 International Conference on Advanced Visual Interfaces, Arenzano, Genoa, Italy, 3 June 2024; pp. 1–9. [Google Scholar]
  121. Liu, Q.; Song, H.; Yan, M.; Ding, Y.; Wang, Y.; Chen, L.; Yin, H. Virtual Reality Technology in the Detection of Mild Cognitive Impairment: A Systematic Review and Meta-Analysis. Ageing Res. Rev. 2023, 87, 101889. [Google Scholar] [CrossRef] [PubMed]
  122. Lucas-Pérez, G.; Ramírez-Sanz, J.M.; Serrano-Mamolar, A.; Arnaiz-González, Á.; Bustillo, A. Personalising the Training Process with Adaptive Virtual Reality: A Proposed Framework, Challenges, and Opportunities. In Extended Reality; De Paolis, L.T., Arpaia, P., Sacco, M., Eds.; Lecture Notes in Computer Science; Springer Nature: Cham, Switzerland, 2024; Volume 15027, pp. 376–384. ISBN 978-3-031-71706-2. [Google Scholar]
  123. Oliveira, E.; Pereira, N.A.A.; Alves, J.; Henriques, P.R.; Rodrigues, N.F. Validating Structural Cognitive Training Using Immersive Virtual Reality. In Proceedings of the 2023 IEEE 11th International Conference on Serious Games and Applications for Health (SeGAH), Athens, Greece, 28 August 2023; pp. 1–8. [Google Scholar]
  124. Dudley, J.; Yin, L.; Garaj, V.; Kristensson, P.O. Inclusive Immersion: A Review of Efforts to Improve Accessibility in Virtual Reality, Augmented Reality and the Metaverse. Virtual Real. 2023, 27, 2989–3020. [Google Scholar] [CrossRef]
  125. Kourtesis, P.; Amir, R.; Linnell, J.; Argelaguet, F.; MacPherson, S.E. Cybersickness, Cognition, & Motor Skills: The Effects of Music, Gender, and Gaming Experience. IEEE Trans. Vis. Comput. Graph. 2023, 29, 2326–2336. [Google Scholar] [CrossRef]
  126. Kourtesis, P.; Papadopoulou, A.; Roussos, P. Cybersickness in Virtual Reality: The Role of Individual Differences, Its Effects on Cognitive Functions and Motor Skills, and Intensity Differences during and after Immersion. Virtual Worlds 2024, 3, 62–93. [Google Scholar] [CrossRef]
  127. Kourtesis, P.; Vizcay, S.; Marchal, M.; Pacchierotti, C.; Argelaguet, F. Action-Specific Perception & Performance on a Fitts’s Law Task in Virtual Reality: The Role of Haptic Feedback. IEEE Trans. Vis. Comput. Graph. 2022, 28, 3715–3726. [Google Scholar] [CrossRef]
  128. Vizcay, S.; Kourtesis, P.; Argelaguet, F.; Pacchierotti, C.; Marchal, M. Design, Evaluation and Calibration of Wearable Electrotacile Interfaces for Enhancing Contact Information in Virtual Reality. Comput. Graph. 2023, 111, 199–212. [Google Scholar] [CrossRef]
  129. Kourtesis, P.; Argelaguet, F.; Vizcay, S.; Marchal, M.; Pacchierotti, C. Electrotactile Feedback Applications for Hand and Arm Interactions: A Systematic Review, Meta-Analysis, and Future Directions. IEEE Trans. Haptics 2022, 15, 479–496. [Google Scholar] [CrossRef]
  130. Fernandes, A.S.; Feiner, S.K. Combating VR Sickness through Subtle Dynamic Field-of-View Modification. In Proceedings of the 2016 IEEE Symposium on 3D User Interfaces (3DUI), Greenville, SC, USA, 19–20 March 2016; pp. 201–210. [Google Scholar]
  131. Kourtesis, P.; MacPherson, S.E. How Immersive Virtual Reality Methods May Meet the Criteria of the National Academy of Neuropsychology and American Academy of Clinical Neuropsychology: A Software Review of the Virtual Reality Everyday Assessment Lab (VR-EAL). Comput. Hum. Behav. Rep. 2021, 4, 100151. [Google Scholar] [CrossRef]
Table 1. Multimodal systems in XR cognitive applications.
Table 1. Multimodal systems in XR cognitive applications.
ModalityDescriptionKey Applications in XR
GSR (Galvanic Skin Response)Measures the skin’s electrical conductivity, which changes with levels of physiological arousal. It is a direct indicator of emotional states such as stress, excitement, or calmness.Used to track and analyze emotional responses during immersive experiences, such as stress levels during virtual simulations or training exercises.
EEG (Electroencephalography)Records the brain’s electrical activity using non-invasive sensors placed on the scalp. It provides real-time data on neural processes related to attention, cognitive workload, and emotional regulation.Applied in monitoring cognitive load, attention, and engagement levels, especially during tasks requiring high mental effort, such as virtual learning environments or problem-solving scenarios.
ET (Eye Tracking)Monitors and records eye movements, including where and how long a person focuses on specific elements. It helps understand visual attention and perception in XR environments.Used for evaluating user attention, navigation patterns, and visual processing. Commonly implemented in user interface testing, training simulations, and studies on how users interact with complex visual scenes.
Hand TrackingDetects and interprets hand movements and gestures, allowing for natural and intuitive interaction with virtual objects without the need for handheld controllers.Enables realistic manipulation of virtual objects, essential for training simulations, virtual prototyping, and enhancing user immersion through gesture-based controls.
Body TrackingCaptures full-body movements and postures, providing comprehensive data on physical behavior and motor coordination. It is crucial for assessing how users move and interact within the virtual space.Utilized in applications that require accurate assessment of motor skills, spatial awareness, or physical training. It is particularly valuable in rehabilitation, sports training, and VR experiences that simulate physical activities.
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

González-Erena, P.V.; Fernández-Guinea, S.; Kourtesis, P. Cognitive Assessment and Training in Extended Reality: Multimodal Systems, Clinical Utility, and Current Challenges. Encyclopedia 2025, 5, 8. https://doi.org/10.3390/encyclopedia5010008

AMA Style

González-Erena PV, Fernández-Guinea S, Kourtesis P. Cognitive Assessment and Training in Extended Reality: Multimodal Systems, Clinical Utility, and Current Challenges. Encyclopedia. 2025; 5(1):8. https://doi.org/10.3390/encyclopedia5010008

Chicago/Turabian Style

González-Erena, Palmira Victoria, Sara Fernández-Guinea, and Panagiotis Kourtesis. 2025. "Cognitive Assessment and Training in Extended Reality: Multimodal Systems, Clinical Utility, and Current Challenges" Encyclopedia 5, no. 1: 8. https://doi.org/10.3390/encyclopedia5010008

APA Style

González-Erena, P. V., Fernández-Guinea, S., & Kourtesis, P. (2025). Cognitive Assessment and Training in Extended Reality: Multimodal Systems, Clinical Utility, and Current Challenges. Encyclopedia, 5(1), 8. https://doi.org/10.3390/encyclopedia5010008

Article Metrics

Back to TopTop