Next Article in Journal
Precision Identification of Locally Advanced Rectal Cancer in Denoised CT Scans Using EfficientNet and Voting System Algorithms
Previous Article in Journal
Micro-Computed Tomography Analysis of Peri-Implant Bone Defects Exposed to a Peri-Implantitis Microcosm, with and without Bone Substitute, in a Rabbit Model: A Pilot Study
Previous Article in Special Issue
Balance Evaluation Based on Walking Experiments with Exoskeleton Interference
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Effects of Action Observation Plus Motor Imagery Administered by Immersive Virtual Reality on Hand Dexterity in Healthy Subjects

1
Physiotherapy Unit, IRCCS Humanitas Research Hospital, Via Manzoni 56, 20089 Rozzano, Milan, Italy
2
Department of Biomedical Sciences, Humanitas University, Via Rita Levi Montalcini 4, 20072 Pieve Emanuele, Milan, Italy
3
Dipartimento di Scienze Medico Chirurgiche, Scienze Radiologiche e Sanità Pubblica (DSMC), Università Degli Studi di Brescia, Viale Europa 11, 25123 Brescia, Brescia, Italy
4
Consiglio Nazionale Delle Ricerche, Istituto di Neuroscienze, Via Volturno, 39-E, 43125 Parma, Parma, Italy
*
Author to whom correspondence should be addressed.
Bioengineering 2024, 11(4), 398; https://doi.org/10.3390/bioengineering11040398
Submission received: 30 January 2024 / Revised: 3 April 2024 / Accepted: 16 April 2024 / Published: 19 April 2024
(This article belongs to the Special Issue Bioengineering of the Motor System)

Abstract

:
Action observation and motor imagery (AOMI) are commonly delivered through a laptop screen. Immersive virtual reality (VR) may enhance the observer’s embodiment, a factor that may boost AOMI effects. The study aimed to investigate the effects on manual dexterity of AOMI delivered through immersive VR compared to AOMI administered through a laptop. To evaluate whether VR can enhance the effects of AOMI, forty-five young volunteers were enrolled and randomly assigned to the VR-AOMI group, who underwent AOMI through immersive VR, the AOMI group, who underwent AOMI through a laptop screen, or the control group, who observed landscape video clips. All participants underwent a 5-day treatment, consisting of 12 min per day. We investigated between and within-group differences after treatments relative to functional manual dexterity tasks using the Purdue Pegboard Test (PPT). This test included right hand (R), left hand (L), both hands (B), R + L + B, and assembly tasks. Additionally, we analyzed kinematics parameters including total and sub-phase duration, peak and mean velocity, and normalized jerk, during the Nine-Hole Peg Test to examine whether changes in functional scores may also occur through specific kinematic patterns. Participants were assessed at baseline (T0), after the first training session (T1), and at the end of training (T2). A significant time by group interaction and time effects were found for PPT, where both VR-AOMI and AOMI groups improved at the end of training. Larger PPT-L task improvements were found in the VR-AOMI group (d: 0.84, CI95: 0.09–1.58) compared to the AOMI group from T0 to T1. Immersive VR used for the delivery of AOMI speeded up hand dexterity improvements.

Graphical Abstract

1. Introduction

Cognitive facilitations including action observation (AO) and motor imagery (MI) have been recently adopted to promote motor learning in athletes or in subjects with motor impairments [1,2].
AO entails the observation of motor tasks through video clips, which are delivered using a laptop screen showing motor contents representing motor acts executed by age and gender-matched subjects [1,3,4]. This stimulation recruits the mirror neuron system (MNS), a frontoparietal network active both during the execution and observation of motor acts. The MNS plays a key role in understanding actions performed by others [5] and is involved in the building of motor memories related to the observed tasks [6]. The neural solicitation driven by AO is dependent on the features of the observed stimuli, such as the observer perspective or the transitive or intransitive nature of the action. For example, a first-person perspective induces higher MNS activation during the observation of upper limb transitive actions, compared to a third-person perspective [7,8]. In addition, the impact of AO on motor learning is larger when the observed actions are accompanied by action-related sounds [9].
MI consists of a cognitive process in which subjects imagine themselves executing motor tasks without performing them [10,11]. MI performed immediately after AO has been reported to further enhance motor learning when compared to the execution of AO alone [11]. The rationale behind this finding lies in the overlap between the neural substrates of AO and MI, both of which share large territories over frontoparietal circuits [10]. In fact, the association of AO and MI (AOMI) has been adopted to enhance the performance of sport-related gestures [12]. Moreover, AOMI-related benefits have been demonstrated in terms of motor re-learning in patients with motor impairments [13] when compared to the administration of AO or MI alone.
When considering treatments integrating action observation (e.g., AOMI), an additional resource is represented by virtual reality (VR), which is a computer-generated simulation of a three-dimensional image or environment that users can interact with in a seemingly real or physical manner. This interaction is facilitated through specialized devices such as helmets with a screen inside or controllers fitted with sensors [14,15]. VR technologies may be “immersive” when the subject is completely immersed in the virtual environment and visual information change according to the movement of the user’s head [16]. Immersive VR offers multiple advantages. On one hand, it leads to the illusion of being in a virtual place and the first-person perspective generates the illusion that the virtual body is the observer’s own [17]. The observed upper limbs are coherent with the real limbs of the subject, enhancing the embodiment of the observer [18], the sense of agency, the self-attribution of the virtual body [19,20], and the overall engagement during the treatment [21]. On the other hand, neuroimaging studies have proven that action observation presented in 3D evokes a greater sensorimotor response than 2D stimuli [22]. All these advantages induced clinical researchers to adopt immersive VR in the field of motor rehabilitation, demonstrating its efficacy in favoring the upper limb recovery in patients with neurological disorders [23,24]. In this scenario, previous studies have suggested that implicit motor learning, defined as the acquisition of new motor skills without the awareness of the learning process, may be enhanced when subjects are embodied in immersive virtual environments and when they have a first-person perspective [25]. The embodiment with the virtual body might increase AO effects, since the mirror mechanism has been reported as modulated not only by the observed movement, but also by action meaning and environmental, contextual, and emotional factors [26,27,28].
Few studies have described the link among AO, VR, and MI in terms of motor learning. Choi [29] described higher motor imagery abilities following hand movements observed in immersive VR. Moreover, the administration of AO through immersive VR to patients in a subacute phase after a stroke has shown improvements in upper limb motor function when compared to upper limb motor rehabilitation performed without VR [30]. It is worth noting that the aforementioned cognitive approaches have been frequently associated with exercises aimed at improving hand dexterity [31,32]. Manual dexterity consists of the ability to execute coordinated hand and fingers movements and derives from the integration of hand biomechanics with sensorimotor and cognitive processes [32]. When considering hand function, improvements in manual dexterity after training represent an index of enhanced motor control during hand and fingers movements [33]. However, although literature data suggest the opportunity to enhance the embodiment through the use of immersive VR systems, no studies have investigated the effect of an AOMI training administered through immersive VR on manual dexterity when compared to a conventional AOMI training.
Therefore, the aim of this study was to investigate the effects on manual dexterity of AOMI delivered through immersive VR compared to conventional AOMI delivered via a laptop screen in healthy subjects. The study hypothesis was that immersive VR might boost the effects of AOMI on motor learning in healthy subjects.
This modality of administration may increase motor resonance, since AOMI is modulated by contextual and environmental factors associated with observed movement [26,28]. The novelty of the study lies in leveraging the embodiment induced by action observation and motor imagery delivered using immersive VR, alongside the first-person perspective, rather than via a laptop screen to induce motor learning in terms of manual dexterity changes.
To measure the extent and changes in manual dexterity, we identified specific functional tests that were administered both before and after a week of daily training. However, motor learning may manifest as an improvement in a functional test, as well as a higher velocity at which this improvement occurs [34]. For this reason, the functional battery was administered also after the first training session. Additionally, we investigated the movement kinematics during manual dexterity tasks to examine whether changes in functional scores may also occur through specific kinematic patterns.
We provide an outline of the manuscript to guide readers through its content. In the Background section, we highlight the rationale behind why VR may boost AOMI effects on motor learning. Subsequently, in the Methods section, we describe the study design, the sample included in the study, the different characteristics of treatment administered, and a detailed description of the outcome measures chosen to assess manual dexterity. The Results section presents the effects of different treatments, whose explanations, interpretations, and implications are addressed in the Discussion and Conclusions.

2. Materials and Methods

2.1. Participants

The sample size was estimated based on previous reports about the Purdue Pegboard Test ([35], see below). It was estimated that, considering an alpha error of 5%, a minimum of 15 participants would be required in each group to provide 80% power to detect a Cohen’s d  = 1.0 (large effect size) between VR-AOMI and AOMI groups at T2 [36].
Forty-five healthy subjects (15 females, 30 males; mean age 23.4 ± 2.68 years) were enrolled, and their characteristics are reported in Table 1. Inclusion criteria were: (1) age between 20 and 35 years, (2) right-handedness according to the Edinburgh Handedness Inventory [37]. Conversely, the presence of upper limb sensorimotor disorders or recent traumatic injuries, the usual performance of motor activities or sports involving remarkable manual skills (e.g., playing instruments, juggling), a history of epilepsy seizures, and visual impairments that cannot be corrected with lenses were used as exclusion criteria. All participants signed a written informed consent and the study protocol was approved by the Ethical Committee of the Humanitas Clinical and Research Center (approval number: VR-AOT-GR-2019).

2.2. Intervention

Action stimulus depicted a humanoid avatar playing a pianola with the left hand from a first-person perspective (Figure 1). The action performed by the humanoid avatar replicated the kinematics recorded from a healthy subject [38] by a motion capture system (Awinda, XSens, Enschede, The Netherlands) incorporating industrial gloves to track hand and finger movements (Manus Prime II Xsens, Enschede, The Netherlands). Given the immersive nature of the stimuli, VR-AOMI participants wearing the headset could explore the surrounding space by moving head and gaze accordingly.
Both VR-AOMI and AOMI participants were asked to carefully observe the action for 3 min, with an audio trace coherent with piano keys pressure. After the observation, they had to imagine themselves performing the action (MI) for one minute, avoiding any active movement. Observation and imagination were repeated three times (12 min in total) for five consecutive days, maintaining constant the daytime of task administration. Finally, participants of the CTRL group observed landscapes (free of any biological motor contents) in immersive VR for the same amount of time as the AOMI groups.

2.3. Study Design, Randomization, and Enrollment

The study has a three-armed, single-blind, randomized, controlled design. Participants were recruited by an independent researcher not involved in the subsequent stages of the study and they were assigned to one of the three experimental groups according to a random computer-generated list. The VR-AOMI group (15 subjects) underwent AOMI through immersive presentation of the action stimuli, the AOMI group (15 subjects) observed the same stimuli via a laptop screen, while the CTRL group (15 subjects) observed landscape videos via the VR visor (Oculus II).

2.4. Functional and Kinematic Assessment

The kinesthetic and visual imagery questionnaire (KVIQ) was administered to participants at baseline to assess their motor imagery abilities [39].
The main outcome of the study pertained to the participants’ hand dexterity. For this reason, all subjects underwent a functional and kinematic assessment at baseline (T0), after the first training session (T1—day 1), and at the end of training (T2—day 5). The study timeline is shown in Figure 2.
The assessment procedures were conducted by a researcher unaware of group allocation at the Motion Analysis Lab of the Humanitas Clinical Institute, Milan, Italy.
The functional assessment consisted of the Purdue Pegboard Test (PPT), including four subtests: PPT-R (right hand), PPT-L (left hand), PPT-B (both hands simultaneously), and an assembly task. Participants had 30 s for each peg-insertion task and 60 s for the assembly task [35].
The kinematic assessment included the Nine-hole peg test (NHPT) [40] and the finger tapping test (FTT) [41]. During the NHPT, participants were seated on a height-adjustable chair with the pegboard positioned on a table in front of them. They were asked to grasp the pegs from a container one by one, place them into a nine-hole board, remove the pegs from the board, and replace them in the container as quickly as possible [40]. Kinematic data were recorded using an optoelectronic system (SMART-DX, BTS, Milan, Italy) equipped with eight infrared cameras and 16 reflective markers, of which three were on the table to define the global reference system and 13 were on anatomical landmarks. The system calibration involved a 10 s static test with four additional markers on the NHPT board.
Markers trajectories were filtered using a fourth-order low-pass Butterworth filter (cut-off 4 Hz). Subsequently, we extracted the total and single-phase times (peg grasp, peg transfer, peg in hole, hand return), normalized jerk, mean and peak velocity during peg transfer and hand return phases [40]. Subjects performed two trials for both the right and the left sides, and the shortest one in terms of total time of execution was considered for the analysis.
Kinematic assessment also included the finger tapping test (FTT), performed with participants seated on a chair, forearms resting on a desk with elbows at 90 degrees flexion. They performed a tapping sequence (thumb, index, middle, ring, and little finger) at maximum speed for 15 s without visual feedback. Fingers were marked distally with smaller reflective markers (6 mm diameter). Errors were recorded for incorrect sequence movements. The total number of errors and the total number of movements considering all the fingers were analyzed [41].

2.5. Statistical Analysis

After verifying the normality assumption through the Shapiro–Wilk test, a parametric pipeline was adopted. Univariate ANOVA and chi-square test were used to investigate between-group differences in terms of demographic characteristics and KVIQ at baseline. A 3 × 3 mixed ANOVA with time as within-subjects and group as between-subjects factors was used to evaluate differences in terms of manual dexterity assessed by functional test and kinematic analysis between groups over time. Post-hoc analyses were Bonferroni-corrected to reduce the false positive ratio. In addition, changes from baseline to post-treatment (deltas) were calculated and the effect size (Cohen’s d), with its 95% confidence interval (CI95), was also computed between deltas in the three groups and between each time point in the same group and interpreted as small (0.2 ˂ d ˂ 0.5), medium (0.5 ˂ d ˂ 0.8), large (0.8 ˂ d ˂ 1.3), and very large (d > 1.3). Analyses were conducted using SPSS Statistics 29.0 for iOS and the statistical level of significance was set to α = 0.05.

3. Results

None of the participants withdrew from the study and no between-group differences were found in terms of baseline characteristics including age, weight, height, gender, and KVIQ score, as shown in Table 1.

3.1. The Effects of AOMI Applied through VR on Manual Dexterity Assessed with the Purdue Pegboard Test

A significant time by group interaction and time effect were found for PPT-R, PPT-L, PPT-R + L + B, and PPT-Assembly tasks, while a time effect was found for PPT-Both.
The graphs in Figure 3 show the PPT scores obtained across the three groups, while numeric values for within- and between-group comparisons of deltas are reported in Table 2 and Table 3, respectively.
Table 2 reports within-group differences expressed as Cohen’s d and CI95. Both VR-AOMI and AOMI groups improved from T0 to T1 and further experienced enhanced manual dexterity from T0 to T2 during PPT-R, PPT-L, PPT-Both, PPT-R + L + B, and PPT-Assembly tasks. These results suggest a major effect driven by action observation regardless of its visual display features. The CTRL group improved from T0 to T2 only for PPT-Both, PPT-R + L + B, and PPT-Assembly tasks.
Table 3 reports between-group differences expressed as Cohen’s d and CI95. Differences between VR-AOMI and AOMI groups in terms of deltas were exclusively found for PPT-L at T1 in favor of VR-AOMI (p = 0.029, d: 0.84, CI95: 0.09–1.58), indicating that the use of virtual reality speeded up the motor learning process, reaching a significant increase in dexterity already after the first training session (Figure 4). In addition, between-group differences in terms of deltas (T1–T0 and T2–T0) were detected in favor of both AOMI groups compared to the CTRL group in all subtasks of PPT (Table 3).

3.2. The Effects of AOMI Applied through VR on Kinematic Assessment during the Nine-Hole Peg Test

Results for the NHPT with the left hand revealed a time effect for removing time (p = 0.003), peak return velocity (p = 0.009), transfer time (p = 0.040), return time (p = 0.029), and transfer velocity (p = 0.042) (Table S1).
Within-group post-hoc analyses revealed a decrease in removing time in the AOMI group at T2 compared to T0 (MD: 0.93, p = 0.041, CI95: 0.03, 1.83), while the VR-AOMI group revealed a decrease in transfer time at T2 when compared to T0 (MD: 0.38, p = 0.05, CI95: 0.76). The CTRL group revealed a decrease in return time at T2 compared to T1 (MD: 0.31, p = 0.013, CI95: 0.05, 0.57). Both the AOMI (MD: −0.11, p = 0.04, CI95: −0.19, 0.03) and control groups (MD: −0.09, p = 0.012, CI95: −0.18, −0.02) achieved higher peak velocity during the return phase at T2 compared to T1.
The NHPT performed with the right hand revealed a time effect for removing time (p = 0.003) and a time by group interaction for total test (p = 0.025) and peg-in-hole times (p = 0.043). Between-group post-hoc analyses revealed a lower peg-in-hole time in the CTRL group compared to the AOMI group at T2 (MD: 0.76, p = 0.036, CI95: 0.04, 1.48). Finally, both VR-AOMI (MD: 0.47, p = 0.027, CI95: 0.06, 0.89) and CTRL (MD: 0.46, p = 0.030, CI95: 0.05, 0.88) groups showed a decrease in terms of removing time from T0 to T2 (Table S2).

3.3. The Effects of AOMI Applied through VR on Kinematic Assessment during the Finger Tapping Test

A time effect was found in total errors in the FTT performed with the left hand (p = 0.002). Post-hoc analyses revealed improvement in the AOMI group (p = 0.027) at T2 compared to T0. A time effect was found also in total fingers movement (p < 0.001). Post-hoc analyses revealed an improvement only in the AOMI group (p = 0.003) from T1 to T0, while all the groups showed an improvement (p < 0.001) at T2 compared to T0. Only the VR-AOMI group improved (p = 0.025) from T1 to T2. No improvements were found during the FTT with the right hand.

4. Discussion

This was the first study aimed at investigating whether immersive VR boosts the effects of AOMI. The main study finding was that AOMI delivered via immersive VR speeded up improvements in manual dexterity in healthy subjects. In fact, although AOMI groups showed similar manual dexterity abilities at the end of training, the VR-AOMI group achieved significantly greater dexterity changes than the AOMI group after the first training session. On the other hand, the AOMI group gradually improved manual dexterity over the five training sessions. Finding the most effective strategies to fasten acquisition of motor skills is fundamental for motor learning relative to both athletes and patients undergoing motor rehabilitation, where one goal is reducing the recovery time of subjects undergoing rehabilitation plannings.
Our study applied this approach to hand dexterity, since the effects of AOMI on the upper limbs are the most investigated in the literature and are described as the most promising in motor rehabilitation [13]. The choice to investigate healthy subjects derived from the desire to avoid any discomfort in patients assigned to the VR-AOMI group, since previous studies have described cybersickness as a potential side effect during the use of immersive virtual reality systems [42]. However, few studies have reported minimal cybersickness using a head-mounted display in healthy subjects and stroke patients during upper limb exercises through fully immersive VR systems [43]. Finally, the choice to associate AO and MI derives from literature data supporting the efficacy of AOMI in promoting the learning of complex motor tasks when compared to the use of AO or MI alone in healthy subjects [44].
The current study demonstrates that virtual reality speeded up motor learning induced by AOMI, as reported by PPT—Left hand and assembly tasks results. Based on these findings, immersive virtual reality may have enhanced embodiment and amplified 3D kinematics details of the motor task, boosting the acquisition of complex motor skills in young healthy subjects [1]. Interestingly, the largest effect in both AOMI groups was found for left ‘trained’ hand, consistently with previous studies which have demonstrated that motor learning is related to the characteristics of the observed movement [45]. Improvements at the level of the right untrained hand may derive from an interlimb transfer effect from the non-dominant to the dominant hand after a unilateral dexterity training [46].
Nevertheless, a certain heterogeneity in terms of the PPT-Assembly task was found at baseline, where the control group revealed higher mean score than the AOMI groups. However, this heterogeneity may be attributed to higher variability in terms of assembly task performance compared to unimanual tasks during PPT execution in healthy subjects [35]. The opportunity to speed up the effects of AOMI through its association with VR may depend on a greater sense of embodiment, since the motor learning process has been shown to increase when embodiment with the observed movements is higher [15]. In fact, resonance of MNS does not depend only on observed actions, since previous studies have suggested that environment and context factors also modulate corticospinal excitability during AO [26,28]. Particularly, the inferior frontal gyrus and the ventral premotor cortex revealed a significant signal increase when the context suggested the intention associated with hands actions [26]. Familiarity with and expertise of the observed action has also been shown to increase the MNS resonance [47], and the embodiment induced by immersive VR may increase the perception of familiarity with the observed action [48].
Moreover, stereoscopic 3D stimuli may provide a greater number of kinematic details than bidimensional visual stimuli, allowing for a precise assessment and understanding of the observed actions [22,49]. Finally, it is worth noting that the sense of embodiment has been also described as influenced by the person-related perspective, since an egocentric perspective has been demonstrated to trigger the illusion of body ownership and self-attribution of the virtual body [50]. Overall, VR-AO delivered in the first-person-perspective may also be considered as a multisensory stimulation which exploits the simultaneous use of different sensory inputs to enhance motor learning processes [51,52]. In particular, immersive VR systems allow for the integration of realistic scenarios engaging the patient’s sensorimotor system and promoting the feeling of being into the virtual environment [53]. In this scenario, the current study findings induce us to consider AO, MI, and VR as mutually beneficial. In fact, immersive VR might enhance motor imagery performance, providing a rich immersive and illusive experience especially in first-person perspective virtual scenarios [29,54].
The current study found improvements after AO treatments in PPT, while differences in the 9HPT and finger tapping test were not detected with respect to AO or CTRL groups. PPT has been previously described as more sensitive than the nine-hole peg test in detecting changes in manual dexterity, especially in healthy subjects [55]. Little improvements in the finger tapping test and bimanual tasks were found in the control group, probably as a result of a learning effect caused by the repetition of the test three times in one week [56].
Although VR seems to speed up improvements in manual dexterity specifically in the left observed hand, changes at the end of the treatment period were the same for immersive and non-immersive AOMI. However, confirmation of this result in clinical practice would have an interesting relevance, mostly in the rehabilitation field, where faster functional recovery is a goal of rehabilitation [57].

5. Conclusions

In conclusion, the results of the study suggest that AOMI delivered through immersive VR speeded up hand dexterity improvements. The study has some limitations. First, a single exercise was delivered during the treatment period and improvements may have been greater with a progression of exercise difficulty within the treatment period. This limitation derived from the fact that exercises recorded in virtual reality were specifically designed for the rehabilitation of post-stroke patients in a subacute phase. Thus, the authors chose the most challenging task for a healthy young subject. Furthermore, visually induced motion sickness was not explored, which has been described as a possible side effect of VR immersion, with symptoms including nausea, disorientation, and oculomotor discomfort [38]. However, no dropouts were reported either in the landscape or the VR-AOMI groups. Finally, the study was unbalanced in terms of gender and age according to the inclusion criteria, as the number of males was twice that of females, and the mean age was lower with respect to the mean age computed from the specified inclusion criteria. However, the proportion of males and females in the VR-AOMI, AOMI, and CTRL groups, as well as the mean age in all three groups, did not show statistically significant differences.
Further studies are needed to confirm the current findings in clinical practice and explore the opportunity to reduce the recovery time in subjects undergoing rehabilitation programs.

Supplementary Materials

The following supporting information can be downloaded at: https://www.mdpi.com/article/10.3390/bioengineering11040398/s1, Table S1: Comparison between VR-AOMI, AOMI and Control groups at T0, T1 and T2 for kinematic indexes during Nine-Hole Peg Test performed with the left hand. Data are shown as mean and standard deviation. Significant results are shown in bold text.; Table S2: Comparison between VR-AOMI, AOMI and Control groups at T0 and T2 for kinematic indexes during Nine-Hole Peg Test (NHPT) performed with right hand. Data are shown as mean and standard deviation. Significant results are shown in bold text.

Author Contributions

Conceptualization, R.G. and P.A. (Paola Adamo); methodology, G.L., G.M. and P.A. (Paola Adamo); software, E.S., M.F.-D. and P.A. (Pietro Avanzini); formal analysis, G.M., P.A. (Paola Adamo) and F.T.; investigation, G.L. and P.A. (Paola Adamo); resources, E.S. and F.T.; data curation, G.L. and P.A. (Paola Adamo); writing—original draft preparation, P.A. (Paola Adamo) and G.L.; writing—review and editing, R.G. and F.T.; visualization, P.A. (Pietro Avanzini), G.M. and M.F.-D.; supervision, P.A. (Pietro Avanzini) and M.F.-D.; project administration, R.G. All authors have read and agreed to the published version of the manuscript.

Funding

This research received no external funding.

Institutional Review Board Statement

The study was conducted in accordance with the Declaration of Helsinki and approved by the Ethical Committee of the Humanitas Clinical and Research Center, Milan, Italy (protocol code: VR-AOT-GR-2019, date of approval: 2 July 2021).

Informed Consent Statement

Informed consent was obtained from all subjects involved in the study. Written informed consent has been obtained from the participants to publish this paper.

Data Availability Statement

The dataset used and analyzed during the current study is available from the corresponding author upon reasonable request.

Conflicts of Interest

The authors declare no conflicts of interest.

Nomenclature

VRVirtual reality
AOMIAction observation and motor imagery
CTRLControl
MNSMirror Neuron System
KVIQKinesthetic and Visual Imagery Questionnaire
NHPTNine Hole Peg Test
PPTPurdue Pegboard Test
FTTFinger Tapping Test
T0Baseline assessment
T1Assessment after the first treatment
T2Assessment at the end of the treatment period

References

  1. Rizzolatti, G.; Fabbri-Destro, M.; Nuara, A.; Gatti, R.; Avanzini, P. The role of mirror mechanism in the recovery, manteinance, and acquisition of motor abilities. Neurosci. Biobehav. Rev. 2021, 127, 404–423. [Google Scholar] [CrossRef] [PubMed]
  2. Simonsmeier, B.A.; Andronie, M.; Buecker, S.; Frank, C. The effects of imagery interventions in sports: A meta-analysis. Int. Rev. Sport Exerc. Psychol. 2021, 14, 186–207. [Google Scholar] [CrossRef]
  3. Buccino, G. Action observation treatment: A novel tool in neurorehabilitation. Philos. Trans. R. Soc. Lond. B Biol. Sci. 2014, 369, 20130185. [Google Scholar] [CrossRef] [PubMed]
  4. Buccino, G.; Solodkin, A.; Small, S.L. Functions of the mirror neuron system: Implications for neurorehabilitation. Cogn. Behav. Neurol. 2006, 19, 55–63. [Google Scholar] [CrossRef] [PubMed]
  5. Rizzolatti, G.; Cattaneo, L.; Fabbri-Destro, M.; Rozzi, S. Cortical mechanisms underlying the organization of goal-directed actions and mirror neuron-based action understanding. Physiol. Rev. 2014, 94, 655–706. [Google Scholar] [CrossRef] [PubMed]
  6. Stefan, K.; Cohen, L.G.; Duque, J.; Mazzocchio, R.; Celnik, P.; Sawaki, L.; Ungerleider, L.; Classen, J. Formation of a motor memory by action observation. J. Neurosci. 2005, 25, 9339–9346. [Google Scholar] [CrossRef] [PubMed]
  7. Angelini, M.; Fabbri-Destro, M.; Lopomo, N.F.; Gobbo, M.; Rizzolatti, G.; Avanzini, P. Perspective-dependent reactivity of sensorimotor mu rhythm in alpha and beta ranges during action observation: An EEG study. Sci. Rep. 2018, 8, 12429. [Google Scholar] [CrossRef] [PubMed]
  8. Gatti, R.; Rocca, M.A.; Fumagalli, S.; Cattrysse, E.; Kerckhofs, E.; Falini, A.; Filippi, M. The effect of action observation/execution on mirror neuron system recruitment: An fMRI study in healthy individuals. Brain. Imaging Behav. 2017, 11, 565–576. [Google Scholar] [CrossRef] [PubMed]
  9. De Manzano, Ö.; Kuckelkorn, K.L.; Ström, K.; Ullén, F. Action-Perception Coupling and Near Transfer: Listening to Melodies after Piano Practice Triggers Sequence-Specific Representations in the Auditory-Motor Network. Cereb. Cortex. 2020, 30, 5193–5203. [Google Scholar] [CrossRef]
  10. Hardwick, R.M.; Caspers, S.; Eickhoff, S.B.; Swinnen, S.P. Neural correlates of action: Comparing meta-analyses of imagery, observation, and execution. Neurosci. Biobehav. Rev. 2018, 94, 31–44. [Google Scholar] [CrossRef]
  11. Bisio, A.; Bassolino, M.; Pozzo, T.; Wenderoth, N. Boosting Action Observation and Motor Imagery to Promote Plasticity and Learning. Neural Plast. 2018, 2018, 8625861. [Google Scholar] [CrossRef] [PubMed]
  12. Romano-Smith, S.; Wood, G.; Wright, D.J.; Wakefield, C.J. Simultaneous and alternate action observation and motor imagery combinations improve aiming performance. Psychol. Sport. Exerc. 2018, 38, 100–106. [Google Scholar] [CrossRef]
  13. Chye, S.; Valappil, A.C.; Wright, D.J.; Frank, C.; Shearer, D.A.; Tyler, C.J.; Diss, C.E.; Mian, O.S.; Tillin, N.A.; Bruton, A.M. The effects of combined action observation and motor imagery on corticospinal excitability and movement outcomes: Two meta-analyses. Neurosci. Biobehav. Rev. 2022, 143, 104911. [Google Scholar] [CrossRef]
  14. Prasertsakul, T.; Kaimuk, P.; Chinjenpradit, W.; Limroongreungrat, W.; Charoensuk, W. The effect of virtual reality-based balance training on motor learning and postural control in healthy adults: A randomized preliminary study. Biomed. Eng. Online 2018, 17, 124. [Google Scholar] [CrossRef]
  15. Pastel, S.; Petri, K.; Chen, C.H.; Wiegand Cáceres, A.M.; Stirnatis, M.; Nübel, C.; Schlotter, L.; Witte, K. Training in virtual reality enables learning of a complex sports movement. Virtual Real. 2023, 27, 523–540. [Google Scholar] [CrossRef]
  16. Fusco, A.; Tieri, G. Challenges and Perspectives for Clinical Applications of Immersive and Non-Immersive Virtual Reality. J. Clin. Med. 2022, 11, 4540. [Google Scholar] [CrossRef]
  17. Slater, M.; Pérez-Marcos, D.; Ehrsson, H.H.; Sanchez-Vives, M.V. Towards a digital body: The virtual arm illusion. Front. Hum. Neurosci. 2008, 2, 6. [Google Scholar] [CrossRef]
  18. Banakou, D.; Slater, M. Body ownership causes illusory self-attribution of speaking and influences subsequent real speaking. Proc. Natl. Acad. Sci. USA 2014, 111, 17678–17683. [Google Scholar] [CrossRef]
  19. Kilteni, K.; Normand, J.M.; Sanchez-Vives, M.V.; Slater, M. Extending body space in immersive virtual reality: A very long arm illusion. PLoS ONE 2012, 7, e40867. [Google Scholar] [CrossRef] [PubMed]
  20. Argelaguet, F.; Hoyet, L.; Trico, M.; Lecuyer, A. The role of interaction in virtual embodiment: Effects of the virtual hand representation. In Proceedings of the 2016 IEEE Virtual Reality (VR), Greenville, SC, USA, 19–23 March 2016; pp. 3–10. [Google Scholar]
  21. Nuara, A.; Fabbri-Destro, M.; Scalona, E.; Lenzi, S.E.; Rizzolatti, G.; Avanzini, P. Telerehabilitation in response to constrained physical distance: An opportunity to rethink neurorehabilitative routines. J. Neurol. 2022, 269, 627–638. [Google Scholar] [CrossRef] [PubMed]
  22. Jastorff, J.; Abdollahi, R.O.; Fasano, F.; Orban, G.A. Seeing biological actions in 3D: An fMRI study. Hum. Brain Mapp. 2016, 37, 203–219. [Google Scholar] [CrossRef] [PubMed]
  23. Laver, K.E.; Lange, B.; George, S.; Deutsch, J.E.; Saposnik, G.; Crotty, M. Virtual reality for stroke rehabilitation. Cochrane Database Syst. Rev. 2017, 11, CD008349. [Google Scholar] [CrossRef] [PubMed]
  24. Burin-Chu, S.; Baillet, H.; Leconte, P.; Lejeune, L.; Thouvarecq, R.; Benguigui, N. Effectiveness of virtual reality interventions of the upper limb in children and young adults with cerebral palsy: A systematic review with meta-analysis. Clin. Rehabil. 2024, 38, 15–33. [Google Scholar] [CrossRef] [PubMed]
  25. Dejian, L.; Dede, C.; Huang, R.; Richards, J. Implicit Learning through Embodiment in Immersive Virtual Reality. In Virtual, Augmented, and Mixed Realities in Education; Springer: Singapore, 2018; pp. 19–33. [Google Scholar]
  26. Iacoboni, M.; Molnar-Szakacs, I.; Gallese, V.; Buccino, G.; Mazziotta, J.C.; Rizzolatti, G. Grasping the intentions of others with one’s own mirror neuron system. PLoS Biol. 2005, 3, e79. [Google Scholar] [CrossRef]
  27. Liuzza, M.T.; Candidi, M.; Sforza, A.L.; Aglioti, S.M. Harm avoiders suppress motor resonance to observed immoral actions. Soc. Cogn. Affect. Neurosci. 2015, 10, 72–77. [Google Scholar] [CrossRef] [PubMed]
  28. Amoruso, L.; Urgesi, C. Contextual modulation of motor resonance during the observation of everyday actions. NeuroImage 2016, 134, 74–84. [Google Scholar] [CrossRef] [PubMed]
  29. Choi, J.W.; Kim, B.H.; Huh, S.; Jo, S. Observing Actions Through Immersive Virtual Reality Enhances Motor Imagery Training. IEEE Tranbs. Neural. Syst. Rehabil. Eng. 2020, 28, 1614–1622. [Google Scholar] [CrossRef]
  30. Mao, H.; Li, Y.; Tang, L.; Chen, Y.; Ni, J.; Liu, L.; Shan, C. Effects of mirror neuron system-based training on rehabilitation of stroke patients. Brain Behav. 2020, 10, e01729. [Google Scholar] [CrossRef]
  31. Tofani, M.; Santecchia, L.; Conte, A.; Berardi, A.; Galeoto, G.; Sogos, C.; Petrarca, M.; Panuccio, F.; Castelli, E. Effects of Mirror Neurons-Based Rehabilitation Techniques in Hand Injuries: A Systematic Review and Meta-Analysis. Int. J. Environ. Res. Public Health 2022, 19, 5526. [Google Scholar] [CrossRef]
  32. Agnelli, M.; Libeccio, B.; Frisoni, M.C.; Bolzoni, F.; Temporiti, F.; Gatti, R. Action observation plus motor imagery and somatosensory discrimination training are effective non-motor approaches to improve manual dexterity. J. Hand Ther. 2023, 37, 94–100. [Google Scholar] [CrossRef]
  33. Chan, T. An investigation of finger and manual dexterity. Percept. Mot. Skills. 2000, 90, 537–542. [Google Scholar] [CrossRef]
  34. Karni, A.; Meyer, G.; Rey-Hipolito, C.; Jezzard, P.; Adams, M.M.; Turner, R.; Ungerleider, L.G. The acquisition of skilled motor performance: Fast and slow experience-driven changes in primary motor cortex. Proc. Natl. Acad. Sci. USA 1998, 95, 861–868. [Google Scholar] [CrossRef]
  35. Buddenberg, L.A.; Davis, C. Test-retest reliability of the Purdue Pegboard Test. Am. J. Occup. Ther. 2000, 54, 555–558. [Google Scholar] [CrossRef]
  36. Lakens, D. Calculating and reporting effect sizes to facilitate cumulative science: A practical primer for t-tests and ANOVAs. Front. Psychol. 2013, 4, 863. [Google Scholar] [CrossRef]
  37. Oldfield, R.C. The assessment and analysis of handedness: The Edinburgh inventory. Neuropsychologia 1971, 9, 97–113. [Google Scholar] [CrossRef]
  38. Scalona, E.; De Marco, D.; Bazzini, M.C.; Nuara, A.; Zilli, A.; Taglione, E.; Pasqualetti, F.; Della Polla, G.; Lopomo, N.F.; Fabbri-Destro, M.; et al. A Repertoire of Virtual-Reality, Occupational Therapy Exercises for Motor Rehabilitation Based on Action Observation. Data 2022, 7, 9. [Google Scholar] [CrossRef]
  39. Malouin, F.; Richards, C.L.; Jackson, P.L.; Lafleur, M.F.; Durand, A.; Doyon, J. The Kinesthetic and Visual Imagery Questionnaire (KVIQ) for assessing motor imagery in persons with physical disabilities: A reliability and construct validity study. J. Neurol. Phys. Ther. 2007, 31, 20–29. [Google Scholar] [CrossRef]
  40. Temporiti, F.; Mandaresu, S.; Calcagno, A. Kinematic evaluation and reliability assessment of the Nine Hole Peg Test for manual dexterity. J. Hand Ther. 2023, 36, 560–567. [Google Scholar] [CrossRef]
  41. Li, J.; Zhu, H.; Pan, Y.; Wang, H.; Cen, Z.; Yang, D.; Luo, W. Three-Dimensional Pattern Features in Finger Tapping Test for Patients with Parkinson’s disease. Proc. Annu. Int. Conf. IEEE Eng. Med. Biol Soc. 2020, 2020, 3676–3679. [Google Scholar]
  42. Huygelier, H.; Mattheus, E.; Abeele, V.V.; van Ee, R.; Gillebert, C.R. The Use of the Term Virtual Reality in Post-Stroke Rehabilitation: A Scoping Review and Commentary. Psychol. Belg. 2021, 61, 145–162. [Google Scholar] [CrossRef]
  43. Lee, S.H.; Jung, H.Y.; Yun, S.J.; Oh, B.M.; Seo, H.G. Upper Extremity Rehabilitation Using Fully Immersive Virtual Reality Games with a Head Mount Display: A Feasibility Study. PMR 2020, 12, 257–262. [Google Scholar] [CrossRef]
  44. Romano Smith, S.; Wood, G.; Coyles, G.; Roberts, J.W.; Wakefield, C.J. The effect of action observation and motor imagery combinations on upper limb kinematics and EMG during dart-throwing. Scand. J. Med. Sci. Sports 2019, 29, 1917–1929. [Google Scholar] [CrossRef]
  45. Gatti, R.; Tettamanti, A.; Gough, P.M.; Riboldi, E.; Marinoni, L.; Buccino, G. Action observation versus motor imagery in learning a complex motor task: A short review of literature and a kinematics study. Neurosci. Lett. 2013, 540, 37–42. [Google Scholar] [CrossRef]
  46. Pereira, E.A.; Raja, K.; Gangavalli, R. Effect of training on interlimb transfer of dexterity skills in healthy adults. Am. J. Phys. Med. Rehabil. 2011, 90, 25–34. [Google Scholar] [CrossRef]
  47. Calvo-Merino, B.; Grèzes, J.; Glaser, D.E.; Passingham, R.E.; Haggard, P. Seeing or Doing? Influence of Visual and Motor Familiarity in Action Observation. Curr. Biol. 2006, 16, 1905–1910. [Google Scholar] [CrossRef]
  48. Turhan, B.; Gümüş, Z.H. A Brave New World: Virtual Reality and Augmented Reality in Systems Biology. Front. Bioinform. 2022, 2, 873478. [Google Scholar] [CrossRef]
  49. Ferri, S.; Pauwels, K.; Rizzolatti, G.; Orban, G.A. Stereoscopically Observing Manipulative Actions. Cereb. Cortex. 2016, 26, 3591–3610. [Google Scholar] [CrossRef]
  50. Petkova, V.I.; Khoshnevis, M.; Ehrsson, H.H. The perspective matters! Multisensory integration in ego-centric reference frames determines full-body ownership. Front. Psychol. 2011, 2, 35. [Google Scholar] [CrossRef]
  51. Šlosar, L.; Peskar, M.; Pišot, R.; Marusic, U. Environmental enrichment through virtual reality as multisensory stimulation to mitigate the negative effects of prolonged bed rest. Front. Aging Neurosci. 2023, 15, 1169683. [Google Scholar] [CrossRef]
  52. Tinga, A.M.; Visser-Meily, J.M.; van der Smagt, M.J.; Van der Stigchel, S.; van Ee, R.; Nijboer, T.C. Multisensory Stimulation to Improve Low- and Higher-Level Sensory Deficits after Stroke: A Systematic Review. Neuropsychol. Rev. 2016, 26, 73–91. [Google Scholar] [CrossRef]
  53. Brugada-Ramentol, V.; Bozorgzadeh, A.; Jalali, H. Enhance VR: A multisensory approach to cognitive training and monitoring. Front. Digit. Health 2022, 4, 916052. [Google Scholar] [CrossRef] [PubMed]
  54. Waltemate, T.; Gall, D.; Roth, D.; Botsch, M.; Latoschik, M.E. The impact of avatar personalization and immersion on virtual body ownership, presence, and emotional response. IEEE Trans. Vis. Comput. Graph. 2018, 24, 1643–1652. [Google Scholar] [CrossRef] [PubMed]
  55. Proud, E.L.; Miller, K.J.; Bilney, B.; Morris, M.E.; McGinley, J.L. Construct validity of the 9-Hole Peg Test and Purdue Pegboard Test in people with mild to moderately severe Parkinson’s disease. Physiotherapy 2020, 107, 202–208. [Google Scholar] [CrossRef] [PubMed]
  56. Tseng, Y.C.; Chang, K.Y.; Liu, P.L.; Chang, C.C. Applying the purdue pegboard to evaluate precision assembly performance. In Proceedings of the IEEE International Conference on Industrial Engineering and Engineering Management (IEEM) 2017, Singapore, 10–13 December 2017; pp. 1179–1183. [Google Scholar]
  57. Cortes, J.C.; Goldsmith, J.; Harran, M.D.; Xu, J.; Kim, N.; Schambra, H.M.; Luft, A.R.; Celnik, P.; Krakauer, J.W.; Kitago, T. A short and distinct time window for recovery of arm motor control early after stroke revealed with a global measure of trajectory kinematics. Neurorehabilit. Neural Repair 2017, 31, 552–560. [Google Scholar] [CrossRef] [PubMed]
Figure 1. Study participant undergoing the VR-AOMI through the Oculus headset (VR-AOMI), AOMI through a laptop screen (AOMI), and landscape observation through VR (CTRL).
Figure 1. Study participant undergoing the VR-AOMI through the Oculus headset (VR-AOMI), AOMI through a laptop screen (AOMI), and landscape observation through VR (CTRL).
Bioengineering 11 00398 g001
Figure 2. Study timeline.
Figure 2. Study timeline.
Bioengineering 11 00398 g002
Figure 3. The scores obtained in Purdue Pegboard Test across the three groups are shown. Data are presented as means (dots and triangles) and standard deviation (vertical bars).
Figure 3. The scores obtained in Purdue Pegboard Test across the three groups are shown. Data are presented as means (dots and triangles) and standard deviation (vertical bars).
Bioengineering 11 00398 g003
Figure 4. Between-group differences for deltas from T0 to T1 and from T2 to T0 for the PPT-L task. Boxes represent the range between the first and the third quartile, the middle horizontal line is the mean value, and the ends of the vertical line, from top to bottom, are the maximum and minimum values, respectively. Symbols show differences in terms of delta between the VR-AOMI and AOMI groups from T0 to T1 (#) and between the VR-AOMI and AOMI groups with the CTRL group from T0 to T1 and from T0 to T2 (*).
Figure 4. Between-group differences for deltas from T0 to T1 and from T2 to T0 for the PPT-L task. Boxes represent the range between the first and the third quartile, the middle horizontal line is the mean value, and the ends of the vertical line, from top to bottom, are the maximum and minimum values, respectively. Symbols show differences in terms of delta between the VR-AOMI and AOMI groups from T0 to T1 (#) and between the VR-AOMI and AOMI groups with the CTRL group from T0 to T1 and from T0 to T2 (*).
Bioengineering 11 00398 g004
Table 1. Characteristics of study participants. Data are shown as mean ± standard deviation.
Table 1. Characteristics of study participants. Data are shown as mean ± standard deviation.
VR-AOMI Group AOMI GroupCTRL Group p-Value
(n = 15)(n = 15)(n = 15)
Age (years)24.06 ± 3.123.53 ± 3.2422.53 ± 1.060.32
Weight (kg)67.43 ± 10.8266.06 ± 12.5269.53 ± 8.710.67
Height (cm)176.53 ± 6.76172.6 ± 11.09172.73 ± 6.960.39
Gender11M/4F (27%F)9M/6F (40%F)10M/5F (33%F)0.74
KVIQ36.6 ± 7.2240.87 ± 6.4238.27 ± 8.130.35
VR-AOMI: action observation and motor imagery through immersive VR; AOMI: action observation and motor imagery through laptop; CTRL: control; KVIQ: kinesthetic and visual imagery questionnaire; M: male; F: female.
Table 2. PPT scores in the VR-AOMI (A), AOMI (B), and CTRL (C) groups at T0, T1, and T2 are expressed as mean ± standard deviation. Within-group comparisons between T1/T0 and T2/T0 are expressed as Cohen’s d with 95% confidence interval (CI95).
Table 2. PPT scores in the VR-AOMI (A), AOMI (B), and CTRL (C) groups at T0, T1, and T2 are expressed as mean ± standard deviation. Within-group comparisons between T1/T0 and T2/T0 are expressed as Cohen’s d with 95% confidence interval (CI95).
(A)
VR-AOMIT0T1T2d (CI95)
T1–T0
d (CI95)
T2–T0
R task15.49 ± 1.5716.60 ± 1.1417.17 ± 1.571.03 (0.39, 1.65)1.51 (0.75, 2.52)
L task14.58 ± 1.0316.06 ± 0.8516.40 ± 0.992.37 (1.53, 3.36)1.84 (0.99, 2.67)
B Task12.04 ± 1.2212.71 ± 1.0913.35 ± 1.380.75 (0.16, 1.31)1.79 (0.95,2.61)
R + L + B task42.62 ± 2.9445.42 ± 2.8246.98 ± 3.801.75 (0.92, 2.56)2.07 (1.15, 2.97)
Assembly task40.86 ± 5.0244.55 ± 4.9446.86 ± 5.701.86 (0.99, 2.69)2.73 (1.60, 3.85)
(B)
AOMIT0T1T2d (CI95)
T1–T0
d (CI95)
T2–T0
R task15.53 ± 1.3316.51 ± 1.5317.13 ± 1.670.98 (0.34,1.58)1.51 (0.74, 2.42)
L task14.49 ± 1.8315.42 ± 1.5516.22 ± 1.381.35 (0.63, 2.05)1.70 (0.89, 2.50)
B Task12.18 ± 1.3712.77 ± 1.5013.13 ± 1.410.84 (0.24, 1.42)1.09 (0.43, 1.72)
R + L + B task42.33 ± 3.9444.29 ± 3.9046.46 ± 3.561.38 (0.65, 2.08)1.91 (1.03, 2.76)
Assembly task43.86 ± 5.9146.15 ± 5.5848.40 ± 4.680.62 (0.06, 1.17)1.11 (0.45, 1.75)
(C)
CTRLT0T1T2d (CI95)
T1–T0
d (CI95)
T2–T0
R task16.00 ± 1.5816.10 ± 1.5716.22 ± 2.000.1 (−0.41, 0.60)0.22 (−0.30, 0.73)
L task14.64 ± 1.8115.02 ± 1.4315.11 ± 1.920.52 (−0.3, 1.05)0.50 (−0.04, 1.04)
B Task12.11 ± 1.5812.24 ± 1.4012.80 ± 1.770.16 (−0.35, 0.67)1.18 (0.54, 1.84)
R + L + B task42.95 ± 4.7343.15 ± 4.1844.11 ± 5.420.09 (−0.42, 0.60)0.72 (0.14, 1.28)
Assembly task45.09 ± 5.7746.78 ± 5.6947.78 ± 6.530.90 (0.28, 1.49)1.53 (0.76, 2.28)
Abbreviations. VR-AOMI: action observation through immersive virtual reality group; AOMI: action observation group; CTRL: control group; d: Cohen’s d; CI95: confidence interval; R: right; L: left; B: both.
Table 3. Comparisons of deltas among the VR-AOMI, AOMI, and CTRL groups for the Purdue Pegboard Test. Data were expressed as Cohen’s d with 95% confidence interval (CI95). Significant results are shown in bold text.
Table 3. Comparisons of deltas among the VR-AOMI, AOMI, and CTRL groups for the Purdue Pegboard Test. Data were expressed as Cohen’s d with 95% confidence interval (CI95). Significant results are shown in bold text.
ΔT1–T0 Cohen’s d (CI95)ΔT2–T0 Cohen’s d (CI95)
VR-AOMI/AOMIVR-AOMI/CTRLAOMI/CTRLVR-AOMI/AOMIVR-AOMI/CTRLAOMI/CTRL
R task0.13
(−0.59–0.84)
0.90
(0.14–1.65)
0.81
(0.06–1.55)
0.08
(−0.63–0.80)
1.39
(0.58–2.19)
1.35
(0.54–2.13)
L task0.84
(0.09–1.58)
1.64
(0.79–2.46)
0.79
(0.04–1.52)
0.09
(−0.63–0.80)
1.42
(0.60–2.21)
1.31
(0.50–2.09)
B Task0.08
(−0.63–0.80)
0.62
(−0.12–1.35)
0.60
(−0.14–1.33)
0.44
(−0.29–1.16)
0.94
(0.18–1.69)
0.36
(−0.37–1.08)
R + L + B task0.18
(−0.54–0.90)
1.38
(0.57–2.17)
1.16
(0.38–1.93)
0.09
(−0.62–0.81)
1.71
(0.85–2.54)
1.57
(0.73–2.38)
Assembly task0.48
(−0.26–1.20)
1.03
(0.26–1.79)
0.21
(−0.51–0.92)
0.45
(−0.28–1.17)
1.67
(0.82–2.49)
0.59
(−0.15–1.32)
Abbreviations. VR-AOMI: action observation performed through immersive virtual reality group; AOMI: action observation group; CTRL: control group; CI95: confidence interval; R: right; L: left; B: both.
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Adamo, P.; Longhi, G.; Temporiti, F.; Marino, G.; Scalona, E.; Fabbri-Destro, M.; Avanzini, P.; Gatti, R. Effects of Action Observation Plus Motor Imagery Administered by Immersive Virtual Reality on Hand Dexterity in Healthy Subjects. Bioengineering 2024, 11, 398. https://doi.org/10.3390/bioengineering11040398

AMA Style

Adamo P, Longhi G, Temporiti F, Marino G, Scalona E, Fabbri-Destro M, Avanzini P, Gatti R. Effects of Action Observation Plus Motor Imagery Administered by Immersive Virtual Reality on Hand Dexterity in Healthy Subjects. Bioengineering. 2024; 11(4):398. https://doi.org/10.3390/bioengineering11040398

Chicago/Turabian Style

Adamo, Paola, Gianluca Longhi, Federico Temporiti, Giorgia Marino, Emilia Scalona, Maddalena Fabbri-Destro, Pietro Avanzini, and Roberto Gatti. 2024. "Effects of Action Observation Plus Motor Imagery Administered by Immersive Virtual Reality on Hand Dexterity in Healthy Subjects" Bioengineering 11, no. 4: 398. https://doi.org/10.3390/bioengineering11040398

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop