Next Article in Journal
Label-Free Biosensor Based on Particle Plasmon Resonance Coupled with Diffraction Grating Waveguide
Previous Article in Journal
Therapeutic Exercise Recognition Using a Single UWB Radar with AI-Driven Feature Fusion and ML Techniques in a Real Environment
Previous Article in Special Issue
Gait and Balance Assessments with Augmented Reality Glasses in People with Parkinson’s Disease: Concurrent Validity and Test–Retest Reliability
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Analysis of Gait Kinematics in Smart Walker-Assisted Locomotion in Immersive Virtual Reality Scenario

by
Matheus Loureiro
1,
Arlindo Elias
2,
Fabiana Machado
3,
Marcio Bezerra
1,
Carla Zimerer
1,
Ricardo Mello
1 and
Anselmo Frizera
1,3,*
1
Graduate Program in Electrical Engineering, Federal University of Espírito Santo, Vitória 29075-910, ES, Brazil
2
Graduate Program in Physiotherapy, Estacio de Sa University, Vitória 29092-095, ES, Brazil
3
Graduate Program in Informatics, Federal University of Espírito Santo, Vitória 29075-910, ES, Brazil
*
Author to whom correspondence should be addressed.
Sensors 2024, 24(17), 5534; https://doi.org/10.3390/s24175534
Submission received: 31 July 2024 / Revised: 20 August 2024 / Accepted: 22 August 2024 / Published: 27 August 2024
(This article belongs to the Collection Sensors for Gait, Human Movement Analysis, and Health Monitoring)

Abstract

:
The decline in neuromusculoskeletal capabilities of older adults can affect motor control, independence, and locomotion. Because the elderly population is increasing worldwide, assisting independent mobility and improving rehabilitation therapies has become a priority. The combination of rehabilitation robotic devices and virtual reality (VR) tools can be used in gait training to improve clinical outcomes, motivation, and treatment adherence. Nevertheless, VR tools may be associated with cybersickness and changes in gait kinematics. This paper analyzes the gait parameters of fourteen elderly participants across three experimental tasks: free walking (FW), smart walker-assisted gait (AW), and smart walker-assisted gait combined with VR assistance (VRAW). The kinematic parameters of both lower limbs were captured by a 3D wearable motion capture system. This research aims at assessing the kinematic adaptations when using a smart walker and how the integration between this robotic device and the VR tool can influence such adaptations. Additionally, cybersickness symptoms were investigated using a questionnaire for virtual rehabilitation systems after the VRAW task. The experimental data indicate significant differences between FW and both AW and VRAW. Specifically, there was an overall reduction in sagittal motion of 16%, 25%, and 38% in the hip, knee, and ankle, respectively, for both AW and VRAW compared to FW. However, no significant differences between the AW and VRAW kinematic parameters and no adverse symptoms related to VR were identified. These results indicate that VR technology can be used in walker-assisted gait rehabilitation without compromising kinematic performance and presenting potential benefits related to motivation and treatment adherence.

1. Introduction

The proportion of elderly people in the world has been increasing over the years, and the pace of aging is expected to intensify in the coming decades [1]. This population profile requires a large number of healthcare professionals and institutions to maintain well-being and quality of life status, which contributes to higher health-related direct and indirect costs [2].
The World Health Organization (WHO) describes healthy aging as the development and maintenance of capabilities that enable the elderly to be and do their activities independently [3]. Nevertheless, the inherent decline in physical and cognitive capabilities faced by aging individuals can affect their independence levels [4]. Musculoskeletal dysfunctions are important consequences of this decline and are manifested as the loss of muscle strength, balance disorders, and gait impairments [5]. The evolution of musculoskeletal problems leads to mobility impairments that affect an individual’s ability to move freely and perform daily tasks [6]. Furthermore, it increases the risk of falls, which is the most common cause of accidental death in the elderly [1,5].
Gait and balance disorders are also common in people affected by neurological problems, such as cerebral palsy, multiple sclerosis, parkinson, and stroke [5]. Rehabilitation plays an important role in these people’s lives by improving independence and confidence during motion tasks. The WHO emphasizes that rehabilitation should be available for the entire world population and all ages [7].
Rehabilitation typically involves repetitive tasks that are performed during a certain timeline, which can be very long depending on the subject’s condition [8]. This process is not linear and can be further complicated by the presence of pain, frustration related to not accomplishing partial goals, and failure to notice improvement in daily tasks. These factors can have a significant impact on the rehabilitation’s dropout rates [8,9]. Recently, technologies, such as robot-assisted gait training, virtual reality systems, and wearable sensors have been introduced into the biomechanical and rehabilitation fields integrated into training to enhance motor recovery, increase motivation, and improve treatment reports and the patient’s adherence [10].
Virtual reality (VR) is an emerging technology that has been explored to improve such goals and develop scenarios to investigate balance issues, gait analysis, and assistive device developments [11,12]. VR technologies can be used to create personalized environments according to the patient’s needs in gait rehabilitation [11,13]. Several studies have compared the impact of VR strategies to traditional methods of balance and gait rehabilitation [11].
VR setups have been utilized not only for gait rehabilitation but also for the rehabilitation of upper and lower limbs in general [14,15]. VR scenarios can be combined with game interfaces, particularly serious games, to enhance user engagement and motivation while performing activities [15,16]. For example, in [17], a VR game was developed for hand rehabilitation of subacute stroke patients, incorporating exercises like grasping and hitting obstacles within a game setting. Additionally, in [16], volunteers using a treadmill were motivated to walk through a VR forest accompanied by a virtual dog, which aimed to increase motivation during gait rehabilitation. At the end of the walk, participants reported a reduced physical demand while immersed in the game.
Despite suggestions that VR can provide rehabilitation that is more effective and motivated than traditional rehabilitation, little evidence exists to date to support these claims [11,15,18,19]. The use of VR can cause some adverse symptoms, such as nausea, disorientation, headache, and eye strain. These symptoms are called cybersickness and can occur after a VR session [20]. Besides cybersickness, VR systems can also affect gait biomechanical parameters, such as reductions in walk speed, stride length, or cadence, when compared with a traditional rehabilitation system [21,22,23].
For example, the works presented in [21,22] compared overground walking in the physical world and in immersive VR. The results showed that the participants demonstrated small changes in some parameters of walk distance, and stride and step length, but some significant differences were found in ankle plantar flexion, cadence, and walking speed. The authors suggested that these changes could be associated with a more conservative and cautious walk pattern due to instability induced by the VR environment [22].
Immersive VR walks can also be combined with robotic assistive devices. These devices are divided into stationary systems, where the robot is fixed in a structure, for example, a treadmill, or overground walking systems, where the user can move freely in a mobile base robot, such as an exoskeleton or smart walker (SW) [24,25,26]. There are several studies on gait rehabilitation with VR on instrumented treadmills, because stationary systems had fewer requisites related to actuators and maneuverability to be developed with VR than overground walking systems [27,28,29].
However, it is important to evaluate VR scenarios with overground walking systems, as this helps the user practice a more natural gait without restrictions and also can improve the outcomes of rehabilitation programs [25,30]. SW stands out in overground walking robots because of their potential to improve balance and provide body weight support during a walk [31]. Additionally, such devices can include monitoring of biomechanical parameters, fall detection algorithms, obstacle avoidance strategies, and guidance to follow specific paths [31]. These features can be used in the rehabilitation of a wide spectrum of neurological and cognitive deficits [24,32,33,34].
Over the past decade, our research group has been developing research in the rehabilitation robotic field, focusing on applications for SW. We developed strategies to enhance human–robot interaction whilst empowering safe navigation respecting the user’s residual capabilities [35,36,37,38,39]. Furthermore, we integrated our SW with cloud computing systems to execute tasks with high computational cost [40]. Recently, our works have focused on integrating SW in VR to develop interactive scenarios for gait training and rehabilitation, combining physical and virtual environments [32,41,42]. The results indicate positive acceptance of the VR system, as shown by questionnaire feedback, and task performance improvements, mainly in execution time, when gait tasks were performed using VR and SW. Nevertheless, these works did not include biomechanical parameters.
The monitoring of human biomechanical parameters during gait is an important tool to diagnose pathologies and evaluate treatments [43]. The recent advancements in the development of wearable systems based on Inertial Measurement Units (IMUs) to track three-dimensional kinematic variables allowed for monitoring these human parameters in real-world scenarios (out of the laboratory) in several fields, including rehabilitation and sports [44]. The combination of this wearable system with SW or other assistive devices can be used as an additional tool to study the kinematics effects during a walk with these devices [34]. However, the clinical applications of such device combinations are scarce in the literature and require investigation.
This paper investigates the gait kinematic parameters of elderly individuals across three experimental tasks: normal free walking (FW), smart walker-assisted gait (AW), and smart walker-assisted gait plus VR assistance (VRAW). The FW task was designed to capture the standard walking patterns of each individual, which were then compared with the patterns observed in the AW and VRAW tasks. The goal of this paper is to evaluate if the differences between FW and AW are the same as those between FW and VRAW and how AW and VRAW differ from each other to assess the effects of VR in gait-assisted tasks. The results can help in understanding the kinematic adaptations to the use of an SW and how the VR environment can influence such adaptations and can ultimately provide an integration strategy between different gait assistance technologies.

2. Materials and Methods

2.1. Participants

A total of 14 random elderly individuals (5 men and 9 women) from the local community were enrolled in the experiment. These volunteers were recruited from an extension program at our university that offers recreational activities to the elderly population and an external exercise guidance center within the community. The sample size calculation was based on a repeated-measures ANOVA, within factors, of one group of participants and three measurements. The level of significance and power were set to 0.05 and 0.80, respectively, and the expected effect size was moderate. The calculations were performed in G.Power (version 3.1.9.2). The participants presented a mean age of 66.3 (±3.9) years, mean weight of 71.43 (±10.8) kg, and mean height of 1.66 (±0.09) m. The inclusion criteria consisted of ages above 60 years old and having enough physical condition for the walk test and training with the smart walker device. Subjects were excluded in the case of additional comorbidities that prevented the walking tests, previous experience with VR technology, being a smoker or under thermogenic supplements or medications use, and scoring above 24 points on the Mini-Mental test to ensure proper cognitive function [45]. This information was obtained by conducting an online interview before receiving the participants in our laboratory. During this interview, we performed an anamnesis to collect comprehensive details about each participant’s medical history, including previous and current illnesses, disorders, living conditions, and potential risk factors. Additionally, we administered the Mini-Mental State Examination to assess cognitive function. The participants were also encouraged to maintain their usual diet and medication use throughout the experiments, and they were also informed about the potential risks and discomforts associated with the research procedures and were required to voluntarily sign a Consent Form. This experiment was approved by the Research Ethics Committee of the Federal University of Espírito Santo (registration number 6.294.101).

2.2. Materials

The materials used in this paper were the UFES vWalker, an SW developed by our group [32], a VR headset (Oculus Quest 2, Meta Quest, Menlo Park, CA, USA), and a 3D motion capture IMU-based system (MVN Awinda, Movella, NV, USA). These materials were divided into five subsystems: Odometry and Control (OC), Human–Robot-Environment Interaction (HREI), Human–Robot Interaction (HRI), Motion Capture (MC), and Virtual Reality Integration (VRI). The first three subsystems were related to the UFES vWalker, and the last two were associated with the VR headset and 3D motion capture system, respectively. All the materials and subsystems are shown in Figure 1.

2.3. UFES vWalker

From the subsystems of the UFES vWalker, the OC subsystem is responsible for odometry and localization using an IMU (BNO055 9-DOF BOSCH, Gerlingen, Germany) and two encoders (H1 series—US Digital, Vancouver, WA, USA). It also facilitates propulsion and stability through two motorized wheels and two caster wheels. The HREI subsystem is composed of a Light Detection and Ranging (LiDAR) system (URG-04LX Hokuyo, Osaka, Japan) used in front of the SW for obstacle identification.
The HRI subsystem is composed of another LiDAR system (RPLIDAR A3 SLAMTEC, Shanghai, China) used in front of the user’s legs to track the distance between the user and the device, using a leg clustering technique proposed in [37]. Another component of HRI is the two triaxial force sensors (MTA400 FUTEK, Irvine, CA, USA) responsible for capturing the user’s movement intention and converting it into linear ν c ( t ) and angular velocity w c ( t ) commands with the admittance controller proposed in [35]. The controller is shown in Equations (1)–(4).
F ( t ) = ( F L Y ( t ) + F R Y ( t ) 2 ) ,
τ ( t ) = ( F L Y ( t ) F R Y ( t ) 2 ) d .
ν c ( t ) = F ( t ) m ν ν ˙ ( t ) d ν ,
w c ( t ) = τ ( t ) m ω ω ˙ ( t ) d ω .
F L Y and F R Y are the forces on the left and right arm, respectively. These forces are used to calculate the force forward F ( t ) and torque τ ( t ) in Equations (1) and (2), where d is the distance between the left and right sensors. Then, it generates the UFES vWalker linear v c ( t ) and angular reference velocities w c ( t ) in Equations (3) and (4), respectively. The constants d ν and d ω are damping parameters, m ν and m w are virtual masses, and ν ˙ ( t ) and ω ˙ ( t ) represent the linear and angular acceleration.
The admittance controller is used in the UFES vWalker to promote a more natural interaction between the user and robot [46]. With this controller, it is possible to modulate the SW inertia, emulating different haptic levels to the user and controlling the device through the force sensors.
The data from the sensors mentioned above are captured by a microcontroller (STM32) that is responsible for the low-level control. This information is sent to a minicomputer (OptiPlex Micro, 16 GB RAM) (DELL, Austin, TX, USA) with a middleware Robot Operating System (ROS), used to control the UFES vWalker.

2.4. The 3D Motion Capture System

The participant’s gait information is captured using a wireless 3D motion capture IMU-based system. This portable equipment has been previously validated for extracting gait information [44,47]. In this paper, this equipment comprised the MC subsystem. Seven IMUs were attached by straps in one in each foot, lower leg, and upper leg and one in the pelvis of the participants, as shown in Figure 1. This information is combined with the individual’s anthropometric measurements to obtain the 3D gait information of the lower limbs in the developer software (MVN Analyze—Version: 2024.1, MVN Awinda, Movella, NV, USA).

2.5. Immersive VR Scenario

The VRI subsystem is responsible for the immersion of the UFES vWalker and the participants in the VR scenario, which was made using the software Unity, a game engine that enables the development of games and other applications. Using the VR headset, as shown in Figure 1, the participants are able to see the user interface. In this interface, the participant sees a star, and each time they collect a star with the UFES vWalker, the score increases in the left corner of the screen, and a new star appears in front of them. The stars were distributed to guide the participant in a straight line, and the black lines on the ground marked the boundaries of the physical world, considering that the experiments were conducted in a 60 m × 5 m hallway.
The user interface is also a feedback interface used to assist participants in maintaining safe navigation and enhancing their presence in the VR scenario. The feet in the middle represent the distance from the user’s leg to the device. The dotted lines are the limits that guarantee a safe distance to the SW. The minimum distance was 0.25 m and the maximum distance was 0.75 m. Each sidebar represents the downward force applied in the force sensor: green bars indicate a correct body weight discharge, while red bars indicate that the user is poorly positioned on the UFES vWalker. A downward force above 5% of the participant’s weight was considered a correct discharge. If the users do not meet the specified distance or force values, the UFES vWalker remains stopped to avoid risk to the participants.
Figure 1 on the right side also shows the digital twin of the UFES vWalker. Every movement of the UFES vWalker in the physical world is replicated in the VR scenario using odometry messages from the OC subsystem through a ROS protocol (ROS#). The motion capture data are also reproduced in the VR scenario using a Unity plugin in MVN Analyze. This information is used to monitor the user during the experiments.

2.6. Experimental Protocol

The participants were randomly divided into two equally distributed groups. Each group participated in three days of experiments, separated by 48 h each day. During the experiments, the participants completed three different gait scenario tasks. The first task was the free walk (FW): a 10-Meter Walk Test (10 MWT) without the UFES vWalker and the VR headset. The second task was the UFES vWalker-Assisted Walk (AW): a straight-line walk for 90 s with the UFES vWalker. The last task was the UFES vWalker Virtual Reality-Assisted Walk (VRAW): a straight-line walk for 90 s with the UFES vWalker and the VR headset. In all the tasks, the participants used the wearable motion capture system.
The FW was designed to extract the standard walking patterns of each individual. These patterns were then compared with the AW and VRAW. It was expected that a walk assisted by SW (whether with or without VR) would differ from a free walk due to body weight support and other characteristics related to the SW. Nonetheless, the goal of this paper is to evaluate if the differences between FW and AW are the same as those between FW and VRAW and how AW and VRAW differ from each other to assess the effects of VR in gait-assisted tasks.
On the first day, the UFES vWalker, the VR headset, and the motion capture system were introduced to the participants, and they performed a familiarization test consisting of one FW trial, one AW trial, and one VRAW trial. The data from these initial tests were not used in the analysis. If any volunteers were unable to perform one of the tests due to physical limitations, they would be excluded from the research. On the second day, half of the participants performed one FW and three AW tasks, while the other half performed one FW and three VRAW tasks. On the third day, the groups swapped their tasks from day two. The goal of this group distribution was to ensure that half of the participants started the second day with AW tasks and the other half with VRAW tasks.
This experimental protocol was designed as a gait training study using the UFES vWalker to evaluate the effects of incorporating a VR headset on gait parameters and cybersickness symptoms. Based on the results obtained, future protocols could be adapted to rehabilitation settings to explore the practical applications of the UFES vWalker in gait rehabilitation contexts.

2.7. Variables

This paper investigated two groups of kinematic variables of the gait cycle across three gait scenario tasks during a 10 MWT. The range of 10 m in a straight line was chosen because it is a tool standard to measure the functional capacity of people in gait analysis [48]. For the AW and VRAW tasks, the 10 MWT was selected from the middle portion of the 90 s walk, during the period when the participants completed this distance. The first group of variables included the overall spatiotemporal parameters of the walking cycle, and these variables were the following:
  • Stride length (meters): the mean of the distance between two consecutive heel strikes of the same foot in the 10 MWT.
  • Stride number: the number of steps in the 10 MWT.
  • Gait speed (meters per second): the mean walk velocity in the 10 MWT.
  • Cadence (steps per second): the mean number of steps per second in the 10 MWT.
  • Stance phase (seconds): the mean time in the stance phase in each gait cycle in the 10 MWT.
  • Swing phase (seconds): the mean time in the swing phase in each gait cycle in the 10 MWT.
  • Time (seconds): the time to complete the 10 MWT.
The second group of variables consisted of lower limb joint angles of the hip, knee, and ankle, extracted from specific instants (mainly extreme points) of each joint cycle waveform in the sagittal, coronal, and transverse planes. The parameters are based on the work by [48]. These parameters are summarized in Table 1. These parameters are also shown in a gait cycle from a random volunteer during the FW in Figure 2 for the hip, knee, and ankle, respectively. A 3D kinematic gait analysis relies on extensive data that can be complex to interpret visually [49]. To facilitate the evaluation of the volunteer’s gait patterns, the gait point segmentation method proposed in [48] is used, enhancing the clarity of the analysis [49].
All these variables were extracted from MVN Analyze, which exports a specific file extension (.mvnx) with all the 3D gait data information. These files were imported in MATLAB (R2024a), where the heel and toe strikes of the foot were identified to extract the variables from each gait cycle during the 10 MWT.
Before starting the VRAW task, the volunteers completed the Simulator Sickness Questionnaire (SSQ) to identify any presence of motion sickness symptoms when using VR systems [50]. Volunteers who exhibited any of these symptoms before the experiments were excluded from the research. After the VRAW task, the volunteers were invited to fill out the SEQ (Suitability Evaluation Questionnaire for Virtual Rehabilitation Systems) at the end of the experiment [51]. This questionnaire is composed of fourteen questions with a 5-point Likert Scale, resulting in a score range from 0 to 65. The questions evaluate the perception of the participants about usability, acceptance, and security and detect frequent VR issues (cybersickness symptoms such as nausea, eye discomfort, and disorientation) after using VR rehabilitation systems.

2.8. Statistical Analysis

The distributions of the spatiotemporal and kinematic variables were analyzed graphically using histograms and statistically with the Shapiro–Wilk test to check for significant deviations from normality. The general features of each variable were analyzed by descriptive statistics. A repeated-measures one-way ANOVA was conducted to compare each kinematic variable across the different gait tasks (FW, AW, and VRAW). In the repeated-measures design, sphericity is an important assumption that refers to the equality of variances of the differences between groups. Mauchly’s test was used to address sphericity issues in each comparison and if the result was significant, then Greenhouse–Geiser correction was applied to adjust the degrees of freedom and provide a more reliable F-test [52]. A p-value of less than 0.05 indicated a statistically significant difference across the three gait tasks. The Bonferroni post hoc adjustment was used to determine specific between-group differences, with a p-value of less than 0.05 indicating significance. In the event of normality violations, a robust version of the repeated-measures ANOVA and a robust post hoc test were performed, both using a significance level of p < 0.05. All the statistical analyses were carried out using R statistical computing software (version 4.3.2) and RStudio (version 2023.06.0+421 “Mountain Hydrangea”) for Windows.

3. Results

3.1. Spatiotemporal Parameters

The results of the descriptive statistics (mean, standard deviation, minimum, and maximum) for the spatiotemporal parameters are shown in Table A1. The repeated-measures ANOVA revealed that all the spatiotemporal parameters differed significantly, with large effect sizes, between FW and both walker-assisted tasks (AW and VRAW) but not between AW and VRAW, as shown in Table A2.
AW and VRAW, compared to FW, showed a reduction in stride length, an increase in the number of strides, a decrease in gait speed, an increase in cadence, and an increase in stance and swing time, as well as the time to complete the 10 MWT. This information is shown in Figure 3.

3.2. Hip Joint

The descriptive statistics for the hip joint parameters are presented in Table A3. There were no significant differences between the right and left parameters, indicating that both legs exhibited similar kinematic patterns at the hip joint.
In the sagittal plane, the repeated-measures ANOVA revealed significant differences between the FW and AW, as well as between the FW and VRAW for all the sagittal hip parameters for both legs, with large effect sizes (Cohen’s d > 0.8), as shown in Table A4. The sagittal parameters were similar between the AW and VRAW. In the coronal plane, the parameters H8 and H9 of the left leg showed significant differences between the FW and AW with large effect sizes, but these variables were similar between the FW and VRAW. In the transverse plane, the H12 parameter was significantly different between the FW and both walker-assisted tasks (AW and VRAW) in both legs, with large effect sizes.
Compared to the FW parameters, the hip kinematics of AW and VRAW in the sagittal plane were characterized by increased flexion at the heel strike (H1), increased flexion at the loading response (H2), lower extension during the stance phase (H3), increased flexion at toe-off (H4), and increased maximum flexion during the swing phase (H5). Conversely, the total sagittal plane range of motion (H6) was lower in both walker-assisted gaits than in FW, as shown in Figure 4. In the transverse plane, AW and VRAW only showed a statistically significant difference in H12 of the hip, presenting greater hip external rotation in the swing phase. In the coronal plane, no significant differences were found.

3.3. Knee Joint

Table A5 shows the descriptive statistics of the knee joint parameters. Unlike the hip joint, the participants exhibited significant differences between the right and left legs for some parameters across the tasks. For FW, the differences were in K7 (mean difference = 1.18° and effect size = 0.60), K9 (mean difference = 1.27° and effect size = 0.52), and K10 (mean difference = 1.37° and effect size = 0.52). For AW, the differences were in K5 (mean difference = 2.56° and effect size = 0.52) and K6 (mean difference = 2.53° and effect size = 0.58). VRAW showed differences in the same parameters as AW, with mean differences of 2.41° (effect size = 0.55) and 2.53° (effect size = 0.59), respectively. Despite these differences, none had large effect sizes.
The results of the repeated-measures ANOVA for the sagittal and transverse parameters of the knee joint showed significant differences for both legs in almost all the parameters (except K11 in the right leg) between the FW and both walker-assisted tasks (AW and VRAW), with large effect sizes for almost all the parameters, except K3, as shown in Table A6. No differences were found between the AW and VRAW. In the coronal plane, no significant differences were observed between any gait tasks.
In summary, AW and VRAW, compared to FW, exhibited increased flexion in the sagittal plane at the heel strike (K1), increased flexion at the loading response (K2), reduced maximum extension during the stance phase (K3), reduced flexion at toe-off (K4), lower maximum flexion during the swing phase (K5), and an overall reduction in the sagittal plane excursion (K6). In the transverse plane, the knee joint exhibited lower internal rotation in the stance phase (K11), greater external rotation in the swing phase (K12), and lower total plane excursion (K10), as shown in Figure 5. No significant differences were found in the coronal plane.

3.4. Ankle Joint

For the ankle joint, the descriptive statistics parameters are shown in Table A7. Both legs exhibited similar kinematic patterns at the ankle joint, except for the A3 parameter in FW, which presented a significant difference (mean difference = 1.07° and effect size = 0.83).
Similar to the hip and knee joints, the repeated-measures ANOVA revealed significant differences between the FW and both walker-assisted tasks (AW and VRAW) for the sagittal plane parameters, as shown in Table A8. However, for the ankle, no significant differences were found in the A3 parameter in any gait tasks, and significant differences in the A4 parameter were found only between the FW and AW in the left leg. No significant differences were found between the AW and VRAW. In the coronal plane, no significant differences were observed between any gait tasks.
The AW and VRAW tasks in the sagittal plane of the ankle parameters were characterized by increased dorsiflexion at the heel strike (A1), reduced maximum plantar flexion at the loading response (A2), increased dorsiflexion at toe-off (A4), increased dorsiflexion in the swing phase (A5), and an overall reduction in the sagittal plane excursion (A6), as shown in Figure 6. The A3 parameter showed no differences across the tasks, meaning the maximum dorsiflexion in the stance phase was similar for FW, AW, and VRAW. No significant differences were found in the coronal plane.

3.5. SEQ

The score obtained from the SEQ after the VRAW day was 57.7 ± 8.4. This score indicates that the VR environment was suitable for a VR rehabilitation system. In general, all the volunteers reported being comfortable and having fun during the VRAW. From the questions designed to detect frequent issues from VR applications, none of the volunteers reported discomfort, dizziness, nausea, or eye discomfort, and only one volunteer reported being neutral about being confused or disoriented during the task. All the volunteers expressed interest in continuing to use the VR system during an eventual rehabilitation program.

4. Discussion

This paper investigated the kinematic patterns of the lower limb joints of elderly subjects during three gait tasks at self-selected speeds: FW, AW, and VRAW. The main goal of this paper was to evaluate whether the differences between FW and AW are comparable to those between FW and VRAW and to analyze how AW and VRAW differ from each other to assess the effects of VR on gait-assisted tasks.
The main results highlight that both walker-assisted gait tasks (AW and VRAW) produced significantly different kinematic patterns of joint motion compared to FW, particularly in the sagittal plane. However, the joint kinematics between AW and VRAW were similar in both legs. The results suggest that the UFES vWalker was the major factor influencing the kinematic changes, not the VR environment. The combination of overground assistive devices, especially SW, gait analysis, and VR for rehabilitation is scarce in the literature, and the results of this study may provide an initial background for future developments in this field.
During the AW and VRAW tasks, the participants took more time to complete the 10 m walk, with a reduced stride length and an increased number of steps. The slower gait pattern featured more than double the stance time and 20% more swing time compared to FW. Although the participants had their body weight partially supported by the walker structure, the need to drive the device forward may have influenced these spatiotemporal characteristics. Previous studies that investigated the biomechanical properties of walker models have also provided evidence that slower walking is a major feature of this gait assistance method [34,53,54]. However, such features are expected in clinical rehabilitation scenarios, where more controlled walking patterns are required [34].
In the sagittal plane, the hip kinematics of the AW and VRAW tasks were characterized by increased flexion at the heel strike and loading response, reduced extension at the stance phase, increased flexion at toe-off, and maximum flexion at the swing phase. The total sagittal plane range of motion was reduced in the AW and VRAW gait compared to FW. The hip joints presented greater flexion parameters in AW and VRAW. The explanation for this feature lies in the structure of the UFES vWalker itself. The forearm supports were designed to lead the user into a forward flexion position of the trunk to aid in body weight support during walking, but this position influenced the entire hip gait cycle by keeping the hip in an increased flexion position. This scenario was not changed by the VRAW task.
However, it is important to highlight that the increase in hip flexion observed with the UFES vWalker may be related to an increased percentage of weight discharge in the device and greater activation of the lower limb muscles [55]. These properties have potential applications for the rehabilitation of musculoskeletal and neurological clinical conditions that require gait training with weight-bearing assistance. In particular, the walker can be used to address early gait recovery in various inpatient and outpatient settings, including hip and knee arthroplasties, ligament reconstructions of the knee, arthroscopic procedures of lower limb joints, bone fractures, stroke, and cardiopulmonary conditions [56,57,58,59]. Some conditions, however, will require specific adjustments to the walker to ensure patient safety. For example, patients who have undergone total hip arthroplasty must not walk with excessive hip flexion during early rehabilitation to prevent hip subluxation [58].
In the transverse plane, AW and VRAW also showed greater hip external rotation during the swing phase for both legs. Future studies must consider this change in transverse plane motion when studying clinical conditions where an increase in hip external rotation might represent some risk, such as hip instability [60].
The knee kinematics between the right and left legs revealed discrepancies during FW for some frontal and transverse parameters. Previous studies have shown that these parameters are highly variable and may reflect random sample variations [49,61]. The AW and VRAW gait tasks showed increased maximum knee flexion at the swing phase of the left leg and, consequently, the overall sagittal motion of this leg was slightly higher. This finding was unexpected, but the greater range observed for the K5 and K6 data suggests that this may reflect adaptations regarding walker maneuverability. Compared to FW, the AW and VRAW knee kinematics were initially characterized by increased knee flexion at the heel contact and loading response. During the stance phase, the knee did not reach maximum extension, and toe-off occurred with reduced knee flexion. The maximum knee flexion at the swing phase was reduced during the AW and VRAW tasks, and the overall sagittal motion was about 25% smaller than FW.
The AW and VRAW tasks showed less knee motion in all the parameters of the transverse plane, meaning that the participants developed a walking pattern with less rotation between the tibia and femur. This feature may have applications in clinical conditions that require supported gait training with local stability requirements, such as ligament reconstructions, incomplete spinal cord injury, or stroke [56,57].
For the ankle joint, changes in kinematics occurred in the sagittal plane. Compared to FW, the AW and VRAW tasks exhibited increased plantar flexion at the heel strike and loading response. Maximum dorsiflexion at the stance phase was not different from FW, but toe-off occurred much earlier, with increased dorsiflexion that was sustained throughout the swing phase. The total sagittal plane excursion of the ankle joint was reduced by approximately 38%.
The overall reduction in the joint range of motion observed during the AW and VRAW tasks may be associated with the forward trunk posture and the level of body weight support. These results confirm previous studies that showed that increased body weight assistance during walking significantly influenced spatiotemporal and kinematic parameters [54,55].
Additionally, future studies should address the influence of the forearm support’s height over the hip joint flexion posture and, consequently, on overall 3D joint kinematics and spatiotemporal parameters. In this paper, this height was adjusted according to the volunteer’s height. Another consideration for future studies is to adjust the admittance controller parameters for each volunteer, as emulating different haptic sensations may influence lower limb kinematics in the attempt to move an SW [34]. Conversely, if the objective of such studies is not a kinematic precision measurement but gait training, admittance control may be used as an accessory to lower limb strengthening protocols because it can be used to simulate dynamics behaviors according to the VR environment [32,35].
The VR scenario developed in this research consisted of a simple star-gathering game that encouraged the participants to walk in a straight line. The findings in this paper indicate that VR did not influence the walking kinematics parameters compared to a similar task performed without VR. Our results differ from the findings of [11,21,22,23], which reported changes in joint kinematics with VR gait. This discrepancy could be attributed to differences in VR applications across studies, which investigated overground free walking and treadmill gait. Furthermore, the posture adopted by the participants in the walker-assisted tasks was more stable, allowing for limited variations in the joint kinematics and spatiotemporal parameters. Future studies in the field of VR have enormous potential to develop gait rehabilitation strategies that mimic real-world challenges in clinical scenarios, which are typically found in the elderly population, potentially contributing to treatment quality and adherence.
During the experiments, some volunteers spontaneously shared their experiences while performing the tasks. The participants expressed enthusiasm about using the UFES vWalker, both with and without the VR headset. This excitement likely stemmed from their unfamiliarity with robotics and virtual reality technology. The level of enthusiasm appeared to increase when using the VR headset, as the participants were positively surprised by the virtual elements, such as the stars they collected and the feedback interface. Some volunteers were even surprised by the distance they had covered once they removed the VR headset. A common suggestion from the participants was to improve the comfort of the UFES vWalker’s forearm support, which we plan to address in the current iteration of our system.
Finally, the mean scores obtained from the SEQ indicate that the VR game was comfortable, and the participants did not report adverse effects. In this way, our results provide evidence that the data from the VRAW group were not influenced by cybersickness related to immersive VR environments. The factors that may mitigate adverse effects in the VRAW include the support offered by the UFES vWalker during the tasks, providing more stability and security for the volunteers during the walk. Additionally, the synchronized movement between the VR and the physical world, where every displacement by the volunteers and the UFES vWalker was reproduced in VR, likely contributed to these findings. This synchronization does not happen in VR treadmills, for example. Furthermore, the straight-line walk reduced the head’s range of motion, which may help to avoid cybersickness outcomes.
The use of the straight-line walk was made to establish a baseline for comparison between the FW, AW, and VRAW tasks. With this path, we could more confidently attribute any observed difference in gait kinematics to the presence of VR itself rather than to specific features of a complex virtual environment, such as turns or obstacles. Additionally, given our elderly participant group’s first contact with both a robotic walker and VR headset, a simple VR task helped minimize potential safety risks. Future research should explore different walking paths, such as incorporating curves, to assess how variations in head movement or motion might affect cybersickness symptoms.

5. Conclusions

This paper investigated the effects on gait kinematics of fourteen elderly participants in assisted locomotion within a VR scenario across three experimental tasks: normal free walking (FW), smart walker-assisted gait (AW), and smart walker-assisted gait with VR assistance (VRAW). The FW task was designed to capture each individual’s standard walking patterns, which were then compared with the patterns observed in the AW and VRAW tasks to assess if the changes from FW to AW were similar to those from FW to VRAW. After the VRAW task, the volunteers responded to the SEQ to evaluate the VR rehabilitation environment.
The main results highlight that both walker-assisted gait tasks (AW and VRAW) produced significantly different kinematic patterns of the spatiotemporal parameters and hip, knee, and ankle joint motions compared to FW, particularly in the sagittal plane. However, the joint kinematics between the AW and VRAW were similar in both legs, suggesting that VR did not influence SW gait kinematics. Both AW and VRAW were characterized by a reduction in the range of motion of the hip, knee, and ankle in the sagittal plane; a reduced stride length and gait speed; and an increased stride number, cadence, and time to complete the 10 MWT. The posture adopted by the participants in our SW model was very stable, allowing for very few variations in joint kinematics and spatiotemporal parameters. The scores from the SEQ suggest high usability with almost no side effects.
The similar gait parameters of AW and VRAW, combined with the high acceptance of the SEQ, indicate that VR environments could be a promising tool during rehabilitation programs with SW. The SW’s advanced features could be combined with the motivation and engagement of VR applications for the development of tasks with different cognitive and physical demands.
However, implementing the UFES vWalker in clinical settings presents challenges, including the need for prior training for both healthcare professionals and patients. Additionally, clinical facilities may require adaptations to accommodate the UFES vWalker and ensure its proper functioning.
Our research group is already developing some tools to overcome these challenges. Healthcare professionals will receive training with our research group to explain how to operate the UFES vWalker and select and customize rehabilitation scenarios in the VR headset. For patients, we are developing explainable user interfaces designed to provide instructions on safe navigation, moving forward, making turns, and avoiding both real and virtual obstacles in the VR headset. We are also focused on developing cloud-based systems for VR. Because VR may require significant computational power to render virtual obstacles and virtualize physical environments, cloud-based VR systems could facilitate remote data processing, enabling clinical facilities to only need the UFES vWalker and the VR headset while the cloud handles all data processing and provides remote support.
These tools will enable future research to conduct long-term clinical trials and develop interactive VR environments tailored to the specific needs of patients undergoing gait rehabilitation. This includes adjusting the height of the UFES vWalker according to individual patient requirements. By customizing both the walker’s design and the VR environment, we can create more effective and personalized gait rehabilitation protocols.

Author Contributions

M.L.: Conceptualization, Methodology, Validation, Formal Analysis, Investigation, and Writing—Original Draft. A.E.: Formal Analysis, Investigation, and Writing—Original Draft. F.M.: Conceptualization, Formal Analysis, Investigation, and Writing—Original Draft. M.B.: Conceptualization, Methodology, and Validation. C.Z.: Conceptualization, Methodology, Formal Analysis, and Investigation. R.M.: Writing—Review and Editing, Supervision, Conceptualization, Investigation, and Funding Acquisition. A.F.: Conceptualization, Methodology, Formal Analysis, Investigation, Resources, Writing—Review and Editing, Supervision, Project Administration, and Funding Acquisition. All authors have read and agreed to the published version of the manuscript.

Funding

Financial support received from Fundação de Amparo à Pesquisa e Inovação do Espírito Santo—FAPES (2022-D48XB and 2022-C5K3H and 2023-XDHHW), the Brazilian National Council for Scientific and Technological Development—CNPq (308155/2023-8 and 403753/2021-0), and the Financiadora de Estudos e Projetos—FINEP (2784/20).

Institutional Review Board Statement

This study was conducted following the Declaration of Helsinki and received approval from the Research Ethics Committee of the Federal University of Espírito Santo, with registration number 6.294.101.

Informed Consent Statement

Informed consent was obtained from all the subjects involved in this study.

Data Availability Statement

The original contributions presented in the study are included in the article, further inquiries can be directed to the corresponding author.

Conflicts of Interest

The authors declare no conflicts of interest.

Abbreviations

The following abbreviations are used in this manuscript:
10 MWT10-Meter Walk Test
AWSmart Walker-assisted Gait
FWFree Walking
HREIHuman–Robot-Environment Interaction
HRIHuman–Robot Interaction
LiDARLight Detection and Ranging
MCMotion Capture
OCOdometry and Control
ROSRobot Operating System
SEQSuitability Evaluation Questionnaire for Virtual Rehabilitation 254 Systems
SSQSimulator Sickness Questionnaire
SWSmart Walker
VRAWSmart Walker-assisted Gait Plus VR Assistance
VRVirtual Reality
VRIVirtual Reality Integration
WHOWorld Health Organization

Appendix A

Table A1. Descriptive statistics include the mean, standard deviation (SD), minimum (min), and maximum (max) values of the spatiotemporal parameters by each task.
Table A1. Descriptive statistics include the mean, standard deviation (SD), minimum (min), and maximum (max) values of the spatiotemporal parameters by each task.
ParametersFWAWVRAW
Mean (SD) Min Max Mean (SD) Min Max Mean (SD) Min Max
Stride
Length
(m)
1.32 (0.1)1.191.540.64 (0.11)0.440.800.67 (0.13)0.480.85
Stride
Number
(steps)
7.29 (0.8)6916.48 (3.03)122315.69 (3.93)1124
Gait
Speed
(m/s)
1.14 (0.11)0.901.30.32 (0.07)0.170.380.33 (0.05)0.190.39
Cadence
(steps/s)
1.12 (0.09)0.991.311.97 (0.42)1.212.722.08 (0.43)1.452.99
Stance
Phase
(s)
0.65 (0.07)0.570.781.37 (0.35)0.802.051.45 (0.33)0.991.95
Swing
Phase
(s)
0.47 (0.05)0.410.570.59 (0.12)0.400.800.63 (0.15)0.451.02
Time
10 MWT
(s)
7.52 (0.65)6.358.9030.61 (8.44)23.9853.4432.11 (8.71)24.7050.38
Table A2. The results of the repeated-measures one-way ANOVA to assess the differences between the tasks in the spatiotemporal parameters. Contrasts made by Bonferroni adjustment.
Table A2. The results of the repeated-measures one-way ANOVA to assess the differences between the tasks in the spatiotemporal parameters. Contrasts made by Bonferroni adjustment.
ParameterF StatisticpEffect
Size ( η 2 )
ContrastspEffect
Size (d)
Stride
Length
F (2, 26) = 239.05<0.0010.89FW vs. AW<0.0014.64
FW vs. VRAW<0.0014.17
AW vs. VRAW=10.45
Stride
Number
F (1.42, 18.2) = 50.80 *<0.0010.69FW vs. AW<0.0013.01
FW vs. VRAW<0.0012.11
AW vs. VRAW=10.36
Gait
Speed
F (2, 26) = 963.49<0.0010.96FW vs. AW<0.0018.94
FW vs. VRAW<0.0019.67
AW vs. VRAW=10.12
CadenceF (2, 26) = 52.80<0.0010.61FW vs. AW<0.0012.17
FW vs. VRAW<0.0012.31
AW vs. VRAW=0.720.33
Stance
Phase
F (2, 26) = 55.27<0.0010.63FW vs. AW<0.0012.19
FW vs. VRAW<0.0012.42
AW vs. VRAW=0.960.28
Swing
Phase
F (1.32, 17.16) = 13.58<0.0010.28FW vs. AW=0.0031.44
FW vs. VRAW=0.0011.25
AW vs. VRAW=0.260.49
Time
10 MWT
F (2, 26) = 67.39<0.0010.73FW vs. AW<0.0012.79
FW vs. VRAW<0.0012.96
AW vs. VRAW=10.15
* Degrees of freedom and p-value corrected by Greenhouse–Geiser method due to violation of sphericity.
Table A3. The descriptive statistics include the mean, standard deviation (SD), minimum (min), and maximum (max) values of the hip joint by each task.
Table A3. The descriptive statistics include the mean, standard deviation (SD), minimum (min), and maximum (max) values of the hip joint by each task.
ParametersFWAWVRAW
Mean (SD)MinMaxpMean (SD)MinMaxpMean (SD)MinMaxp
H1 (°)R28.23 (2.99)24.0233.87=0.5450.83 (7.42)38.7770.11=0.8949.70 (6.05)39.9864.28=0.33
L27.85 (2.82)24.1135.2250.7 0(7.45)42.2767.8350.61 (5.20)43.0559.57
H2 (°)R22.32 (3.41)16.9029.26=0.7745.28 (6.76)35.1362.96=0.5743.84 (6.33)35.7859.57=0.73
L22.53 (3.10)18.2530.4244.73 (6.06)37.5957.3644.24 (5.00)33.9153.49
H3 (°)R−8.42 (3.46)−15.75−4.04=0.0818.98 (8.99)6.8235.94=0.9919.46 (8.29)7.4033.39=0.40
L−7.51 (3.05)−13.59−2.4419.22 (8.87)9.6238.3218.72 (7.14)7.8133.96
H4 (°)R−2.85 (3.25)−9.062.19=0.1624.85 (9.66)8.8242.56=0.9923.95 (8.17)12.0338.52=0.58
L−2.16 (3.49)−8.304.8624.85 (9.38)14.3744.0823.45 (8.03)13.5342.28
H5 (°)R32.79 (2.65)28.0036.10=0.6552.85 (8.03)39.5871.96=0.6950.71 (7.42)35.3165.87=0.08
L32.58 (2.70)27.2636.8853.20 (7.79)43.5470.5852.40 (6.72)36.9362.34
H6 (°)R41.22 (4.04)34.7549.02=0.0634.45 (5.30)24.1844.24=0.9832.78 (6.05)19.9741.56=0.71
L40.10 (3.28)35.1545.5534.46 (4.17)27.2841.7833.99 (6.23)20.1042.14
H7 (°)R9.41 (3.30)5.2416.09=0.119.60 (2.65)4.9314.46=0.759.81 (2.87)5.6415.11=0.71
L10.5 (2.93)6.6117.439.71 (2.57)5.4015.559.60 (3.18)3.4414.53
H8 (°)R−4.32 (1.75)−9.98−3.87=0.85−8.47 (2.67)−12.88−3.18=0.07−8.45 (2.91)−11.55−0.35=0.23
L−7.27 (2.11)−10.56−3.71−10.23 (2.01)−13.15−6.20−9.55 (4.94)−13.996.18
H9 (°)R2.05 (2.73)−1.296.18=0.101.12 (4.20)−5.297.52=0.121.93 (6.25)−3.6721.70=0.06
L3.23 (2.56)−0.568.21−0.54 (2.61)−3.985.360.34 (6.04)−6.9717.12
H10 (°)R9.81 (4.24)4.1418.67=0.277.58 (3.15)3.7814.55=0.908.11 (3.24)3.2515.97=0.90
L9.23 (3.84)2.7115.657.69 (2.74)3.9013.358.15 (3.56)2.1813.54
H11 (°)R7.47 (3.28)2.5514.05=0.099.80 (4.49)1.1117.27=0.509.24 (4.87)0.5118.67=0.27
L6.01 (2.77)2.2413.239.12 (6.04)1.1021.6910.11 (5.69)0.8919.48
H12 (°)R−2.34 (4.65)−13.616.15=0.352.22 (4.63)−6.0710.00=0.341.50 (5.12)−8.7010.88=0.45
L−3.21 (4.24)−11.172.021.44 (5.12)−5.4113.721.16 (5.07)−7.189.63
Table A4. The results of the repeated-measures one-way ANOVA to assess the differences between the tasks in the hip parameters. Contrasts made by Bonferroni adjustment.
Table A4. The results of the repeated-measures one-way ANOVA to assess the differences between the tasks in the hip parameters. Contrasts made by Bonferroni adjustment.
ParameterF StatisticpEffect
Size ( η 2 )
Contrastsp (R)Effect
Size (d)
p (L)Effect
Size (d)
H1 FW vs. AW<0.0013.23<0.0012.75
RF (2, 26) = 116.47<0.0010.77FW vs. VRAW<0.0013.34<0.0014.07
LF (2, 26) = 103.68<0.0010.80AW vs. VRAW=10.22=10.01
H2 FW vs. AW<0.0014.44<0.0013.50
RF (2, 26) = 152.21<0.0010.79FW vs. VRAW<0.0013.34<0.0013.95
LF (2, 26) = 129.53<0.0010.83AW vs. VRAW=10.30=10.08
H3 FW vs. AW<0.0013.61<0.0013.40
RF (2, 26) = 166.04<0.0010.77FW vs. VRAW<0.0014.40<0.0014.30
LF (2, 26) = 161.35<0.0010.78AW vs. VRAW=10.08=10.10
H4 FW vs. AW<0.0013.34<0.0013.31
RF (1.7, 22.1) = 120.51 *<0.0010.77FW vs. VRAW<0.0014.04<0.0013.40
LF (1.82, 23.92) = 110.34 *<0.0010.75AW vs. VRAW=10.15=10.22
H5 FW vs. AW<0.0012.64<0.0012.51
RF (2, 26) = 66.51<0.0010.67FW vs. VRAW<0.0012.25<0.0012.83
LF (2, 26) = 64.56<0.0010.72AW vs. VRAW=0.630.35=10.10
H6 FW vs. AW<0.0011.82<0.0011.81
RF (2, 26) = 26.67<0.0010.35FW vs. VRAW<0.0011.59=0.010.95
LF (1.34, 17.42) = 8.54 *=0.0010.27AW vs. VRAW=0.590.36=10.09
H7 FW vs. AW=10.09=10.25
RF (1.24, 16.12) = 0.12 *=0.700.003FW vs. VRAW=10.12=10.20
LF (1.3, 16.9) = 0.39 *=0.480.02AW vs. VRAW=10.12=10.04
H8 FW vs. AW=0.460.40=0.400.47
RF (2.26, 16.38) = 1.99=0.150.04FW vs. VRAW=0.280.40=0.280.48
LF (1.16, 15.08) = 1.05 *=0.210.11AW vs. VRAW=10.001=10.12
H9 FW vs. AW=0.580.36=0.461.76
RF (1.26, 16.38) = 0.22=0.610.001FW vs. VRAW=10.02=0.280.48
LF (1.16, 15.08) = 0.96 *=0.300.09AW vs. VRAW=10.17=10.17
H10 FW vs. AW=0.380.44=0.640.35
RF (2, 26) = 1.96=0.160.07FW vs. VRAW=0.570.37=10.17
LF (2, 26) = 0.69=0.510.03AW vs. VRAW=10.15=10.10
H11 FW vs. AW=0.130.60=0.160.56
RF (2, 26) = 2.28=0.120.06FW vs. VRAW=0.590.36=0.070.68
LF (2, 26) = 4.02=0.030.11AW vs. VRAW=10.13=10.18
H12 FW vs. AW=0.0051.04=0.0150.89
RF (2, 26) = 9.17<0.0010.15FW vs. VRAW=0.0170.88=0.0080.98
LF (2, 26) = 9.58<0.0010.17AW vs. VRAW=10.17=10.07
* Degrees of freedom and p-value corrected by Greenhouse–Geiser method due to violation of sphericity.
Table A5. The descriptive statistics include the mean, standard deviation (SD), minimum (min), and maximum (max) values of the knee joint by each task.
Table A5. The descriptive statistics include the mean, standard deviation (SD), minimum (min), and maximum (max) values of the knee joint by each task.
ParametersFWAWVRAW
Mean (SD)MinMaxpMean (SD)MinMaxpMean (SD)MinMaxp
K1 (°)R2.86 (5.34)−3.6715.09=0.1126.89 (16.57)2.0170.49=0.35 *22.79 (13.31)−0.0346.05=0.88
L3.81 (4.85)−4.3414.7826.33 (14.81)1.2963.5122.97 (13.71)−0.2345.54
K2 (°)R10.43 (6.43)1.1322.01=0.35 *20.39 (11.55)−2.5032.42=0.29 *21.89 (11.66)−2.2344.93=0.67
L11.81 (5.23)−1.2418.0419.53 (9.76)2.2430.1321.32 (11.97)1.4544.08
K3 (°)R0.95 (3.44)−3.677.45=0.066.68 (6.67)−5.6021.35=0.865.54 (6.50)−5.0120.91=0.74
L2.03 (3.2)−4.696.866.87 (6.30)−1.0623.335.76 (5.89)−2.3420.35
K4 (°)R32.73 (3.34)25.3036.31=0.3528.22 (6.02)17.4137.58=0.8427.48 (5.81)17.9639.53=0.70
L33.26 (4.00)22.4839.5828.44 (4.74)21.0338.7227.1 (4.88)17.7937.50
K5 (°)R61.87 (4.35)54.2567.43=0.39 *49.62 (10.78)32.4673.87=0.0447.67 (9.09)30.3762.33=0.03
L62.59 (4.06)56.8068.5752.18 (8.25)36.8468.2550.08 (7.78)34.2160.35
K6 (°)R60.92 (4.13)56.4268.17=0.6344.42 (9.66)28.8071.32=0.03 *42.42 (4.98)30.8948.91=0.02
L60.56 (3.26)54.9967.2546.96 (8.28)31.6267.2544.43 (5.83)32.7451.51
K7 (°)R1.67 (1.16)0.714.25=0.02 *1.76 (1.13)0.593.92=0.852.08 (1.28)0.895.27=0.26 *
L2.85 (2.04)0.658.612.08 (1.51)0.775.841.71 (1.11)0.303.69
K8 (°)R−1.59 (0.56)−2.59−0.73=0.72−2.30 (1.00)−4.31−0.40=0.09−2.13 (0.87)−3.82−0.92=0.69
L−1.65 (0.59)−3.14−0.78−1.66 (0.87)−3.81−0.36−2.24 (1.33)−5.31−0.89
K9 (°)R−2.92 (1.46)−6.53−1.40=0.05 *−3.01 (2.36)−7.76−0.33=0.65−2.74 (2.51)−7.021.87=0.38
L−4.2 (2.32)−10.33−1.30−3.38 (2.34)−7.590.52−3.46 (2.52)−7.331.00
K10 (°)R11.16 (2.82)6.1715.18=0.046.74 (3.21)2.9814.90=0.796.04 (2.74)2.0111.7=0.20
L12.53 (2.55)9.0316.026.96 (2.20)3.7810.016.88 (2.51)1.6910.91
K11 (°)R4.14 (1.66)0.946.87=0.443.16 (1.41)0.736.13=0.602.64 (1.18)0.954.74=0.13
L4.59 (1.56)2.357.152.95 (1.19)1.564.913.08 (1.23)0.724.88
K12 (°)R−7.02 (2.06)−10.98−3.09=0.11−3.57 (2.34)−8.77−0.29=0.49−3.4 (2.05)−6.950.57=0.29
L−7.94 (2.03)−10.95−4.71−4.01 (1.33)−5.94−1.92−4 (1.53)−6.69−1.78
Table A6. The results of the repeated-measures one-way ANOVA to assess the differences between the tasks in the knee parameters. Contrasts made by Bonferroni adjustment.
Table A6. The results of the repeated-measures one-way ANOVA to assess the differences between the tasks in the knee parameters. Contrasts made by Bonferroni adjustment.
ParameterF StatisticpEffect
Size ( η 2 )
Contrastsp (R)Effect
Size (d)
p (L) Effect
Size (d)
K1 FW vs. AW<0.0011.42<0.0011.43
RF (2, 26) = 23.33<0.0010.43FW vs. VRAW<0.0011.44<0.0014.34
LF (2, 26) = 23.37<0.0010.43AW vs. VRAW=0.550.38=0.540.38
K2 FW vs. AW=0.0160.89<0.0010.85
RF (2, 26) = 9.78<0.0010.21FW vs. VRAW=0.0051.06<0.0010.90
LF (2, 26) = 129.53<0.0010.17AW vs. VRAW=10.16=10.15
K3 FW vs. AW=0.0061.02=0.030.78
RF (2, 26) = 8.95=0.0010.17FW vs. VRAW=0.040.78=0.050.73
LF (2, 26) = 5.69=0.0080.14AW vs. VRAW=10.25=10.19
K4 FW vs. AW=0.0041.07=0.0031.09
RF (2, 26) = 14.45<0.0010.18FW vs. VRAW=0.0011.24=0.0021.17
LF (2, 26) = 15.31<0.0010.27AW vs. VRAW=10.22=0.430.42
K5 FW vs. AW=0.0021.23<0.0011.56
RF (2, 26) = 19.83<0.0010.37FW vs. VRAW<0.0011.72<0.0011.97
LF (2, 26) = 28.67<0.0010.40AW vs. VRAW=10.21=0.810.31
K6 FW vs. AW<0.0011.57<0.0011.46
RF (2, 26) = 30.09<0.0010.62FW vs. VRAW<0.0012.67<0.0012.86
LF (2, 26) = 29.27=0.0010.59AW vs. VRAW=10.20=10.25
K7 FW vs. AW=10.08=0.130.59
RF (2, 26) = 0.77=0.470.02FW vs. VRAW=0.840.30=0.090.64
LF (2, 26) = 4.55=0.020.08AW vs. VRAW=10.22=0.820.30
K8 FW vs. AW=0.110.61=10.02
RF (2, 26) = 3.13=0.060.12FW vs. VRAW=0.200.53=0.250.50
LF (1.4, 18.2) = 1.75 *=0.090.08AW vs. VRAW=10.14=0.400.42
K9 FW vs. AW=10.05=0.280.48
RF (2, 26) = 0.14=0.860.01FW vs. VRAW=10.09=0.870.29
LF (2, 26) = 1.15=0.330.02AW vs. VRAW=10.14=10.03
K10 FW vs. AW<0.0011.23<0.0012.01
RF (2, 26) = 21.73<0.0010.39FW vs. VRAW<0.0011.84<0.0012.55
LF (2, 26) = 54.85<0.0010.56AW vs. VRAW=10.23=10.03
K11 FW vs. AW=0.180.54=0.010.94
RF (2, 26) = 6.30=0.0050.17FW vs. VRAW=0.010.93=0.010.90
LF (2, 26) = 4.02<0.0010.25AW vs. VRAW=0.550.37=10.11
K12 FW vs. AW<0.0011.32<0.0011.97
RF (2, 26) = 49.43<0.0010.39FW vs. VRAW<0.0011.87<0.0012.25
LF (2, 26) = 49.43<0.0010.58AW vs. VRAW=10.07=10.003
Table A7. The descriptive statistics include the mean, standard deviation (SD), minimum (min), and maximum (max) values of the ankle joint by each task.
Table A7. The descriptive statistics include the mean, standard deviation (SD), minimum (min), and maximum (max) values of the ankle joint by each task.
ParametersFWAWVRAW
Mean (SD)MinMaxpMean (SD)MinMaxpMean (SD)MinMaxp
A1 (°)R−1.24 (3.44)−6.344.92=0.438.83 (7.14)−2.8917.73=0.979.14 (7.67)−3.6324.60=0.97
L−0.64 (4.30)−8.956.548.85 (6.60)0.1317.908.12 (7.23)−1.0718.59
A2 (°)R−8.34 (2.68)−13.27−5.29=0.222.0 (7.44)−11.7312.70=0.98−0.60 (7.26)−10.3711.83=0.98
L−7.37 (2.67)−13.73−2.132.03 (6.69)−12.2612.880.18 (6.96)−14.367.72
A3 (°)R14.45 (2.47)9.1719.54=0.00716.85 (3.83)9.0221.50=0.7917.55 (4.48)9.4628.69=0.79
L15.52 (2.43)12.6022.0617.06 (3.70)10.8921.6916.45 (3.72)11.0921.03
A4 (°)R−3.95 (5.41)−13.277.36=0.804.52 (8.45)−11.5916.12=0.803.85 (6.16)−9.111.89=0.91
L−4.15 (5.06)−10.107.464.72 (6.32)−11.1012.082.16 (6.43)−12.268.92
A5 (°)R−17.73 (4.78)−27.26−7.10=0.99−3.02 (9.78)−20.6315.50=0.99−3.61 (7.53)−15.489.98=0.44
L−17.73 (6.38)−27.90−3.29−4.54 (8.70)−21.699.15−6.31 (8.35)−22.463.47
A6 (°)R32.18 (3.48)25.1037.27=0.2820.29 (8.04)5.1034.23=0.2821.47 (6.86)7.7432.60=0.26
L33.26 (4.71)25.5241.8922.07 (7.78)9.3133.7923.05 (6.73)14.2336.35
A7 (°)R15.10 (4.27)8.2823.18=0.2214.91 (4.71)6.9922.53=0.2214.55 (5.74)6.6625.41=0.87
L16.95 (4.18)11.5125.8415.08 (4.56)8.8324.7415.64 (6.99)4.6927.77
A8 (°)R6.08 (1.29)3.358.30=0.307.79 (3.30)−0.1211.87=0.297.37 (2.71)2.8411.43=0.19
L5.62 (1.36)2.837.696.54 (3.02)−0.7211.687.58 (2.96)2.0512.80
A9 (°)R−9.02 (3.94)−14.88−1.10=0.15−7.11 (3.86)−14.21−0.17=0.15−7.17 (4.04)−14.090.24=0.19
L−11.33 (4.05)−20.14−5.44−8.54 (3.14)−14.25−3.51−8.64 (4.45)−15.87−1.03
Table A8. The results of the repeated-measures one-way ANOVA to assess the differences between the tasks in the ankle parameters. Contrasts made by Bonferroni adjustment.
Table A8. The results of the repeated-measures one-way ANOVA to assess the differences between the tasks in the ankle parameters. Contrasts made by Bonferroni adjustment.
ParameterF StatisticpEffect
Size ( η 2 )
Contrastsp (R)Effect
Size (d)
p (L)Effect
Size (d)
A1 FW vs. AW<0.0011.39=0.0011.23
RF (1.42, 18.46) = 17.04 *<0.0010.38FW vs. VRAW<0.0011.40=0.0080.98
LF (1.14, 18.46) = 9.10 *<0.0010.34AW vs. VRAW=10.08=0.540.25
A2 FW vs. AW=0.0021.20=0.0021.15
RF (1.36, 17.68) = 10.40 *<0.0010.35FW vs. VRAW=0.0130.94=0.0071.00
LF (2, 26) = 11.61<0.0010.35AW vs. VRAW=0.110.63=10.25
A3 FW vs. AW=0.340.45=0.770.32
RF (1.42, 18.46) = 2.16 *=0.060.12FW vs. VRAW=0.220.52=1.000.19
LF (1.1, 14.3) = 0.58 *=0.360.14AW vs. VRAW=10.23=0.300.47
A4 FW vs. AW=0.030.78=0.0041.07
RF (2, ) = 7.53=0.0020.25FW vs. VRAW=0.010.93=0.070.68
LF (2, 26) = 9.40<0.0010.30AW vs. VRAW=10.08=0.350.45
A5 FW vs. AW<0.0011.35<0.0011.66
RF (2, 26) = 21.35<0.0010.46FW vs. VRAW<0.0011.50=0.0051.05
LF (2, 26) = 18.73<0.0010.37AW vs. VRAW=10.07=0.810.25
A6 FW vs. AW<0.0011.50<0.0011.69
RF (2, 26) = 18.66<0.0010.43FW vs. VRAW<0.0011.31=0.0021.19
LF (2, 26) = 29.27<0.0010.39AW vs. VRAW=10.15=10.14
A7 FW vs. AW=10.03=0.660.34
RF (2, 26) = 0.07=0.930.002FW vs. VRAW=10.09=10.17
LF (2, 26) = 0.70=0.500.02AW vs. VRAW=10.08=10.12
A8 FW vs. AW=0.160.56=0.910.28
RF (2, 26) = 2.71=0.080.07FW vs. VRAW=0.180.55=0.140.59
LF (2, 26) = 2.87=0.740.09AW vs. VRAW=10.14=0.480.39
A9 FW vs. AW=0.570.36=0.120.61
RF (1.26, 16.38) = 1.87=0.170.05FW vs. VRAW=0.470.40=0.160.56
LF (2, 26) = 4.14=0.030.10AW vs. VRAW=10.03=10.03
* Degrees of freedom and p-value corrected by Greenhouse–Geiser method due to violation of sphericity.

References

  1. World Health Organization. Ageing and Health. 2024. Available online: https://www.who.int/health-topics/ageing (accessed on 5 February 2024).
  2. Ravankar, A.A.; Tafrishi, S.A.; Luces, J.V.S.; Seto, F.; Hirata, Y. Care: Cooperation of ai robot enablers to create a vibrant society. IEEE Robot. Autom. Mag. 2022, 30, 8–23. [Google Scholar] [CrossRef]
  3. Rudnicka, E.; Napierała, P.; Podfigurna, A.; Męczekalski, B.; Smolarczyk, R.; Grymowicz, M. The World Health Organization (WHO) approach to healthy ageing. Maturitas 2020, 139, 6–11. [Google Scholar] [CrossRef] [PubMed]
  4. Ma, L.; Chan, P. Understanding the physiological links between physical frailty and cognitive decline. Aging Dis. 2020, 11, 405. [Google Scholar] [CrossRef]
  5. Osoba, M.Y.; Rao, A.K.; Agrawal, S.K.; Lalwani, A.K. Balance and gait in the elderly: A contemporary review. Laryngoscope Investig. Otolaryngol. 2019, 4, 143–153. [Google Scholar] [CrossRef]
  6. Winter, D.A. Biomechanics and Motor Control of Human Movement; John Wiley & Sons: Hoboken, NJ, USA, 2009. [Google Scholar]
  7. Gimigliano, F.; Negrini, S. The World Health Organization “Rehabilitation 2030: A call for action”. Eur. J. Phys. Rehabil. Med. 2017, 53, 155–168. [Google Scholar] [CrossRef]
  8. Ferreira, B.; Menezes, P. Gamifying motor rehabilitation therapies: Challenges and opportunities of immersive technologies. Information 2020, 11, 88. [Google Scholar] [CrossRef]
  9. Howard, M.C. A meta-analysis and systematic literature review of virtual reality rehabilitation programs. Comput. Hum. Behav. 2017, 70, 317–327. [Google Scholar] [CrossRef]
  10. Mikolajczyk, T.; Ciobanu, I.; Badea, D.I.; Iliescu, A.; Pizzamiglio, S.; Schauer, T.; Seel, T.; Seiciu, P.L.; Turner, D.L.; Berteanu, M. Advanced technology for gait rehabilitation: An overview. Adv. Mech. Eng. 2018, 10, 1687814018783627. [Google Scholar] [CrossRef]
  11. Canning, C.G.; Allen, N.E.; Nackaerts, E.; Paul, S.S.; Nieuwboer, A.; Gilat, M. Virtual reality in research and rehabilitation of gait and balance in Parkinson disease. Nat. Rev. Neurol. 2020, 16, 409–425. [Google Scholar] [CrossRef]
  12. Saposnik, G. Virtual reality in stroke rehabilitation. In Ischemic Stroke Therapeutics: A Comprehensive Guide; Springer: Cham, Switzerland, 2016; pp. 225–233. [Google Scholar]
  13. Kaplan, A.D.; Cruit, J.; Endsley, M.; Beers, S.M.; Sawyer, B.D.; Hancock, P.A. The effects of virtual reality, augmented reality, and mixed reality as training enhancement methods: A meta-analysis. Hum. Factors 2021, 63, 706–726. [Google Scholar] [CrossRef]
  14. Patil, V.; Narayan, J.; Sandhu, K.; Dwivedy, S.K. Integration of virtual reality and augmented reality in physical rehabilitation: A state-of-the-art review. In Revolutions in Product Design for Healthcare: Advances in Product Design and Design Methods for Healthcare; Springer: Singapore, 2022; pp. 177–205. [Google Scholar]
  15. Vieira, C.; da Silva Pais-Vieira, C.F.; Novais, J.; Perrotta, A. Serious game design and clinical improvement in physical rehabilitation: Systematic review. JMIR Serious Games 2021, 9, e20066. [Google Scholar] [CrossRef] [PubMed]
  16. Kern, F.; Winter, C.; Gall, D.; Käthner, I.; Pauli, P.; Latoschik, M.E. Immersive virtual reality and gamification within procedurally generated environments to increase motivation during gait rehabilitation. In Proceedings of the 2019 IEEE Conference on Virtual Reality and 3D User Interfaces (VR), Osaka, Japan, 23–27 March 2019; pp. 500–509. [Google Scholar]
  17. Amin, F.; Waris, A.; Syed, S.; Amjad, I.; Umar, M.; Iqbal, J.; Gilani, S.O. Effectiveness of Immersive Virtual Reality Based Hand Rehabilitation Games for Improving Hand Motor Functions in Subacute Stroke Patients. IEEE Trans. Neural Syst. Rehabil. Eng. 2024, 32, 2060–2069. [Google Scholar] [CrossRef] [PubMed]
  18. Casuso-Holgado, M.J.; Martín-Valero, R.; Carazo, A.F.; Medrano-Sánchez, E.M.; Cortés-Vega, M.D.; Montero-Bancalero, F.J. Effectiveness of virtual reality training for balance and gait rehabilitation in people with multiple sclerosis: A systematic review and meta-analysis. Clin. Rehabil. 2018, 32, 1220–1234. [Google Scholar] [CrossRef]
  19. Tieri, G.; Morone, G.; Paolucci, S.; Iosa, M. Virtual reality in cognitive and motor rehabilitation: Facts, fiction and fallacies. Expert Rev. Med. Devices 2018, 15, 107–117. [Google Scholar] [CrossRef]
  20. Li, X.; Luh, D.B.; Xu, R.H.; An, Y. Considering the consequences of cybersickness in immersive virtual reality rehabilitation: A systematic review and meta-analysis. Appl. Sci. 2023, 13, 5159. [Google Scholar] [CrossRef]
  21. Horsak, B.; Simonlehner, M.; Schöffer, L.; Dumphart, B.; Jalaeefar, A.; Husinsky, M. Overground walking in a fully immersive virtual reality: A comprehensive study on the effects on full-body walking biomechanics. Front. Bioeng. Biotechnol. 2021, 9, 780314. [Google Scholar] [CrossRef]
  22. Canessa, A.; Casu, P.; Solari, F.; Chessa, M. Comparing Real Walking in Immersive Virtual Reality and in Physical World using Gait Analysis. In Proceedings of the VISIGRAPP (2: HUCAPP), Prague, Czech Republic, 25–27 February 2019; pp. 121–128. [Google Scholar]
  23. Held, J.P.O.; Yu, K.; Pyles, C.; Veerbeek, J.M.; Bork, F.; Heining, S.M.; Navab, N.; Luft, A.R. Augmented reality–based rehabilitation of gait impairments: Case report. JMIR mHealth uHealth 2020, 8, e17804. [Google Scholar] [CrossRef]
  24. Khanuja, K.; Joki, J.; Bachmann, G.; Cuccurullo, S. Gait and balance in the aging population: Fall prevention using innovation and technology. Maturitas 2018, 110, 51–56. [Google Scholar] [CrossRef]
  25. Calabrò, R.S.; Cacciola, A.; Bertè, F.; Manuli, A.; Leo, A.; Bramanti, A.; Naro, A.; Milardi, D.; Bramanti, P. Robotic gait rehabilitation and substitution devices in neurological disorders: Where are we now? Neurol. Sci. 2016, 37, 503–514. [Google Scholar] [CrossRef]
  26. Yuan, F.; Klavon, E.; Liu, Z.; Lopez, R.P.; Zhao, X. A systematic review of robotic rehabilitation for cognitive training. Front. Robot. AI 2021, 8, 605715. [Google Scholar] [CrossRef]
  27. Hao, J.; Buster, T.W.; Cesar, G.M.; Burnfield, J.M. Virtual reality augments effectiveness of treadmill walking training in patients with walking and balance impairments: A systematic review and meta-analysis of randomized controlled trials. Clin. Rehabil. 2023, 37, 603–619. [Google Scholar] [CrossRef]
  28. Zukowski, L.A.; Shaikh, F.D.; Haggard, A.V.; Hamel, R.N. Acute effects of virtual reality treadmill training on gait and cognition in older adults: A randomized controlled trial. PLoS ONE 2022, 17, e0276989. [Google Scholar] [CrossRef]
  29. Winter, C.; Kern, F.; Gall, D.; Latoschik, M.E.; Pauli, P.; Käthner, I. Immersive virtual reality during gait rehabilitation increases walking speed and motivation: A usability evaluation with healthy participants and patients with multiple sclerosis and stroke. J. Neuroeng. Rehabil. 2021, 18, 68. [Google Scholar] [CrossRef]
  30. Andrade, R.M.; Sapienza, S.; Mohebbi, A.; Fabara, E.E.; Bonato, P. Transparent Control in Overground Walking Exoskeleton Reveals Interesting Changing in Subject’s Stepping Frequency. IEEE J. Transl. Eng. Health Med. 2023, 12, 182–193. [Google Scholar] [CrossRef]
  31. Martins, M.; Santos, C.; Frizera, A.; Ceres, R. A review of the functionalities of smart walkers. Med. Eng. Phys. 2015, 37, 917–928. [Google Scholar] [CrossRef]
  32. Machado, F.; Loureiro, M.; Mello, R.C.; Diaz, C.A.; Frizera, A. A novel mixed reality assistive system to aid the visually and mobility impaired using a multimodal feedback system. Displays 2023, 79, 102480. [Google Scholar] [CrossRef]
  33. Moreira, R.; Alves, J.; Matias, A.; Santos, C. Smart and assistive walker–asbgo: Rehabilitation robotics: A smart–walker to assist ataxic patients. In Robotics in Healthcare: Field Examples and Challenges; Springer: Cham, Switzerland, 2019; pp. 37–68. [Google Scholar]
  34. Sierra M, S.D.; Múnera, M.; Provot, T.; Bourgain, M.; Cifuentes, C.A. Evaluation of physical interaction during walker-assisted gait with the AGoRA Walker: Strategies based on virtual mechanical stiffness. Sensors 2021, 21, 3242. [Google Scholar] [CrossRef]
  35. Jimenez, M.F.; Mello, R.C.; Loterio, F.; Frizera-Neto, A. Multimodal Interaction Strategies for Walker-Assisted Gait: A Case Study for Rehabilitation in Post-Stroke Patients. J. Intell. Robot. Syst. 2024, 110, 13. [Google Scholar] [CrossRef]
  36. Cifuentes, C.A.; Frizera, A. Human-Robot Interaction Strategies for Walker-Assisted Locomotion; Springer: Cham, Switzerland, 2016; Volume 115. [Google Scholar]
  37. Scheidegger, W.M.; De Mello, R.C.; Jimenez, M.F.; Múnera, M.C.; Cifuentes, C.A.; Frizera-Neto, A. A novel multimodal cognitive interaction for walker-assisted rehabilitation therapies. In Proceedings of the 2019 IEEE 16th International Conference on Rehabilitation Robotics (ICORR), Toronto, ON, Canada, 24–28 June 2019; pp. 905–910. [Google Scholar]
  38. Cardoso, P.; Mello, R.C.; Frizera, A. Handling Complex Smart Walker Interaction Strategies with Behavior Trees. In Advances in Bioengineering and Clinical Engineering; Springer: Cham, Switzerland, 2022; pp. 297–304. [Google Scholar]
  39. Rocha-Júnior, J.; Mello, R.; Bastos-Filho, T.; Frizera-Neto, A. Development of Simulation Platform for Human-Robot-Environment Interface in the UFES CloudWalker. In Proceedings of the Brazilian Congress on Biomedical Engineering, Vitoria, Brazil, 26–30 October 2020; Springer: Cham, Switzerland, 2020; pp. 1431–1437. [Google Scholar]
  40. Mello, R.C.; Ribeiro, M.R.; Frizera-Neto, A. Implementing cloud robotics for practical applications. In Springer Tracts in Advanced Robotics; Springer: Cham, Switzerland, 2023; Volume 10. [Google Scholar] [CrossRef]
  41. Loureiro, M.; Machado, F.; Mello, R.C.; Frizera, A. A virtual reality based interface to train smart walker’s user. In Proceedings of the XII Congreso Iberoamericano de Tecnologias de Apoyo a la Discapacidad, Sao Carlos, Brazil, 20–22 November 2023; pp. 1–6. [Google Scholar]
  42. Machado, F.; Loureiro, M.; Mello, R.C.; Diaz, C.A.; Frizera, A. UFES vWalker: A Preliminary Mixed Reality System for Gait Rehabilitation using a smart walker. In Proceedings of the XII Congreso Iberoamericano de Tecnologias de Apoyo a la Discapacidad, Sao Carlos, Brazil, 20–22 November 2023; pp. 1–6. [Google Scholar]
  43. Roberts, M.; Mongeon, D.; Prince, F. Biomechanical parameters for gait analysis: A systematic review of healthy human gait. Phys. Ther. Rehabil. 2017, 4, 10–7243. [Google Scholar] [CrossRef]
  44. Schepers, M.; Giuberti, M.; Bellusci, G. Xsens MVN: Consistent tracking of human motion using inertial sensing. Xsens Technol. 2018, 1, 1–8. [Google Scholar]
  45. Folstein, M.F.; Folstein, S.E.; McHugh, P.R. “Mini-mental state”: A practical method for grading the cognitive state of patients for the clinician. J. Psychiatr. Res. 1975, 12, 189–198. [Google Scholar] [CrossRef]
  46. Keemink, A.Q.; Van der Kooij, H.; Stienen, A.H. Admittance control for physical human–robot interaction. Int. J. Robot. Res. 2018, 37, 1421–1444. [Google Scholar] [CrossRef]
  47. Robert-Lachaine, X.; Mecheri, H.; Larue, C.; Plamondon, A. Validation of inertial measurement units with an optoelectronic system for whole-body motion analysis. Med. Biol. Eng. Comput. 2017, 55, 609–619. [Google Scholar] [CrossRef]
  48. Benedetti, M.; Catani, F.; Leardini, A.; Pignotti, E.; Giannini, S. Data management in gait analysis for clinical applications. Clin. Biomech. 1998, 13, 204–215. [Google Scholar] [CrossRef]
  49. Kirtley, C. Clinical Gait Analysis: Theory and Practice; Elsevier Health Sciences: Philadelphia, PA, USA, 2006. [Google Scholar]
  50. Kennedy, R.S.; Lane, N.E.; Berbaum, K.S.; Lilienthal, M.G. Simulator sickness questionnaire: An enhanced method for quantifying simulator sickness. Int. J. Aviat. Psychol. 1993, 3, 203–220. [Google Scholar] [CrossRef]
  51. Gil-Gómez, J.A.; Gil-Gómez, H.; Lozano-Quilis, J.A.; Manzano-Hernández, P.; Albiol-Pérez, S.; Aula-Valero, C. SEQ: Suitability evaluation questionnaire for virtual rehabilitation systems. Application in a virtual rehabilitation system for balance rehabilitation. In Proceedings of the 2013 7th International Conference on Pervasive Computing Technologies for Healthcare and Workshops, Venice, Italy, 5–8 May 2013; pp. 335–338. [Google Scholar]
  52. Field, A.P.; Miles, J.; Field, Z. Discovering statistics using R/Andy Field, Jeremy Miles, Zoë Field; Sage: Thousand Oaks, CA, USA, 2012; pp. 664–666. [Google Scholar]
  53. Baniasad, M.; Farahmand, F.; Arazpour, M.; Zohoor, H. Kinematic and electromyography analysis of paraplegic gait with the assistance of mechanical orthosis and walker. J. Spinal Cord Med. 2020, 43, 854–861. [Google Scholar] [CrossRef]
  54. Mun, K.R.; Lim, S.B.; Guo, Z.; Yu, H. Biomechanical effects of body weight support with a novel robotic walker for over-ground gait rehabilitation. Med. Biol. Eng. Comput. 2017, 55, 315–326. [Google Scholar] [CrossRef]
  55. Ishikura, T. Biomechanical analysis of weight bearing force and muscle activation levels in the lower extremities during gait with a walker. Acta Medica Okayama 2001, 55, 73–82. [Google Scholar]
  56. Musahl, V.; Engler, I.D.; Nazzal, E.M.; Dalton, J.F.; Lucidi, G.A.; Hughes, J.D.; Zaffagnini, S.; Della Villa, F.; Irrgang, J.J.; Fu, F.H.; et al. Current trends in the anterior cruciate ligament part II: Evaluation, surgical technique, prevention, and rehabilitation. Knee Surg. Sport. Traumatol. Arthrosc. 2022, 30, 34–51. [Google Scholar] [CrossRef]
  57. Hornby, T.G.; Reisman, D.S.; Ward, I.G.; Scheets, P.L.; Miller, A.; Haddad, D.; Fox, E.J.; Fritz, N.E.; Hawkins, K.; Henderson, C.E.; et al. Clinical practice guideline to improve locomotor function following chronic stroke, incomplete spinal cord injury, and brain injury. J. Neurol. Phys. Ther. 2020, 44, 49–100. [Google Scholar] [CrossRef]
  58. Maxey, L.; Magnusson, J. Rehabilitation for the Postsurgical Orthopedic Patient; Elsevier Health Sciences: St. Louis, MO, USA, 2013. [Google Scholar]
  59. Wang, C.; Xu, Y.; Zhang, L.; Fan, W.; Liu, Z.; Yong, M.; Wu, L. Comparative efficacy of different exercise methods to improve cardiopulmonary function in stroke patients: A network meta-analysis of randomized controlled trials. Front. Neurol. 2024, 15, 1288032. [Google Scholar] [CrossRef] [PubMed]
  60. Dumont, G.D. Hip instability: Current concepts and treatment options. Clin. Sport. Med. 2016, 35, 435–447. [Google Scholar] [CrossRef] [PubMed]
  61. Baker, R.; Esquenazi, A.; Benedetti, M.G.; Desloovere, K. Gait analysis: Clinical facts. Eur. J. Phys. Rehabil. Med. 2016, 52, 560–574. [Google Scholar] [PubMed]
Figure 1. A participant from the experiment using the UFES vWalker and the five subsystems: Odometry and Control (OC), Human–Robot-Environment Interaction (HREI), Human–Robot Interaction (HRI), Motion Capture (MC), and Virtual Reality Integration (VRI).
Figure 1. A participant from the experiment using the UFES vWalker and the five subsystems: Odometry and Control (OC), Human–Robot-Environment Interaction (HREI), Human–Robot Interaction (HRI), Motion Capture (MC), and Virtual Reality Integration (VRI).
Sensors 24 05534 g001
Figure 2. An example of the joint angles of the hip, knee, and ankle respectively during a gait cycle from a participant in the FW task.
Figure 2. An example of the joint angles of the hip, knee, and ankle respectively during a gait cycle from a participant in the FW task.
Sensors 24 05534 g002aSensors 24 05534 g002b
Figure 3. The boxplot of all the spatiotemporal parameters divided by the FW, AW and VRAW tasks. The boxplots for AW and VRAW, marked with an asterisk (*), indicate that statistically significant differences were found between these tasks and FW.
Figure 3. The boxplot of all the spatiotemporal parameters divided by the FW, AW and VRAW tasks. The boxplots for AW and VRAW, marked with an asterisk (*), indicate that statistically significant differences were found between these tasks and FW.
Sensors 24 05534 g003
Figure 4. The boxplot of the kinematic parameters of the right hip joint in the sagittal plane, and both sides in the coronal and transverse planes, divided by the FW, AW, and VRAW tasks. The boxplots for AW and VRAW, marked with an asterisk (*), indicate that statistically significant differences were found between these tasks and FW.
Figure 4. The boxplot of the kinematic parameters of the right hip joint in the sagittal plane, and both sides in the coronal and transverse planes, divided by the FW, AW, and VRAW tasks. The boxplots for AW and VRAW, marked with an asterisk (*), indicate that statistically significant differences were found between these tasks and FW.
Sensors 24 05534 g004
Figure 5. The boxplot of the kinematic parameters of the right knee joint in the sagittal plane, and both sides in the coronal and transverse planes, divided by the FW, AW, and VRAW tasks. The boxplots for AW and VRAW, marked with an asterisk (*), indicate that statistically significant differences were found between these tasks and FW.
Figure 5. The boxplot of the kinematic parameters of the right knee joint in the sagittal plane, and both sides in the coronal and transverse planes, divided by the FW, AW, and VRAW tasks. The boxplots for AW and VRAW, marked with an asterisk (*), indicate that statistically significant differences were found between these tasks and FW.
Sensors 24 05534 g005
Figure 6. The boxplot of the kinematic parameters of the right ankle joint in the sagittal plane, and both sides in the coronal plane, divided by the FW, AW, and VRAW tasks. The boxplots for AW and VRAW, marked with an asterisk (*), indicate that statistically significant differences were found between these tasks and FW.
Figure 6. The boxplot of the kinematic parameters of the right ankle joint in the sagittal plane, and both sides in the coronal plane, divided by the FW, AW, and VRAW tasks. The boxplots for AW and VRAW, marked with an asterisk (*), indicate that statistically significant differences were found between these tasks and FW.
Sensors 24 05534 g006
Table 1. The parameters of the lower limb joint angles of the hip, knee, and ankle analyzed.
Table 1. The parameters of the lower limb joint angles of the hip, knee, and ankle analyzed.
Hip JointKnee JointAnkle Joint
H1: flexion at heel strike.K1: flexion at heel strike.A1: dorsiflexion at heel strike.
H2: max. flexion at load. response.K2: max. flexion at load. response.A2: max. plantar dorsiflex. at load. response.
H3: max. extension in stance phase.K3: max. extension in stance phase.A3: max. dorsiflexion in stance phase.
H4: flexion at toe-off.K4: flexion at toe-off.A4: dorsiflexion at toe-off.
H5: max. flexion in swing phase.K5: max. flexion in swing phase.A5: max. dorsiflexion in swing phase.
H6: total sagittal plane excursion.K6: total sagittal plane excursion.A6: total sagittal plane excursion.
H7: total coronal plane excursion.K7: total coronal plane excursion.A7: total coronal plane excursion.
H8: max. adduction in stance phase.K8: max. adduction in stance phase.A8: max. abduction in stance phase.
H9: max. abduction in swing phase.K9: max. abduction in swing phase.A9: max. adduction in swing phase.
H10: total transverse plane excursion.K10: total transverse plane excursion.
H11: max. internal rot. in stance phase.K11: max. internal rot. in stance phase.
H12: max. external rot. in swing.K12: max. external rot. in swing phase.
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Loureiro, M.; Elias, A.; Machado, F.; Bezerra, M.; Zimerer, C.; Mello, R.; Frizera, A. Analysis of Gait Kinematics in Smart Walker-Assisted Locomotion in Immersive Virtual Reality Scenario. Sensors 2024, 24, 5534. https://doi.org/10.3390/s24175534

AMA Style

Loureiro M, Elias A, Machado F, Bezerra M, Zimerer C, Mello R, Frizera A. Analysis of Gait Kinematics in Smart Walker-Assisted Locomotion in Immersive Virtual Reality Scenario. Sensors. 2024; 24(17):5534. https://doi.org/10.3390/s24175534

Chicago/Turabian Style

Loureiro, Matheus, Arlindo Elias, Fabiana Machado, Marcio Bezerra, Carla Zimerer, Ricardo Mello, and Anselmo Frizera. 2024. "Analysis of Gait Kinematics in Smart Walker-Assisted Locomotion in Immersive Virtual Reality Scenario" Sensors 24, no. 17: 5534. https://doi.org/10.3390/s24175534

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop