Next Article in Journal
Distance Measurement Error in Time-of-Flight Sensors Due to Shot Noise
Next Article in Special Issue
Tracking Systems for Virtual Rehabilitation: Objective Performance vs. Subjective Experience. A Practical Scenario
Previous Article in Journal
Viability Prediction of Ricinus cummunis L. Seeds Using Multispectral Imaging
Previous Article in Special Issue
False Alarm Reduction in BSN-Based Cardiac Monitoring Using Signal Quality and Activity Type Information
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

New Lower-Limb Gait Asymmetry Indices Based on a Depth Camera

1
University Rennes 2, ENS Rennes, Campus de Ker lann, Avenue Robert Schuman, Bruz 35170, France
2
Université de Montréal, C.P. 6128, succ. Centre-ville, Montréal H3C 3J7, QC, Canada
3
Inria, Campus Universitaire de Beaulieu, Rennes 35052, France
*
Author to whom correspondence should be addressed.
Sensors 2015, 15(3), 4605-4623; https://doi.org/10.3390/s150304605
Submission received: 23 October 2014 / Revised: 13 January 2015 / Accepted: 9 February 2015 / Published: 24 February 2015
(This article belongs to the Collection Sensors for Globalized Healthy Living and Wellbeing)

Abstract

:
Background: Various asymmetry indices have been proposed to compare the spatiotemporal, kinematic and kinetic parameters of lower limbs during the gait cycle. However, these indices rely on gait measurement systems that are costly and generally require manual examination, calibration procedures and the precise placement of sensors/markers on the body of the patient. Methods: To overcome these issues, this paper proposes a new asymmetry index, which uses an inexpensive, easy-to-use and markerless depth camera (Microsoft Kinect™) output. This asymmetry index directly uses depth images provided by the Kinect™ without requiring joint localization. It is based on the longitudinal spatial difference between lower-limb movements during the gait cycle. To evaluate the relevance of this index, fifteen healthy subjects were tested on a treadmill walking normally and then via an artificially-induced gait asymmetry with a thick sole placed under one shoe. The gait movement was simultaneously recorded using a Kinect™ placed in front of the subject and a motion capture system. Results: The proposed longitudinal index distinguished asymmetrical gait (p < 0.001), while other symmetry indices based on spatiotemporal gait parameters failed using such Kinect™ skeleton measurements. Moreover, the correlation coefficient between this index measured by Kinect™ and the ground truth of this index measured by motion capture is 0.968. Conclusion: This gait asymmetry index measured with a Kinect™ is low cost, easy to use and is a promising development for clinical gait analysis.

1. Introduction

In pathological gait, significant differences in kinematic and kinetic parameters can be observed between the left and right human lower extremities [1]. Gait asymmetry can consequently be used to identify pathology and track recovery. To do so, various asymmetry indices and ratios have been developed [2]. These measurements could provide insight into the efficiency of the treatment of people with asymmetric gait, whatever the origin, e.g., cerebral palsy [3], stroke [4], hip arthritis [5] and leg length discrepancy [6].
As reported in [4], most of these indices rely on a simple comparison between left and right variables, such as step length and step duration, joint angles or ground reaction force. However, they provide only a unique value for the whole gait cycle without considering the possible evolution of gait asymmetry that may occur within the cycle. Moreover, measuring the variation of asymmetry inside the gait cycle would allow one to identify when the maximum of asymmetry occurred during the gait cycle.
Moreover, most of the current indices rely on data provided by sensors or markers that are placed on the body, such as optical motion capture [3] or inertial systems [7,8]. These methods require expertise for marker/sensor placement, calibration and manual editing of the data, which involves recruiting trained staff and requires time for measurement preparation and analysis. Gaitrite [4] or a force plate can also be used to measure spatiotemporal parameters in an almost automatic manner. However, they fail to give joint coordination information at each time of the gait cycle and could underestimate asymmetry due to compensation strategies along the kinematic chain [9]. Carroza et al. [10] proposed a method to use 16 video cameras to reconstruct the volume of the subject in order to compute the joint orientation. However, this method still requires a calibration stage. To sum-up, an index that could estimate gait asymmetry evolution during the gait cycle, without markers or sensors on the patient’s body, without calibration and without manual editing does not exist yet.
Depth cameras have recently been introduced in the public market through the increase of the Microsoft Kinect™ sensor usage. It has rapidly been used to measure 3D human motion, as it provides a low-cost, markerless and calibrationless system [11]. Paolini proposed a method in [12] that uses the Kinect™ to measure foot position on a treadmill, but a marker on each foot is still required. Two methods have been introduced to measure spatiotemporal gait parameters without markers (mainly step length and duration) using a Kinect™. Stone and Skubic [13] computed the subject’s centroid in depth images to deduce the step length and the lower part of the legs to detect gait cycle. Clark et al. [11] computed gait parameters thanks to a 3D skeleton fitted to the depth maps [14]. However, for both methods, inaccuracies can appear because of possible confusion between the feet and the ground during the contact phase due to the Kinect™ noisy output. This leads to inaccurate computation of the spatiotemporal gait parameters, which is key information in any asymmetry assessment. Moreover, none of them addressed gait asymmetry measurements, especially during the step. Furthermore, this use of the skeleton for gait analysis on a treadmill has been evaluated in [15]. The hip angle and knee angle measurements were compared with reference motion capture system measurements. The hip angle correlation was lower than 0.3, and the knee angle was mainly less than 0.8, which could lead to erroneous gait parameter estimation. According to [15], the skeleton information would need some improvement in sensitivity before being clinically usable. Moreover, the skeleton-fitting algorithm was trained with healthy subjects and could possibly fail to deliver accurate information in the case of impaired subjects.
To overcome these limitations, this paper proposes a new index to assess gait asymmetry inside the step, using depth images with no skeleton fitting. This asymmetry index is based on spatial differences between the lower limb positions at each instant of the step. It uses information recorded by a depth camera Kinect™ placed in front of the subject walking on a treadmill, as shown Figure 1a.
Figure 1. (a) Representation of the experimental set-up featuring the depth camera (1), the recording computer (2) and the treadmill (3); (b) representation of the orthographic projection operation and the bounding box in red, chosen to isolate the points belonging to the subject; (c) the orthographic depth image showing the parts of the body defining the limits of the regions of interest used for the registration operation and asymmetry indices’ computation.
Figure 1. (a) Representation of the experimental set-up featuring the depth camera (1), the recording computer (2) and the treadmill (3); (b) representation of the orthographic projection operation and the bounding box in red, chosen to isolate the points belonging to the subject; (c) the orthographic depth image showing the parts of the body defining the limits of the regions of interest used for the registration operation and asymmetry indices’ computation.
Sensors 15 04605 g001

2. Method

The key concept of the proposed asymmetry index is to compare the spatial position of the left and the right legs at comparable times within their respective step cycle (i.e., at the same percentage of time in each step cycle).
Kinect™ sensors are based on depth images that could be processed by the Shotton algorithm [14] to segment body parts and estimate joint centers. We assume that this post-processing may lead to less accurate results in gait asymmetry measurements compared to the use of raw depth images. A mean step cycle model is introduced in this paper in order to obtain a more representative and reliable sequence of depth images before asymmetry analysis. The overall process is depicted in Figure 2. The details are given in the following sections.

2.1. Raw Depth Image Preprocessing

Four stages are needed to process the raw depth images provided by the depth camera before computing the asymmetry index, as shown in Figure 2. Let us consider these stages in detail.
Figure 2. Representation of the method for measuring asymmetry indices of lower limbs during the step cycle. In the red box are the depth image pretreatment operations. In the green box are the mean step cycle model construction steps. Finally, in the blue box are the different operations to construct the asymmetry indices.
Figure 2. Representation of the method for measuring asymmetry indices of lower limbs during the step cycle. In the red box are the depth image pretreatment operations. In the green box are the mean step cycle model construction steps. Finally, in the blue box are the different operations to construct the asymmetry indices.
Sensors 15 04605 g002

2.1.1. Transformation from Raw Depth Images to 3D Points

The depth camera is considered as a pinhole camera with its optical axis intersecting the center of the image plane and the focal length f = 575.82   p x · m 1 given by the manufacturer. Then, horizontal (X), vertical (Y) and depth (Z) coordinates of 3D points are reconstructed according to the following equations using the depth information z returned for each pixel (x, y), where w and h are the width and height of the depth image (in pixels), respectively.
X = ( x w 2 ) z f Y = ( y w 2 ) z f Z = z

2.1.2. Clipping

Because the subject is above the treadmill, only points located in the 2.5 m × 3 m × 1.5 m (height × depth × width) bounding box above the treadmill belt surface (depicted in Figure 1b) are associated with the subject and kept for further processing.

2.1.3. Orthographic Projection

The depth information returned by the Kinect™ is provided in a projective representation. As shown in Equation (1), the lateral and height position of a point in this projective representation depend on its depth. Hence, two points in space with identical heights, but different depths will have different vertical positions in the projective image. Without projection correction, the same ankle joint could be displayed at two different heights, depending on its depth, with a difference of up to 60 pixels. To ensure that left and right body parts are correctly compared in the following method, 3D surfaces are orthographically projected on a plane perpendicular to the optical axis of the depth camera, as illustrated in Figure 1b. The spatial resolution of this orthogonal image was set to 0.5 cm·px−1 which is close to the resolution of the depth camera (Kinect™) when the subject is 2 m away from the sensor. The depth value of each pixel was computed using the inverse distance interpolation, as described by Shepard [16]. As an output, this preprocessing stage transforms each raw projective view in orthographic projections of the 3D surface of the subject (depth images (DI)).

2.1.4. Lower-Limb Region of Interest and Pelvis Zone

The asymmetry index presented in this paper focuses on lower-limb movements. This lower-limb region is called the leg zone. This leg zone is defined as the volume between the lower leg zone parameter at 0.22 H and the upper leg zone parameter at 0.47 H, where H is the height of the subject. These parameter value estimations were done posteriorly with our dataset in order to maximize the correlation between Kinect™ measurements and ground truth obtained with reference to motion capture systems. In the same way, to ensure correct comparison of left and right legs at various times in the gait cycles, we need to displace the pelvis location (origin of the kinematic chain) to a unique position, i.e., the same distance to the camera. To this end, the pelvis zone is defined by a 20-cm square centered on the pelvis, the position of which is estimated at 0.61 H, according to anthropometrical tables [17].

2.2. Mean Step Cycle Model Computation

The Kinect™ measurements have a random error [18] with a standard deviation up to 1 cm when the subject is 2.5 m from the sensor. This error could result in a decreased accuracy of the asymmetry estimation. The mean step cycle model (MSCM) aims at helping to decrease the impact of the sensor’s noise on the computation of the index. It consists of computing a sequence of mean depth images (MDI) representing the most representative (averaged) gait cycle for each subject. This is performed in three main stages.

2.2.1. Step Cycle Detection

Firstly, it is necessary to determine the beginning and end moment of each step cycle in order to extract a set of step cycles. Traditionally, step cycles are defined as two successive heel strikes of the right and left feet. Unfortunately, heel positions are difficult to track in a depth image, because the points of the feet could be confused with the treadmill belt. We therefore used the cyclic longitudinal distance between the knees to estimate heel strike and identify steps, as described in [19].

2.2.2. Assignation of Frame to Mean Depth Image

Once these events are identified, each step is divided into ten 10%-long intervals. The MSCM is also cut into 10 intervals. The i-th MDI of the MSCM is the average image of the DI belonging to the i-th interval of all of the recorded steps. Now, consider the details of this averaging process.

2.2.3. Mean Depth Image Construction

The objective here is to compute a representative mean depth image MDI(s,i) for each step side s ( s { R , L } for left and right) and for each interval I (i = 1,...,10) of the step, which corresponds to the average of the depth images DI(s,c,i) for all of the recorded steps c (c = 1,...,n). This averaging process requires a registering procedure in order to suppress the lateral and longitudinal movements of the subject on the treadmill that are not relevant. To this end, we consider each first step DI(s,1,i) for each interval i and step side s as a reference depth image.
Consequently, each DI(s,c,i) can be laterally registered by computing the lateral shift x ( s , c , i ) that maximizes the cross-correlation between DI(s,c,i) and the corresponding reference DI(s,1,i) leg of each step c. This process is only applied on the pixels belonging to the leg zones. x ( s , c , i ) is computed with Equation (2), where x and y are, respectively, the lateral and vertical coordinates in the depth image, and y lies between 0.22 H and 0.47 H.
x ( s , c , i ) = a r g m a x x x y = 0.22 H y = 0.47 H D I ( s , c , i ) ( x + x , y ) D I ( s , 1 , i ) ( x , y )
The longitudinal registration is done by subtracting the mean depth of the pelvic zone from all of the DI(s,c,i).
MDIs for each step side s and each interval i are computed using the Equation (3).
M D I ( s , i ) ( x , y ) = c = 1 , ... , n D I ( s , c , i ) ( x + x ( s , c , i ) , y ) z ( s , c , i ) n
where z ( s , c , i ) is the mean depth of the pelvis for step side s, step c and interval i. The final MSCM is consequently composed of 10 MDI for the right step and 10 MDI for the left step, denoted respectively MDI(R,i) and MDI(L,i), i = 1,…,10. Finally, in order to allow the horizontal flip, the subject is laterally centered. This is done in each MDI with the computation of the center of the subject and the application of a lateral translation in order to align the center of the subject with the center of the image.

2.3. Asymmetry Index Based on the Mean Gait Cycle Model

The goal of this phase is to compute the index, which quantifies the longitudinal difference between the left and the right leg movements within the step. To this end, we use the MSCM as the representative of the step of the subject. The analysis of the MSCM begins with the selection of corresponding intervals i (i = 1,…,10) in the right and left steps. If the gait is symmetric, we expect leg positions of the right step in MDI(R,i) to be similar to those of the left step in MDI(L,i) after being horizontally flipped (denoted fMDI(L,i)), for each interval i. In the following, the right step is used as a reference, and the left step is compared to this reference. In each MDI(R,i) and fMDI(L,i), the right and left legs are identified with a two-class K-means algorithm, as described in Auvinet et al. [19]. We will denote leftleg(.) (resp. rightleg(.)) as the function that returns the points belonging to the left leg (resp. right leg) in the depth images, which is provided as an input.
Let us recall that the resulting index aims at assessing gait asymmetry by computing the spatial difference between the two legs at a comparable time, at each interval i of the step. It consists of computing the positional difference between the corresponding legs in MDI(R,i) and the corresponding fMDI(L,i), as shown in Figure 3. These differences are computed for each height y of the leg zone (i.e., y between 0.22 H and 0.47 H) and at each interval i of the step (i.e., i = 1,...,10). We propose to compute this spatial difference along the longitudinal axis to provide the user with information concerning longitudinal asymmetry. This therefore leads to the Ilong (longitudinal asymmetry index). Let us consider now the computational details for this index.
Before computing the longitudinal difference between corresponding legs, it is necessary to displace each leg of fMDI(L,i) laterally in order to align them with their corresponding leg of MDI(R,i). This lateral difference between the right leg (resp. left leg) of the reference step from MDI(R,i) and the corresponding leg of the compared step from fMDI(L,i) is denoted R ( i ) l a t (resp. L ( i ) l a t ). These lateral differences are computed, at each height y of the leg zone and interval i in the step, in order to maximize the correlation between corresponding legs:
R ( i ) l a t ( y ) = a r g m a x x x r i g h t l e g ( M D I ( R , i ) ( x , y ) ) r i g h t l e g ( f M D I ( L , i ) ( x + x , y ) )
L ( i ) l a t ( y ) = a r g m a x x x l e f t l e g ( M D I ( R , i ) ( x , y ) ) l e f t l e g ( f M D I ( L , i ) ( x + x , y ) )
Once corresponding legs are laterally aligned, the longitudinal index is obtained by computing the mean longitudinal difference between each leg of MDI(R,i) with the corresponding aligned leg of fMDI(L,i). This longitudinal difference between the right leg (resp. left leg) of the reference step from MDI(R,i) and the corresponding aligned leg of the compared step from fMDI(L,i) is denoted R ( i ) l o n g (resp. L ( i ) l o n g ). These longitudinal differences are computed at each height y of the leg zone and step interval i with Equations (8) and (9).
R ( i ) l o n g ( y ) = m e a n x ( r i g h t l e g ( M D I ( R , i ) ( x , y ) ) r i g h t l e g ( f M D I ( L , i ) ( x + R ( i ) l a t ( y ) , y ) ) )
L ( i ) l o n g ( y ) = m e a n x ( l e f t l e g ( M D I ( R , i ) ( x , y ) ) l e f t l e g ( f M D I ( L , i ) ( x + L ( i ) l a t ( y ) , y ) ) )
The longitudinal asymmetry index Ilong(i) is given by the average value of these two distances at each interval i:
I l o n g ( i ) = m e a n y ( R ( i ) l o n g ( y ) + L ( i ) l o n g ( y ) )
Thanks to this index for each interval i, it is also possible to compute a unique mean value for the whole step:
I l o n g = m e a n i = 1 , , 10 I l o n g ( i )
Figure 3. (a) Position of the horizontal lines (y) selected in the legs from right step MDI(i) and corresponding left step fMDI(i), for lateral and longitudinal asymmetry evaluation; (b) representation of these horizontal lines in the transverse plane. Each leg depth curve is plotted in red and blue for, respectively, the right and left legs of the right step MDI(i) and cyan and magenta for, respectively, the right and left legs of the left step fMDI(i). (c) Lateral differences R ( i ) l a t ( y ) and L ( i ) l a t ( y ) are calculated with a lateral inter-correlation of corresponding leg depth curves; (d) once the legs of the left step are laterally registered on the legs of the right step, the depth differences R ( i ) l o n g ( y ) and L ( i ) l o n g ( y ) are computed by averaging the depth difference between depth curves.
Figure 3. (a) Position of the horizontal lines (y) selected in the legs from right step MDI(i) and corresponding left step fMDI(i), for lateral and longitudinal asymmetry evaluation; (b) representation of these horizontal lines in the transverse plane. Each leg depth curve is plotted in red and blue for, respectively, the right and left legs of the right step MDI(i) and cyan and magenta for, respectively, the right and left legs of the left step fMDI(i). (c) Lateral differences R ( i ) l a t ( y ) and L ( i ) l a t ( y ) are calculated with a lateral inter-correlation of corresponding leg depth curves; (d) once the legs of the left step are laterally registered on the legs of the right step, the depth differences R ( i ) l o n g ( y ) and L ( i ) l o n g ( y ) are computed by averaging the depth difference between depth curves.
Sensors 15 04605 g003

2.4. Asymmetry Index Based on Skeleton Information

As previously mentioned, we assumed in this work that skeleton information based on Shotton’s method [14] would not be reliable enough to obtain an accurate asymmetry index. To evaluate this assumption, we applied the method described above with the MSCM to skeleton information provided by the Kinect™ software. We also applied this method to the skeleton information from the reference motion capture system in order to have a ground truth. The computation follows the same scheme as previously presented. Firstly, steps are detected with the distance between knees at a constant estimated knee height, as described in [19]. Each pose is then associated with one of the 10%-long interval i of the step. Secondly, the skeleton in each pose is laterally and longitudinally registered thanks to the root joint position, as described above for depth images. Thirdly, poses belonging to the same 10%-long step interval i are averaged in order to create a skeleton-based mean gait cycle model. Fourthly, spatial leg positions of poses from corresponding 10%-long intervals are compared to the same method as that previously described with depth images. The pose of the left step is horizontally flipped, and longitudinal differences R ( i ) l o n g ( y )   and L ( i ) l o n g ( y )   are evaluated at each height y of the leg zone, as shown in Figure 4.
Figure 4. Localization of the longitudinal difference between corresponding legs when using skeleton information.
Figure 4. Localization of the longitudinal difference between corresponding legs when using skeleton information.
Sensors 15 04605 g004
In this comparison, legs are considered as lines joining two successive joint centers. Linear interpolation between hip, knee and ankle joint locations is used to compute intermediate points along the segment. Finally, the longitudinal asymmetry indices I l o n g ( i )   for each 10%-long interval i are computed using Equation (10), and the mean longitudinal asymmetry value for the step I l o n g is computed using Equation (11). The performance of this index is then compared to the one obtained with depth images to evaluate the relevance of our assumption.

2.5. Experimentation

We carried out experiments to evaluate the following hypotheses: (1) the longitudinal asymmetry index based on depth images should be reliable enough to quantify asymmetrical gait; and (2) the longitudinal asymmetry index computed thanks to the skeleton information provided by the Kinect™ would lead to significantly less accurate results. Fifteen healthy subjects (12 males, 3 females, 25.3 ± 3.6 years old, 172 ± 7 cm height and 69.4 ± 11.6 kg mass) with no reported clinical asymmetry or gait impairment participated in this study. The institutional ethical review board approved the study.
Each subject walked on a treadmill (LifeFitness F3). The depth images were recorded with a Kinect™ (Microsoft, Redmond, USA) at 30 frames per second with a resolution of 640 × 480 pixels. The Kinect™ was placed 2 m in front of the subject, as shown on Figure 1a. The optical axis of the camera was parallel to the motion direction of the treadmill belt.
To evaluate the accuracy of the longitudinal asymmetry index computed with the skeleton information, we also captured the motion of the subject with an accurate motion capture system (Vicon, Oxford, UK) consisting of 12 infrared cameras capturing motion at 120 Hz. Surface reflective markers were placed over standardized anatomical landmarks, as suggested by the International Society of Biomechanics [20], enabling us to accurately evaluate joint centers, as described in [21]. We used the full body marker set as suggested in [19]. The resulting joint centers could be used to compute the longitudinal asymmetry index as described in Section 2.4, but with reliable and accurate input data.
For each subject, a comfortable speed was selected according to the standard procedure from Holtz et al. [22]: the subjects selected their preferred walking speed as the treadmill speed was systematically increased. This comfort speed was then used for all trials. The mean comfortable speed was 1.03 m·s−1 ± 0.2.
Each subject performed three trials: one normal gait and two asymmetrical gaits artificially generated by alternatively adding a 5 cm-thick sole under the left and the right foot, as classically suggested in previous works [6,8]. The choice of a 5.0-cm sole was inspired by Bhave et al. [23], who indicated that a minimum leg length discrepancy of 4.9 cm was necessary to generate externally visible effects. The duration of a trial after adaptation was set at 6 min with a 2-min adaptation period and a 4-min period of testing to record a large amount of steps. The analysis was automatically carried out on the last 120 gait cycles using the proposed method.
Spatiotemporal gait parameters, such as step length and duration, were measured following the Clark et al. [11] method with both Kinect™ skeleton data and motion capture skeleton data on the last 120 gait cycles. Step duration and length were used to compute the reference gait symmetry ratios (SRlength and SRduration), as suggested by Patterson et al. [4]. This reference value from the motion capture system will be used to evaluate the relevance of the index introduced in this paper by assessing the existence of an asymmetry and to evaluate SRlength and SRduration computed with the Kinect™ skeleton.

2.6. Statistical Analysis

A statistical study was conducted to assess the sensitivity of the methods to detect the artificially-introduced asymmetry. The statistical analysis designed by Gouwanda and Senanayake [8] was used because of protocol similarities. One-way ANOVA tests were conducted to examine the significant difference between normal and asymmetric gaits for each index. This one-way ANOVA was applied to each index value for normal, left-deformed and right-deformed gaits for all of the subjects. The alpha level was set at 0.01. Tukey-HSD multiple comparison tests were then carried out to identify the differences between the experiments when the null hypothesis was rejected. This statistical test allows evaluation if the studied index is able to distinguish whether the left (respectively right) deformed gait is different from the normal gait.
This statistical test was conducted from the values of Ilong(i) obtained with the three trial conditions (normal, left deformation, right deformation) at each step interval i. This is done in order to evaluate the sensitivity of these indices to distinguish the left deformed gait (respectively right) from the normal gait at this step interval.
Moreover, this statistical test was also conducted independently using values of SRlength, Ilong and SRduration obtained with the three trial conditions in order to evaluate the sensibility of these indices to distinguish the left deformed gait (respectively right) from the gait with normal whole step indices.

3. Results

The data analysis process worked correctly for all subjects using both our method and the motion capture method. However, a subject has been excluded from the result computed with Kinect™ skeleton data. For one of the trials of this subject, the results were not reliable and an outlier from the others subjects.

3.1. SRduration and SRlength

SRlength and SRduration computed with Kinect™ skeleton data and the motion capture skeleton are detailed in Table 1. With the Kinect™ skeleton information, no statistical difference exists between these indices for gaits with and without a sole under a foot. However, with the motion capture skeleton data, both SRlength and SRduration are statistically different (p < 0.001) between gait with a sole under one foot and normal gait. These results show that the skeleton information from Kinect™ is not accurate enough to allow SRlength and SRduration to identify potential adaptations in gait cycles for such a perturbation.
Table 1. Results of the symmetry ratios for step duration and for step length (SRlength and SRduration), computed with the spatio-temporal step information, as suggested by Patterson [4]. The results are reported with the mean value and the standard deviation. Significant statistical results are written in bold and marked with * when p < 0.001. SRlength and SRduration values close to one indicate perfect symmetry of the gait.
Table 1. Results of the symmetry ratios for step duration and for step length (SRlength and SRduration), computed with the spatio-temporal step information, as suggested by Patterson [4]. The results are reported with the mean value and the standard deviation. Significant statistical results are written in bold and marked with * when p < 0.001. SRlength and SRduration values close to one indicate perfect symmetry of the gait.
IndexLeft Deformed GaitNormal GaitRight Deformed Gait
Step length symmetry ratio (SRlength) with Kinect™ skeleton
Left/right SR0.93 ± 0.111.01 ± 0.101.10 ± 0.18
Statistical differencep = 0.12p = 0.34
Step duration symmetry ratio (SRduration) with Kinect™ skeleton
Left/right SR0.99 ± 0.151.07 ± 0.181.18 ± 0.14
Statistical differencep = 0.17p = 0.49
Step length symmetry ratio (SRlength) with motion capture skeleton
Left/right SR0.87 ± 0.041.00 ± 0.041.15 ± 0.05
Statistical differencep < 0.001 *p < 0.001 *
Step duration symmetry ratio (SRduration) with motion capture skeleton
Left/right SR0.93 ± 0.041.00 ± 0.021.06 ± 0.04
Statistical differencep < 0.001 *p < 0.001 *

3.2. Asymmetry Index Based on a Skeleton Estimated with a Kinect™

The results for Ilong(i) computed at 10 intervals (i = 1,…,10) of the step cycle when using skeleton data delivered by the Kinect™ software are presented in Figure 5c. There is no statistical difference between right-deformed gait and normal gait from 50% to 70% of the step. There is no statistical difference between left-deformed gait and normal gait at 50% and 80% and of the step. There is a significant difference for Ilong (p < 0.001) between the left-deformed gait (resp. right-deformed) and the normal gait, as shown in Table 2. Comparison between the Ilong(i) computed with the Kinect™ skeleton and the Ilong(i) computed with accurate motion capture data leads to a correlation coefficient of 0.796. The mean value of Ilong was on average 22.5% lower than the reference Ilong computed from motion capture for left and right deformation. This means that asymmetry obtained with the Kinect™ skeleton tends to underestimate asymmetry computed with more precise data. Moreover, the standard deviation of the Ilong computed with this method was on average 44% higher than reference Ilong from motion capture data for both the normal and deformed gait.
Table 2. Total asymmetry index results for the average longitudinal asymmetry index (Ilong) computed with different methods for normal, left deformed and right deformed gait. The results are reported with the mean value and the standard deviation. Significant statistical results are written in bold and marked with * when p < 0.001. An Ilong value close to zero indicates perfect symmetry of the gait.
Table 2. Total asymmetry index results for the average longitudinal asymmetry index (Ilong) computed with different methods for normal, left deformed and right deformed gait. The results are reported with the mean value and the standard deviation. Significant statistical results are written in bold and marked with * when p < 0.001. An Ilong value close to zero indicates perfect symmetry of the gait.
IndexLeft Deformed GaitNormal GaitRight Deformed Gait
Average longitudinal asymmetry index (Ilong) with the proposed method
Asymmetry index −50 ± 20 mm3 ± 15 mm53 ± 29 mm
Statistical differencep < 0.001 *p < 0.001 *
Average longitudinal asymmetry index (Ilong) with motion capture data
Asymmetry index−57 ± 20 mm8 ± 17 mm56 ± 28 mm
Statistical differencep < 0.001 *p < 0.001 *
Average longitudinal asymmetry index (Ilong) with Kinect™ skeleton data
Asymmetry index−38 ± 30 mm−3 ± 25 mm48 ± 38 mm
Statistical differencep < 0.001 *p < 0.001 *

3.3 Asymmetry Index Computed with the Proposed Method Based on Raw Depth Images

The results for Ilong(i) computed at 10 intervals (i = 1,…,10) of the step cycle based on depth images are presented in Figure 5a. At each interval, there is a significant difference (p < 0.001) between the left-deformed gait (resp. right-deformed) and the normal gait. There is a significant difference for Ilong (p < 0.001) between the left-deformed gait (resp. right-deformed) and the normal gait, as shown in Table 2. The correlation between Ilong(i) computed from the depth images with the proposed method and the Ilong(i) computed with accurate motion capture data leads to a correlation coefficient of 0.968. The mean value of Ilong was on average 9.0% lower than the reference Ilong computed from motion capture for left and right deformation. The standard deviations of the Ilong computed with this method were on average the same (less than 2.5% of variation) as reference Ilong from the motion capture data for both the normal and deformed gait.
Figure 5. Representation of mean Ilong(i) computed for each group (normal, left deformed, right deformed) from raw depth data with our method (a) from the skeleton computed with the motion capture data (b) and from the skeleton computed with the Kinect™ data. The colored zones define standard deviation intervals. An Ilong value close to zero indicates perfect symmetry of the gait.
Figure 5. Representation of mean Ilong(i) computed for each group (normal, left deformed, right deformed) from raw depth data with our method (a) from the skeleton computed with the motion capture data (b) and from the skeleton computed with the Kinect™ data. The colored zones define standard deviation intervals. An Ilong value close to zero indicates perfect symmetry of the gait.
Sensors 15 04605 g005
These results tend to validate Hypothesis 1: Ilong was able to distinguish asymmetrical gaits from symmetrical ones with a high statistical significance. Furthermore, Ilong(i) computed with our method has a better correlation with reference data, a lower underestimation of the asymmetry and a lower standard deviation of the results than Ilong(i) computed with the Kinect™ skeleton. Moreover, Ilong(i) computed with our method is always statistically different for deformed gait versus normal gait, whereas Ilong(i) computed with the Kinect™ skeleton is not statistically different at several intervals of the step. Ilong(i) computed with depth images is then significantly more correlated and sensitive than the one computed with the Kinect™ skeleton, supporting Hypothesis 2.
Let us now consider the sensitivity of this asymmetry index to the parameters used in the method.

3.4. Sensibility of the Leg Zone Estimation

A sensibility study was carried-out regarding the parameters used for the proposed method. These parameters are mainly the knee height, the lower leg zone limit and the upper leg zone limit estimations.
To this end, we tested various heights for the knee height estimation, ranging from 0.25 H to 0.29 H instead of 0.27 H, which is the parameter used to estimate gait events, as described in [19]. Whatever the height used, the gait event estimation remained unchanged, which shows the robustness of this parameter selection.
Figure 6. Correlation coefficient between the longitudinal asymmetry index computed from raw depth information with our method and the reference longitudinal asymmetry index computed from motion capture information. The lower leg zone parameter was varied from 0.3 H to 0.5 H, and the lower leg zone was varied from 0.1 H to 0.25 H. The maximum correlation coefficient is 0.968, as shown for the upper leg zone at 0.47 H and the lower leg zone at 0.22 H.
Figure 6. Correlation coefficient between the longitudinal asymmetry index computed from raw depth information with our method and the reference longitudinal asymmetry index computed from motion capture information. The lower leg zone parameter was varied from 0.3 H to 0.5 H, and the lower leg zone was varied from 0.1 H to 0.25 H. The maximum correlation coefficient is 0.968, as shown for the upper leg zone at 0.47 H and the lower leg zone at 0.22 H.
Sensors 15 04605 g006
In the same way, we sampled the upper-leg (resp. lower-leg) zone estimation from 0.30 H to 0.50 H (resp. 0.10 H to 0.25 H). The impact on the correlation coefficients between the resulting asymmetry index and the reference value from the motion capture is depicted in Figure 6. Rssesults show that correlation coefficient ranged from 0.75 to 0.968, with a maximum corresponding to the parameters finally selected for the method: 0.22 H for the lower leg zone estimation and 0.47 H for the upper leg zone. The correlation coefficient was higher than 0.9 in 85% of the leg zone parameters tested, which shows the robustness of the method.

4. Discussion

The results show that Ilong is a very promising index to detect asymmetry with depth cameras, such as the Kinect™. This index revealed asymmetry globally for the whole step cycle, but also locally for each interval of the step cycle. When estimating spatiotemporal gait parameters with the method proposed in Clark et al. [11], we noticed that the feet were sometimes incorrectly located when in contact with the treadmill surface, which partly explains the poor results of the SRlength and SRduration indices. The indices introduced in this paper are consequently more robust to Kinect™ inaccuracies and errors compared to classical approaches.
The impact of the slope under one foot on SRduration and SRlength was as described in Gurney [6]. The step length and duration of the shortened leg computed with motion capture data decreased, as shown in Table 1.
We observed that the longitudinal asymmetry index (Ilong) changed over time during the step cycle, with a maximum value during the double support phase and a minimum during the swing phase. This is coherent with results reported in the literature by Gouwanda and Senanayake [8]. The Ilong variability for normal walking (shaded colored green area in Figure 3a) can be partly explained by the slight natural asymmetry of healthy subjects [2,6]. In the future, we plan to better estimate the sensitivity of this index for subtler asymmetry by using different sole thicknesses under the feet and different modifications.
The method proposed in this paper relied on two main hypotheses, as shown in Section 2.5. Compared to classical symmetry ratios computed using spatio-temporal parameters obtained with the Kinect™ skeleton, our approach was able to highlight significant differences between normal and deformed gaits, as well as classical symmetry ratios computed from spatio-temporal parameters obtained with motion capture. This result supports the first assumption, showing that Ilong and Ilong(i) computed with depth images are promising gait asymmetry indices based on a simple Kinect™ placed in front of a treadmill. Results for Ilong(i) obtained with the Kinect™ skeleton exhibit less satisfactory results, supporting the second assumption. Hence, skeleton estimation inaccuracies lead to less satisfactory asymmetry assessment for the same experimental set-up, supporting the motivation to develop a method based on depth images instead of the Kinect™ skeleton. Furthermore, the correlation coefficient of 0.796 obtained in this study between gait asymmetry measurements using the Kinect™ skeleton and motion capture system is close to the correlation coefficient between Kinect™ and motion capture measurements obtained in [15]. We principally noticed a problem of segmentation of feet. In these cases, parts of the treadmill belt were segmented as belonging to the body. The results could then be increased with a better feet segmentation algorithm. Furthermore, it is possible that the sole under the feet could have disturbed the algorithm. Finally, one of the issues of the Kinect™ skeleton method resides in the need for an open space in front of the subject. Most of the treadmills have a console in front, which cause the Shotton method [14] to fail to reconstruct the skeleton from Kinect™ data. Pfister et al. [15] tackle this issue by positioning the Kinect™ on the side of the treadmill with 45 degrees between the optical axis and the progression axis of the treadmill. Furthermore, this could lead to issues for gait asymmetry assessment, as shown by different correlation coefficients regarding the side measured in [15], and as specified in [15], this angle could reduce the algorithm accuracy. However, the backward view on the treadmill is commonly free. To adapt the Shotton method [14], the learning process would have to be carried again in contrast with our method, which could be adapted directly by changing the position of the Kinect™ from the front to the back of the treadmill. We plan to evaluate this new orientation in future work.
The asymmetry index is fully automatic to compute. However, the proposed method relies on two main parameters: (1) the number of intervals that were used to decompose the gait cycle (set to 10 intervals in this paper); and (2) the number of strides used to compute the mean step cycle model (set to 120 cycles in this paper). We evaluated the sensitivity of these parameters by tuning the number of intervals with values ranging from five to 20, and the number of gait cycles from 60 to 180. We did not notice significant changes in the results. We chose 120 gait cycles, which approximately corresponds to a 2-min walking test on the treadmill. Some patients may have difficulties in maintaining a stable walking pattern, and it would be interesting to evaluate to what extended the asymmetry index introduced in this paper could deal with a very small number of steps (down to a single step in the extreme cases).

5. Conclusions

In this paper, we introduced two new gait asymmetry indices based on positional differences between the left and right legs during the gait cycle. They deliver asymmetry information in the two main locomotion planes: lateral and longitudinal. Contrary to many other asymmetry indices, they do not focus on a unique measuring point (compared to force plates or Gaitrite sensors) and can deliver values at different instants during the cycles (compared to SR and other ratios that deliver a unique value for the whole cycle). In this paper, we especially focused on another advantage: robustness when using inexpensive and noisy sensors, such as a Microsoft Kinect™. We have shown that the longitudinal index was able to detect asymmetry caused by a 5 cm sole placed under one shoe, whereas classical indices were not so accurate when using similar data.
This robustness to noisy Kinect™ data is the result of several factors. Firstly, these indices do not rely on a unique point, but on surfaces, which are less sensitive to noise. Secondly, they rely on a mean step cycle model that consists of averaging surfaces of several gait cycles to decrease the influence of noise. In the future, it will be important to evaluate the sensitivity of these indices to the number of cycles used in this model. Especially, one could expect to reach an acceptable accuracy even with one cycle, enabling the use of these indices for over-ground walking.
These indices have been tested on depth images, but it could be possible to adapt them for marker-based systems, such as an optoelectronic motion capture device. Using this standard type of measurement, it would be interesting to compare the performance of this index to previously published ones. Moreover it would be necessary to more carefully analyze the sensitivity of these indices for other types of asymmetry, especially for actual patients with impaired gaits.
A treadmill was used in order to have a relatively constant and ideal distance between the sensor and the subject and to facilitate the acquisition of several gait cycles. However, treadmill walking is substantially different from over-ground walking, although [24] showed that the essential equivalence between treadmill and over-ground walking was enough to allow clinical analysis. Nevertheless, it might be interesting to extend this work to a walking path by using several Kinects along a walkway and by reducing the number of gait cycles used to estimate the MSCM.
This method focuses on the lower limb movement asymmetry because of its significant contribution to gait movement. It would be very interesting in future work to analyze the spatial difference for the upper leg and for the lower leg separately, as well as to adapt the method for the rest of the body in order to increase the information assessment of gait asymmetry.
Furthermore, it could be interesting to analyze the lateral displacement used to align legs in order to create a lateral asymmetry index. It could be interesting to design a new protocol to induce asymmetry along the lateral axis to evaluate the potential of this index. This index would be interesting because lateral gait asymmetry occurs in some pathologies, such as the circumduction movement in hemiplegic walking.
Being able to obtain relevant asymmetry information even when using a simple Kinect™ sensor is a real advantage in clinical applications: the method is fully automatic, with no marker, no calibration and no manual editing. In addition to screening, it could enable clinicians to perform a follow-up of patients after surgery, treatment (such as joint replacement) or to measure recovery after a stroke. The resulting system is low cost, and easy to use and is a promising tool for a wide range of applications for gait analysis.

Acknowledgment

The authors wish to thank the Fond de recherche du Quebec—Nature et technologies (FRQNT) (Québec) and the Association nationale de la recherche et de la technologie (ANRT) (France) for the Conventions Industrielles de Formation par la REcherche (CIFRE) 146/2009 funding.

Author Contributions

Edouard Auvinet, Franck Multon and Jean Meunier participated in the conception of and design of the experiments. Edouard Auvinet performed the experiments and analyzed the data. Edouard Auvinet, Franck Multon and Jean Meunier wrote the paper.

Abbreviation Table

AbbreviationTypeName
DI2D imageOrthographic projection of the 3D surface of the subject
MDI2D imageMean depth image
fMDI2D imageHorizontally-flipped mean depth image
MSCMGroup of MDIGroup of MDI
nNumericNumber of step cycles
iNumericInterval of the step
z NumericLongitudinal global displacement
x NumericLateral global displacement
cNumericStep cycle number
s{R,L}Step side
R ( i ) l a t VectorLateral asymmetry at the right leg
L ( i ) l a t VectorLateral asymmetry at the left leg
R ( i ) l o n g VectorLongitudinal asymmetry at the right leg
L ( i ) l o n g VectorLongitudinal asymmetry at the left leg
Ilong(i)NumericLongitudinal asymmetry index value at the instant i of the step
IlongNumericMean value of the longitudinal asymmetry index for the step
SRlengthNumericRight step length versus left step length ratio
SRdurationNumericRight step duration versus left step duration ratio

Conflicts of Interest

The authors declare no conflicts of interest.

References

  1. Griffin, M.P.; Olney, S.J.; McBride, I.D. Role of symmetry in gait performance of stroke subjects with hemiplegia. Gait Posture 1995, 3, 132–142. [Google Scholar] [CrossRef]
  2. Sadeghi, H.; Allard, P.; Prince, F.; Labelle, H. Symmetry and limb dominance in able-bodied gait: A review. Gait Posture 2000, 12, 34–45. [Google Scholar] [CrossRef] [PubMed]
  3. Böhm, H.; Döderlein, L. Gait asymmetries in children with cerebral palsy: Do they deteriorate with running? Gait Posture 2012, 35, 322–327. [Google Scholar] [CrossRef] [PubMed]
  4. Patterson, K.K.; Gage, W.H.; Brooks, D.; Black, S.E.; McIlroy, W.E. Evaluation of gait symmetry after stroke: A comparison of current methods and recommendations for standardization. Gait Posture 2010, 31, 241–246. [Google Scholar] [CrossRef] [PubMed]
  5. James, P.; Nicol, A.; Hamblen, D. A comparison of gait symmetry and hip movements in the assessment of patients with monoarticular hip arthritis. Clin. Biomech. 1994, 9, 162–166. [Google Scholar] [CrossRef]
  6. Gurney, B. Leg length discrepancy. Gait Posture 2002, 15, 195–206. [Google Scholar] [CrossRef] [PubMed]
  7. Auvinet, B.; Berrut, G.; Touzard, C.; Moutel, L.; Collet, N.; Chaleil, D.; Barrey, E. Reference data for normal subjects obtained with an accelerometric device. Gait Posture 2002, 16, 124–134. [Google Scholar] [CrossRef] [PubMed]
  8. Gouwanda, D.; Senanayake, S.A. Identifying gait asymmetry using gyroscopes-A cross-correlation and Normalized Symmetry Index approach. J. Biomech. 2011, 44, 972–978. [Google Scholar] [CrossRef] [PubMed]
  9. Kaufman, K.R.; Miller, L.S.; Sutherland, D.H. Gait asymmetry in patients with limb-length inequality. J. Pediatr. Orthop. 1996, 16, 144–150. [Google Scholar] [CrossRef] [PubMed]
  10. Corazza, S.; Mündermann, L.; Chaudhari, A.M.; Demattio, T.; Cobelli, C.; Andriacchi, T.P. A markerless motion capture system to study musculoskeletal biomechanics: Visual hull and simulated annealing approach. Ann. Biomed. Eng. 2006, 34, 1019–1029. [Google Scholar] [CrossRef] [PubMed]
  11. Clark, R.A.; Bower, K.J.; Mentiplay, B.F.; Paterson, K.; Pua, Y.H. Concurrent validity of the Microsoft Kinect for assessment of spatiotemporal gait variables. J. Biomech. 2013, 46, 2722–2725. [Google Scholar] [CrossRef] [PubMed]
  12. Paolini, G.; Peruzzi, A.; Mirelman, A.; Cereatti, A.; Gaukrodger, S.; Hausdorff, J.M.; Della Croce, U. IEEE Transactions on Neural Systems and Rehabilitation Engineering. Available online: http://ieeexplore.ieee.org/stamp/stamp.jsp?arnumber=6998126 (accessed on 17 February 2015).
  13. Stone, E.E.; Skubic, M. Unobtrusive, continuous, in-home gait measurement using the Microsoft Kinect. IEEE Trans. Biomed. Eng. 2013, 60, 2925–2932. [Google Scholar] [CrossRef] [PubMed]
  14. Shotton, J.; Fitzgibbon, A.; Cook, M.; Sharp, T.; Finocchio, M.; Moore, R.; Kipman, A.; Blake, A. Real-time human pose recognition in parts from single depth images. In Proceeding of the 24th IEEE Conference on Computer Vision and Pattern Recognition (CVPR), Colorado Springs, CO, USA, 21–23 June 2011; pp. 1297–1304.
  15. Pfister, A.; West, A.M.; Bronner, S.; Noah, J.A. Comparative abilities of Microsoft Kinect and Vicon 3D motion capture for gait analysis. J. Med. Eng. Technol. 2014, 38, 274–280. [Google Scholar] [CrossRef] [PubMed]
  16. Shepard, D. A two-dimensional interpolation function for irregularly-spaced data. In Proceedings of the 1968 23rd ACM National Conference, New York, NY, USA, 27–29 August 1968; pp. 517–524.
  17. NASA. Anthropometric Source Book Volume 2: A Handbook of Anthropometric Data, NASA Reference Publication 1978. Available online: https://archive.org/download/nasa_techdoc_19790005540/19790005540.pdf (accessed on 17 February 2015).
  18. Khoshelham, K.; Elberink, S.O. Accuracy and Resolution of Kinect Depth Data for Indoor Mapping Applications. Sensors 2012, 12, 1437–1454. [Google Scholar] [CrossRef] [PubMed]
  19. Auvinet, E.; Multon, F.; Aubin, C.E.; Meunier, J.; Raison, M. Detection of gait cycles in treadmill walking using a Kinect. Gait Posture 2014. [Google Scholar] [CrossRef]
  20. Wu, G.; Siegler, S.; Allard, P.; Kirtley, C.; Leardini, A.; Rosenbaum, D.; Whittle, M.; D’Lima, D.; Cristofolini, L.; Witte, H.; et al. ISB recommendation on definitions of joint coordinate system of various joints for the reporting of human joint motion—Part I: Ankle, hip, and spine. J. Biomech. 2002, 35, 543–548. [Google Scholar] [CrossRef] [PubMed]
  21. Leardini, A.; Cappozzo, A.; Catani, F.; Toksvig-Larsen, S.; Petitto, A.; Sforza, V.; Cassanelli, G.; Giannini, S. Validation of a functional method for the estimation of hip joint center location. J. Biomech. 1999, 32, 99–103. [Google Scholar] [CrossRef] [PubMed]
  22. Holt, K.G.; Hamill, J.; Andres, R.O. Predicting the minimal energy costs of human walking. Med. Sci. Sports Exerc. 1991, 23, 491–498. [Google Scholar] [CrossRef] [PubMed]
  23. Bhave, A.; Paley, D.; Herzenberg, J.E. Improvement in gait parameters after lengthening for the treatment of limb-length discrepancy. J. Bone Joint Surg. Am. 1999, 81, 529–534. [Google Scholar] [PubMed]
  24. Riley, P.O.; Paolini, G.; Croce, U.D.; Paylo, K.W.; Kerrigan, D.C. A kinematic and kinetic comparison of overground and treadmill walking in healthy subjects. Gait Posture 2007, 26, 17–24. [Google Scholar] [CrossRef] [PubMed]

Share and Cite

MDPI and ACS Style

Auvinet, E.; Multon, F.; Meunier, J. New Lower-Limb Gait Asymmetry Indices Based on a Depth Camera. Sensors 2015, 15, 4605-4623. https://doi.org/10.3390/s150304605

AMA Style

Auvinet E, Multon F, Meunier J. New Lower-Limb Gait Asymmetry Indices Based on a Depth Camera. Sensors. 2015; 15(3):4605-4623. https://doi.org/10.3390/s150304605

Chicago/Turabian Style

Auvinet, Edouard, Franck Multon, and Jean Meunier. 2015. "New Lower-Limb Gait Asymmetry Indices Based on a Depth Camera" Sensors 15, no. 3: 4605-4623. https://doi.org/10.3390/s150304605

Article Metrics

Back to TopTop