Methods for Spatiotemporal Analysis of Human Gait Based on Data from Depth Sensors
Abstract
:1. Introduction
- estimates of the positions of anatomical landmarks obtainable using depth sensors are corrupted by non-negligible measurement uncertainty—in particular, estimates of the positions of feet [19];
- during walking, the lower limbs may occlude each other from time to time, making it difficult to accurately track the movement of both of them simultaneously;
- estimates of positions of anatomical landmarks can usually be obtained only in part of the depth sensor’s field of view, typically covering no more than 1 or 2 strides;
- the angle between the walking direction and the depth sensor’s line of sight, as well as the subject’s clothing, may affect the accuracy of localisation of anatomical landmarks.
- a method based on the analysis of the anteroposterior distance between the examined person’s knees, being a variant of the method described in [33];
- a method based on the analysis of the vertical oscillations of the examined person’s centre of mass, being a variant of the method described in [22];
- a method based on the analysis of the horizontal velocity of the examined person’s feet, being a variant of the method described in [20].
2. Materials and Methods
2.1. Studied Data-Processing Methods
2.1.1. Transformation of the Coordinate System
2.1.2. KD Method
- 1.
- smoothing of the sequences of knee position coordinates, e.g., using a Savitzky-Golay filter (Figure 3a);
- 2.
- computation of the anteroposterior distance between the knees according to the following formula:
- 3.
- detection of the left and right FC moments by identifying the local minima and maxima, respectively, of the sequence (Figure 3b);
- 4.
- estimation of the step and stride times based on the detected FC moments;
- 5.
- estimation of the step and stride lengths and widths based on estimates of positions of feet at the detected FC moments.
2.1.3. CH Method
- 1
- smoothing of the sequences of coordinates of the height of the examined person’s centre of mass, e.g., using a Savitzky-Golay filter (Figure 4a);
- 2
- identification of the local minima and maxima of that height (Figure 4b);
- 3
- estimation of the mean step and stride time based on the detected FC moments;
- 4
- estimation of the mean step and stride length based on the detected MS moments.
2.1.4. FV Method
- smoothing of the coordinates of the positions of feet, e.g., using a Savitzky-Golay filter (Figure 5a);
- estimation of the horizontal velocity of the feet by numerical differentiation of the sequences of position estimates;
- comparison of that velocity with an empirically selected threshold value;
- detection of the FC moments and the foot-off (FO) moments (i.e., the moments when a foot is lifted off the floor) by identifying the moments when velocity falls below or rises above—respectively—the aforementioned threshold value (Figure 5b);
- estimation of selected spatiotemporal gait parameters based on the detected FC and FO moments and the estimates of positions of feet.
- left and right step time,
- left and right step length,
- stride time,
- stride length,
- left and right swing time,
- left and right stance time,
- double-support time,
- step width.
2.2. Methodology of Experimentation
- depth sensor’s line of sight parallel to the walking direction (φ ≅ 180°);
- depth sensor’s line of sight diagonal to the walking direction (φ ≅ 135°);
- depth sensor’s line of sight perpendicular to the walking direction (φ ≅ 90°).
- the mean error ME, defined in the following way:
- the standard deviation of errors SDE, defined in the following way:
- the mean absolute value of the relative errors MARE, defined in the following way:
- p denotes a given spatiotemporal gait parameter under analysis—such as the step length or time;
- denote the estimates of the parameter p, based on the data from the depth sensor;
- denote the reference values of the parameter p, obtained using the Zebris FDM platform;
- M is the number of obtained estimates of the parameter p, i.e., the number of triplets of the subjects’ passages across the tensometric platform.
3. Results
- those obtained only for Subjects #1–#3 (i.e., those who wore typical trousers) and only in the experiments in which φ = 180° (i.e., in which the subjects walked towards the depth sensor along its line of sight);
- those obtained for all subjects in all experiments (i.e., for three different angles φ between the walking direction and the depth sensor’s line of sight: 90°, 135° and 180°).
4. Discussion
5. Conclusions
- The results of gait analysis based on data from a Kinect v2 device can be comparable in accuracy to those based on data from a Zebris FDM platform if the subject is wearing typical trousers and walking toward that device. In such a case, the FV method allows for the most detailed characterisation of human gait.
- The angle between the walking direction and the depth sensor’s line of sight significantly affects the accuracy of the estimates of the spatiotemporal gait parameters, obtained using all three studied data-processing methods.
- The subject’s clothing significantly affects the accuracy of the gait analysis results obtained using the FV method and the KD method, but not the CH method.
- The practical advantages and disadvantages of the studied data processing methods, identified based on experimental results described above, may serve as a basis for further research aimed at developing more versatile methods for spatiotemporal gait analysis, dedicated to in-home monitoring systems.
- the development and testing of other methods for processing data from depth sensors, aimed at obtaining more accurate and robust estimates of spatiotemporal gait parameters;
- conducting more experiments aimed at testing the usefulness of gait analysis systems based on depth sensors in clinical practice and in systems for in-home monitoring of the elderly.
Author Contributions
Funding
Institutional Review Board Statement
Informed Consent Statement
Data Availability Statement
Acknowledgments
Conflicts of Interest
Appendix A
- the angle φ between the depth sensor’s line of sight and the walking direction;
- the subjects whose gait was analysed: Subjects #1–#3, who wore typical trousers, and, separately, Subjects #4 and #5, who wore wide-leg trousers and a skirt, respectively.
Parameter | Subject #1 | Subject #2 | Subject #3 | Subject #4 | Subject #5 | |||||
---|---|---|---|---|---|---|---|---|---|---|
Mean | Std | Mean | Std | Mean | Std | Mean | Std | Mean | Std | |
Left step time [s] | 0.539 | 0.022 | 0.517 | 0.011 | 0.480 | 0.022 | 0.538 | 0.020 | 0.528 | 0.016 |
Right step time [s] | 0.536 | 0.013 | 0.505 | 0.015 | 0.485 | 0.016 | 0.545 | 0.028 | 0.517 | 0.021 |
Left step length [cm] | 73.1 | 2.8 | 71.6 | 1.9 | 68.2 | 2.6 | 66.1 | 0.9 | 65.1 | 3.2 |
Right step length [cm] | 75.4 | 1.6 | 70.4 | 1.6 | 65.8 | 4.6 | 64.3 | 3.4 | 67.9 | 1.0 |
Stride time [s] | 1.074 | 0.033 | 1.023 | 0.017 | 0.965 | 0.030 | 1.083 | 0.035 | 1.045 | 0.022 |
Stride length [cm] | 148.5 | 3.7 | 142.1 | 3.3 | 134.0 | 5.5 | 130.3 | 3.1 | 133.0 | 3.8 |
Left swing time [%] | 38.5 | 1.2 | 40.4 | 0.8 | 36.3 | 1.6 | 35.7 | 1.1 | 39.0 | 1.6 |
Right swing time [%] | 37.4 | 0.3 | 39.3 | 0.7 | 37.7 | 2.3 | 38.3 | 1.3 | 37.9 | 1.1 |
Left stance time [%] | 61.5 | 1.2 | 59.6 | 0.8 | 63.7 | 1.6 | 64.3 | 1.1 | 61.0 | 1.6 |
Right stance time [%] | 62.6 | 0.3 | 60.7 | 0.7 | 62.3 | 2.3 | 61.7 | 1.3 | 62.1 | 1.1 |
Double-support time [%] | 24.1 | 1.4 | 20.1 | 1.0 | 25.3 | 2.1 | 26.0 | 0.5 | 23.1 | 1.0 |
Step width [cm] | 12.3 | 1.1 | 7.2 | 0.6 | 12.8 | 0.6 | 15.6 | 1.2 | 8.4 | 1.9 |
φ = 180° | φ = 135° | φ = 90° | |||
---|---|---|---|---|---|
Subjects #1–#3 | Subjects #4, #5 | Subjects #1–#3 | Subjects #4, #5 | Subjects #1, #2 | |
ME | |||||
Left step time [s] | −0.007 | −0.016 | 0.003 | −0.073 | −0.053 |
Right step time [s] | 0.004 | 0.004 | 0.003 | −0.137 | −0.033 |
Left step length [cm] | −1.5 | −5.1 | 1.6 | 10.4 | 1.1 |
Right step length [cm] | −1.8 | −2.8 | −2.2 | −33.2 | −15.9 |
Stride time [s] | 0.004 | −0.016 | 0.011 | −0.150 | −0.125 |
Stride length [cm] | 0.8 | −4.3 | −2.3 | −15.3 | −19.1 |
Step width [cm] | 7.6 | 7.6 | 4.6 | −1.2 | 1.4 |
SDE | |||||
Left step time [s] | 0.021 | 0.040 | 0.028 | 0.121 | 0.125 |
Right step time [s] | 0.022 | 0.029 | 0.036 | 0.062 | 0.066 |
Left step length [cm] | 2.9 | 4.3 | 4.1 | 9.9 | 6.2 |
Right step length [cm] | 3.4 | 3.2 | 5.1 | 32.8 | 13.1 |
Stride time [s] | 0.032 | 0.058 | 0.013 | 0.146 | 0.140 |
Stride length [cm] | 5.3 | 5.1 | 4.7 | 21.5 | 14.3 |
Step width [cm] | 0.8 | 1.9 | 3.2 | 2.6 | 3.9 |
MARE | |||||
Left step time | 3.7% | 5.0% | 3.9% | 20.9% | 19.1% |
Right step time | 2.9% | 4.0% | 5.1% | 25.5% | 11.6% |
Left step length | 3.9% | 7.7% | 5.2% | 16.9% | 6.8% |
Right step length | 4.4% | 5.1% | 5.6% | 52.4% | 22.5% |
Stride time | 2.6% | 3.4% | 1.4% | 14.1% | 12.8% |
Stride length | 2.9% | 3.6% | 2.5% | 12.7% | 13.7% |
Step width | 81.8% | 67.8% | 53.9% | 20.6% | 26.3% |
φ = 180° | φ = 135° | φ = 90° | |||
---|---|---|---|---|---|
Subjects #1–#3 | Subjects #4, #5 | Subjects #1–#3 | Subjects #4, #5 | Subjects #1, #2 | |
ME | |||||
Step time [s] | 0.000 | −0.004 | 0.003 | −0.031 | −0.047 |
Stride time [s] | 0.002 | −0.012 | 0.010 | −0.057 | −0.114 |
Step length | −2.3 | −1.8 | −1.4 | −4.6 | −3.6 |
Stride length [cm] | −1.9 | −3.8 | 1.3 | −4.0 | −3.1 |
SDE | |||||
Step time [s] | 0.013 | 0.013 | 0.015 | 0.026 | 0.047 |
Stride time [s] | 0.027 | 0.025 | 0.025 | 0.062 | 0.081 |
Step length [cm] | 2.5 | 2.1 | 2.2 | 4.3 | 4.1 |
Stride length [cm] | 3.3 | 4.4 | 4.4 | 8.2 | 4.0 |
MARE | |||||
Step time | 1.9% | 2.0% | 2.3% | 5.8% | 9.2% |
Stride time | 1.9% | 2.0% | 2.1% | 6.0% | 11.0% |
Step length | 4.2% | 3.1% | 2.8% | 8.6% | 4.9% |
Stride length | 2.3% | 3.3% | 2.7% | 6.0% | 2.9% |
φ = 180° | φ = 135° | φ = 90° | |||
---|---|---|---|---|---|
Subjects #1–#3 | Subjects #4, #5 | Subjects #1–#3 | Subjects #4, #5 | Subjects #1, #2 | |
ME | |||||
Left step time [s] | 0.003 | −0.001 | −0.002 | −0.251 | −0.065 |
Right step time [s] | 0.012 | 0.012 | 0.004 | −0.141 | −0.032 |
Left step length [cm] | −0.6 | −3.2 | 0.0 | −10.7 | −1.5 |
Right step length [cm] | −1.6 | −3.9 | −0.8 | −31.9 | −4.9 |
Stride time [s] | 0.016 | −0.013 | −0.001 | −0.415 | −0.143 |
Stride length [cm] | 3.6 | −4.2 | −0.1 | −56.2 | −8.5 |
Left swing phase [%] | −1.0 | 0.9 | 0.6 | 0.0 | −0.8 |
Right swing phase [%] | −0.8 | −1.4 | 1.7 | −0.4 | 5.1 |
Left stance phase [%] | 1.0 | 0.9 | 0.3 | 0.1 | 0.8 |
Right stance phase [%] | 0.8 | 1.4 | −1.7 | 1.0 | −5.1 |
Double-support phase [%] | 2.1 | 2.0 | −1.4 | 0.5 | −3.5 |
Step width [cm] | 7.6 | 9.0 | 4.9 | 0.6 | 0.7 |
SDE | |||||
Left step time [s] | 0.020 | 0.019 | 0.016 | 0.138 | 0.088 |
Right step time [s] | 0.022 | 0.040 | 0.018 | 0.076 | 0.049 |
Left step length [cm] | 3.3 | 3.2 | 3.5 | 10.6 | 1.7 |
Right step length [cm] | 3.5 | 2.2 | 2.9 | 35.8 | 11.3 |
Stride time [s] | 0.034 | 0.023 | 0.042 | 0.128 | 0.205 |
Stride length [cm] | 7.8 | 3.5 | 3.7 | 19.5 | 13.1 |
Left swing phase [%] | 1.7 | 2.9 | 3.1 | 5.2 | 8.0 |
Right swing phase [%] | 1.1 | 2.2 | 2.2 | 6.1 | 6.4 |
Left stance phase [%] | 1.7 | 2.9 | 3.6 | 4.2 | 8.0 |
Right stance phase [%] | 1.1 | 2.2 | 2.2 | 5.3 | 6.4 |
Double-support phase [%] | 2.2 | 1.6 | 3.4 | 6.4 | 3.0 |
Step width [cm] | 0.9 | 1.8 | 2.4 | 4.0 | 3.6 |
MARE | |||||
Left step time | 3.0% | 2.5% | 2.4% | 48.4% | 13.2% |
Right step time | 4.3% | 6.2% | 3.1% | 26.3% | 8.4% |
Left step length | 3.8% | 5.1% | 4.0% | 19.2% | 2.4% |
Right step length | 4.6% | 5.9% | 3.4% | 54.4% | 10.0% |
Stride time | 2.8% | 1.7% | 3.1% | 39.0% | 14.9% |
Stride length | 4.6% | 3.7% | 2.1% | 42.7% | 6.3% |
Left swing phase | 4.4% | 4.5% | 6.1% | 12.6% | 14.2% |
Right swing phase | 2.8% | 4.9% | 5.2% | 12.4% | 13.8% |
Left stance phase | 2.8% | 2.8% | 4.5% | 5.7% | 9.2% |
Right stance phase | 1.7% | 3.0% | 3.1% | 6.8% | 8.8% |
Double-support phase | 10.7% | 8.2% | 11.1% | 20.9% | 16.6% |
Step width | 80.7% | 80.6% | 53.9% | 34.0% | 29.8% |
References
- Montero-Odasso, M.; Camicioli, R. (Eds.) Falls as a manifestation of brain failure: Gait, cognition, and the neurobiology of falls. In Falls and Cognition in Older Persons: Fundamentals, Assessment and Therapeutic Options; Springer International Publishing: Cham, Switzerland, 2020; pp. 3–20. [Google Scholar]
- Armand, S.; Decoulon, G.; Bonnefoy-Mazure, A. Gait analysis in children with cerebral palsy. Efort Open Rev. 2016, 1, 448–460. [Google Scholar] [CrossRef] [PubMed]
- Newman, A.B.; Simonsick, E.M.; Naydeck, B.L.; Boudreau, R.M.; Kritchevsky, S.B.; Nevitt, M.C.; Pahor, M.; Satterfield, S.; Brach, J.S.; Studenski, S.A.; et al. Association of long-distance corridor walk performance with mortality, cardiovascular disease, mobility limitation, and disability. JAMA 2006, 295, 2018–2026. [Google Scholar] [CrossRef] [PubMed] [Green Version]
- Ewins, D.; Collins, T. Clinical Gait Analysis. In Clinical Engineering; Taktak, A., Ganney, P., Long, D., White, P., Eds.; Academic Press: Oxford, UK, 2014; pp. 389–406. [Google Scholar]
- Maki, B.E. Gait changes in older adults: Predictors of falls or indicators of fear? J. Am. Geriatr. Soc. 1997, 45, 313–320. [Google Scholar] [CrossRef] [PubMed]
- Pieruccini-Faria, F.; Montero-Odasso, M.; Hausdorff, J.M. Gait variability and fall risk in older adults: The role of cognitive function. In Falls and Cognition in Older Persons: Fundamentals, Assessment and Therapeutic Options; Montero-Odasso, M., Camicioli, R., Eds.; Springer: Cham, Switzerland, 2020; pp. 107–138. [Google Scholar]
- Hausdorff, J.M.; Cudkowicz, M.E.; Firtion, R.; Wei, J.Y.; Goldberger, A.L. Gait variability and basal ganglia disorders: Stride-to-stride variations of gait cycle timing in Parkinson’s disease and Huntington’s disease. Mov. Disord. 1998, 13, 428–437. [Google Scholar] [CrossRef] [PubMed]
- Robinovitch, S.N.; Feldman, F.; Yang, Y.; Schonnop, R.; Leung, P.M.; Sarraf, T.; Sims-Gould, J.; Loughin, M. Video capture of the circumstances of falls in elderly people residing in long-term care: An observational study. Lancet 2013, 381, 47–54. [Google Scholar] [CrossRef] [PubMed] [Green Version]
- Rubenstein, L.Z.; Josephson, K.R. The epidemiology of falls and syncope. Clin. Geriatr. Med. 2002, 18, 141–158. [Google Scholar] [CrossRef]
- United Nations Department of Economic and Social Affairs—Population Division. World Population Prospects. Available online: https://population.un.org/wpp/publications/ (accessed on 28 October 2022).
- Allali, G.; Launay, C.P.; Blumen, H.M.; Callisaya, M.L.; De Cock, A.-M.; Kressig, R.W.; Srikanth, V.; Steinmetz, J.-P.; Verghese, J.; Beauchet, O.; et al. Falls, cognitive impairment, and gait performance: Results from the GOOD initiative. J. Am. Med. Dir. Assoc. 2017, 18, 335–340. [Google Scholar] [CrossRef] [Green Version]
- Mulas, I.; Putzu, V.; Asoni, G.; Viale, D.; Mameli, I.; Pau, M. Clinical assessment of gait and functional mobility in Italian healthy and cognitively impaired older persons using wearable inertial sensors. Aging Clin. Exp. Res. 2021, 33, 1853–1864. [Google Scholar] [CrossRef]
- Bock, J.-O.; König, H.-H.; Brenner, H.; Haefeli, W.E.; Quinzler, R.; Matschinger, H.; Saum, K.-U.; Schöttker, B.; Heider, D. Associations of frailty with health care costs—Results of the ESTHER cohort study. BMC Health Serv. Res. 2016, 16, 128. [Google Scholar] [CrossRef] [Green Version]
- Peetoom, K.K.B.; Lexis, M.A.S.; Joore, M.; Dirksen, C.D.; De Witte, L.P. Literature review on monitoring technologies and their outcomes in independently living elderly people. Disabil. Rehabil. Assist. Technol. 2015, 10, 271–294. [Google Scholar] [CrossRef]
- World Health Organisation. Integrated Care for Older People: Guidelines on Community-Level Interventions to Manage Declines in Intrinsic Capacity. Available online: https://apps.who.int/iris/handle/10665/258981 (accessed on 28 October 2022).
- Rudisch, J.; Jöllenbeck, T.; Vogt, L.; Cordes, T.; Klotzbier, T.J.; Vogel, O.; Wollesen, B. Agreement and consistency of five different clinical gait analysis systems in the assessment of spatiotemporal gait parameters. Gait Posture 2021, 85, 55–64. [Google Scholar] [CrossRef] [PubMed]
- Prasanth, H.; Caban, M.; Keller, U.; Courtine, G.; Ijspeert, A.; Vallery, H.; von Zitzewitz, J. Wearable sensor-based real-time gait detection: A systematic review. Sensors 2021, 21, 2727. [Google Scholar] [CrossRef]
- Clark, R.A.; Mentiplay, B.F.; Hough, E.; Pua, Y.H. Three-dimensional cameras and skeleton pose tracking for physical function assessment: A review of uses, validity, current developments and Kinect alternatives. Gait Posture 2019, 68, 193–200. [Google Scholar] [CrossRef] [PubMed]
- Otte, K.; Kayser, B.; Mansow-Model, S.; Verrel, J.; Paul, F.; Brandt, A.U.; Schmitz-Hübsch, T. Accuracy and reliability of the Kinect Version 2 for clinical measurement of motor function. PLoS ONE 2016, 11, e0166532. [Google Scholar] [CrossRef] [PubMed] [Green Version]
- Ferraris, C.; Cimolin, V.; Vismara, L.; Votta, V.; Amprimo, G.; Cremascoli, R.; Galli, M.; Nerino, R.; Mauro, A.; Priano, L. Monitoring of gait parameters in post-stroke individuals: A feasibility study using RGB-D sensors. Sensors 2021, 21, 5945. [Google Scholar] [CrossRef]
- Dubois, A.; Bihl, T.; Bresciani, J.-P. Automatic measurement of fall risk indicators in timed up and go test. Inform. Health Soc. Care 2019, 44, 237–245. [Google Scholar] [CrossRef]
- Dubois, A.; Bihl, T.; Bresciani, J.-P. Identifying fall risk predictors by monitoring daily activities at home using a depth sensor coupled to machine learning algorithms. Sensors 2021, 21, 1957. [Google Scholar] [CrossRef]
- Guffanti, D.; Brunete, A.; Hernando, M. Non-invasive multi camera gait analysis system and its application to gender classification. IEEE Access 2020, 8, 95734–95746. [Google Scholar] [CrossRef]
- Chaparro-Rico, B.D.M.; Cafolla, D. Test-retest, inter-rater and intra-rater reliability for spatiotemporal gait parameters using SANE (an eaSy gAit aNalysis systEm) as measuring instrument. Appl. Sci. 2020, 10, 5781. [Google Scholar] [CrossRef]
- Cui, X.; Zhao, Z.; Ma, C.; Chen, F.; Liao, H. A gait character analyzing system for osteoarthritis pre-diagnosis using RGB-D camera and supervised classifier. In Proceedings of the World Congress on Medical Physics and Biomedical Engineering, Prague, Czech Republic, 3–8 June 2018; pp. 297–301. [Google Scholar]
- Burle, A.D.; Lafayette, T.B.D.; Fonseca, J.R.; Teichrieb, V.; Da Gama, A.E.F. Real-time approach for gait analysis using the Kinect v2 sensor for clinical assessment purpose. In Proceedings of the 22nd Symposium on Virtual and Augmented Reality (SVR), Porto de Galinhas, Brazil, 7–10 November 2020; pp. 144–153. [Google Scholar]
- Castaño-Pino, Y.J.; González, M.C.; Quintana-Peña, V.; Valderrama, J.; Muñoz, B.; Orozco, J.; Navarro, A. Automatic gait phases detection in Parkinson disease: A comparative study. In Proceedings of the 42nd Annual International Conference of the IEEE Engineering in Medicine and Biology Society, Montreal, QC, Canada, 20–24 July 2020; pp. 798–802. [Google Scholar]
- Albert, J.A.; Owolabi, V.; Gebel, A.; Brahms, C.M.; Granacher, U.; Arnrich, B. Evaluation of the pose tracking performance of the Azure Kinect and Kinect v2 for gait analysis in comparison with a gold standard: A pilot study. Sensors 2020, 20, 5104. [Google Scholar] [CrossRef]
- Vilas-Boas, M.d.C.; Rocha, A.P.; Choupina, H.M.P.; Cardoso, M.N.; Fernandes, J.M.; Coelho, T.; Cunha, J.P.S. Validation of a single RGB-D camera for gait assessment of polyneuropathy patients. Sensors 2019, 19, 4929. [Google Scholar] [CrossRef] [PubMed] [Green Version]
- Rocha, A.P.; Choupina, H.M.P.; Vilas-Boas, M.D.C.; Fernandes, J.M.; Cunha, J.P.S. System for automatic gait analysis based on a single RGB-D camera. PLoS ONE 2018, 13, 24. [Google Scholar] [CrossRef] [PubMed] [Green Version]
- Atanasov, M.; Kampel, M. Automated determination of gait parameters using depth based person tracking. In Proceedings of the 16th IEEE International Conference on Embedded and Ubiquitous Computing, Bucharest, Romania, 29–31 October 2018; pp. 96–102. [Google Scholar]
- Geerse, D.; Coolen, B.; Kolijn, D.; Roerdink, M. Validation of foot placement locations from ankle data of a Kinect v2 sensor. Sensors 2017, 17, 2301. [Google Scholar] [CrossRef] [PubMed] [Green Version]
- Auvinet, E.; Multon, F.; Manning, V.; Meunier, J.; Cobb, J.P. Validity and sensitivity of the longitudinal asymmetry index to detect gait asymmetry using Microsoft Kinect data. Gait Posture 2017, 51, 162–168. [Google Scholar] [CrossRef]
- Amini, A.; Banitsas, K.; Hosseinzadeh, S. A new technique for foot-off and foot contact detection in a gait cycle based on the knee joint angle using Microsoft Kinect v2. In Proceedings of the 4th IEEE EMBS International Conference on Biomedical and Health Informatics (BHI), Orlando, FL, USA, 16–19 February 2017; pp. 153–156. [Google Scholar]
- Xu, X.; McGorry, R.W.; Chou, L.-S.; Lin, J.-H.; Chang, C.-C. Accuracy of the Microsoft Kinect™ for measuring gait parameters during treadmill walking. Gait Posture 2015, 42, 145–151. [Google Scholar] [CrossRef] [Green Version]
- Hynes, A.; Czarnuch, S.; Kirkland, M.C.; Ploughman, M. Spatiotemporal gait measurement with a side-view depth sensor using human joint proposals. IEEE J. Biomed. Health Inform. 2021, 25, 1758–1769. [Google Scholar] [CrossRef]
- Latorre, J.; Llorens, R.; Colomer, C.; Alcañiz, M. Reliability and comparison of Kinect-based methods for estimating spatiotemporal gait parameters of healthy and post-stroke individuals. J. Biomech. 2018, 72, 268–273. [Google Scholar] [CrossRef]
- Valencia-Jimenez, N.; Leal-Junior, A.; Avellar, L.; Vargas-Valencia, L.; Caicedo-Rodriguez, P.; Ramirez-Duque, A.A.; Lyra, M.; Marques, C.; Bastos, T.; Frizera, A. A comparative study of markerless systems based on color-depth cameras, polymer optical fiber curvature sensors, and inertial measurement units: Towards increasing the accuracy in joint angle estimation. Electronics 2019, 8, 173. [Google Scholar] [CrossRef] [Green Version]
- Pathegama, M.P.; Marasinghe, D.M.; Wijayasekara, K.; Karunanayake, I.; Edussooriya, C.U.S.; Silva, P.; Rodrigo, R. Moving Kinect-based gait analysis with increased range. In Proceedings of the IEEE International Conference on Systems, Man, and Cybernetics, Miyazaki, Japan, 7–10 October 2018; pp. 4126–4131. [Google Scholar]
- Andre, J.; Lopes, J.; Palermo, M.; Goncalves, D.; Matias, A.; Pereira, F.; Afonso, J.; Seabra, E.; Cerqueira, J.; Santos, C. Markerless gait analysis vision system for real-time gait monitoring. In Proceedings of the IEEE International Conference on Autonomous Robot Systems and Competitions, Ponta Delgada, Portugal, 15–16 April 2020; pp. 269–274. [Google Scholar]
- Paulo, J.; Asvadi, A.; Peixoto, P.; Amorim, P. Human gait pattern changes detection system: A multimodal vision-based and novelty detection learning approach. Biocybern. Biomed. Eng. 2017, 37, 701–717. [Google Scholar] [CrossRef]
- Wagner, J.; Morawski, R.Z.; Mazurek, P. Non-Invasive Monitoring of Elderly Persons: Systems Based on Impulse-Radar Sensors and Depth Sensors; Springer: Cham, Switzerland, 2022. [Google Scholar]
- Dubois, A.; Charpillet, F. Measuring frailty and detecting falls for elderly home care using depth camera. J. Ambient. Intell. Smart Environ. 2017, 9, 469–481. [Google Scholar] [CrossRef]
- Kırcalı, D.; Tek, F.B. Ground plane detection using an RGB-D sensor. In Proceedings of the 29th International Symposium on Computer and Information Sciences, Kraków, Poland, 27–28 October 2014; pp. 69–77. [Google Scholar]
- Pterneas, V. Floor Detection Using Kinect. Available online: https://pterneas.com/2017/09/10/floor-kinect/ (accessed on 6 November 2022).
- Auvinet, E.; Multon, F.; Meunier, J. Lower limb movement asymmetry measurement with a depth camera. In Proceedings of the 2012 Annual International Conference of the IEEE Engineering in Medicine and Biology Society, San Diego, CA, USA, 28 August–1 September 2012; pp. 6793–6796. [Google Scholar]
- zebris Medical GmbH. zebris FDM Software Manual. Available online: https://www.zebris.de/fileadmin/Editoren/zebris-PDF-Manuals/Medizin/Software/Alte_Versionen/Manual_zebris_FDM_1.16.x_R1_EN_web.pdf (accessed on 3 November 2022).
- Dubois, A.; Charpillet, F. A gait analysis method based on a depth camera for fall prevention. In Proceedings of the 36th Annual International Conference of the IEEE Engineering in Medicine and Biology Society, Chicago, IL, USA, 26–30 August 2014; pp. 4515–4518. [Google Scholar]
- Mazurek, P.; Wagner, J.; Morawski, R.Z. Use of kinematic and mel-cepstrum-related features for fall detection based on data from infrared depth sensors. Biomed. Signal Process. Control. 2018, 40, 102–110. [Google Scholar] [CrossRef]
- MathWorks. MATLAB Smooth. Available online: https://www.mathworks.com/help/curvefit/smooth.html (accessed on 8 November 2022).
- MathWorks. MATLAB Islocalmin. Available online: https://www.mathworks.com/help/matlab/ref/islocalmin.html (accessed on 8 November 2022).
- MathWorks. MATLAB Islocalmax. Available online: https://www.mathworks.com/help/matlab/ref/islocalmax.html (accessed on 8 November 2022).
- Steinert, A.; Sattler, I.; Otte, K.; Röhling, H.; Mansow-Model, S.; Müller-Werdan, U. Using New Camera-Based Technologies for Gait Analysis in Older Adults in Comparison to the Established GAITRite System. Sensors 2020, 20, 125. [Google Scholar] [CrossRef] [Green Version]
- Dubois, A.; Bresciani, J.-P. Validation of an ambient system for the measurement of gait parameters. J. Biomech. 2018, 69, 175–180. [Google Scholar] [CrossRef] [Green Version]
- Vilas-Boas, M.d.C.; Rocha, A.P.; Cardoso, M.N.; Fernandes, J.M.; Coelho, T.; Cunha, J.P.S. Supporting the assessment of hereditary transthyretin amyloidosis patients based on 3-D gait analysis and machine learning. IEEE Trans. Neural Syst. Rehabil. Eng. 2021, 29, 1350–1362. [Google Scholar] [CrossRef]
- Thilo, F.J.S.; Hahn, S.; Halfens, R.J.G.; Schols, J.M.G.A. Usability of a wearable fall detection prototype from the perspective of older people: A real field testing approach. J. Clin. Nurs. 2019, 28, 310–320. [Google Scholar] [CrossRef]
Subject | Age [Years] | Gender | Weight [kg] | Height [cm] | Clothing |
---|---|---|---|---|---|
#1 | 33 | m | 85 | 188 | typical trousers |
#2 | 22 | f | 58 | 163 | typical trousers |
#3 | 25 | m | 82 | 175 | typical trousers |
#4 | 42 | f | 74 | 165 | wide-leg trousers |
#5 | 45 | f | 58 | 168 | skirt |
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content. |
© 2023 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
Wagner, J.; Szymański, M.; Błażkiewicz, M.; Kaczmarczyk, K. Methods for Spatiotemporal Analysis of Human Gait Based on Data from Depth Sensors. Sensors 2023, 23, 1218. https://doi.org/10.3390/s23031218
Wagner J, Szymański M, Błażkiewicz M, Kaczmarczyk K. Methods for Spatiotemporal Analysis of Human Gait Based on Data from Depth Sensors. Sensors. 2023; 23(3):1218. https://doi.org/10.3390/s23031218
Chicago/Turabian StyleWagner, Jakub, Marcin Szymański, Michalina Błażkiewicz, and Katarzyna Kaczmarczyk. 2023. "Methods for Spatiotemporal Analysis of Human Gait Based on Data from Depth Sensors" Sensors 23, no. 3: 1218. https://doi.org/10.3390/s23031218
APA StyleWagner, J., Szymański, M., Błażkiewicz, M., & Kaczmarczyk, K. (2023). Methods for Spatiotemporal Analysis of Human Gait Based on Data from Depth Sensors. Sensors, 23(3), 1218. https://doi.org/10.3390/s23031218