Can Machine Learning Enhance Computer Vision-Predicted Wrist Kinematics Determined from a Low-Cost Motion Capture System?
Abstract
:1. Introduction
2. Materials and Methods
2.1. Experimental Procedures
2.2. Data Preprocessing
2.2.1. Video Editing
2.2.2. VICON Data Processing
2.2.3. K-Fold Cross Validation
2.2.4. DeepLabCut Data Labeling and Model Training
2.2.5. Anipose 3D Data Labeling and 3D Dataset
2.3. Wrist Angle Prediction with a Multi-Camera System and Open-Source Software (3DCV)
2.4. Wrist Angle Prediction Using Machine Learning and a Single Camera (2DML)
2.5. Statistical Analysis
3. Results
3.1. Wrist Angle Prediction Using a Multi-Camera System and Open-Source Software (3DCV)
3.2. Wrist Angle Prediction Using Machine Learning and a Multi-Camera System (3DML)
3.3. Wrist Angle Prediction Using Machine Learning and a Single-Camera System (2DML)
4. Discussion
Author Contributions
Funding
Institutional Review Board Statement
Informed Consent Statement
Data Availability Statement
Conflicts of Interest
References
- Blatter, B.M.; Bongers, P.M. Work Related Neck and Upper Limb Symptoms (RSI): High Risk Occupations and Risk Factors in the Dutch Working Population; Technical report; Netherlands Organization for Applied Scientific Research: Hague, The Netherlands, 1999; Available online: https://publications.tno.nl/publication/34613606/62U9cq/blatter-1999-workrelated.pdf (accessed on 17 December 2023).
- Palmer, K.T. Pain in the forearm, wrist and hand. Best Pract. Res. Clin. Rheumatol. 2003, 17, 113–135. [Google Scholar] [CrossRef]
- You, D.; Smith, A.H.; Rempel, D. Meta-Analysis: Association Between Wrist Posture and Carpal Tunnel Syndrome Among Workers. Saf. Health Work 2014, 5, 27–31. [Google Scholar] [CrossRef] [PubMed]
- Institute of Musculoskeletal Health and Arthritis. Strategic Plan 2014–2018, 2014. ISBN 978-1-100-25019-9. Available online: https://publications.gc.ca/site/eng/472758/publication.html (accessed on 1 November 2023).
- Tjepkema, M. Repetitive strain injury. Health Rep. 2003, 14, 11–30. [Google Scholar] [PubMed]
- Marín, J.; Marín, J.J. Forces: A Motion Capture-Based Ergonomic Method for the Today’s World. Sensors 2021, 21, 5139. [Google Scholar] [CrossRef]
- Caimmi, M.; Carda, S.; Giovanzana, C.; Maini, E.S.; Sabatini, A.M.; Smania, N.; Molteni, F. Using Kinematic Analysis to Evaluate Constraint-Induced Movement Therapy in Chronic Stroke Patients. Neurorehabilit. Neural Repair 2008, 22, 31–39. [Google Scholar] [CrossRef]
- Engdahl, S.M.; Gates, D.H. Reliability of upper limb and trunk joint angles in healthy adults during activities of daily living. Gait Posture 2018, 60, 41–47. [Google Scholar] [CrossRef]
- Henmi, S.; Yonenobu, K.; Masatomi, T.; Oda, K. A biomechanical study of activities of daily living using neck and upper limbs with an optical three-dimensional motion analysis system. Mod. Rheumatol. 2006, 16, 289–293. [Google Scholar] [CrossRef] [PubMed]
- Kontson, K.L.; Wang, S.; Barovsky, S.; Bloomer, C.; Wozniczka, L.; Civillico, E.F. Assessing kinematic variability during performance of Jebsen-Taylor Hand Function Test. J. Hand Ther. 2020, 33, 34–44. [Google Scholar] [CrossRef]
- Mackey, A.H.; Walt, S.E.; Lobb, G.A.; Stott, N.S. Reliability of upper and lower limb three-dimensional kinematics in children with hemiplegia. Gait Posture 2005, 22, 1–9. [Google Scholar] [CrossRef]
- Colyer, S.L.; Evans, M.; Cosker, D.P.; Salo, A.I.T. A Review of the Evolution of Vision-Based Motion Analysis and the Integration of Advanced Computer Vision Methods Towards Developing a Markerless System. Sport. Med.-Open 2018, 4, 24. [Google Scholar] [CrossRef]
- Harsted, S.; Holsgaard-Larsen, A.; Hestbæk, L.; Boyle, E.; Lauridsen, H.H. Concurrent validity of lower extremity kinematics and jump characteristics captured in pre-school children by a markerless 3D motion capture system. Chiropr. Man. Ther. 2019, 27, 39. [Google Scholar] [CrossRef]
- Schurr, S.A.; Marshall, A.N.; Resch, J.E.; Saliba, S.A. Two-dimensional video analysis is comparable to 3D motion capture in lower extremity movement assessment. Int. J. Sport. Phys. Ther. 2017, 12, 163–172. [Google Scholar] [PubMed] [PubMed Central]
- Simon, S.R. Quantification of human motion: Gait analysis—Benefits and limitations to its application to clinical problems. J. Biomech. 2004, 37, 1869–1880. [Google Scholar] [CrossRef] [PubMed]
- Kanko, R.M.; Laende, E.K.; Davis, E.M.; Selbie, W.S.; Deluzio, K.J. Concurrent assessment of gait kinematics using marker-based and markerless motion capture. J. Biomech. 2021, 127, 110665. [Google Scholar] [CrossRef]
- Riazati, S.; McGuirk, T.E.; Perry, E.S.; Sihanath, W.B.; Patten, C. Absolute Reliability of Gait Parameters Acquired With Markerless Motion Capture in Living Domains. Front. Hum. Neurosci. 2022, 16, 867474. [Google Scholar] [CrossRef]
- Lahkar, B.K.; Muller, A.; Dumas, R.; Reveret, L.; Robert, T. Accuracy of a markerless motion capture system in estimating upper extremity kinematics during boxing. Front. Sport. Act. Living 2022, 4, 939980. [Google Scholar] [CrossRef]
- Wade, L.; Needham, L.; McGuigan, P.; Bilzon, J. Applications and limitations of current markerless motion capture methods for clinical gait biomechanics. PeerJ 2022, 10, e12995. [Google Scholar] [CrossRef]
- Fang, H.S.; Li, J.; Tang, H.; Xu, C.; Zhu, H.; Xiu, Y.; Li, Y.L.; Lu, C. AlphaPose: Whole-Body Regional Multi-Person Pose Estimation and Tracking in Real-Time. IEEE Trans. Pattern Anal. Mach. Intell. 2023, 45, 7157–7173. [Google Scholar] [CrossRef]
- Cao, Z.; Hidalgo, G.; Simon, T.; Wei, S.E.; Sheikh, Y. OpenPose: Realtime Multi-Person 2D Pose Estimation Using Part Affinity Fields. IEEE Trans. Pattern Anal. Mach. Intell. 2021, 43, 172–186. [Google Scholar] [CrossRef]
- Jeong, S.o.; Kook, J. CREBAS: Computer-Based REBA Evaluation System for Wood Manufacturers Using MediaPipe. Appl. Sci. 2023, 13, 938. [Google Scholar] [CrossRef]
- Mathis, A.; Mamidanna, P.; Cury, K.M.; Abe, T.; Murthy, V.N.; Mathis, M.W.; Bethge, M. DeepLabCut: Markerless pose estimation of user-defined body parts with deep learning. Nat. Neurosci. 2018, 21, 1281–1289. [Google Scholar] [CrossRef] [PubMed]
- Karashchuk, P.; Rupp, K.L.; Dickinson, E.S.; Walling-Bell, S.; Sanders, E.; Azim, E.; Brunton, B.W.; Tuthill, J.C. Anipose: A toolkit for robust markerless 3D pose estimation. Cell Rep. 2021, 36, 109730. [Google Scholar] [CrossRef] [PubMed]
- He, K.; Zhang, X.; Ren, S.; Sun, J. Deep Residual Learning for Image Recognition. In Proceedings of the 2016 IEEE Conference on Computer Vision and Pattern Recognition (CVPR), Las Vegas, NV, USA, 27–30 June 2016; pp. 770–778. [Google Scholar] [CrossRef]
- Lei, M.; Wang, Z.; Chen, F. Ballet Form Training Based on MediaPipe Body Posture Monitoring. J. Phys. Conf. Ser. 2023, 2637, 012019. [Google Scholar] [CrossRef]
- Insafutdinov, E.; Pishchulin, L.; Andres, B.; Andriluka, M.; Schiele, B. DeeperCut: A Deeper, Stronger, and Faster Multi-person Pose Estimation Model. In Computer Vision—ECCV 2016; Leibe, B., Matas, J., Sebe, N., Welling, M., Eds.; Springer International Publishing: Cham, Switzerland, 2016; Volume 9910, pp. 34–50. [Google Scholar] [CrossRef]
- Pishchulin, L.; Insafutdinov, E.; Tang, S.; Andres, B.; Andriluka, M.; Gehler, P.; Schiele, B. DeepCut: Joint Subset Partition and Labeling for Multi Person Pose Estimation. In Proceedings of the 2016 IEEE Conference on Computer Vision and Pattern Recognition (CVPR), Las Vegas, NV, USA, 27–30 June 2016; pp. 4929–4937. [Google Scholar] [CrossRef]
- Fiker, R.; Kim, L.H.; Molina, L.A.; Chomiak, T.; Whelan, P.J. Visual Gait Lab: A user-friendly approach to gait analysis. J. Neurosci. Methods 2020, 341, 108775. [Google Scholar] [CrossRef]
- Gupta, A.; Shrestha, P.L.; Thapa, B.; Silwal, R.; Shrestha, R. Knee Flexion/Extension Angle Measurement for Gait Analysis Using Machine Learning Solution “MediaPipe Pose” and Its Comparison with Kinovea. IOP Conf. Ser. Mater. Sci. Eng. 2023, 1279, 012004. [Google Scholar] [CrossRef]
- Stenum, J.; Rossi, C.; Roemmich, R.T. Two-dimensional video-based analysis of human gait using pose estimation. PLoS Comput. Biol. 2021, 17, e1008935. [Google Scholar] [CrossRef]
- Tang, Y.-m.; Wang, Y.-H.; Feng, X.-Y.; Zou, Q.-S.; Wang, Q.; Ding, J.; Shi, R.C.-J.; Wang, X. Diagnostic value of a vision-based intelligent gait analyzer in screening for gait abnormalities. Gait Posture 2022, 91, 205–211. [Google Scholar] [CrossRef]
- Fan, J.; Gu, F.; Lv, L.; Zhang, Z.; Zhu, C.; Qi, J.; Wang, H.; Liu, X.; Yang, J.; Zhu, Q. Reliability of a human pose tracking algorithm for measuring upper limb joints: Comparison with photography-based goniometry. BMC Musculoskelet. Disord. 2022, 23, 877. [Google Scholar] [CrossRef]
- Goulermas, J.Y.; Howard, D.; Nester, C.J.; Jones, R.K.; Ren, L. Regression Techniques for the Prediction of Lower Limb Kinematics. J. Biomech. Eng. 2005, 127, 1020–1024. [Google Scholar] [CrossRef]
- Ren, S.; Wang, W.; Hou, Z.G.; Chen, B.; Liang, X.; Wang, J.; Peng, L. Personalized gait trajectory generation based on anthropometric features using Random Forest. J. Ambient. Intell. Humaniz. Comput. 2019, 14, 15597–15608. [Google Scholar] [CrossRef]
- Sivakumar, S.; Gopalai, A.A.; Lim, K.H.; Gouwanda, D. Artificial neural network based ankle joint angle estimation using instrumented foot insoles. Biomed. Signal Process. Control 2019, 54, 101614. [Google Scholar] [CrossRef]
- Garrido-Jurado, S.; Muñoz Salinas, R.; Madrid-Cuevas, F.; Marín-Jiménez, M. Automatic generation and detection of highly reliable fiducial markers under occlusion. Pattern Recognit. 2014, 47, 2280–2292. [Google Scholar] [CrossRef]
- Zhang, Z. A flexible new technique for camera calibration. IEEE Trans. Pattern Anal. Mach. Intell. 2000, 22, 1330–1334. [Google Scholar] [CrossRef]
- HAS-Motion. Marker Set Guidelines—Visual3D Wiki Documentation. 2024. Available online: https://wiki.has-motion.com/index.php/Marker_Set_Guidelines#References (accessed on 17 December 2023).
- Wu, G.; Van Der Helm, F.C.; DirkJan Veeger, H.; Makhsous, M.; Van Roy, P.; Anglin, C.; Nagels, J.; Karduna, A.R.; McQuade, K.; Wang, X.; et al. ISB recommendation on definitions of joint coordinate systems of various joints for the reporting of human joint motion–Part II: Shoulder, elbow, wrist and hand. J. Biomech. 2005, 38, 981–992. [Google Scholar] [CrossRef]
- Young, C.; DeDecker, S.; Anderson, D.; Oliver, M.L.; Gordon, K.D. Accuracy of a Low-Cost 3D-Printed Wearable Goniometer for Measuring Wrist Motion. Sensors 2021, 21, 4799. [Google Scholar] [CrossRef] [PubMed]
- Finstad, M. Mifi/Lossless-Cut, Original-Date: 30 October 2016. 2023. Available online: https://github.com/mifi/lossless-cut (accessed on 17 December 2023).
- Hillstrom, H.J.; Garg, R.; Kraszewski, A.; Lenhoff, M.; Carter, T.; Backus, S.I.; Wolff, A.; Syrkin, G.; Cheng, R.; Wolfe, S.W. Development of an Anatomical Wrist Joint Coordinate System to Quantify Motion During Functional Tasks. J. Appl. Biomech. 2014, 30, 586–593. [Google Scholar] [CrossRef]
- DeepLabCut. DeepLabCut User Guide (for Single Animal Projects). 2023. Available online: https://deeplabcut.github.io/DeepLabCut/docs/standardDeepLabCut_UserGuide.html (accessed on 6 February 2024).
- Anipose. Anipose Tutorial—Anipose 0.8.1 Documentation. 2020. Available online: https://anipose.readthedocs.io/en/latest/tutorial.html (accessed on 17 December 2023).
- Kingma, D.P.; Ba, J. Adam: A Method for Stochastic Optimization. arXiv 2017, arXiv:1412.6980. [Google Scholar]
- Saarela, M.; Jauhiainen, S. Comparison of feature importance measures as explanations for classification models. SN Appl. Sci. 2021, 3, 272. [Google Scholar] [CrossRef]
- Munn, M.; Pitman, D. Explainable AI for Practitioners: Designing and Implementing Explainable ML Solutions; O’Reilly: Beijing, China; Sebastopol, CA, USA,, 2022. [Google Scholar]
- McHugh, B.P.; Morton, A.M.; Akhbari, B.; Molino, J.; Crisco, J.J. Accuracy of an electrogoniometer relative to optical motion tracking for quantifying wrist range of motion. J. Med. Eng. Technol. 2020, 44, 49–54. [Google Scholar] [CrossRef]
- McKinnon, C.D.; Ehmke, S.; Kociolek, A.M.; Callaghan, J.P.; Keir, P.J. Wrist Posture Estimation Differences and Reliability Between Video Analysis and Electrogoniometer Methods. Hum. Factors J. Hum. Factors Ergon. Soc. 2021, 63, 1284–1294. [Google Scholar] [CrossRef]
- Kidzínski, L.; Yang, B.; Hicks, J.L.; Rajagopal, A.; Delp, S.L.; Schwartz, M.H. Deep neural networks enable quantitative movement analysis using single-camera videos. Nat. Commun. 2020, 11, 4054. [Google Scholar] [CrossRef] [PubMed]
DeepLabCut Marker | Anatomical Landmark |
---|---|
mcp2_right | Second metacarpal |
mcp2_right | Third metacarpal |
mcp5_right | Fifth metacarpal |
ulna_right | Ulnar styloid process |
radial_right | Radial styloid process |
centre_wrist | Dorsal side of wrist centered between ulnar and radial markers |
lat_elbow | Humeral lateral epicondyle |
med_elbow | Humeral medial epicondyle |
centre_elbow | 1” proximal to olecranon fossa |
Arm Position | Wrist Posture and Range of Motion Tasks | |||
---|---|---|---|---|
90 from torso, pointing to front camera | 3 cycles of a radial/ ulnar deviation range of motion in pronated posture | 3 cycles of a flexion/ extension range of motion in pronated posture | 3 cycles of a radial/ ulnar deviation range of motion in handshake posture | 3 cycles of a flexion/ extension range of motion in handshake posture |
90 from torso, pointing to oblique camera | ||||
90 from torso, pointing to right-side camera | ||||
45 from torso, pointing to front camera | ||||
45 from torso, pointing to oblique camera | ||||
45 from torso, pointing to right-side camera |
Model | Shuffle 1 MAE (°) | Shuffle 2 MAE (°) | Shuffle 3 MAE (°) | Shuffle 4 MAE (°) | Mean MAE ± SD (°) | |||||
---|---|---|---|---|---|---|---|---|---|---|
FE | RUD | FE | RUD | FE | RUD | FE | RUD | FE | RUD | |
Linear Regression | 10.4 | 7.2 | 13.7 | 8.0 | 11.7 | 8.2 | 11.0 | 6.5 | 11.7 ± 1.4 (B) | 7.5 ± 0.8 (B) |
Quadratic Regression | 8.7 | 5.8 | 10.6 | 9.4 | 11.3 | 7.4 | 10.2 | 6.1 | (B,C) | (B,C) |
Cubic Regression | 36.2 | 21.8 | 83.3 | 74.1 | 142.9 | 134.2 | 40.8 | 28.3 | (A) | (A) |
XGB Regressor | 7.8 | 6.2 | 8.4 | 6.3 | 9.0 | 5.9 | 9.5 | 6.1 | (B,C) | (B,C) |
Support Vector Regressor | 6.6 | 5.6 | 9.4 | 6.2 | 7.5 | 4.9 | 8.2 | 5.2 | (B,C) | (C) |
RBF Sampler and Ridge Regression | 10.3 | 6.8 | 10.9 | 7.2 | 11.4 | 7.5 | 11.0 | 6.3 | (B,C) | (B,C) |
Neural Network | 5.9 | 5.2 | 8.0 | 6.1 | 7.6 | 5.2 | 7.4 | 5.4 | (C) | (C) |
Feature Importance Rank | 3DML Flexion/Extension | 3DML Radial-Ulnar Deviation | 2DML Flexion/Extension | 2DML Radial-Ulnar Deviation |
---|---|---|---|---|
1 | mcp2_z | mcp2_x | mcp3_y | mcp2_y |
2 | mcp2_x | mcp3_x | centre_wrist_y | mcp5_y |
3 | centre_wrist_z | centre_wrist_x | mcp2_y | mcp3_x |
4 | ulna_z | ulna_z | mcp3_x | radial_y |
5 | mcp2_y | ulna_x | ulna_y | mcp2_x |
Model | Shuffle 1 MAE (°) | Shuffle 2 MAE (°) | Shuffle 3 MAE (°) | Shuffle 4 MAE (°) | Mean MAE ± SD (°) | |||||
---|---|---|---|---|---|---|---|---|---|---|
FE | RUD | FE | RUD | FE | RUD | FE | RUD | FE | RUD | |
Linear Regression | 15.1 | 9.5 | 16.9 | 9.7 | 18.8 | 9.1 | 18.6 | 8.9 | 17.4 ± 1.7 (B) | 9.3 ± 0.4 (B,C) |
Quadratic Regression | 14.2 | 9.0 | 24.5 | 12.5 | 17.7 | 10.3 | 15.9 | 8.8 | 18.1 ± 4.5 (B) | 10.2 ± 1.7 (B,C) |
Cubic Regression | 28.9 | 13.4 | 89.0 | 75.6 | 68.5 | 56.2 | 57.0 | 40.7 | 61 ± 25 (A) | 47 ± 26 (A) |
XGB Regressor | 14.8 | 9.4 | 16.6 | 9.9 | 14.0 | 8.1 | 14.9 | 80 | 15 ± 1.1 (B,C) | 27 ± 0.9 (A,B) |
Support Vector Regressor | 13.8 | 8.8 | 16.9 | 9.7 | 18.4 | 8.9 | 17.8 | 8.8 | 16.7 ± 2.0 (B) | 9.1 ± 0.4 (B,C) |
RBF Sampler and Ridge Regression | 14.1 | 9.5 | 16.9 | 9.6 | 15.8 | 9.1 | 15.4 | 8.6 | 15.6 ± 1.2 (B,C) | 9.2 ± 0.5 (B,C) |
Neural Network | 10.9 | 7.5 | 13.8 | 5.2 | 12.1 | 7.1 | 11.9 | 6.9 | 12.2 ± 1.2 (C) | 6.7 ± 1.0 (C) |
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content. |
© 2025 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
Carriere, J.; Oliver, M.L.; Hamilton-Wright, A.; Young, C.; Gordon, K.D. Can Machine Learning Enhance Computer Vision-Predicted Wrist Kinematics Determined from a Low-Cost Motion Capture System? Appl. Sci. 2025, 15, 3552. https://doi.org/10.3390/app15073552
Carriere J, Oliver ML, Hamilton-Wright A, Young C, Gordon KD. Can Machine Learning Enhance Computer Vision-Predicted Wrist Kinematics Determined from a Low-Cost Motion Capture System? Applied Sciences. 2025; 15(7):3552. https://doi.org/10.3390/app15073552
Chicago/Turabian StyleCarriere, Joel, Michele L. Oliver, Andrew Hamilton-Wright, Calvin Young, and Karen D. Gordon. 2025. "Can Machine Learning Enhance Computer Vision-Predicted Wrist Kinematics Determined from a Low-Cost Motion Capture System?" Applied Sciences 15, no. 7: 3552. https://doi.org/10.3390/app15073552
APA StyleCarriere, J., Oliver, M. L., Hamilton-Wright, A., Young, C., & Gordon, K. D. (2025). Can Machine Learning Enhance Computer Vision-Predicted Wrist Kinematics Determined from a Low-Cost Motion Capture System? Applied Sciences, 15(7), 3552. https://doi.org/10.3390/app15073552