Vision-Based Grasping Method for Prosthetic Hands via Geometry and Symmetry Axis Recognition
Abstract
:1. Introduction
2. Division of Grasping Mode
3. Feature Based Grasping Method
3.1. Object Geometry Recognition
3.1.1. Dataset Establishment
3.1.2. Training of Neural Network Models
3.2. Acquisition of Object Position Information
3.3. Object Symmetry Axis Detection
- (1)
- Extract the image regions from each object prediction box and use these image regions as input images for the symmetry axis detection algorithm.
- (2)
- Mirror the carton image. And convert both the original and mirrored images into array matrices, a collection of row and column index points. The SIFT algorithm is applied to detect key points in the image, which are points that can be reliably detected and matched under varying scales and rotation conditions.
- (3)
- The Brute-Force Matcher (BFMatcher) is used to match the symmetric key points between the original image and the specular reflection image, and a set of matched feature point pairs is generated. According to the distance between each pair of feature points, a smaller distance indicates better symmetry matching. Figure 16a shows the top 10 pairs of feature points.
- (4)
- The Hexagonal Binning (Hexbin) diagram is used to display all the count results, as shown in Figure 16b. The region with the highest count corresponds (the darkest color), according to the mirror line that draws the image of the object. It’s the symmetry axis of the object.
3.3.1. Determination of Grasp Posture
3.3.2. Determination of Grasp Points
4. Experiment
4.1. Introduction to the Experimental Platform
4.2. Grasp Experiment
5. Discussion
6. Conclusions
Author Contributions
Funding
Institutional Review Board Statement
Informed Consent Statement
Data Availability Statement
Conflicts of Interest
References
- Luo, Q.; Chou, C.-H.; Liang, W.; Tang, H.; Du, R.; Wei, K.; Zhang, W. Reflex Regulation of Model-Based Biomimetic Control for a Tendon-Driven Prosthetic Hand. Biomed. Signal Process. Control 2025, 101, 107223. [Google Scholar] [CrossRef]
- Abbasi, B.; Noohi, E.; Parastegari, S.; Žefran, M. Grasp Taxonomy Based on Force Distribution. In Proceedings of the 2016 25th IEEE International Symposium on Robot and Human Interactive Communication (RO-MAN), New York, NY, USA, 26–31 August 2016; pp. 1098–1103. [Google Scholar]
- Feix, T.; Romero, J.; Schmiedmayer, H.-B.; Dollar, A.M.; Kragic, D. The GRASP Taxonomy of Human Grasp Types. IEEE Trans. Hum.-Mach. Syst. 2016, 46, 66–77. [Google Scholar] [CrossRef]
- Liu, Y.; Jiang, L.; Liu, H.; Ming, D. A Systematic Analysis of Hand Movement Functionality: Qualitative Classification and Quantitative Investigation of Hand Grasp Behavior. Front. Neurorobot. 2021, 15, 658075. [Google Scholar] [CrossRef]
- Stival, F.; Michieletto, S.; Cognolato, M.; Pagello, E.; Müller, H.; Atzori, M. A Quantitative Taxonomy of Human Hand Grasps. J. Neuroeng. Rehabil. 2019, 16, 2. [Google Scholar] [CrossRef]
- Yoon, D.; Kim, K. Fully Passive Robotic Finger for Human-Inspired Adaptive Grasping in Environmental Constraints. IEEE/ASME Trans. Mechatron. 2022, 27, 3841–3852. [Google Scholar] [CrossRef]
- Sun, Y.; Liu, Y.; Pancheri, F.; Lueth, T.C. LARG: A Lightweight Robotic Gripper With 3-D Topology Optimized Adaptive Fingers. IEEE/ASME Trans. Mechatron. 2022, 27, 2026–2034. [Google Scholar] [CrossRef]
- Zhang, N.; Ren, J.; Dong, Y.; Yang, X.; Bian, R.; Li, J.; Gu, G.; Zhu, X. Soft Robotic Hand with Tactile Palm-Finger Coordination. Nat. Commun. 2025, 16, 2395. [Google Scholar] [CrossRef]
- Teng, Z.; Xu, G.; Zhang, X.; Chen, X.; Zhang, S.; Huang, H.-Y. Concurrent and Continuous Estimation of Multi-Finger Forces by Synergy Mapping and Reconstruction: A Pilot Study. J. Neural Eng. 2023, 20, 066024. [Google Scholar] [CrossRef]
- Vargas, L.; Shin, H.; Huang, H.H.; Zhu, Y.; Hu, X. Object Stiffness Recognition Using Haptic Feedback Delivered through Transcutaneous Proximal Nerve Stimulation. J. Neural Eng. 2019, 17, 016002. [Google Scholar] [CrossRef]
- Chen, J.-C. Bridging the Finger-Action Gap between Hand Patients and Healthy People in Daily Life with a Biomimetic System. Biomimetics 2023, 8, 76. [Google Scholar] [CrossRef]
- Sun, Q.; Merino, E.C.; Yang, L.; Van Hulle, M.M. Unraveling EEG Correlates of Unimanual Finger Movements: Insights from Non-Repetitive Flexion and Extension Tasks. J. Neuroeng. Rehabil. 2024, 21, 228. [Google Scholar] [CrossRef]
- Xu, H.; Xiong, A. Advances and Disturbances in sEMG-Based Intentions and Movements Recognition: A Review. IEEE Sens. J. 2021, 21, 13019–13028. [Google Scholar] [CrossRef]
- Zhong, B.; Huang, H.; Lobaton, E. Reliable Vision-Based Grasping Target Recognition for Upper Limb Prostheses. IEEE Trans. Cybern. 2022, 52, 1750–1762. [Google Scholar] [CrossRef]
- Yang, J.; Wang, C.; Jiang, B.; Song, H.; Meng, Q. Visual Perception Enabled Industry Intelligence: State of the Art, Challenges and Prospects. IEEE Trans. Ind. Inform. 2021, 17, 2204–2219. [Google Scholar] [CrossRef]
- DeGol, J.; Akhtar, A.; Manja, B.; Bretl, T. Automatic Grasp Selection Using a Camera in a Hand Prosthesis. In Proceedings of the 2016 38th Annual International Conference of the IEEE Engineering in Medicine and Biology Society (EMBC), Orlando, FL, USA, 16–20 August 2016; pp. 431–434. [Google Scholar]
- Cui, J.-W.; Du, H.; Yan, B.-Y.; Wang, X.-J. Research on Upper Limb Action Intention Recognition Method Based on Fusion of Posture Information and Visual Information. Electronics 2022, 11, 3078. [Google Scholar] [CrossRef]
- Cognolato, M.; Atzori, M.; Gassert, R.; Müller, H. Improving Robotic Hand Prosthesis Control with Eye Tracking and Computer Vision: A Multimodal Approach Based on the Visuomotor Behavior of Grasping. Front. Artif. Intell. 2022, 4, 744476. [Google Scholar] [CrossRef]
- Peng, C.; Yang, D.; Zhao, D.; Cheng, M.; Dai, J.; Jiang, L. Viiat-Hand: A Reach-and-Grasp Restoration System Integrating Voice Interaction, Computer Vision, Auditory and Tactile Feedback for Non-Sighted Amputees. IEEE Robot. Autom. Lett. 2024, 9, 8674–8681. [Google Scholar] [CrossRef]
- Sato, M.; Takahashi, A.; Namiki, A. High-Speed Catching by Multi-Vision Robot Hand. In Proceedings of the 2020 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), Las Vegas, NV, USA, 25–29 October 2020; pp. 9131–9136. [Google Scholar]
- Weiner, P.; Starke, J.; Hundhausen, F.; Beil, J.; Asfour, T. The KIT Prosthetic Hand: Design and Control. In Proceedings of the 2018 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), Madrid, Spain, 1–5 October 2018; pp. 3328–3334. [Google Scholar]
- Shi, C.; Yang, D.; Zhao, J.; Liu, H. Computer Vision-Based Grasp Pattern Recognition with Application to Myoelectric Control of Dexterous Hand Prosthesis. IEEE Trans. Neural Syst. Rehabil. Eng. 2020, 28, 2090–2099. [Google Scholar] [CrossRef]
- Chen, W.; Wang, Y.; Xiao, Z.; Zhao, Z.; Yan, C.; Li, Z. Design and Experiments of a Three-Fingered Dexterous Hand Based on Biomechanical Characteristics of Human Hand Synergies. IEEE/ASME Trans. Mechatron. 2022, 27, 2930–2941. [Google Scholar] [CrossRef]
- Redmon, J.; Divvala, S.; Girshick, R.; Farhadi, A. You Only Look Once: Unified, Real-Time Object Detection. In Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition, Las Vegas, NV, USA, 27–30 June 2016; pp. 779–788. [Google Scholar]
- Lowe, D.G. Distinctive Image Features from Scale-Invariant Keypoints. Int. J. Comput. Vis. 2004, 60, 91–110. [Google Scholar] [CrossRef]
- Hu, R.; Xu, X.; Zhang, Y.; Deng, H. Dynamic Fractional-Order Nonsingular Terminal Super-Twisting Sliding Mode Control for a Low-Cost Humanoid Manipulator. Electronics 2022, 11, 3693. [Google Scholar] [CrossRef]
- Zhang, Y.; Zhao, Q.; Deng, H.; Xu, X. Design of a Multi-Mode Mechanical Finger Based on Linkage and Tendon Fusion Transmission. Biomimetics 2023, 8, 316. [Google Scholar] [CrossRef] [PubMed]
- Zhang, Y.; Xu, X.; Xia, R.; Deng, H. Stiffness-Estimation-Based Grasping Force Fuzzy Control for Underactuated Prosthetic Hands. IEEE/ASME Trans. Mechatron. 2023, 28, 140–151. [Google Scholar] [CrossRef]
- Xu, X.; Deng, H.; Zhang, Y.; Yi, N. Compliant Grasp Control Method for the Underactuated Prosthetic Hand Based on the Estimation of Grasping Force and Muscle Stiffness with sEMG. Biomimetics 2024, 9, 658. [Google Scholar] [CrossRef]
Geometric Shape of Object | Prosthetic Hand Finger Movement Mode |
---|---|
cylinder | Fixed-coupling adaptive mode |
globe | Fixed-coupling adaptive mode |
cuboid | Fixed-coupling adaptive mode |
tablet | Uncoupled adaptive mode |
ellipsoid | Variable-coupling adaptive mode |
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content. |
© 2025 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
Zhang, Y.; Xie, Y.; Zhao, Q.; Xu, X.; Deng, H.; Yi, N. Vision-Based Grasping Method for Prosthetic Hands via Geometry and Symmetry Axis Recognition. Biomimetics 2025, 10, 242. https://doi.org/10.3390/biomimetics10040242
Zhang Y, Xie Y, Zhao Q, Xu X, Deng H, Yi N. Vision-Based Grasping Method for Prosthetic Hands via Geometry and Symmetry Axis Recognition. Biomimetics. 2025; 10(4):242. https://doi.org/10.3390/biomimetics10040242
Chicago/Turabian StyleZhang, Yi, Yanwei Xie, Qian Zhao, Xiaolei Xu, Hua Deng, and Nianen Yi. 2025. "Vision-Based Grasping Method for Prosthetic Hands via Geometry and Symmetry Axis Recognition" Biomimetics 10, no. 4: 242. https://doi.org/10.3390/biomimetics10040242
APA StyleZhang, Y., Xie, Y., Zhao, Q., Xu, X., Deng, H., & Yi, N. (2025). Vision-Based Grasping Method for Prosthetic Hands via Geometry and Symmetry Axis Recognition. Biomimetics, 10(4), 242. https://doi.org/10.3390/biomimetics10040242