Sign in to use this feature.

Years

Between: -

Subjects

remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline

Journals

remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline

Article Types

Countries / Regions

remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline

Search Results (235)

Search Parameters:
Keywords = haptic applications

Order results
Result details
Results per page
Select all
Export citation of selected articles as:
20 pages, 3191 KB  
Article
Visuomotor Control Accuracy of Circular Tracking Movement According to Visual Information in Virtual Space
by Jihyoung Lee, Kwangyong Han, Woong Choi and Jaehyo Kim
Sensors 2025, 25(19), 5998; https://doi.org/10.3390/s25195998 - 29 Sep 2025
Viewed by 501
Abstract
The VR-based circular tracking movement evaluation system (CES) was developed to quantitatively assess visuomotor control. The virtual stick, a component of the CES, provides visual cues in the virtual environment and haptic feedback when holding the controller. This study examined the effects of [...] Read more.
The VR-based circular tracking movement evaluation system (CES) was developed to quantitatively assess visuomotor control. The virtual stick, a component of the CES, provides visual cues in the virtual environment and haptic feedback when holding the controller. This study examined the effects of stick presence and presentation order on control accuracy for distance, angle, and angular velocity. Twenty-seven participants (12 females; mean age 23.3 ± 2.3 years) performed tasks in the frontal plane followed by the sagittal plane. In each plane, the stick was visible for the first 1–3 revolutions and invisible for the subsequent 4–6 revolutions in the invisible condition, with the reverse order in the visible condition. In the invisible condition, control accuracy with the stick was 1.10 times higher for distance only in the sagittal plane, and 1.13 and 1.09 times higher for angle and angular velocity in the frontal plane, and 1.11 and 1.08 times higher in the sagittal plane. No significant differences were observed in the visible condition. The improved control accuracy when the stick was visible is likely due to enhanced precision in constructing the reference frame, internal models, body coordinates, and effective multisensory integration of visual and haptic information. Such visual information may enable fine control in virtual environment-based applications, including games and surgical simulations. Full article
(This article belongs to the Special Issue Sensors Technologies for Measurements and Signal Processing)
Show Figures

Figure 1

16 pages, 1756 KB  
Article
The Effects of Vibrotactile Stimulation of the Upper Extremity on Sensation and Perception: A Study for Enhanced Ergonomic Design
by Abeer Abdel Khaleq, Yash More, Brody Skaufel and Mazen Al Borno
Theor. Appl. Ergon. 2025, 1(2), 8; https://doi.org/10.3390/tae1020008 - 29 Sep 2025
Viewed by 204
Abstract
Vibrotactile stimulation has applications in a variety of fields, including medicine, virtual reality, and human–computer interaction. Eccentric Rotating Mass (ERM) vibrating motors are widely used in wearable haptic devices owing to their small size, low cost, and low-energy features. User experience with vibrotactile [...] Read more.
Vibrotactile stimulation has applications in a variety of fields, including medicine, virtual reality, and human–computer interaction. Eccentric Rotating Mass (ERM) vibrating motors are widely used in wearable haptic devices owing to their small size, low cost, and low-energy features. User experience with vibrotactile stimulation is an important factor in ergonomic design for these applications. The effects of ERM motor vibrations on upper-extremity sensation and perception, which are important in the design of better wearable haptic devices, have not been thoroughly studied previously. Our study focuses on the relationship between user sensation and perception and on different vibration parameters, including frequency, location, and number of motors. We conducted experiments with vibrotactile stimulation on 15 healthy participants while the subjects were both at rest and in motion to capture different use cases of haptic devices. Eight motors were placed on a consistent set of muscles in the subjects’ upper extremities, and one motor was placed on their index fingers. We found a significant correlation between voltage and sensation intensity (r = 0.39). This finding is important in the design and safety of customized haptic devices. However, we did not find a significant aggregate-level correlation with the perceived pleasantness of the simulation. The sensation intensity varied based on the location of the vibration on the upper extremities (with the lowest intensities on the triceps brachii and brachialis) and slightly decreased (5.9 ± 2.9%) when the participants performed reaching movements. When a single motor was vibrating, the participants’ accuracy in identifying the motor without visual feedback increased as the voltage increased, reaching up to 81.4 ± 14.2%. When we stimulated three muscles simultaneously, we found that most participants were able to identify only two out of three vibrating motors (41.7 ± 32.3%). Our findings can help identify stimulation parameters for the ergonomic design of haptic devices. Full article
Show Figures

Figure 1

19 pages, 312 KB  
Review
Beyond Da Vinci: Comparative Review of Next-Generation Robotic Platforms in Urologic Surgery
by Stamatios Katsimperis, Lazaros Tzelves, Georgios Feretzakis, Themistoklis Bellos, Panagiotis Triantafyllou, Polyvios Arseniou and Andreas Skolarikos
J. Clin. Med. 2025, 14(19), 6775; https://doi.org/10.3390/jcm14196775 - 25 Sep 2025
Viewed by 580
Abstract
Robotic surgery has become a cornerstone of modern urologic practice, with the da Vinci system maintaining dominance for over two decades. In recent years, however, a new generation of robotic platforms has emerged, introducing greater competition and innovation into the field. These systems [...] Read more.
Robotic surgery has become a cornerstone of modern urologic practice, with the da Vinci system maintaining dominance for over two decades. In recent years, however, a new generation of robotic platforms has emerged, introducing greater competition and innovation into the field. These systems aim to address unmet needs through features such as modular architectures, enhanced ergonomics, haptic feedback, and cost-containment strategies. Several platforms—including Hugo™ RAS, Versius™, Avatera™, REVO-I, Hinotori™, Senhance™, KangDuo, MicroHand S, Dexter™, and Toumai®—have entered clinical use with early results demonstrating perioperative and short-term oncologic outcomes broadly comparable to those of established systems, particularly in procedures such as radical prostatectomy, partial nephrectomy, and radical cystectomy. At the same time, they introduce unique advantages in workflow flexibility, portability, and economic feasibility. Nevertheless, important challenges remain, including the need for rigorous comparative trials, standardized training curricula, and long-term cost-effectiveness analyses. The integration of artificial intelligence, augmented reality, and telesurgery holds the potential to further expand the role of robotics in urology, offering opportunities to enhance precision, improve accessibility, and redefine perioperative care models. This review summarizes the evolving landscape of robotic platforms in urology, highlights their clinical applications and limitations, and outlines future directions for research, training, and global implementation. Full article
(This article belongs to the Special Issue The Current State of Robotic Surgery in Urology)
9 pages, 394 KB  
Proceeding Paper
From Human-Computer Interaction to Human-Robot Manipulation
by Shuwei Guo, Cong Yang, Zhizhong Su, Wei Sui, Xun Liu, Minglu Zhu and Tao Chen
Eng. Proc. 2025, 110(1), 1; https://doi.org/10.3390/engproc2025110001 - 25 Sep 2025
Viewed by 608
Abstract
The evolution of Human–Computer Interaction (HCI) has laid the foundation for more immersive and dynamic forms of communication between humans and machines. Building on this trajectory, this work introduces a significant advancement in the domain of Human–Robot Manipulation (HRM), particularly in the remote [...] Read more.
The evolution of Human–Computer Interaction (HCI) has laid the foundation for more immersive and dynamic forms of communication between humans and machines. Building on this trajectory, this work introduces a significant advancement in the domain of Human–Robot Manipulation (HRM), particularly in the remote operation of humanoid robots in complex scenarios. We propose the Advanced Manipulation Assistant System (AMAS), a novel manipulation method designed to be low cost, low latency, and highly efficient, enabling real-time, precise control of humanoid robots from a distance. This method addresses critical challenges in current teleoperation systems, such as delayed response, expensive hardware requirements, and inefficient data transmission. By leveraging lightweight communication protocols, optimized sensor integration, and intelligent motion mapping, our system ensures minimal lag and accurate reproduction of human movements in the robot counterpart. In addition to these advantages, AMAS integrates multimodal feedback combining visual and haptic cues to enhance situational awareness, close the control loop, and further stabilize teleoperation. This transition from traditional HCI paradigms to advanced HRM reflects a broader shift toward more embodied forms of interaction, where human intent is seamlessly translated into robotic action. The implications are far-reaching, spanning applications in remote caregiving, hazardous environment exploration, and collaborative robotics. AMAS represents a step forward in making humanoid robot manipulation more accessible, scalable, and practical for real-world deployment. Full article
Show Figures

Figure 1

10 pages, 1329 KB  
Article
Initial Experience with the Saroa Surgical System in Robot-Assisted Hysterectomy: First Clinical Case Series and Haptic Feedback Assessment
by Noriko Oshima, Naoyuki Yoshiki, Yusuke Kohri, Maki Takao and Naoyuki Miyasaka
Medicina 2025, 61(9), 1716; https://doi.org/10.3390/medicina61091716 - 21 Sep 2025
Viewed by 380
Abstract
Background and Objectives: Laparoscopic surgery has evolved with the integration of robotic systems, offering enhanced precision and ergonomic benefits. However, conventional robotic systems often lack haptic feedback and are associated with high cost. The Saroa surgical system is a compact, pneumatically driven [...] Read more.
Background and Objectives: Laparoscopic surgery has evolved with the integration of robotic systems, offering enhanced precision and ergonomic benefits. However, conventional robotic systems often lack haptic feedback and are associated with high cost. The Saroa surgical system is a compact, pneumatically driven robot that integrates real-time haptic feedback, potentially addressing the limitations associated with conventional robotic systems. This preliminary study reports the first clinical use of the Saroa system in gynecologic surgery, aiming to assess its feasibility, safety, and usability in robot-assisted hysterectomy. Materials and Methods: Five patients underwent robot-assisted total laparoscopic hysterectomy using the Saroa surgical system. The clinical outcomes, setup and console times, estimated blood loss, and subjective surgeon evaluation were recorded. Results: All surgeries were successfully completed without any intraoperative complications or the need for conversion to conventional surgery. The median setup time was 12 min, the console time was 211 min, and the median blood loss was 80 mL. Surgeons subjectively noted that the system’s real-time haptic feedback substantially improved precision during vaginal cuff tissue manipulation, based on their tactile sensation and real-time force display, thereby reducing the perceived risk of traction-related tissue injuries. Conclusions: This study represents the first clinical application of the Saroa surgical system in gynecologic surgery. The findings suggest that the system is feasible and safe for robot-assisted hysterectomy. Despite limitations such as small sample size and the absence of objective force data, the favorable surgeon-reported experience highlights the potential value of haptic feedback in improving surgical performance. These results support further investigation through larger, controlled studies and quantitative performance evaluation. Full article
(This article belongs to the Special Issue Clinical Advances in Gynecological Surgery)
Show Figures

Figure 1

24 pages, 4385 KB  
Review
Clinical Applications of Anterior Segment Optical Coherence Tomography in Managing Phakic and Secondary Intraocular Lens Implants: A Comprehensive Review
by José Ignacio Fernández-Vigo, Bárbara Burgos-Blasco, Lucía De-Pablo-Gómez-de-Liaño, Ignacio Almorín-Fernández-Vigo, Pedro Arriola-Villalobos, Diego Ruiz-Casas, Ana Macarro-Merino and José Ángel Fernández-Vigo
Diagnostics 2025, 15(18), 2385; https://doi.org/10.3390/diagnostics15182385 - 19 Sep 2025
Viewed by 548
Abstract
Anterior segment optical coherence tomography (AS-OCT) has emerged as a crucial imaging technique in ophthalmology, particularly for evaluating intraocular structures and the behavior of phakic and secondary intraocular lenses (IOLs). This narrative review summarizes the latest findings and clinical applications of OCT regarding [...] Read more.
Anterior segment optical coherence tomography (AS-OCT) has emerged as a crucial imaging technique in ophthalmology, particularly for evaluating intraocular structures and the behavior of phakic and secondary intraocular lenses (IOLs). This narrative review summarizes the latest findings and clinical applications of OCT regarding phakic and secondary IOLs, focusing on their effectiveness, safety, and factors influencing performance. Through a comprehensive analysis of current literature, we explore how OCT facilitates the assessment of IOLs on key anatomical parameters—such as vault, angle configuration, lens centration, tilt, and haptic positioning—essential for optimizing surgical outcomes and minimizing postoperative complications. In phakic IOLs, including posterior chamber lenses such as the Implantable Collamer Lens (ICL, STAAR Surgical, Monrovia, CA, USA) and iris-fixated lenses, such as Artiflex (Ophtec BV, Groningen, The Netherlands), OCT enables precise evaluation of the anterior segment, aiding both candidate selection and long-term monitoring. In secondary implants for aphakia—especially iris-fixated lenses like Artisan (Ophtec BV, Groningen, The Netherlands) and sutureless scleral-fixated lenses such as the Carlevale IOL (Soleko, Rome, Italy)—or those implanted via the Yamane technique, OCT provides high-resolution visualization of haptic fixation, IOL stability, and potential complications, including tilt or decentration. This review also highlights comparative insights between fixation techniques, underscores the need for standardized OCT protocols, and discusses the integration of artificial intelligence tools. In summary, the routine use of OCT in the preoperative and postoperative management of phakic and secondary IOLs has been increasingly incorporated into clinical practice, as it enhances clinical decision-making and improves patient outcomes. Full article
(This article belongs to the Section Biomedical Optics)
Show Figures

Figure 1

15 pages, 1297 KB  
Review
Haircutting Robots: From Theory to Practice
by Shuai Li
Automation 2025, 6(3), 47; https://doi.org/10.3390/automation6030047 - 18 Sep 2025
Viewed by 723
Abstract
The field of haircutting robots is poised for a significant transformation, driven by advancements in artificial intelligence, mechatronics, and humanoid robotics. This perspective paper examines the emerging market for haircutting robots, propelled by decreasing hardware costs and a growing demand for automated grooming [...] Read more.
The field of haircutting robots is poised for a significant transformation, driven by advancements in artificial intelligence, mechatronics, and humanoid robotics. This perspective paper examines the emerging market for haircutting robots, propelled by decreasing hardware costs and a growing demand for automated grooming services. We review foundational technologies, including advanced hair modeling, real-time motion planning, and haptic feedback, and analyze their application in both teleoperated and fully autonomous systems. Key technical requirements and challenges in safety certification are discussed in detail. Furthermore, we explore how cutting-edge technologies like direct-drive systems, large language models, virtual reality, and big data collection can empower these robots to offer a human-like, personalized, and efficient experience. We propose a business model centered on supervised autonomy, which enables early commercialization and sets a path toward future scalability. This perspective paper provides a theoretical and technical framework for the future deployment and commercialization of haircutting robots, highlighting their potential to create a new sector in the automation industry. Full article
(This article belongs to the Section Robotics and Autonomous Systems)
Show Figures

Figure 1

21 pages, 1251 KB  
Review
Haptic Feedback Systems for Lower-Limb Prosthetic Applications: A Review of System Design, User Experience, and Clinical Insights
by Mohammadmahdi Karimi, Nashmin Yeganeh, Ivan Makarov, Atli Örn Sverrisson, Karl Fannar Gunnarsson, Kristín Briem, Sigurður Brynjólfsson, Árni Kristjánsson and Runar Unnthorsson
Bioengineering 2025, 12(9), 989; https://doi.org/10.3390/bioengineering12090989 - 18 Sep 2025
Viewed by 808
Abstract
Systems presenting haptic information have emerged as an important technological advance in assisting individuals with sensory impairments or amputations, where the aim is to enhance sensory perception or provide sensory substitution through tactile feedback. These systems provide information on limb positioning, environmental interactions, [...] Read more.
Systems presenting haptic information have emerged as an important technological advance in assisting individuals with sensory impairments or amputations, where the aim is to enhance sensory perception or provide sensory substitution through tactile feedback. These systems provide information on limb positioning, environmental interactions, and gait events, significantly improving mobility in amputees and their confidence about using such devices. This review summarizes recent progress in haptic feedback systems by providing a comparative analysis of different feedback approaches, evaluating their clinical effectiveness and usability, tactile feedback system design, and user experience, while identifying key gaps in the literature. These insights can contribute to the advancement of more effective, user-centered haptic feedback systems tailored for lower limb prosthetics. The findings are aimed at guiding future research in designing adaptive, intuitive, and clinically viable feedback mechanisms, fostering the widespread implementation of haptic systems in both assistive and rehabilitative applications. Full article
(This article belongs to the Section Biomechanics and Sports Medicine)
Show Figures

Graphical abstract

21 pages, 2603 KB  
Article
Sensing What You Do Not See: Alerting of Approaching Objects with a Haptic Vest
by Albina Rurenko, Devbrat Anuragi, Ahmed Farooq, Marja Salmimaa, Zoran Radivojevic, Sanna Kumpulainen and Roope Raisamo
Sensors 2025, 25(18), 5808; https://doi.org/10.3390/s25185808 - 17 Sep 2025
Viewed by 636
Abstract
Workplace accidents in high-risk environments remain a major safety concern, particularly when workers’ visual and auditory channels are overloaded. Haptic feedback offers a promising alternative for alerting individuals to unseen dangers and enhancing situational awareness. Motivated by challenges commonly observed in construction, this [...] Read more.
Workplace accidents in high-risk environments remain a major safety concern, particularly when workers’ visual and auditory channels are overloaded. Haptic feedback offers a promising alternative for alerting individuals to unseen dangers and enhancing situational awareness. Motivated by challenges commonly observed in construction, this study investigates haptic alerting strategies applicable across dynamic, attentionally demanding contexts. We present two empirical experiments exploring how wearable vibration cues can inform users about approaching objects outside their field of view. The first experiment evaluated variations of pattern-based vibrations to simulate motion and examined the relationship between signal parameters and perceived urgency. A negative correlation between urgency and pulse duration emerged, identifying a key design factor. The second experiment conducted a novel comparison of pattern-based and location-based haptic alerts in a complex virtual environment, with tasks designed to simulate cognitive engagement with work processes. Results indicate that location-based alerts were more efficient for hazard detection. These findings offer insights into the design of effective user-centred haptic-based safety systems and provide a foundation for future development and deployment in real-world settings. This work contributes a generalisable step toward wearable alerting technologies for safety-critical occupations, including but not limited to construction. Full article
(This article belongs to the Section Wearables)
Show Figures

Figure 1

33 pages, 1120 KB  
Review
Wearables in ADHD: Monitoring and Intervention—Where Are We Now?
by Mara-Simina Olinic, Roland Stretea and Cristian Cherecheș
Diagnostics 2025, 15(18), 2359; https://doi.org/10.3390/diagnostics15182359 - 17 Sep 2025
Cited by 1 | Viewed by 1059
Abstract
Introduction: Wearable devices capable of continuously sampling movement, autonomic arousal and neuro-electrical activity are emerging as promising complements to traditional assessment and treatment of Attention-Deficit/Hyperactivity Disorder (ADHD). By moving data collection from the clinic to everyday settings, these technologies offer an unprecedented window [...] Read more.
Introduction: Wearable devices capable of continuously sampling movement, autonomic arousal and neuro-electrical activity are emerging as promising complements to traditional assessment and treatment of Attention-Deficit/Hyperactivity Disorder (ADHD). By moving data collection from the clinic to everyday settings, these technologies offer an unprecedented window onto the moment-to-moment fluctuations that characterise the condition. Methods: Drawing on a comprehensive literature search spanning 2013 to February 2025 across biomedical and engineering databases, we reviewed empirical studies that used commercial or research-grade wearables for ADHD-related diagnosis, monitoring or intervention. Titles and abstracts were screened against predefined inclusion criteria, with full-text appraisal and narrative synthesis of the eligible evidence. A narrative synthesis was conducted, with inclusion criteria targeting empirical studies of wearable devices applied to ADHD for monitoring, mixed monitoring-plus-intervention, or intervention-only applications. No quantitative pooling was undertaken due to heterogeneity of designs, endpoints, and analytic methods. Results: The reviewed body of work demonstrates that accelerometers, heart-rate and electrodermal sensors, and lightweight EEG headsets can enrich clinical assessment by capturing ecologically valid markers of hyperactivity, arousal and attentional lapses. Continuous monitoring studies suggest that wearable-derived metrics align with symptom trajectories and medication effects, while early intervention trials explore haptic prompts, attention-supporting apps and non-invasive neuromodulation delivered through head-worn devices. Across age groups, participants generally tolerate these tools well and value the objective feedback they provide. Nevertheless, the literature is limited by heterogeneous study designs, modest sample sizes and short follow-up periods, making direct comparison and clinical translation challenging. Conclusions: Current evidence paints an optimistic picture of the feasibility and acceptability of wearables in ADHD, yet larger, standardised and longer-term investigations are needed to confirm their clinical utility. Collaboration between clinicians, engineers and policymakers will be crucial to address data-privacy, equity and cost-effectiveness concerns and to integrate wearable technology into routine ADHD care. Full article
Show Figures

Figure 1

18 pages, 3097 KB  
Article
Deep Neural Network-Based Alignment of Virtual Reality onto a Haptic Device for Visuo-Haptic Mixed Reality
by Hyeonsu Kim, Hanbit Yong and Myeongjin Kim
Appl. Sci. 2025, 15(18), 10071; https://doi.org/10.3390/app151810071 - 15 Sep 2025
Viewed by 338
Abstract
Precise alignment between virtual reality (VR) and haptic interfaces is essential for delivering an immersive visuo-haptic mixed reality experience. Existing methods typically depend on markers, external trackers, or cameras, which can be intrusive and hinder usability. In addition, previous network-based approaches generally rely [...] Read more.
Precise alignment between virtual reality (VR) and haptic interfaces is essential for delivering an immersive visuo-haptic mixed reality experience. Existing methods typically depend on markers, external trackers, or cameras, which can be intrusive and hinder usability. In addition, previous network-based approaches generally rely on image data for alignment. This paper introduces a deep neural network-based alignment method that eliminates the need for such external components. Unlike existing methods, our approach is designed based on coordinate transformation and leverages a network model for alignment. The proposed method utilizes the head-mounted display (HMD) position, fingertip position obtained via hand tracking, and the six-degrees-of-freedom (6-DOF) pose of a haptic device’s end-effector as inputs to a neural network model. A shared multi-layer perceptron and max pooling layer are employed to extract global feature vectors from the inputs, ensuring permutation invariance. The extracted feature vectors are then processed through fully connected layers to estimate the pose of the haptic device’s base. Experimental results show a mean positional error of 2.718 mm and the mean rotation error of 0.5330°, which equates to 1.3% relative to the haptic device’s maximum length. The proposed method demonstrates robustness against noise, demonstrating its applicability across various domains, including medical simulations, virtual prototyping, and interactive training environments. Full article
(This article belongs to the Special Issue Advances in Human–Machine Interaction)
Show Figures

Figure 1

14 pages, 2032 KB  
Article
Surface Reading Model via Haptic Device: An Application Based on Internet of Things and Cloud Environment
by Andreas P. Plageras, Christos L. Stergiou, Vasileios A. Memos, George Kokkonis, Yutaka Ishibashi and Konstantinos E. Psannis
Electronics 2025, 14(16), 3185; https://doi.org/10.3390/electronics14163185 - 11 Aug 2025
Viewed by 523
Abstract
In this research paper, we have implemented a computer program thanks to the XML language to sense the differences in image color depth by using haptic/tactile devices. With the use of “Bump Map” and tools such as “Autodesk’s 3D Studio Max”, “Adobe Photoshop”, [...] Read more.
In this research paper, we have implemented a computer program thanks to the XML language to sense the differences in image color depth by using haptic/tactile devices. With the use of “Bump Map” and tools such as “Autodesk’s 3D Studio Max”, “Adobe Photoshop”, and “Adobe Illustrator”, we were able to obtain the desired results. The haptic devices used for the experiments were the “PHANTOM Touch” and the “PHANTOM Omni R” of “3D Systems”. The programs that were installed and configured properly so as to model the surfaces, run the experiments, and finally achieve the desired goal are “H3D Api”, “Geomagic_OpenHaptics”, and “OpenHaptics_Developer_Edition”. The purpose of this project was to feel different textures, shapes, and objects in images by using a haptic device. The primary objective was to create a system from the ground up to render visuals on the screen and facilitate interaction with them via the haptic device. The main focus of this work is to propose a novel pattern of images that we can classify as different textures so that they can be identified by people with reduced vision. Full article
Show Figures

Graphical abstract

15 pages, 2908 KB  
Article
Bioinspired Design of Ergonomic Tool Handles Using 3D-Printed Cellular Metamaterials
by Gregor Harih and Vasja Plesec
Biomimetics 2025, 10(8), 519; https://doi.org/10.3390/biomimetics10080519 - 8 Aug 2025
Viewed by 1069
Abstract
The design of ergonomic tool handles is crucial for user comfort and performance, yet conventional stiff materials often lead to uneven pressure distribution and discomfort. This study investigates the application of 3D-printed cellular metamaterials with tunable stiffness, specifically gyroid structures, to enhance the [...] Read more.
The design of ergonomic tool handles is crucial for user comfort and performance, yet conventional stiff materials often lead to uneven pressure distribution and discomfort. This study investigates the application of 3D-printed cellular metamaterials with tunable stiffness, specifically gyroid structures, to enhance the ergonomic and haptic properties of tool handles. We employed finite element analysis to simulate finger–handle interactions and conducted subjective comfort evaluations with participants using a foxtail saw with handles of varying gyroid infill densities and a rigid PLA handle. Numerical results demonstrated that handles with medium stiffness significantly reduced peak contact pressures and promoted a more uniform pressure distribution compared to the stiff PLA handle. The softest gyroid handle, while compliant, exhibited excessive deformation, potentially compromising stability. Subjective comfort ratings corroborated these findings, with medium-stiffness handles receiving the highest scores for overall comfort, fit, and force transmission. These results highlight that a plateau-like mechanical response of the 3D-printed cellular metamaterial handle, inversely bioinspired by human soft tissue, effectively balances pressure redistribution and grip stability. This bioinspired design approach offers a promising direction for developing user-centered products that mitigate fatigue and discomfort in force-intensive tasks. Full article
(This article belongs to the Special Issue 3D Bio-Printing for Regenerative Medicine Applications)
Show Figures

Figure 1

27 pages, 6183 KB  
Article
A Cartesian Parallel Mechanism for Initial Sonography Training
by Mykhailo Riabtsev, Jean-Michel Guilhem, Victor Petuya, Mónica Urizar and Med Amine Laribi
Robotics 2025, 14(7), 95; https://doi.org/10.3390/robotics14070095 - 10 Jul 2025
Cited by 1 | Viewed by 538
Abstract
This paper presents the development and analysis of a novel 6-DOF Cartesian parallel mechanism intended for use as a haptic device for initial sonography training. The system integrates a manipulator designed for delivering force feedback in five degrees of freedom; however, in the [...] Read more.
This paper presents the development and analysis of a novel 6-DOF Cartesian parallel mechanism intended for use as a haptic device for initial sonography training. The system integrates a manipulator designed for delivering force feedback in five degrees of freedom; however, in the current stage, only mechanical architecture and kinematic validation have been conducted. Future enhancements will focus on implementing and evaluating closed-loop force control to enable complete haptic feedback. To assess the kinematic performance of the mechanism, a detailed kinematic model was developed, and both the Kinematic Conditioning Index (KCI) and Global Conditioning Index (GCI) were computed to evaluate the system’s dexterity. A trajectory simulation was conducted to validate the mechanism’s movement, using motion patterns typical in sonography procedures. Quasi-static analysis was performed to study the transmission of force and torque for generating realistic haptic feedback, critical for simulating real-life sonography. The simulation results showed consistent performance, with dexterity and torque distribution confirming the suitability of the mechanism for haptic applications in sonography training. Additionally, structural analysis verified the robustness of key components under expected loads. In order to validate the proposed design, the prototype was constructed using a combination of aluminum components and 3D-printed ABS parts, with Igus® linear guides for precise motion. The outcomes of this study provide a foundation for the further development of a low-cost, effective sonography training system. Full article
(This article belongs to the Section Medical Robotics and Service Robotics)
Show Figures

Figure 1

27 pages, 10314 KB  
Article
Immersive Teleoperation via Collaborative Device-Agnostic Interfaces for Smart Haptics: A Study on Operational Efficiency and Cognitive Overflow for Industrial Assistive Applications
by Fernando Hernandez-Gobertti, Ivan D. Kudyk, Raul Lozano, Giang T. Nguyen and David Gomez-Barquero
Sensors 2025, 25(13), 3993; https://doi.org/10.3390/s25133993 - 26 Jun 2025
Viewed by 1074
Abstract
This study presents a novel investigation into immersive teleoperation systems using collaborative, device-agnostic interfaces for advancing smart haptics in industrial assistive applications. The research focuses on evaluating the quality of experience (QoE) of users interacting with a teleoperation system comprising a local robotic [...] Read more.
This study presents a novel investigation into immersive teleoperation systems using collaborative, device-agnostic interfaces for advancing smart haptics in industrial assistive applications. The research focuses on evaluating the quality of experience (QoE) of users interacting with a teleoperation system comprising a local robotic arm, a robot gripper, and heterogeneous remote tracking and haptic feedback devices. By employing a modular device-agnostic framework, the system supports flexible configurations, including one-user-one-equipment (1U-1E), one-user-multiple-equipment (1U-ME), and multiple-users-multiple-equipment (MU-ME) scenarios. The experimental set-up involves participants manipulating predefined objects and placing them into designated baskets by following specified 3D trajectories. Performance is measured using objective QoE metrics, including temporal efficiency (time required to complete the task) and spatial accuracy (trajectory similarity to the predefined path). In addition, subjective QoE metrics are assessed through detailed surveys, capturing user perceptions of presence, engagement, control, sensory integration, and cognitive load. To ensure flexibility and scalability, the system integrates various haptic configurations, including (1) a Touch kinaesthetic device for precision tracking and grounded haptic feedback, (2) a DualSense tactile joystick as both a tracker and mobile haptic device, (3) a bHaptics DK2 vibrotactile glove with a camera tracker, and (4) a SenseGlove Nova force-feedback glove with VIVE trackers. The modular approach enables comparative analysis of how different device configurations influence user performance and experience. The results indicate that the objective QoE metrics varied significantly across device configurations, with the Touch and SenseGlove Nova set-ups providing the highest trajectory similarity and temporal efficiency. Subjective assessments revealed a strong correlation between presence and sensory integration, with users reporting higher engagement and control in scenarios utilizing force feedback mechanisms. Cognitive load varied across the set-ups, with more complex configurations (e.g., 1U-ME) requiring longer adaptation periods. This study contributes to the field by demonstrating the feasibility of a device-agnostic teleoperation framework for immersive industrial applications. It underscores the critical interplay between objective task performance and subjective user experience, providing actionable insights into the design of next-generation teleoperation systems. Full article
(This article belongs to the Special Issue Recent Development of Flexible Tactile Sensors and Their Applications)
Show Figures

Figure 1

Back to TopTop