Velocity-Oriented Dynamic Control–Display Gain for Kinesthetic Interaction with a Grounded Force-Feedback Device
Abstract
:1. Introduction
How does the velocity-oriented dynamic CD gain affect user performance in kinesthetic interaction, in terms of interaction speed, positioning accuracy and touch perception?
2. Background
2.1. Professional Applications and Challenges of Kinesthetic Interaction
2.2. CD Gain Techniques in Human–Computer Interaction
3. Method
3.1. Velocity-Oriented Dynamic CD Gain for Kinesthetic Interaction
3.2. Experiment Design
3.3. Pilot Study
- The minimal and maximum velocity thresholds (i.e., 0.05 m/s and 0.25 m/s) were selected based on the users’ hand movement velocity in the 3D space while they were holding the arm of the force-feedback device. To determine these values, we also considered the velocity values for the PRISM technique as PRISM also relied on the hand movement speed. These two velocity values were verified in the pilot study.
- The size of virtual objects (i.e., 4.5 cm) and their spatial positions (see Table 1) were selected by considering the workspace of the used force-feedback device and the size of 2D screen. They were easy to interact with and simultaneously there were clear differences in horizontal distances (along x- and y-axes) and depths (along z-axis) between the easy-to-reach groups and the difficult-to-reach groups.
- The softness parameter values (see Table 1) were chosen. All the differences in softness degree were perceivable by the participants considering the sensitivity of the force-feedback device and human kinesthetic perception. For the tasks with high perception difficulty, participants had to pay more attention to compare two soft tissues to perceive which one was harder.
4. Experiment
4.1. Apparatus
4.2. Participants
4.3. Procedure
5. Results
5.1. Task Completion Time
5.2. Positioning Accuracy
5.3. Touch Perception
5.4. Subjective Response
6. Discussion
7. Conclusions
Author Contributions
Funding
Institutional Review Board Statement
Informed Consent Statement
Data Availability Statement
Conflicts of Interest
References
- Saddik, A.E.; Orozco, M.; Eid, M.; Cha, J. Haptics Technologies. Bringing Touch to Multimedia; Springer: Berlin, Germany, 2011. [Google Scholar] [CrossRef]
- HaptX Gloves. Available online: https://haptx.com/ (accessed on 27 November 2022).
- Geomagic Touch Haptic Device. Available online: https://www.3dsystems.com/haptics-devices/touch-x (accessed on 27 November 2022).
- Force Dimension Omega Device. Available online: https://www.forcedimension.com/products (accessed on 27 November 2022).
- Massie, T.H.; Salisbury, J.K. The Phantom Haptic Interface: A Device for Probing Virtual Objects. In Proceedings of the ASME Winter Annual Meeting, Symposium on Haptic Interfaces for Virtual Environment and Teleoperator Systems, Chicago, IL, USA, 13–18 November 1994. [Google Scholar]
- Read, A.; Ritchie, J.; Lim, T. A UNITY sketch based modelling environment for virtual assembly and machining to evaluate DFMA metrics. In Proceedings of the ASME International Design Engineering Technical Conferences and Computers and Information in Engineering Conference, Charlotte, NC, USA, 21–24 August 2016. [Google Scholar] [CrossRef] [Green Version]
- Licona, A.R.; Liu, F.; Pinzon, D.; Torabi, A.; Boulanger, P.; Lelevé, A.; Moreau, R.; Pham, M.T.; Tavakoli, M. Applications of Haptics in Medicine. In Haptic Interfaces for Accessibility, Health, and Enhanced Quality of Life; McDaniel, T., Panchanathan, S., Eds.; Springer: Cham, Germany, 2020; pp. 183–214. [Google Scholar] [CrossRef]
- Kinnison, T.; Forrest, N.D.; Frean, S.P.; Baillie, S. Teaching bovine abdominal anatomy: Use of a haptic simulator. Anat. Sci. Educ. 2009, 2, 280–285. [Google Scholar] [CrossRef] [PubMed]
- Okamura, A.M. Methods for haptic feedback in teleoperated robot-assisted surgery. Ind. Robot Int. J. 2004, 31, 499–508. [Google Scholar] [CrossRef] [PubMed] [Green Version]
- Medellín-Castillo, H.I.; Govea-Valladares, E.H.; Pérez-Guerrero, C.N.; Gil-Valladares, J.; Lim, T.; Ritchie, J.M. The evaluation of a novel haptic-enabled virtual reality approach for computer-aided cephalometry. Comput. Meth. Prog. Bio. 2016, 130, 46–53. [Google Scholar] [CrossRef]
- Kangas, J.; Li, Z.; Raisamo, R. Expert evaluation of haptic virtual reality user interfaces for medical landmarking. In Proceedings of the CHI Conference on Human Factors in Computing Systems Extended Abstracts, New Orleans, AK, USA, 29 April–5 May 2022. [Google Scholar] [CrossRef]
- Argelaguet, F.; Andujar, C. A survey of 3D object selection techniques for virtual environments. Comput. Graph. 2013, 37, 121–136. [Google Scholar] [CrossRef] [Green Version]
- Casiez, G.; Roussel, N. No more bricolage!: Methods and Tools to Characterize, Replicate and Compare Pointing Transfer Functions. In Proceedings of the 24th Annual ACM Symposium on User Interface Software and Technology, Santa Barbara, CA, USA, 16–19 October 2011. [Google Scholar] [CrossRef]
- Casiez, G.; Vogel, D.; Balakrishnan, R.; Cockburn, A. The impact of control-display gain on user performance in pointing tasks. Hum.-Comput. Interact. 2008, 23, 215–250. [Google Scholar] [CrossRef]
- Frees, S.; Kessler, G.D. Precise and rapid interaction through scaled manipulation in immersive virtual environments. In Proceedings of the IEEE Virtual Reality 2005, Bonn, Germany, 12–16 March 2005. [Google Scholar] [CrossRef]
- König, W.A.; Gerken, J.; Dierdorf, S.; Reiterer, H. Adaptive pointing—Design and evaluation of a precision enhancing technique for absolute pointing devices. In Proceedings of the IFIP Conference on Human-Computer Interaction, Uppsala, Sweden, 24–28 August 2009. [Google Scholar] [CrossRef] [Green Version]
- Li, Z.; Akkil, D.; Raisamo, R. The impact of control-display gain in kinesthetic search. In Proceedings of the 12th International Conference on Haptics: Science, Technology, Applications, Leiden, The Netherlands, 6–9 September 2020. [Google Scholar] [CrossRef]
- Jellinek, H.D.; Card, S.K. Powermice and user performance. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, Seattle, DC, USA, 1–5 April 1990. [Google Scholar] [CrossRef]
- Lin, M.L.; Radwin, R.G.; Vanderheiden, G.C. Gain effects on performance using a head-controlled computer input device. Ergonomics 1992, 35, 159–175. [Google Scholar] [CrossRef] [PubMed]
- Teklemariam, H.G.; Das, A.K. A case study of phantom omni force feedback device for virtual product design. Int. J. Interact. Des. Manuf. 2017, 11, 881–892. [Google Scholar] [CrossRef]
- Ribeiro, M.L.; Lederman, H.M.; Elias, S.; Nunes, F.L.S. Techniques and devices used in palpation simulation with haptic feedback. ACM Comput. Surv. 2017, 49, 1–28. [Google Scholar] [CrossRef]
- Ortmaier, T.; Deml, B.; Kübler, B.; Passig, G.; Reintsema, D.; Seibold, U. Robot assisted force feedback surgery. In Advances in Telerobotics; Springer Tracts in Advanced Robotics; Ferre, M., Buss, M., Eds.; Springer: Berlin, Germany, 2007; Volume 31, pp. 361–379. [Google Scholar] [CrossRef]
- Li, Z.; Kiiveri, M.; Rantala, J.; Raisamo, R. Evaluation of haptic virtual reality user interfaces for medical marking on 3D models. Int. J. Hum. Comput. Stud. 2021, 147, 102561. [Google Scholar] [CrossRef]
- Hinckley, K.; Tullio, J.; Pausch, R.; Proffitt, D.; Kassell, N. Usability analysis of 3D rotation techniques. In Proceedings of the 10th Annual ACM Symposium on User Interface Software and Technology, Banff, Canada, 14–17 October 1997. [Google Scholar] [CrossRef]
- Bowman, D.A.; Kruijff, E., Jr.; LaViola, J.J.; Poupyrev, I. 3D User Interfaces: Theory and Practice; Addison-Wesley Professional: Boston, MA, USA, 2004. [Google Scholar]
- Kim, H.; Choi, Y. Performance comparison of user interface devices for controlling mining software in virtual reality environments. Appl. Sci. 2019, 9, 2584. [Google Scholar] [CrossRef] [Green Version]
- Johnsgard, T. Fitts’ law with a virtual reality glove and a mouse: Effects of gain. In Proceedings of the Graphics Interface 1994, Banff, AB, Canada, 18–20 May 1994. [Google Scholar] [CrossRef]
- Accot, J.; Zhai, S. Scale effects in steering law tasks. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, Seattle, DC, USA, 31 March–5 April 2001. [Google Scholar] [CrossRef]
- Kwon, S.; Choi, E.; Chung, M.K. Effect of control-to-display gain and movement direction of information spaces on the usability of navigation on small touch-screen interfaces using tap-n-drag. Int. J. Ind. Ergon. 2011, 41, 322–330. [Google Scholar] [CrossRef]
- Poupyrev, I.; Billinghurst, M.; Weghorst, S.; Ichikawa, T. The go-go interaction technique: Non-linear mapping for direct manipulation in VR. In Proceedings of the 9th Annual ACM Symposium on User Interface Software and Technology, Seattle, DC, USA, 6–8 November 1998. [Google Scholar] [CrossRef]
- Lecuyer, A.; Coquillart, S.; Kheddar, A.; Richard, P.; Coiffet, P. Pseudo-haptic feedback: Can isometric input devices simulate force feedback? In Proceedings of the IEEE Virtual Reality 2000, New Brunswick, NJ, USA, 18–22 March 2000. [Google Scholar] [CrossRef] [Green Version]
- Samad, M.; Gatti, E.; Hermes, A.; Benko, H.; Parise, C. Pseudo-Haptic Weight. In Proceedings of the 2019 CHI Conference on Human Factors in Computing Systems, Glasgow, UK, 4–9 May 2019. [Google Scholar] [CrossRef]
- H3D API. Available online: https://h3d.org/ (accessed on 27 November 2022).
- Wobbrock, J.O.; Findlater, L.; Gergle, D.; Higgins, J.J. The aligned rank transform for nonparametric factorial analyses using only ANOVA procedures. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, Vancouver, BC, Canada, 7–12 May 2011. [Google Scholar] [CrossRef] [Green Version]
- Holm, S. A simple sequentially rejective multiple test procedure. Scand. J. Stat. 1979, 6, 65–70. Available online: https://www.jstor.org/stable/4615733 (accessed on 27 November 2022).
- Berthier, N.E.; Clifton, R.K.; Gullapalli, V.; McCall, D.D.; Robin, D.J. Visual information and object size in the control of reaching. J. Mot. Behav. 1996, 28, 187–197. [Google Scholar] [CrossRef] [PubMed]
The First Task | The Second Task | ||
---|---|---|---|
Position Values (x, y, z) | Levels | Softness Parameter (k) | Levels |
(0.08, 0.06, 0.06), (0.08, −0.06, 0.04) (−0.08, 0.06, 0.04), (−0.08, −0.06, 0.06) | Easy to reach | = 0.0585, = 0.0405 = 0.018 | Easy to perceive |
(0.1, 0.08, 0.04), (0.1, −0.08, 0.06) (−0.1, 0.08, 0.06), (−0.1, −0.08, 0.04) | Easy to reach | = 0.057, = 0.042 = 0.015 | Easy to perceive |
(0.12, 0.1, 0.06), (0.12, −0.1, 0.04) (−0.12, 0.1, 0.04), (−0.12, −0.1, 0.06) | Easy to reach | = 0.0555, = 0.0435 = 0.012 | Easy to perceive |
(0.15, 0.12, 0.07), (0.15, −0.12, 0.03) (−0.15, 0.12, 0.03), (−0.15, −0.12, 0.07) | Difficult to reach | = 0.054, = 0.045 = 0.009 | Difficult to perceive |
(0.17, 0.14, 0.03), (0.17, −0.14, 0.07) (−0.17, 0.14, 0.07), (−0.17, −0.14, 0.03) | Difficult to reach | = 0.0525, = 0.0465 = 0.006 | Difficult to perceive |
(0.19, 0.16, 0.07), (0.19, −0.16, 0.03) (−0.19, 0.16, 0.03), (−0.19, −0.16, 0.07) | Difficult to reach | = 0.051, = 0.048 = 0.003 | Difficult to perceive |
Task Completion Time (Seconds) | Maximum Gain 1.5 | Maximum Gain 2 | Maximum Gain 2.5 | Maximum Gain 3 | Maximum Gain 3.5 |
M = 5.28 SD = 2.84 | M = 5.18 SD = 2.43 | M = 5.14 SD = 2.12 | M = 5.27 SD = 2.04 | M = 5.57 SD = 2.42 | |
Baseline maximum gain 1 | Z = −4.171 p < 0.001 | Z = −2.600 p = 0.036 | Z = −2.571 p = 0.030 | Z = −1.314 p = 0.378 | Z = −0.857 p = 0.391 |
M = 6.42 SD = 3.61 |
Error Distances (cm) | Maximum Gain 1 | Maximum Gain 1.5 | Maximum Gain 2 | Maximum Gain 2.5 | Maximum Gain 3 | Maximum Gain 3.5 |
M = 0.49 SD = 0.27 | M = 0.47 SD = 0.21 | M = 0.47 SD = 0.20 | M = 0.48 SD = 0.22 | M = 0.48 SD = 0.20 | M = 0.50 SD = 0.19 |
Number of Errors | Maximum Gain 1.5 | Maximum Gain 2 | Maximum Gain 2.5 | Maximum Gain 3 | Maximum Gain 3.5 |
M = 1.8 SD = 1.2 | M = 2.3 SD = 1.3 | M = 2.4 SD = 1.5 | M = 2.8 SD = 1.6 | M = 2.4 SD = 1.3 | |
Baseline maximum gain 1 | Z = −1.669 p = 0.095 | Z = −2.578 p = 0.040 | Z = −2.462 p = 0.028 | Z = −3.200 p = 0.005 | Z = −2.572 p = 0.030 |
M = 1.4 SD = 1.1 |
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content. |
© 2023 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
Li, Z.; Kangas, J.; Raisamo, R. Velocity-Oriented Dynamic Control–Display Gain for Kinesthetic Interaction with a Grounded Force-Feedback Device. Multimodal Technol. Interact. 2023, 7, 12. https://doi.org/10.3390/mti7020012
Li Z, Kangas J, Raisamo R. Velocity-Oriented Dynamic Control–Display Gain for Kinesthetic Interaction with a Grounded Force-Feedback Device. Multimodal Technologies and Interaction. 2023; 7(2):12. https://doi.org/10.3390/mti7020012
Chicago/Turabian StyleLi, Zhenxing, Jari Kangas, and Roope Raisamo. 2023. "Velocity-Oriented Dynamic Control–Display Gain for Kinesthetic Interaction with a Grounded Force-Feedback Device" Multimodal Technologies and Interaction 7, no. 2: 12. https://doi.org/10.3390/mti7020012
APA StyleLi, Z., Kangas, J., & Raisamo, R. (2023). Velocity-Oriented Dynamic Control–Display Gain for Kinesthetic Interaction with a Grounded Force-Feedback Device. Multimodal Technologies and Interaction, 7(2), 12. https://doi.org/10.3390/mti7020012