Next Article in Journal
An Investigation of Clock Skew Using a Wirelength-Aware Floorplanning Process in the Pre-Placement Stages of MSV Layouts
Next Article in Special Issue
Visual Positioning System Based on 6D Object Pose Estimation Using Mobile Web
Previous Article in Journal
Motion Planning for Vibration Reduction of a Railway Bridge Maintenance Robot with a Redundant Manipulator
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Effects of Using Vibrotactile Feedback on Sound Localization by Deaf and Hard-of-Hearing People in Virtual Environments

1
Institute of Visual Computing and Human-Centered Technology, Vienna University of Technology (TU Wien), 1040 Wien, Austria
2
Department of Computer Science, Aarhus University, 8200 Aarhus, Denmark
*
Author to whom correspondence should be addressed.
Electronics 2021, 10(22), 2794; https://doi.org/10.3390/electronics10222794
Submission received: 17 October 2021 / Revised: 9 November 2021 / Accepted: 11 November 2021 / Published: 15 November 2021

Abstract

:
Sound source localization is important for spatial awareness and immersive Virtual Reality (VR) experiences. Deaf and Hard-of-Hearing (DHH) persons have limitations in completing sound-related VR tasks efficiently because they perceive audio information differently. This paper presents and evaluates a special haptic VR suit that helps DHH persons efficiently complete sound-related VR tasks. Our proposed VR suit receives sound information from the VR environment wirelessly and indicates the direction of the sound source to the DHH user by using vibrotactile feedback. Our study suggests that using different setups of the VR suit can significantly improve VR task completion times compared to not using a VR suit. Additionally, the results of mounting haptic devices on different positions of users’ bodies indicate that DHH users can complete a VR task significantly faster when two vibro-motors are mounted on their arms and ears compared to their thighs. Our quantitative and qualitative analysis demonstrates that DHH persons prefer using the system without the VR suit and prefer mounting vibro-motors in their ears. In an additional study, we did not find a significant difference in task completion time when using four vibro-motors with the VR suit compared to using only two vibro-motors in users’ ears without the VR suit.

1. Introduction

Deafness and hearing loss are issues that affect millions of people around the world (World Federation of the Deaf (Available Online: http://wfdeaf.org/our-work/ (accessed on 5 October 2021))). These issues are manifested in different intensities related to different causes and can affect different aspects of Deaf and Hard-of-Hearing (DHH) persons’ social life [1]. Recent advances in technology and medicine, such as Cochlear Implants (CIs), help these groups of people to use and enjoy technology more than before [2], but CI technology has some disadvantages. It is costly and not practical for all DHH persons [3]. In addition, it requires invasive surgery, and the surgery’s success rate depends on the person’s age [3]. Therefore, using alternative technologies to help DHH persons is beneficial.
One of the low-cost technologies to help DHH persons is inexpensive vibrotactile devices that can transmit information as vibration signals through the DHH person’s body. Previous studies have shown that using haptic sensors in wearable devices such as suits, belts, bracelets, shoes, gloves, and chairs can help DHH persons perceive information from their environments [4,5,6]. DHH persons detect sounds differently, so they have limitations in completing sound-related tasks, especially in Virtual Reality (VR). However, DHH persons can perceive sound information using their other senses, such as tactile sensation [7]. They can sense vibrations and feel sounds in the same part of their brain that hearing persons use to hear sounds [8].
Previous research has shown the usability of vibrotactile systems for hearing and DHH persons as they complete different tasks related to navigation [9] and sound awareness [10]. In VR, haptic feedback is usually used in special VR suits or other wearable devices, and it improves the immersive VR experience. A few versions of vibrotactile-based VR suits are available on the market, such as “TactSuit” (Available Online: https://www.bhaptics.com/ (accessed on 5 October 2021)) and “TeslaSuit” (Available Online: https://teslasuit.io/ (accessed on 5 October 2021)), but they have not been tested on DHH users. Haptic VR suits can help DHH persons to perform sound-related VR tasks, but more comprehensive studies are needed about the effects of using different setups of haptic VR suits for DHH users. We also need to know if the number of haptic devices positively affects the completion of sound-related VR tasks among DHH users. This paper investigates the capabilities and effects of using different setups of haptic VR suits among DHH users. Our main hypotheses in this study are as follows:
Hypothesis 1 (H1). 
DHH persons can complete sound-related VR tasks faster using different VR suit setups compared to not using the VR suit.
Hypothesis 2 (H2). 
Increasing the number of haptic devices on a VR suit does not significantly affect the performance of sound source localization in VR among DHH persons.
A special VR suit with four adjustable vibro-motors was designed for this study to analyze different aspects of using haptic VR suit setups for DHH users. We intended to find the optimal number of haptic devices (vibro-motors) necessary for DHH persons to perform sound source localization in VR. In addition, we determined the best positions for mounting vibro-motors on DHH persons’ bodies for the completion of sound-related VR tasks by analyzing the results of questionnaires about discomfort scores and the desire to use different haptic setups of our proposed VR suit. In summary, the main contributions of our study are as follows:
  • The effects of using haptic VR suit setups among DHH users on the completion of sound-related VR tasks are analyzed.
  • The optimal number of haptic devices necessary for DHH persons to perform sound source localization in VR is identified.
  • The best positions for mounting haptic devices on DHH persons’ bodies for completing sound-related VR tasks are defined.
The rest of the paper is organized as follows. In Section 2, related work on the use of wearable vibrotactile feedback systems in VR is presented. In Section 3, we explain the study design and methodologies of our approaches. In Section 4, the experimental results of different setups of our proposed haptic VR suit are presented. In Section 5, the results of a complementary study related to the optimal number of haptic devices are presented. In Section 6, we discuss the effects, limitations, and future work of our proposed VR haptic device, and finally, we conclude the paper in Section 7.

2. Related Work

Vibrotactile feedback has been widely used in many previous studies for navigation or spatial awareness and to show different application scenarios for delivering information using vibrations through the skin. Hashizume et al. [9] developed a special wearable haptic suit called “LIVEJACKET” that can improve the quality of music experience among users when listening to digital media by using vibrotactile feedback [9]. They did not test the “LIVEJACKET” on DHH persons, but there are some other studies, such as Petry et al. [5] and Shibasaki et al. [11], that used a similar approach to improve the music experience among DHH persons.
Some other researchers have tried to deliver haptic cues from virtual environments to users’ bodies. Lindeman et al. [12] implemented a system that can deliver vibrotactile stimuli to the user’s whole body from virtual environments. Their proposed system could improve the immersive VR experience and the feedback time in critical situations in a VR environment. Kaul et al. [13] proposed a system called “HapticHead” that can utilize multiple vibrotactile actuators around the head for intuitive haptic guidance through moving tactile cues, and it was able to effectively guide users towards virtual and real targets in 3D environments. Peng et al. [14] proposed a system called “WalkingVibe” that uses vibrotactile feedback for walking experiences in VR to reduce VR sickness among users. Vibrotactile devices are also used in some specific gloves. G. Sziebig et al. [15] and Regenbrecht et al. [16] presented vibrotactile gloves with vibration motors to provide sensory feedback from a virtual environment to the user’s hands.
A few other researchers have proposed systems that help DHH persons with sound awareness, such as Saba et al. [6], Jain et al. [10], and Mirzaei et al. [17]. Saba et al. proposed a wearable interaction system called “Hey yaa”. This system allows DHH persons to call each other using sensory-motor communication through vibration. In a qualitative study, Jain et al. [10] showed the importance of sound awareness and vibrotactile wearable devices among DHH persons. In addition, Mirzaei et al. [17] proposed a wearable system for DHH users called “EarVR” that can be mounted on VR Head-Mounted Displays (HMDs) and can locate sound sources in VR environments in real time. Their results suggest that “EarVR” can help DHH persons to complete sound-related VR tasks and can also encourage DHH users to use VR technology more than before [17].
Almost all of these studies show the positive effects of using vibrotactile feedback systems in VR or the real world. However, to the best of our knowledge, none of them investigated the effects of using different setups of haptic VR suits by mounting vibration devices on different body positions of DHH persons for completing sound-related VR tasks.

3. User Study

For our study, we designed a special VR suit with four vibro-motors to demonstrate the four main directions of incoming sounds (front, back, left, and right) for deaf users (Figure 1). This VR suit can deliver vibrotactile cues from a VR environment to DHH persons’ bodies. The vibro-motors are controlled wirelessly using an Arduino (Available Online: https://www.arduino.cc/ (accessed on 5 October 2021)) processing unit with a Bluetooth module.
We conducted a test to investigate the effects of mounting vibro-motors on different sections of DHH persons’ bodies, such as thighs, arms, and ears. At the end of the test, we asked participants to fill out questionnaires about their preferred position for mounting the vibro-motors and the discomfort score of different setups of our proposed VR suit.

3.1. Hardware Design

An Arduino Micro Pro with an HC-06 Bluetooth module controls coin vibro-motors (10–14 mm) wirelessly from the host computer that is running the VR application. We assembled the processing unit in a mountable package on the back of the VR suit (Figure 2). The whole system is powered by a customized rechargeable Lithium-ION (Li-ON) battery with a capacity of 8000 mAh and a voltage of 7.4 V. The vibro-motors have an operating voltage range of 3 V to 4 V at 40–80 mA with a frequency of 150–205 Hz.
The flat surfaces of the vibro-motors were very close to the user’s body so that the user could feel the vibrations very well. Previous studies, such as Rupert [18] and Toney et al. [19], have reported that a major problem is difficulty in maintaining good contact between vibro-motors and the users’ bodies. They suggest that haptic devices (active motors) be optimally fit in their positions with an appropriate degree of pressure to ensure the perception of haptic feedback. Therefore, we designed our VR suit with special Velcro tapes to help to maintain the vibro-motors in their fixed positions on the suit during the tests. The Velcro tapes also allowed us to easily change the positions of vibro-motors on the users’ bodies. We fitted the VR suit for each participant before the main test and asked them to wear thin clothes for the main experiment to ensure that they felt the vibrations from all four vibro-motors on the VR suit.

3.2. Software Design

We designed a simple VR task using the Unreal Engine 4 (UE4) (Available Online: https://www.unrealengine.com/en-US/ (accessed on 5 October 2021)) game engine with an Arduino plugin to communicate with the Arduino processing unit. Arduino Integrated Development Environment (IDE) was used to develop the code for the Arduino Micro Pro. In the VR task, the player is spawned at the center of an enclosed VR room and can only rotate around. We also added a “FRONT” label to one of the four walls in the VR room to show the front direction in the VR environment (Figure 3a). We used this label as an index corresponding to the vibro-motor mounted on the front of the VR suit.
The player had to start the task by standing in front of the wall labeled “FRONT”. This procedure let us know the exact positions of the sound sources in the VR environment with respect to the user’s orientation so that we could send the proper signals to correct vibro-motors on the VR suit. The player was able to start the VR task by pressing the grip button on the VR controller after standing in the mentioned position and select sound sources (speakers) by pressing the trigger button on the VR controller with the help of a ray-cast laser pointer (Figure 3b). We designed the task so that every time it was started, only one sound source (speaker) would appear randomly in one of the four main positions in the VR environment (front, back, left, and right).

3.3. Main Experiment

For the main experiment, we prepared three different setups of our proposed haptic VR suit. We used two fixed vibro-motors to represent the front and back directions but changed the positions of the other two vibro-motors in different setups as follows:
  • Setup 1: Two vibro-motors mounted on the front and back and two on the left and right sides of the thighs (left side of the left thigh and right side of the right thigh), Figure 4a;
  • Setup 2: Two vibro-motors mounted on the front and back and two on the left and right sides of the arms (left side of the left arm and right side of the right arm), Figure 4b;
  • Setup 3: Two vibro-motors mounted on the front and back and two on the left and right ears, Figure 4c.
We called these setups “Setup 1 (thighs)”, “Setup 2 (arms)”, and “Setup 3 (ears)” respectively. We also considered an additional setup without using the haptic suit or any other assisting tool. In this condition, deaf users had to search and find sound sources in the VR environment only by using vision without the help of a haptic suit by rotating around to find the rendered 3D model of the sound source (speaker). We called this additional condition “Setup 0 (no suit)” and compared the results of using the VR suit in Setups 1 (thighs), 2 (arms), and 3 (ears) with this new condition.

3.4. Procedure

We recruited 20 DHH participants from the deaf community with no hearing in either ear (12 men, 8 women; ages: 20–40, X ¯ = 30.4 ). We briefly introduced the goal of our experiment to the participants and described the equipment and the environment of the test. Because none of the participants had tested a haptic suit before, we asked each of them to wear our proposed haptic suit with different setups before the main experiment and asked them about their opinion in an open-ended question. Thus, the participants became familiar with different setups of the haptic suit and could withdraw from the experiment if they felt uncomfortable with any of the haptic suit setups. In addition, we carefully stated all VR safety warnings in our consent form, so all participants were completely aware of them and signed the form before the main experiment. None of the participants withdrew from the experiment, and all of them participated in our study.
For “Setup 3 (ears)”, we asked the participants to answer a question about their preferred position on the head for mounting the vibro-motors. All 20 participants preferred mounting the vibro-motors inside the ears compared to the temples or behind the ears. We attached soft and flexible plastics to the ears’ vibro-motors to prevent unpleasant feelings when DHH users put the vibro-motors inside their ears. Many participants mentioned in their comments that putting vibro-motors inside their ears felt like wearing headphones, similar to persons without hearing problems.
For the main experiment, we asked each participant to play each condition (setup) for 10 rounds. The player was able to start the VR task by pressing the grip button on the VR controller. After starting the VR task, one speaker appeared randomly in one of the four main positions in the VR environment (front, back, left, and right) based on the player’s starting position.
Every time a speaker appeared in the VR environment, a vibro-motor related to the speaker’s position would vibrate (controlled by the Arduino) so that the player could find the correct speaker position and select it. This meant that the vibro-motor on the player’s chest started when the sound was in front of the user, the one on the back started when it was behind the user, and the left and right vibro-motors started based on the respective sound location. After selecting the speaker, it would disappear, and the player had to face the wall labeled “FRONT” to start the next round. This process continued until the player selected 10 speakers (completing 10 rounds).
For each setup, the completion time of every round was saved for the player. Then, the average task completion time of the setup was calculated for that player. In the end, the overall average task completion times of all players were calculated for the setups. A Friedman test with significance level of α = 0.05 and a post hoc Wilcoxon signed-rank test with Bonferroni correction resulting in a significance level set at α < 0.008 were used to assess statistical significance between average task completion times of the setups.
We asked participants to answer a question about the discomfort score for each setup, with the possibility of changing their answers at the end of the experiment (after completing all of the setups). To determine the preferred setup, we used a questionnaire at the end of the experiment with a 5-point Likert-scale question (“1 = most negative” and “5 = most positive”) about which setup is more desirable to use, and to identify the discomfort level, we collected discomfort scores ranging from 1 to 10 (lower value = more comfortable) and calculated the preference score and discomfort level by averaging the scores of all participants for each setup at the end of the experiment (the final results). We analyzed the data for each setup based on the participants’ average task completion time and their responses to our questionnaires about discomfort and preferred setup in our study.
We asked the participants to complete each setup in one day, starting with Setup 0. Starting the experiment with “Setup 0 (no suit)” and then “Setup 1 (thighs)” was important for our further study because using the sense of touch in the lower body as a cue for audio direction was very unusual among DHH participants, and they wanted to test it first (after Setup 0). Therefore, we fixed the order of setups to “Setup 0 (no suit)” on day 1, “Setup 1 (thighs)” on day 2, “Setup 2 (arms)” on day 3, and “Setup 3 (ears)” on day 4.

4. Results

The average task completion time of each setup is shown in Figure 5. Deaf users completed the VR task with average task completion times of 27.05 s for “Setup 0 (no suit)”, 23.39 s for “Setup 1 (thighs)”, 21.7 s for “Setup 2 (arms)”, and 21.41 s for “Setup 3 (ears)”.
The Friedman test revealed a statistically significant effect of using different setups on the average task completion times ( χ 2 = 49.020 , p < 0.001 ). Figure 6 shows the Wilcoxon test results for different setups. The Wilcoxon test revealed a statistically significant main effect of “Setups 1 (thighs), 2 (arms), and 3 (ears)” vs. “Setup 0 (no suit)”, as well as “Setup 2 (arms)” and “Setup 3 (ears)” vs. “Setup 1 (thighs)”, but no significant main effect was found for “Setup 2 (arms)” vs. “Setup 3 (ears)”. These results indicate that the position of vibro-motors on the VR suit affects the task completion times of DHH users.
The results of comparing Setups 1, 2, and 3 vs. Setup 0 indicate that using a haptic VR suit with different setups positively affects task completion times of DHH users in VR. Therefore, hypothesis H1 is supported. In addition, the results show that the arms and ears are preferred to the thighs for mounting the vibro-motors. Some deaf participants mentioned in their comments that feeling the sense of touch in their upper body, such as arms and ears, was more familiar to them as a warning sign to focus attention in a specific direction compared to the sense of touch in their lower body (thighs). We assume that this is the main reason why mounting vibro-motors on the upper body was more effective for completing sound localization tasks in VR among DHH users. Future studies are required to understand why the arms and the ears are better than the thighs for DHH persons.
Figure 7 shows the average responses to question about discomfort scores of Setups 0 (no suit), 1 (thighs), 2 (arms), and 3 (ears). The Friedman test revealed significant differences between different setups for the average discomfort level ( χ 2 = 58.898 , p < 0.001 ).
Figure 8 shows the Wilcoxon test results of the average discomfort level for different setups. The Wilcoxon test (with α < 0.008 ) revealed a statistically significant main effect of “Setup 0 (no suit)” vs. “Setups 1 (thighs), 2 (arms), and 3 (ears)”, “Setup 2 (arms)” and “Setup 3 (ears)” vs. “Setup 1 (thighs)”, and “Setup 3 (ears)” vs. “Setup 2 (arms)”. These results indicate that participants rated “Setup 0 (no suit)” significantly better (more comfortable) than all of the other setups. The results of comparing setups of the VR suit show that “Setup 3 (ears)” was rated as significantly more comfortable than Setups 1 and 2.
Figure 9 shows the frequencies of responses to a question about the desire to use the setups. The Friedman test revealed significant differences between different setups for the desire to use the setup ( χ 2 = 54.279 , p < 0.001 ).
Figure 10 shows the Wilcoxon test results for the desire to use different setups. The Wilcoxon test (with α < 0.008 ) revealed a statistically significant main effect of “Setups 1 (thighs), 2 (arms), and 3 (ears)” vs. “Setup 0 (no suit)”, “Setup 2 (arms)” and “Setup 3 (ears)” vs. “Setup 1 (thighs)”, and “Setup 3 (ears)” vs. “Setup 2 (arms)”.
These results indicate that participants preferred to use at least one setup with the VR suit compared to “Setup 0 (no suit)”. Many deaf participants commented to us that although setups with the VR suit (Setups 1, 2, and 3) felt a little uncomfortable because of the VR suit, these setups were very useful compared to “Setup 0 (no suit)” in VR and helped them to complete the VR task much more quickly and easily. In addition, the results of comparing setups with the VR suit show that participants preferred “Setup 3 (ears)” more than other setups.

5. Experiment with the Number of Vibro-Motors

Almost all of the participants (18 out of 20) commented that they prefer to use an assistive system in VR without a VR suit. Therefore, we decided to conduct another experiment without the VR suit and with only two vibro-motors mounted in deaf users’ ears. We used soft plastic covers for each vibro-motor to minimize the unpleasant feeling of mounting them inside the users’ ears (Figure 11). All participants commented that putting vibro-motors inside their ears felt like using regular in-ear headphones without any unpleasant sensation.
This new condition is very similar to “Setup 3 (ears)”, and we used the same VR task as in the previous experiment. The only difference was the removal of the VR suit and the two vibro-motors on the front and back. For this new condition, when the sound source was in the front, none of the vibro-motors vibrated because the player could see the sound source immediately, and when the sound source was in the back, both of the vibro-motors in the left and right ears vibrated at the same time. We intended to determine if the human brain can handle the new situation for sound source localization in VR and compared the results with the results of using the VR suit.
We called this new condition “Setup 4 (only ears)”, and we tested both “Setup 3 (ears)” from the main study and this new “Setup 4 (only ears)” on a new group of participants comprising 10 DHH persons from the deaf community (7 men, 3 women; ages: 25–35, X ¯ = 28.3, with no hearing in either ear) with the same procedure as that explained in Section 3.4. Each participant completed both the “Setup 3 (ears)” and “Setup 4 (only ears)” tests on one day, in random order. After finishing both tests, we asked the participants to complete a questionnaire about the discomfort score and the desire to use each setup. Then, we calculated the average task completion time, average discomfort level, and the desire to use each setup and compared the results. Figure 12 shows the average task completion time of “Setup 4 (only ears)—without the VR suit” compared to “Setup 3 (ears)—with the VR suit”. Participants completed the VR task with an average task completion time of 21.058 s for “Setup 3 (ears)” and 21.389 seconds for “Setup 4 (ears only)”.
We applied a Wilcoxon signed-rank test with α = 0.05 on the average task completion times of setups. The Wilcoxon test suggested that these average task completion times were close, with no significant difference ( Z = 0.714 , p = 0.475 ). This result indicates that using the VR suit with four vibro-motors does not significantly affect the average task completion time of deaf users in comparison to using only two vibro-motors on the ears. Therefore, hypothesis H2 is supported.
Figure 13 shows the average responses to questions about discomfort scores and the desire to use “Setup 3 (ears)” and “Setup 4 (only ears)”. Comparison of the average discomfort level of “Setup 4 (only ears)—without the VR suit” with that of “Setup 3 (ears)—with the VR suit” using the Wilcoxon test showed significant differences between these two setups ( Z = 2.809 , p = 0.005 ); participants rated “Setup 4 (ears only)” as more comfortable than “Setup 3 (ears)”.
In addition, the Wilcoxon test for the results on the desire to use the setups showed significant differences between these two setups ( Z = 2.831 , p = 0.005 ). DHH participants preferred to use “Setup 4 (only ears)—without the VR suit” more than “Setup 3 (ears)—with the VR suit” for completing sound-related VR tasks. This is an exciting result because it will help us to develop small portable VR assistants for deaf persons without using VR suits in the future.

6. Discussion and Future Work

The results of our first experiment suggest that DHH persons complete sound-related VR tasks significantly faster with our proposed haptic VR suit than without haptic feedback. In addition, the results for the discomfort level and the desire to use different setups show that DHH persons prefer mounting the vibro-motors on their upper body sections, such as arms and ears, compared to lower body sections, such as thighs, when using the VR suit. However, the results of our second experiment suggest that DHH persons prefer mounting vibro-motors on their ears when not using a VR suit. According to these results, both of our hypotheses in this study are supported, but more neurological studies are required to understand why DHH persons react differently to tactile sensations from different parts of their bodies.
In this study, the experiment was entirely new to DHH participants, so we limited the number of vibro-motors to four on a custom VR suit and limited the directions of incoming sounds to the front, back, left, and right. In addition, we only tested our suggested setups of the VR suit on the same group of participants on different days and in a fixed order. Although the VR environment randomly changed in each test for each participant, the fixed order of the setups in our first experiment may have had some learning effects on the participants. Therefore, it will be essential to randomize the order of conditions across participants in our future work.
Our study results can ultimately lead to a more efficient design and save resources and costs while providing a more enjoyable VR experience for DHH persons. In summary, our study suggests the following guidelines for VR developers who intend to design haptic devices for DHH users with a focus on sound source localization in VR:
1
When using a VR suit, DHH persons prefer mounting vibro-motors on the upper body.
2
Front and back vibro-motors (haptic devices) are not mandatory for sound source localization in VR.
3
Haptic VR suits are not preferred by DHH persons for sound source localization in VR, and they prefer compact and portable haptic devices.
4
DHH persons prefer mounting a vibro-motor inside each of their ears for sound source localization in VR.

7. Conclusions

In this study, we investigated different setups of a haptic VR suit for DHH persons that helps them to complete sound-related VR tasks. Our experimental results suggest that sound source localization in VR is not significantly faster with four haptic devices compared to two. Therefore, the efficiency of completing sound-related VR tasks is not related to the number of haptic devices on the suit. However, further studies are required to analyze more complex sound source localization scenarios in VR for DHH persons (e.g., diagonal directions for incoming sounds).
Furthermore, our complementary study shows that DHH persons prefer mounting vibro-motors in their ears when not using a VR suit. We consider this an exciting result because it can help VR developers to develop compact, cheap, and portable haptic devices for DHH users for purposes such as sound source localization in VR.
Our results suggest that using haptic devices in VR can help deaf persons to complete sound-related VR tasks faster. Using haptic devices can also encourage DHH persons to use VR technology more than before. Further studies are required to test and analyze the effects of haptic devices on sound source localization VR tasks and VR enjoyment among DHH persons. We hope to inspire VR developers to design and develop wearable assistive haptic technologies for deaf persons to help them to use and enjoy VR technology as much as persons without hearing problems.

Author Contributions

The authors in this research contributed to this article as follows: Conceptualization, M.M.; data curation, M.M.; formal analysis, M.M., P.K. and H.K.; funding acquisition, M.M.; investigation, M.M.; methodology, M.M., P.K. and H.K.; project administration, M.M.; resources, M.M.; software, M.M.; supervision, H.K.; validation, M.M., P.K. and H.K.; visualization, M.M.; writing—original draft, M.M.; writing—review and editing, M.M., P.K. and H.K. All authors have read and agreed to the published version of the manuscript.

Funding

TU Wien Open Access publication fund. This research received no external funding.

Informed Consent Statement

Signed consent forms were obtained from all subjects involved in the study.

Data Availability Statement

The data presented in this study are available on request from the corresponding author. The data are not publicly available due to privacy request from the DHH participants.

Acknowledgments

The authors acknowledge TU Wien Bibliothek for financial support through its Open Access Funding Program, and they also wish to thank all DHH participants who volunteered for this study.

Conflicts of Interest

The authors declare no conflict of interest.

Abbreviations

The following abbreviations are used in this manuscript:
3D3-Dimensional
CICochlear Implant
DHHDeaf and Hard-of-Hearing
IDEIntegrated Development Environment
Li-ONLithium-ION
VRVirtual Reality

References

  1. Araujo, F.A.; Brasil, F.L.; Santos, A.C.L.; Batista Junior, L.S.; Dutra, S.P.F.; Batista, C.E.C.F. Auris System: Providing Vibrotactile Feedback for Hearing Impaired Population. BioMed Res. Int. 2017, 2017, 2181380. [Google Scholar]
  2. Maiorana-Basas, M.; Pagliaro, C. Technology use among adults who are deaf and hard of hearing: A national survey. J. Deaf. Stud. Deaf. Educ. 2014, 19, 400–410. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  3. Spencer, P.E.; Marschark, M.; Spencer, L.J. Cochlear implants: Advances, issues, and implications. In The Oxford Handbook of Deaf Studies, Language, and Education; Marschark, M., Spencer, P.E., Eds.; Publishing House: Washington, DC, USA, 2011; pp. 452–470. [Google Scholar]
  4. Karam, M.; Branje, C.; Nespoli, G.; Thompson, N.; Russo, F.A.; Fels, D.I. The emoti-chair: An interactive tactile music exhibit. In Proceedings of the CHI ’10 Extended Abstracts on Human Factors in Computing Systems (CHI EA ’10), Atlanta, GA, USA, 10–15 April 2010; Association for Computing Machinery: New York, NY, USA, 2016; pp. 3069–3074. [Google Scholar]
  5. Petry, B.; Illandara, T.; Nanayakkara, S. MuSS-bits: Sensor-display blocks for deaf people to explore musical sounds. In Proceedings of the 28th Australian Conference on Computer-Human Interaction (OzCHI ’16), Launceston, Australia, 29 November–2 December 2016; Association for Computing Machinery: New York, NY, USA, 2016; pp. 72–80. [Google Scholar]
  6. Saba, M.P.; Filippo, D.; Pereira, F.R.; de Souza, P.L.P. Hey yaa: A Haptic Warning Wearable to Support Deaf People Communication. In Collaboration and Technology. CRIWG 2011. Lecture Notes in Computer Science; Vivacqua, A.S., Gutwin, C., Borges, M.R.S., Eds.; Springer: Berlin/Heidelberg, Germany, 2011; pp. 7–17. [Google Scholar]
  7. Levänen, S.; Hamdorf, D. Feeling vibrations: Enhanced tactile sensitivity in congenitally deaf humans. Neurosci. Lett. 2001, 301, 75–77. [Google Scholar] [CrossRef]
  8. Schmitz, A.; Holloway, C.; Cho, Y. Hearing through Vibrations: Perception of Musical Emotions by Profoundly Deaf People. arXiv 2020, arXiv:abs/2012.13265. [Google Scholar]
  9. Hashizume, S.; Sakamoto, S.; Suzuki, K.; Ochiai, Y. LIVEJACKET: Wearable Music Experience Device with Multiple Speakers. In Distributed, Ambient and Pervasive Interactions: Understanding Humans. DAPI 2018. Lecture Notes in Computer Science; Streitz, N., Konomi, S., Eds.; Springer: Cham, Switzerland, 2018; pp. 359–371. [Google Scholar]
  10. Jain, D.; Lin, A.; Guttman, R.; Amalachandran, M.; Zeng, A.; Findlater, L.; Froehlich, J. Exploring Sound Awareness in the Home for People who are Deaf or Hard of Hearing. In Proceedings of the 2019 CHI Conference on Human Factors in Computing Systems (CHI ’19), Glasgow, UK, 4–9 May 2019; Association for Computing Machinery: New York, NY, USA, 2019; pp. 1–13. [Google Scholar]
  11. Shibasaki, M.; Kamiyama, Y.; Minamizawa, K. Designing a Haptic Feedback System for Hearing-Impaired to Experience Tap Dance. In Proceedings of the 29th Annual Symposium on User Interface Software and Technology (UIST ’16 Adjunct), Tokyo, Japan, 16–19 October 2016; Association for Computing Machinery: New York, NY, USA, 2016; pp. 97–99. [Google Scholar]
  12. Lindeman, R.W.; Page, R.; Yanagida, Y.; Sibert, J.L. Towards full-body haptic feedback: The design and deployment of a spatialized vibrotactile feedback system. In Proceedings of the ACM symposium on Virtual reality software and technology (VRST ’04), Hong Kong, China, 10–12 November 2004; Association for Computing Machinery: New York, NY, USA, 2004; pp. 146–149. [Google Scholar]
  13. Kaul, O.B.; Rohs, M. HapticHead: A Spherical Vibrotactile Grid around the Head for 3D Guidance in Virtual and Augmented Reality. In Proceedings of the 2017 CHI Conference on Human Factors in Computing Systems (CHI ’17), Denver, CO, USA, 6–11 May 2017; Association for Computing Machinery: New York, NY, USA, 2017; pp. 3729–3740. [Google Scholar]
  14. Peng, Y.; Yu, C.; Liu, S.; Wang, C.; Taele, P.; Yu, N.; Chen, M.Y. WalkingVibe: Reducing Virtual Reality Sickness and Improving Realism while Walking in VR using Unobtrusive Head-mounted Vibrotactile Feedback. In Proceedings of the 2020 CHI Conference on Human Factors in Computing Systems (CHI ’20), Honolulu, HI, USA, 25–30 April 2020; Association for Computing Machinery: New York, NY, USA, 2020; pp. 1–12. [Google Scholar]
  15. Sziebig, G.; Solvang, B.; Kiss, C.; Korondi, P. Vibro-tactile feedback for VR systems. In Proceedings of the 2009 2nd Conference on Human System Interactions, Catania, Italy, 21–23 May 2009; pp. 406–410. [Google Scholar]
  16. Regenbrecht, H.; Hauber, J.; Schoenfelder, R.; Maegerlein, A. Virtual reality aided assembly with directional vibro-tactile feedback. In Proceedings of the 3rd International Conference on Computer Graphics and Interactive Techniques in Australasia and South East Asia (GRAPHITE ’05), Dunedin, New Zealand, 29 November–2 December 2005; Association for Computing Machinery: New York, NY, USA, 2005; pp. 381–387. [Google Scholar]
  17. Mirzaei, M.; Kán, P.; Kaufmann, H. EarVR: Using Ear Haptics in Virtual Reality for Deaf and Hard-of-Hearing People. IEEE Trans. Vis. Comput. Graph. 2020, 26, 2084–2093. [Google Scholar] [CrossRef] [PubMed]
  18. Rupert, A.H. An instrumentation solution for reducing spatial disorientation mishaps. IEEE Eng. Med. Biol. Mag. 2000, 19, 71–80. [Google Scholar] [CrossRef] [PubMed]
  19. Toney, A.; Dunne, L.; Thomas, B.H.; Ashdown, S.P. A shoulder pad insert vibrotactile display. In Proceedings of the 7th IEEE International Symposium on Wearable Computers, White Plains, NY, USA, 21–23 October 2003; pp. 35–44. [Google Scholar]
Figure 1. The concept design of our proposed VR suit.
Figure 1. The concept design of our proposed VR suit.
Electronics 10 02794 g001
Figure 2. The prototype version of our proposed VR suit.
Figure 2. The prototype version of our proposed VR suit.
Electronics 10 02794 g002
Figure 3. The VR environment, (a) The “FRONT” label position, (b) Sound source position based on the “FRONT” label direction.
Figure 3. The VR environment, (a) The “FRONT” label position, (b) Sound source position based on the “FRONT” label direction.
Electronics 10 02794 g003
Figure 4. Suggested setups of the VR suit.
Figure 4. Suggested setups of the VR suit.
Electronics 10 02794 g004
Figure 5. Average task completion times of each setup.
Figure 5. Average task completion times of each setup.
Electronics 10 02794 g005
Figure 6. Wilcoxon test results on differences in average task completion times between each pair of setups.
Figure 6. Wilcoxon test results on differences in average task completion times between each pair of setups.
Electronics 10 02794 g006
Figure 7. Average discomfort level of each setup.
Figure 7. Average discomfort level of each setup.
Electronics 10 02794 g007
Figure 8. Wilcoxon test results on average discomfort level between each pair of setups.
Figure 8. Wilcoxon test results on average discomfort level between each pair of setups.
Electronics 10 02794 g008
Figure 9. Desire to use the setups.
Figure 9. Desire to use the setups.
Electronics 10 02794 g009
Figure 10. Wilcoxon test results on desire to use the setups.
Figure 10. Wilcoxon test results on desire to use the setups.
Electronics 10 02794 g010
Figure 11. Mounting vibro-motors with soft covers inside users’ ears.
Figure 11. Mounting vibro-motors with soft covers inside users’ ears.
Electronics 10 02794 g011
Figure 12. Average task completion times of “Setup 3 (ears)” and “Setup 4 (only ears)”.
Figure 12. Average task completion times of “Setup 3 (ears)” and “Setup 4 (only ears)”.
Electronics 10 02794 g012
Figure 13. Results of average discomfort level and desire to use “Setup 3 (ears)” and “Setup 4 (only ears)”.
Figure 13. Results of average discomfort level and desire to use “Setup 3 (ears)” and “Setup 4 (only ears)”.
Electronics 10 02794 g013
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Share and Cite

MDPI and ACS Style

Mirzaei, M.; Kán, P.; Kaufmann, H. Effects of Using Vibrotactile Feedback on Sound Localization by Deaf and Hard-of-Hearing People in Virtual Environments. Electronics 2021, 10, 2794. https://doi.org/10.3390/electronics10222794

AMA Style

Mirzaei M, Kán P, Kaufmann H. Effects of Using Vibrotactile Feedback on Sound Localization by Deaf and Hard-of-Hearing People in Virtual Environments. Electronics. 2021; 10(22):2794. https://doi.org/10.3390/electronics10222794

Chicago/Turabian Style

Mirzaei, Mohammadreza, Peter Kán, and Hannes Kaufmann. 2021. "Effects of Using Vibrotactile Feedback on Sound Localization by Deaf and Hard-of-Hearing People in Virtual Environments" Electronics 10, no. 22: 2794. https://doi.org/10.3390/electronics10222794

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop