Next Article in Journal
Applying Cognitive Load Theory to eLearning of Crafts
Previous Article in Journal
“A Safe Space for Sharing Feelings”: Perspectives of Children with Lived Experiences of Anxiety on Social Robots
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Systematic Review

The Role of Haptics in Training and Games for Hearing-Impaired Individuals: A Systematic Review

1
Multisensory Experience Lab, Department for Architecture, Design and Media Technology, Aalborg University, 2450 Copenhagen, Denmark
2
Department of Mathematics, Computer Science and Physics Università di Udine, 33100 Udine, Italy
*
Author to whom correspondence should be addressed.
Multimodal Technol. Interact. 2024, 8(1), 1; https://doi.org/10.3390/mti8010001
Submission received: 28 October 2023 / Revised: 5 December 2023 / Accepted: 14 December 2023 / Published: 22 December 2023

Abstract

:
Sensory substitution and augmentation are pivotal concepts in multi-modal perception, particularly when confronting the challenges associated with impaired or missing sense rehabilitation. The present systematic review investigates the role of haptics for the hearing impaired in training or gamified activities. We applied a set of keywords to the Scopus® and PubMed® databases, obtaining a collection of 35 manuscripts spanning 23 years. Each article has been categorized following a documented procedure and thoroughly analyzed. Our findings reveal a rising number of studies in this field in the last five years, mostly testing the effectiveness of the developed rehabilitative method (77.14%). Despite a wide variety in almost every category we analyzed, such as haptic devices, body location, and data collection, we report a constant difficulty in recruitment, reflected in the low number of hearing-impaired participants (mean of 8.31). This review found that in all six papers reporting statistically significant positive results, the vibrotactile device in use generated vibrations starting from a sound, suggesting that some perceptual aspects connected to sound are transmittable through touch. This fact provides evidence that haptics and vibrotactile devices could be viable solutions for hearing-impaired rehabilitation and training.

1. Introduction

Human perception is by nature multisensory, and an extensive amount of research has been dedicated to understanding how we perceive the world through the interaction of our senses, starting with the pioneering work presented in [1]. In Biocca et al. [2], the authors outline different ways in which the senses interact: for instance, cross-modal enhancement refers to the fact that stimuli from one sensory channel enhance or alter the perceptual interpretation of stimulation from another sensory channel. In certain situations individuals can lack or have reduced sensory modalities. This is the case of individuals with hearing impairment, visual impairment or tactile impairment. In these situations, technologies could help augment or substitute the missing modality.
In this paper we focus on individuals with hearing impairment and investigate how devices based on the sense of touch can help them to make better sense of different stimuli and in recent years, several devices have been proposed for this purpose. This approach is supported by the fact that hearing and touch present a higher temporal resolution if compared to vision, which is especially true when the sense of touch is experienced by the hands, which have a greater resolution than other body parts [3].
Since the 1920s, researchers conducted experiments with the goal of investigating the perception of vibrating objects through tactile sensitivity, and comparing their characteristics with hearing perception. These efforts also led to new research paths such as inquiring how deaf people experience sound through the sense of touch [4].
Recent research by Cieśla et al. [5] has shown that a speech-to-touch sensory substitution device significantly improves speech recognition in both cochlear implant users and individuals with normal hearing. This finding aligns with the longstanding idea that the sense of touch can be effectively employed to substitute or enhance auditory experiences in such devices. One of the first experiments in this direction was the “hearing glove”, a speech technology modeled on the cochlea but constrained by the limited sensitivity of human skin, presumably invented by Norbert Wiener in the 1940s [6]. Other prominent experiments were performed by Clark and colleagues who proposed the Tickle Talker [7], an eight channel electro-tactile speech processor. The Tickle Talker was used to reinforce residual hearing or to supplement lip reading; the device showed potential in rehabilitation of severe hearing-impaired children and adults. Since then, tactile feedback has been used for several applications aimed at aiding hearing-impaired individuals, such as music listening [8], and even tap dancing [9]. In the works considered by this review, vibrotactile devices have been used to enhance various dimensions of hearing, such as sound source localization [10], pitch discrimination [11,12], and speech comprehension [13,14]. Experiments have been proposed to improve non-auditory perceptual abilities, including environmental perception [15], voice tone control [16], as well as cognitive ones like braille perception [17], lip reading [18], or web browsing [19].
Most of the time, developers of games and video games most of the times do not take the needs of individuals with disabilities into account while creating their products [20]. Thus, accessible games have an important role to include a population that otherwise would be excluded [21]. In the last twenty years [22] a strong focus has been placed on creating accessible games for populations with different abilities. For individuals with severe hearing loss or those who may not benefit from traditional speech training, augmentative and alternative communication methods can support effective rehabilitation [23]. Among them, gamification principles have been demonstrated to be effective in strengthening children’s learning performance and improving their training experience [24,25]; this strategy has been widely applied in children’s education and training products, bringing principles and mechanics from the gaming world to increase engagement and motivation of the user [26]. Therefore, a gamification approach to auditory-verbal training is also a promising direction for hearing rehabilitation [27], merging the fields of gamification and training to benefit individuals with hearing impairments.
We have briefly discussed the development of devices that enhance or replace acoustic signals, as well as the extensive use of tactile and vibrotactile feedback in games and video games over the past several decades [28]. These applications have roots dating back to the early days of gaming [29]. As a result of these developments, the intersection of rehabilitation and training techniques for the deaf, haptic and vibrotactile stimulation, and game dynamics emerges as a promising area of research that deserves further investigation.
A relatively recent review concludes that there is a lack of research in auditory or cognitive impairments compared with visual and motor disabilities, suggesting this as a topic for further research [20]. While devices that augment or substitute hearing using touch have been continuously developed, it is less known how training using such devices can help improve hearing skills. Systematic reviews have raised questions about the effectiveness of musical training [30] and investigated and individualized computer-based auditory training [31]. Some have underscored the influence of variables such as participants’ age, training duration, and the type of hearing device used [32], while some enquired the use of tactile displays for music applications design for hearing impaired individuals [8]. Additionally, studies have explored the impact of gamification on the learning process [24], while others have focused on deaf students without incorporating the haptic aspect into the assessment [33]. Therefore, the evaluation of the impact of vibrotactile technology is a crucial consideration for providing assistance in training activities to individuals with hearing impairments.
In this paper, we present a systematic review of the literature regarding training and gamified experiences that use haptic feedback to help individuals with hearing impairments. Section 2 introduces the (often ambiguous) terminology, Section 3 addresses the research questions, Section 4 the methodology, Section 5 the results, Section 6 the discussion, Section 7 the limitations of this study and Section 8 the conclusions.

2. Definitions

To establish a foundational understanding of this review and facilitate comprehension of the central concepts addressed in it, we will commence by providing relevant definitions. These definitions will serve as a framework for the subsequent analysis and discussion throughout the document.
Haptic In the Dictionary of Psychology, James M. Baldwin defined haptics as “[…] the concomitant sensations and perceptions […] cover[ing] the whole range of function of skin, muscle, tendon, and even of the static sense—thus including the senses of temperature and pain, and the perceptions of position, movement, etc.” [34].
Sensory augmentation Involves extending the individual perception of a sense by utilizing another sense or the same sense, and can involve various sensory systems [35].
Sensory substitution Is the replacement of a missing sensory perception by conveying the information typically acquired through one sense to another [36].
Tactile Is an umbrella term for the perception of vibrations, static pressure, skin stretch, or friction [37].
Vibrotactile Is a subcategory of tactile perception, where the tactile sensation is caused by an oscillating object [37].

3. Problem Statement and Research Questions

The problem tackled by this systematic review is to explore the methods we found that integrate haptic and vibrotactile stimulation into rehabilitation and aid in general for people with hearing problems. This research is motivated by a recognized gap in the existing literature, as discussed in the introduction. We want to update the corpus of existing devices and techniques (e.g., training) in this area and report their impact and effectiveness on the disabled population under study. Our systematic review focuses on the following research questions:
RQ1 
What are the main methodological characteristics of the reviewed articles?
RQ2 
What are the most common strategies for designing haptic-enhanced games or training programs to facilitate skill development, communication, or accessibility for individuals with varying degrees of hearing impairment?
RQ3 
Are the studies successful in reproducing positive effects when haptic feedback is applied?

4. Methodology

In this section, we describe the methodology we employed to select relevant literature for the systematic review. The whole study has been conducted following the PRISMA 2020 guidelines, and the related checklist [38].

4.1. Keywords

We curated a set of keywords concatenated with the logical operator “AND”, organized into three essential categories: tactile, hearing impairment, and games for rehabilitation and training. To cover various aspects of the core topics, we used the logical operator “OR” to connect alternative keywords. The keyword combination with the operators used for the database search is presented below (Listing 1):
Listing 1. Keywords combination used for the database search.
            haptic OR vibrotactile
            OR tactile OR touch
        AND
            hearing-impaired OR deaf
            OR (hearing AND impaired)
        AND
            game OR training OR education
            OR videogame OR gamification
The first group of keywords comprises terms that cover aspects of tactile interaction, such as haptics and vibrotactile. The term ‘haptics’ refers to the broad sense of touch, while ‘vibrotactile’ specifically entails the presence of a vibrating object that stimulates the tactile sensation. For a more in-depth explanation of these terms, we invite the reader to consult Section 2. The second group of keywords focuses on the target group who are hearing impaired or deaf individuals. Lastly, the third group includes keywords related to both training and gamification, which are at the core of this research.
Thanks to this set of keywords we intended to cover the majority of terms that are commonly used in the research fields of tactile perception, hearing impairment, and game-based interventions.

4.2. Inclusion/Exclusion Criteria

Together with the keywords, we established specific inclusion criteria. The manuscripts accepted in our review had to be written in English, undergo peer review, present primary research, being published in the last 25 years, and be designed to address the specific needs of the hearing-impaired population.
During the analysis, we adopted different exclusion criteria codes to better track the process. Concerning the first iteration where we took into account only abstract, title, and keywords, we applied the codes that are reported in Table 1 with the exception of the last two (missing validation or intervention), that have been used for the in-depth analysis.

4.3. Database Selection

To identify relevant literature, we conducted searches in two prominent electronic databases: Scopus® and PubMed®. We chose these databases due to their extensive and pertinent literature in the technical and medical domains. Since Scopus® includes more than 90 million records (Scopus blog, https://blog.scopus.com/posts/scopus-now-includes-90-million-content-records, accessed on 5 December 2023) and PubMed® more than 36 million (Pubmed about page, https://pubmed.ncbi.nlm.nih.gov/about/, accessed on 5 December 2023), we deemed incorporating additional data sources into this review unnecessary.

4.4. Data Collection

Using the aforementioned keywords and criteria, we retrieved a total of 187 entries from Scopus® and 180 from PubMed® databases (as of 26 September 2023). In the former database, one paper has been automatically removed by the Scopus® search engine due to lack of a peer review. The research results were stored in the references manager Zotero (https://www.zotero.org/, accessed on 5 December 2023); here, we merged the two collections and removed the duplicates, obtaining 294 unique records. Subsequently, we exported the results in a spreadsheet that allowed us to better organize the references and keep the relevant information only. We performed a second filtering operation by choosing only the manuscripts published after 1998 (i.e., within the last 25 years). For each of the 159 records obtained we analysed the title, the abstract, and the keywords, finally selecting only 42 relevant records. As a last step, we performed a comprehensive review of the full papers by narrowing down the eligible records for this review to 35 manuscripts. In Figure 1 we report the diagram of the whole process for the data collection of this systematic review.

4.5. Coding and Analysis

To answer the research questions, we categorized the selected entries with the methodology illustrated in Figure 1, analyzing the content of each manuscript. We reported the most relevant aspects of each research in a spreadsheet that contains, among other things, the following elements: study type, haptic body location, haptic usage, vibrotactile technology, mappings, vibrotactile processing, target impairment, training, and data collection method. This choice has been made to create an overview of the different applied methodologies and technologies, with the goal of answering the research questions.

4.6. Categorization

Here we introduce and explain some of the categories we applied. We divided the entries based on their research focus. The ones that are primarily aiming to demonstrate or validate the efficacy and outcomes of a particular approach or intervention are categorized as effectiveness. Others that focus on exploring and refining design methodologies and proving their validity fell into the design category. A second relevant category is the type of study. We differentiated the experimental from the quasi-experimental designs when we found that sample randomization, i.e., the random selection and assignment to a group of participants, was missing [39]. We indicated with pre-post tests the studies that analyzed the effect of an intervention measuring the subjects’ performances before and after it. The development and usability evaluation study category has been created for less structured studies that mainly focused on the design and the functionality aspects rather than the effects of the treatment. Finally, mixed methods was the category chosen for the experiments where the design of the study featured different aspects of other studies.
Considerable attention has been paid to categorizing and describing the approaches for the haptic usage, the choices around mappings, and the processing techniques for the generation of vibrotactile stimuli (where present). In the literature, the use of haptic feedback mainly addresses the substitution or the augmentation of one or more senses (e.g., hearing).
Diverse mapping strategies include employing full sound for actuator feedback, generating vibrotactile stimuli through text or gestures, and utilizing synthesis techniques that diverge from traditional sound-based or input-related methods. Endless combinations can be chosen when referring to vibrotactile processing. The categories we used try to simplify the plethora of techniques, pointing at two main features that can be identified in most of them: fundamental frequency (F0) extraction and modulation of a carrier with a temporal envelope. For more complex choices, we invite the reader to refer directly to the related manuscripts, since it would have been impractical to put this information inside a table.
The manuscripts reviewed in this study employed various mappings in their research projects, which we categorized into three groups: input-vibrotactile, input-location vibrotactile, and body location-output. In the first category, an input source is recognized and mapped to a specific vibration output. For example, sound-vibrotactile mapping involves generating vibrations from a manipulated sound sample to achieve specific perceptual effects. Researchers also utilized other inputs such as visual input (e.g., associating a specific image with a vibration), gestural input (e.g., associating a specific movement with a vibration), and textual input (e.g., associating a specific word or group of words with a vibration). The second category involves the use of sound-vibration maps on a specific body area, where the information includes both the spatial position and the vibration itself. Lastly, the third category incorporates the use of body location as an input source (e.g., touch of a part of the hand) mapped to a text output (e.g., letter, phoneme).

5. Results

By applying the filters presented in Section 4, we selected 35 out of 159 articles that were obtained with the screening process (22.01% rate of inclusion). Of the excluded items, 27.35% were marked as out of topic due to the lack of multiple key aspects for this research (i.e., more than one exclusion code applied). An additional 23.94% were not focusing on the hearing-impaired population, and 22.22% were missing haptic feedback. Other reasons for exclusion and the associated rates can be found in Table 1. As a result, we present in Table 2, Table 3, Table 4 and Table 5 the papers that constitute this systematic review, highlighting key characteristics of each publication.
In Figure 2, we can observe the distribution of manuscripts over the years. An increase in publications concerning this review’s topic is evident over the last four years, starting from 2019.
In the following sections, we will present the data retrieved from the manuscripts and organized into charts and tables that categorize the main themes: Section 5.1—metrics, Section 5.2—methodologies, Section 5.3—haptics, Section 5.4—vibrotactile technologies, Section 5.5—subjects, and Section 5.6—outcomes. The consequent plots have been generated with MATLAB (version: 23.2.0 (R2023b), https://www.mathworks.com/products/matlab.html, accessed on 5 December 2023) using a combination of plot, scatter and bar functions.

5.1. Metrics

Here we display the metrics in terms of type of publication and amount of citations per article.

5.1.1. Publication Types

In Figure 3 we report the type of publication of the included articles: the vast majority (68.57%) of them are journal articles, while only one is a book chapter. The remaining papers are conference proceedings.

5.1.2. Citations

Here, we present the citation count from Google Scholar along with the citations per year. The latter are calculated by dividing the total number of citations by the number of years between the publication date and the current year. In Figure 4, we also show the means for both categories: 18.43 for total citations and 3.01 for citations per year.

5.2. Methodologies

The first step of the analysis process included an investigation of the practices involved in the study. We considered the type of study design, the aim of the research, and the data collection procedure.

5.2.1. Study Type

We classified each article based on the study typology, as shown in Figure 5. The majority of articles fall into two main categories: experimental (9, 25.71%) and quasi-experimental design (8, 22.86%). We employed this distinction to clearly identify studies randomizing the participants’ groups (experimental) [67].
Two other frequently occurring study designs include the pre-post test and the development and usability evaluation study. The former examines the impact of a treatment by assessing performance before and after treatment administration [68]; based on our research criteria, we observe that this design has only been adopted during the last four years. The latter, as implied by its own name, is attributed to manuscripts whose aim is to design a process or device, and a test on a small group of participants is conducted.
We came across only one field trial, in which researchers aimed to enhance the ability of individuals with severe hearing impairment or deaf-blindness to detect, identify, and recognize the direction of sound-producing events [15]. Lastly, we encountered two mixed methods studies where both qualitative and quantitative evaluations have been made [43,45].

5.2.2. Research Focus

In this section, we report the focus of each research included in this review. Despite the high variability in study design approaches and topics, we tried to summarize the principal goals in only two categories: design and effectiveness. The difference between the two groups is the main focus: in the first group, specific attention is payed to developing a solution, leaving the evaluation as a secondary aspect; in the second group, the core of the research is the assessment of a specific method/device in terms of its performances. We can see in Figure 6 that the vast majority of the articles can be grouped in the effectiveness category (27 articles, 77.14%), and they can be found along the whole period of time that we took into account.

5.2.3. Data Collection

Figure 7 showcases a diverse range of data collection methods. Notably, task accuracy stands out as the most commonly employed method. In this category, we included all the manuscripts that contain accuracy measurements to evaluate user performance in specific tasks that are crucial for assessing the effectiveness of a treatment or a particular design. This prevalence of task accuracy as a method is not surprising, especially when compared to the findings in Figure 6, which indicate that the majority of papers are exploring the effectiveness of novel solutions.

5.3. Haptics

The objective of this systematic review is to explore the utilization of haptic feedback in studies involving the hearing-impaired population and their training. To achieve this goal, it is essential to delve into various aspects of haptic feedback. In this section, we will examine the diverse roles of haptic feedback, investigate the specific body parts involved in this process, and provide an overview of the devices commonly used for this purpose.

5.3.1. Usage

The breakdown in Figure 8 reveals distinct patterns: 54.29% of the manuscripts have designed their studies to convey specific information through touch, completely bypassing other senses (sensory substitution). Conversely, approximately 45.71% use haptic feedback to enhance one or more senses falling in the category of sensory augmentation, as explained in Section 2.

5.3.2. Haptic Body Location

The human body presents different sensitivity to haptic stimuli depending on the body location involved. Therefore, we investigated the distribution on the body of stimulus application and presented the results in Figure 9. A significant portion of the studies focused on stimulating either hands (13 studies) or fingertips (12 studies).

5.3.3. Mappings

A final aspect that has a great importance in the design of the experience with haptic feedback is the mapping, that is the way we connect a source stimulus with the haptic feedback. It is important to notice that haptic feedback can also play the role as an input, as in [55] where the shape/texture of a symbol was matched with a word. In Figure 10, we can observe that 18 articles (51.42%) use the sound-vibrotactile mapping. The group none encompasses all the manuscripts that do not present a specific connection between a sensorial input and the vibrotactile feedback generated, but instead investigate a perceptual aspect related to haptic feedback.

5.4. Vibrotactile Technology

The majority of the publications involved the use of some vibrotactile feedback technology. This can be provided by either a prototype conveying vibrations or a commercially available device. In the following sections we are going to investigate which kind of solutions have been used.

5.4.1. Device

Figure 11 displays the devices utilized in various articles. Smartphones are the most frequently used, with six publications employing their features to provide vibrotactile feedback. We can also observe that a great variety of solutions have been investigated, from measuring devices [10,14,48,50] to industrial products [11], and specifically designed devices for conveying vibrotactile feedback [40,41,57,60,64].

5.4.2. Actuators

Another important aspect of vibrotactile feedback is the actuator’s technology. In Figure 12, it is evident that the Eccentric Rotating Mass (ERM) is the most commonly employed type of actuator. This aligns with our earlier discussion in Section 5.4.1, where we discussed about using smartphones as tactile devices. Notably, ERMs are the most prevalent actuators found in smartphones due to their low cost and small dimensions. The studies that opted for some of the Tactaid devices have been tagged with not specified, since to the best of our knowledge it is not clear which technology operates behind these patented devices. The second most common type of actuator is the electrodynamic shaker, that is a high precision device for laboratory experiments and presents higher fidelity for greater cost and size compared to ERMs.

5.4.3. Vibrotactile Processing

The choice of a specific processing technique for generating vibrotactile stimuli is as crucial as selecting the target body part and the device. In Figure 13, we can observe that 10 of the studies employed a temporal envelope to modulate a carrier signal, while an additional 10 generated bespoke signals without starting from a pre-existing sound or source; both such techniques have seen increased usage in the last decade.
It is worth noting that five studies did not specify how vibrotactile stimuli were created. Three among those studies falling under the category none are primarily focused on haptic interactions [43,55] or they measure the perception of a vibrotactile stimulus that is not associated with other sources [63]. Furthermore, three studies employed a unique and convoluted approach to derive vibrations from sound signals that did not fit any of the categories part of the figure. As a result, we categorized them as complex [13,61,66].

5.5. Subjects

Upon examining the various participant groups in each study, we found that the mean number of total participants was 16.46 (STD = 15.67). By contrast, for studies including only sensory impaired participants, the mean value was 8.31 (STD = 12.13). Figure 14 indicates the number of total and sensory-impaired participants in each article. It is evident that the sensory-impaired group shows less consistency compared to the non-impaired group across different experiments. Fourteen studies (40.00%) from this review are actually missing an impaired testing pool. This inconsistency can be attributed to the challenge of recruiting individuals with specific sensory impairments who are willing to participate in the tests. As a result, it is more common to simulate sensory impairments by depriving non-impaired individuals of a sense (e.g., using earplugs).

5.5.1. Target Impairment

The distribution of target groups is quite homogeneous, with the majority of the articles dedicating to the hearing impaired (17 studies) followed by the deaf (Nine studies) and the deaf-blind (Eight studies). Other categories included in Figure 15 are associated with at least one of the above-mentioned groups.

5.5.2. Training

A final key point of our systematic review is the training aspect. The pre-test label reported in Figure 16 indicates a short training experience conducted right before the test, with variable time, and often not specified. This condition has been reported by 13 articles (37.14%) while in four manuscripts we found an extended training experience that took at least one month [40,55,56,65].

5.6. Outcome

In this section we examine the overall outcomes of the included articles and their statistical significance.

5.6.1. Positive/Negative

Figure 17 depicts the results obtained in each study. None of the articles reported only negative effects of their treatments. On the contrary, it is quite surprising to see that 26 articles out of 35 (74.28%) obtained a positive result from their tests, and almost all of them have been published in the last 12 years. This fact might recall the effect of positive findings on the submission rate [69]. The category complex represents all the studies where more than one outcome has been found and not all of them were positive.

5.6.2. Statistical Significance

The outcomes obtained from each study could be statistically significant or not, and can be related to both a qualitative and quantitative measurement. In Figure 18 we can see that in 19 (54.29%) articles there is an outcome that is statistically significant.

6. Discussion

The primary objective of this systematic review is to gather and analyze articles that propose haptic treatments or design solutions for the hearing-impaired population. In our methodology we detail the sampling approach, and in the subsequent chapter we evaluate 35 identified papers published between 2000 and 2023.
Two central themes underpin our exploration: training and gamification. To include all relevant literature where haptic technology intersects with gamification for hearing-impaired individuals, specific keywords such as game and gamification were introduced. The research in this particular domain yielded a limited number of papers. Despite the interest from industry and academia in video games equipped with vibrotactile feedback [70], the literature reporting their application to enhance the experience for the hearing-impaired population appears to be sparse. In our research, only three articles directly addressed gamification aspects in their design processes [46,47,62]. Cano et al. [46] focused on a table game for children aged 7 to 11 with hearing impairment. The game board and cards are the principal means of engagement. Additionally, a smartphone provides visual and vibrotactile feedback by reading QR codes on the physical interface. However, the latter is somewhat limited, offering a buzz-like sensation only when a child’s answer is incorrect, given its secondary role since the smartphone screen simultaneously displays a corresponding sad face emoticon. The second paper [47] introduces a mobile app that offers vibrotactile feedback in response to a detected drumming gesture by the smartphone. The interaction and feedback are described clearly, but the study lacks emphasis on the gaming aspect, even if the keyword game is included. The final paper addressing gamification is authored by Evreinov et al. [62]. In this work, the authors showcase a pen which is able to provide vibrotactile feedback when connected to a pocket PC. The primary objective of the vibrotactile feedback in this context is to convey tactile icons (i.e., tactons [71]) to deaf or blind users during their interaction with two video games designed for this specific purpose. Bringing together these considerations, we noticed a gap in the literature regarding games and haptics for the hearing impaired, making this a valuable path to investigate in the future.
Figure 2 reveals a growing interest in haptics applied to training for the hearing impaired. This can be paired with the increase in the number of new systems for music applications for the same target population [8]. The majority of papers focus on the effectiveness of the developed rehabilitative method, as stated in Section 5.2.2. Seventeen out of 27 studies (62.96%) measure the user’s accuracy on a specific task which is the most recurrent measurement, as reported in Figure 7. The remaining ones rely on psychophysiological measurements (such as two-interval forced choice scores), speech comprehension evaluations, or qualitative observations. Conversely, all the studies that collected data regarding task accuracy have effectiveness as a research focus, except for three [12,52,62]. This finding can be read as a shared methodology construction; the experimental design of a rehabilitative or training method includes the definition of a task whose outcome serves as a measurable quantity that can be used as a metric for training effectiveness.
We relate the almost equal partition in Figure 8 to some of the observed themes’ main patterns. For the target population, we note that there are seven studies involving blind participants in which vibrotactile technology was used for sensory substitution; only two used it for sensory augmentation. It is reasonable to think that absence of sight drives this design choice. Conversely, all three studies involving users with cochlear implants use haptics for sensory augmentation. Cochlear implant users receive a new electrical stimulation to their auditory nerve that gives them a mode of perception; making it multisensory could be a way to acquaint them with hearing. We note that sensory substitution and augmentation have a symmetric distribution concerning the main trends in vibrotactile processing in Figure 13. Among the studies using the augmentation approach, four used temporal envelope and seven synthetic generation; in the other group, six used temporal envelope, and only three generated vibrotactile stimulation synthetically. Even if distributed almost uniformly, sensory substitution leans toward creating the vibrotactile stimulus from scratch; sensory augmentation tends to use a temporal envelope perceivable by the other senses.
Determining the placement of vibrotactile stimuli is crucial for achieving the intended outcomes. The sensitivity distribution of our body to haptic stimuli is quite diverse, and considering these concepts is pivotal for good design. From Figure 9 it can be seen that most devices deliver haptic feedback to hands, palms, and fingertips. This is because these areas are rich in mechanoreceptors such as Pacinian receptors and Meissner corpuscles, which are crucial for perceiving vibrotactile stimulation [72]. Notably, even in the early stages of human life, during infants’ exploration, it has been demonstrated that we commonly rely on our hands and fingers to give sense to our surroundings. This tactile exploration allows humans to discern the objects’ shapes, textures, and temperatures, even before having the ability to investigate them visually [73]. Another area of the body used in the selected manuscripts are the wrists. This is a more convenient area for conducting other activities while receiving haptic feedback, since our hands can be left free to perform other tasks. The drawbacks are the presence of body hair that affects sensitivity, and clothes that might interfere with the experience. It is worth mentioning that in the article by Tufatulin et al. [45], the researchers used a loudspeaker to convey a full-body haptic feedback experience through water, using it as a medium to provide a multisensory experience, combining sound and vibrations to improve children’s hearing activation after hearing aid or cochlear implantation.
As observed in Section 5.3.3, a variety of mappings have been explored. However, more than 50% of these studies utilized vibrations derived from sound stimuli (sound-vibrotactile mapping). This finding is unsurprising given the well-established connection between auditory and tactile modalities in the literature [74], as these two senses show good potential when working together and present some close interactions [75]. When we look at perceptual aspects, such as the different sensitivities and thresholds of frequency perception for tactile and auditory channels, we can observe similar integration, masking, gap detection and just noticeable difference (JND) effects [76]. From a practical standpoint, sound-to-vibrotactile mapping proves technically convenient, as it often allows direct feeding of sounds within the audio range (20–20000 Hz). Since humans present limited tactile capabilities if compared to hearing ones (e.g., reduced frequency spectrum and resolution [76]), often Digital Signal Processing (DSP) techniques are often applied to the input sound stimuli to extract specific features such as the fundamental frequency (F0), harmonics, and temporal envelope. This way, the vibrotactile stimulation can emphasize certain aspects of the sound input while omitting secondary ones, aligning with the research objectives and tactile capabilities. In Figure 13, we can observe that one of the most common approaches involves extracting the temporal envelope from sound signals and applying it to the vibrotactile signal (that could, for instance, be generated using a synthesis method). If we focus on the articles that employed the sound-vibrotactile mapping, a remarkable pattern emerges: all of them administered vibrotactile stimuli to either the fingertip, the palm, or the whole hand, capitalizing on the high sensitivity of these body parts to vibrations [72]. Furthermore, stimulating the hand or fingertip requires minimal preparation from the participants, often eliminating the need for additional garments or wearable equipment that might increase the task duration and discomfort. Out of the 19 studies applying the sound-vibrotactile mapping, nine present positive statistically significant results; and additional six show more complex results with negative and positive outcomes [51,57,60,61,64,65]. These outcomes are tightly linked with both the design choices and the characteristics of the participants. Upon examining individual experiments, a common trend emerges in eight of the 14 studies that reported a positive or statistically significant result: the temporal envelope processing technique. This technique involves extracting the amplitude of sound stimuli over time and applying it to the vibrotactile signal, aiming at a clear amplitude correlation between the two. Summarizing these findings, one could argue that employing sound-vibrotactile mapping with temporal envelope processing techniques and delivering this stimulus to the hand (or fingertip or palm) may result in positive and statistically significant outcomes.
Moving to the device choice, we can observe that almost every article adopts a unique approach. The most common device is the smartphone [38,44,47,54,58], given its near-ubiquity; with most people owning one or at least being familiar with it, smartphones serve as convenient and portable tools for training and enhancing experiences. However, the compact size of smartphones comes with some drawbacks, particularly concerning vibrotactile performance. Due to their small form factor, the actuators in these devices must also be small, resulting in reduced frequency performance. Additionally, the design focus for the vibrotactile experience on smartphones has consistently prioritized conveying simple messages or notifications rather than complex sounds. To reduce costs and keep them as compact as possible, the majority of smartphones are equipped with ERM actuators that usually operate on one single frequency (resonant frequency) [77]. Furthermore, using such devices in this field introduces significant challenges in controlling potentially confounding variables that are typically less pronounced in controlled laboratory settings and equipment, hence complicating and reducing the reliability of experiments and evaluations. A contrasting approach is evident in the studies by Fletcher et al. [10,14,48,50], where electrodynamic shakers are employed to convey vibrations through a complex and high-fidelity piece of equipment. Specifically, electrodynamic shakers are closely linked to voice-coils and find extensive use in industrial applications. Using this method, the HVLab device reproduces the input signal with good quality, covering a frequency range of 16 to 500 Hz with a low tolerance for frequency deviation (<0.1%). Given the variety of tools and devices available, researchers should exercise caution when choosing an actuator technology, bearing in mind that each has its pros and cons. Broadly, two major categories can be distinguished: piezoelectric, ERM, and linear moving magnets favor small size and low cost, whereas electromagnetic vibrators, voice-coils, loudspeakers, and inertial transducers emphasize high-quality performance.
Our research has unveiled haptic solutions that have evolved over the years, often utilizing unique vibrotactile processing techniques tailored to specific devices, as shown in Section 5.4.1 and Section 5.4.3. The lack of documentation on both hardware and software for the patented solutions generated issues concerning transparency and replicability. The lack of standardization of processing (Section 5.4.3) and device technology raises concerns about the generalizability of the findings to broader user populations. This issue becomes even more evident when considering the target population (Section 5.5.1).
The retrieved data reveals a significant disparity in the participants involved in these experiments: most studies either include a limited number of individuals with target impairment, or simulate impairments by depriving people of one or more senses. Thirteen publications present more than eight participants with impairments (above the mean of the whole study group; see Figure 14), and ten of these studies declared an affiliation with a hospital or collaboration with a school, health institution, or association for impaired individuals [14,40,45,48,51,54,56,59,60,63,64,65]. While recognizing the substantial challenges in the recruitment process, particularly within minority groups, we recommend that researchers establish close collaborations with hospitals, schools, and care centers to access a more diverse and representative population. Working closely within a clinical environment can also shed light on challenges that might not be apparent to academics alone. This collaborative approach can foster a better understanding of the real-world needs and experiences of the hearing-impaired population, ultimately leading to more effective haptic solutions.
A consistent pattern emerges when filtering the included articles to focus on those with positive statistically significant outcomes involving impaired individuals. All six studies meeting these criteria have been published within the last eleven years and employed sound-to-vibrotactile feedback mapping. If we dig into the details, four of the six articles applied haptic technology to enhance another sensory modality by applying vibrotactile stimulation on the wrist [14,40,48] or full body, as observed by Nanayakkara et al. [56]. The remaining two studies used vibrations in other body parts for sensory substitution [11,54]. In three of them, the vibrotactile processing techniques utilized temporal envelope-based methods [14,48,54], while the other three applied full sound [56] generated synthetic stimuli [11]. Since Daza Gonzalez et al. [40] utilized the Lofelt bracelet, the specifics of the DSP method for the vibrotactile generation were not disclosed.
Considering the studies with either no training or only a brief training experience before exposing the participants to the experiment (pre-test), we observed no relevant pattern relating the training length and the statistical significance of the results. Nine studies reported no significant results, whereas seven studies did.
In conclusion, the evidence that all the significant positive outcomes involved a sound-to-vibrotactile feedback mapping confirms the long-standing idea that the multiple perceptual aspects connected to sound are transmittable through touch. Thus, a sensory substitution of this type is a viable solution for hearing-impaired rehabilitation and training.

7. Limitations

For this systematic review, we exclusively used two databases: Scopus® and PubMed®. We did not employ alternative methods for the literature search, such as secondary references or websites, as we believed that these two databases comprehensively covered the available literature. However, it’s worth noting that we may have missed some grey literature.
Given the wide range of topics covered in the selected manuscripts, we acknowledge that justifying the inclusion of some articles, even if they met the selection criteria, presented challenges. For example, we are aware that some literature primarily aims to measure perception thresholds rather than to assess the effectiveness of haptic treatments or designs. Another hurdle was comparing studies involving haptics with those focusing on vibrotactile feedback. Some employed categories may not perfectly align with studies that do not exclusively involve vibrotactile feedback. Moreover, we reviewed studies where haptics was used to train individuals with sensory abilities to communicate with those who have impairments, presenting a different perspective from the majority of the included studies. Despite this difference, we chose not to exclude these articles because they offered insights into valuable aspects relevant to all the manuscripts.

8. Conclusions and Future Research

This systematic review compiles a set of papers exploring the integration of haptic feedback in training and gamification protocols to enhance the auditory experience for individuals with hearing impairments. We initially identified 294 articles from two prominent databases using relevant keywords. After careful screening and eligibility checks, we included 35 manuscripts in our analysis. Our examination primarily centers on study design, hardware and software solutions, training protocols, and the resulting test outcomes. Finally, we derive insights from the findings to provide recommendations for future researchers and designers.
Within the literature review, we observed a notable scarcity of studies addressing games and haptics for hearing-impaired individuals, underlining the urgency for further exploration in this critical area. Furthermore, those that delved into the topic often had a limited focus, either on vibrotactile or gamification aspects, leaving the combination relatively unexplored.
A noteworthy discovery is a consensus on targeting hands and wrists with haptic feedback alongside temporal envelope-processed sound, yielding positive and statistically significant results. This presents a promising avenue for future research. On the contrary, the diverse array of devices conveying vibrotactile feedback adds complexity, making it challenging to establish clear correlations between treatment administration and observed outcomes.
We emphasize the importance of conducting research in real-world, ecologically valid environments, collaborating closely with end-users, rather than confining studies solely to controlled laboratory settings. While acknowledging the challenges of field research, we contend that testing in real-world scenarios offers a more accurate understanding of the practical challenges and benefits experienced by hearing-impaired individuals with haptic solutions. Inspired by the diversity of design choices for training programs (see Section 5.5.2), we believe that combining qualitative assessments with quantitative data can provide a more comprehensive understanding of this multifaceted sensory domain and richer interpretation of the results.
Referring to the results in Section 5.6.2, it is crucial to note that several papers were excluded from our analysis due to a lack of statistically significant quantitative findings, attributed to a low participant count. This challenge can be addressed by designing studies involving organizations and hospitals, thereby ensuring a more extensive population to collaborate with and emphasizing the importance of qualitative results alongside quantitative ones.
As a final remark for future research, we recommend exploring more engaging technologies tailored to the younger population. While researchers and industries have developed immersive technologies over the past decade, it is noteworthy that previous studies emphasize the importance of clinical environments [78]. However, there is a limited inclusion of haptic feedback in immersive technology specifically designed for hearing-impaired individuals, with only a few examples found in the literature [79]. Therefore, we propose further investigation into the potential benefits of immersive experiences coupled with haptic feedback for this demographic.

Funding

This research was funded by the Nordfosk Nordic University Hub through the Nordic Sound and Music project (number 86892), and as well as the Nordic Sound and Music Computing Network and Aalborg University.

Institutional Review Board Statement

Not applicable.

Informed Consent Statement

Not applicable.

Data Availability Statement

Data is containted within the article.

Conflicts of Interest

The authors declare that the research was conducted in the absence of any commercial or financial relationships that could be construed as a potential conflict of interest. The funders had no role in the design of the study; in the collection, analyses, or interpretation of data; in the writing of the manuscript; or in the decision to publish the results.

References

  1. Stein, B.E.; Meredith, M.A. The Merging of the Senses; MIT Press: Cambridge, MA, USA, 1993. [Google Scholar]
  2. Biocca, F.; Kim, J.; Choi, Y. Visual touch in virtual environments: An exploratory study of presence, multimodal interfaces, and cross-modal sensory illusions. Presence: Teleoper. Virtual Environ. 2001, 10, 247–265. [Google Scholar] [CrossRef]
  3. Lederman, S.J.; Klatzky, R.L. Haptic perception: A tutorial. Atten. Percept. Psychophys. 2009, 71, 1439–1459. [Google Scholar] [CrossRef] [PubMed]
  4. Knudsen, V.O. “Hearing” with the sense of touch. J. Gen. Psychol. 1928, 1, 320–352. [Google Scholar] [CrossRef]
  5. Cieśla, K.; Wolak, T.; Lorens, A.; Heimler, B.; Skarżyński, H.; Amedi, A. Immediate improvement of speech-in-noise perception through multisensory stimulation via an auditory to tactile sensory substitution. Restor. Neurol. Neurosci. 2019, 37, 155–166. [Google Scholar] [CrossRef] [PubMed]
  6. Mills, M. On disability and cybernetics: Helen Keller, Norbert Wiener, and the hearing glove. Differences 2011, 22, 74–111. [Google Scholar] [CrossRef]
  7. Cowan, R.; Galvin, K.; Sarant, J.; Millard, R.; Blamey, P.; Clark, G. Improved electrotactile speech processor: Tickle Talker. Ann. Otol. Rhinol. Laryngol. Suppl. 1995, 166, 454–456. [Google Scholar]
  8. Paisa, R.; Nilsson, N.C.; Serafin, S. Tactile displays for auditory augmentation—A scoping review and reflections on music applications for hearing impaired users. Front. Comput. Sci. 2023, 5, 1085539. [Google Scholar] [CrossRef]
  9. Shibasaki, M.; Kamiyama, Y.; Minamizawa, K. Designing a haptic feedback system for hearing-impaired to experience tap dance. In Proceedings of the UIST’16 Adjunct: Adjunct Proceedings of the 29th Annual ACM Symposium on User Interface Software and Technology, Tokyo, Japan, 16–19 October 2016; pp. 97–99. [Google Scholar]
  10. Fletcher, M.D.; Zgheib, J. Haptic sound-localisation for use in cochlear implant and hearing-aid users. Sci. Rep. 2020, 10, 14171. [Google Scholar] [CrossRef]
  11. Hopkins, C.; Maté-Cid, S.; Fulford, R.; Seiffert, G.; Ginsborg, J. Perception and learning of relative pitch by musicians using the vibrotactile mode. Music. Sci. 2023, 27, 3–26. [Google Scholar] [CrossRef]
  12. Shin, S.; Oh, C.; Shin, H. Tactile Tone System: A Wearable Device to Assist Accuracy of Vocal Pitch in Cochlear Implant Users. Int. Conf. Comput. Access. (ASSETS) 2020, 1–3. [Google Scholar] [CrossRef]
  13. Tan, H.Z.; Reed, C.M.; Jiao, Y.; Perez, Z.D.; Wilson, E.C.; Jung, J.; Martinez, J.S.; Severgnini, F.M. Acquisition of 500 English Words through a TActile Phonemic Sleeve (TAPS). IEEE Trans. Haptics 2020, 13, 745–760. [Google Scholar] [CrossRef] [PubMed]
  14. Fletcher, M.D.; Hadeedi, A.; Goehring, T.; Mills, S.R. Electro-haptic enhancement of speech-in-noise performance in cochlear implant users. Sci. Rep. 2019, 9, 11428. [Google Scholar] [CrossRef] [PubMed]
  15. Ranjbar, P.; Stenström, I. Monitor, a vibrotactile aid for environmental perception: A field evaluation by four people with severe hearing and vision impairment. Sci. World J. 2013, 2013, 206734. [Google Scholar] [CrossRef] [PubMed]
  16. Sakajiri, M.; Nakamura, K.; Fukushima, S.; Miyoshi, S.; Ifukube, T. Effect of voice pitch control training using a two-dimensional tactile feedback display system. In Proceedings of the 2012 IEEE International Conference on Systems, Man, and Cybernetics (SMC), Seoul, Republic of Korea, 14–17 October 2012; pp. 2943–2947. [Google Scholar] [CrossRef]
  17. Rantala, J.; Raisamo, R.; Lylykangas, J.; Surakka, V.; Raisamo, J.; Salminen, K.; Pakkanen, T.; Hippula, A. Methods for presenting braille characters on a mobile device with a touchscreen and tactile feedback. IEEE Trans. Haptics 2009, 2, 28–39. [Google Scholar] [CrossRef] [PubMed]
  18. Bernstein, L.E.; Demorest, M.E.; Coulter, D.C.; O’Connell, M.P. Lipreading sentences with vibrotactile vocoders: Performance of normal-hearing and hearing-impaired subjects. J. Acoust. Soc. Am. 1991, 90, 2971–2984. [Google Scholar] [CrossRef] [PubMed]
  19. Kaklanis, N.; Tzovaras, D.; Moustakas, K. Haptic navigation in the world wide web. In Proceedings of the Universal Access in Human-Computer Interaction. Applications and Services: 5th International Conference, UAHCI 2009, Held as Part of HCI International 2009, San Diego, CA, USA, 19–24 July 2009; Proceedings, Part III 5. Springer: Berlin, Germany, 2009; pp. 707–715. [Google Scholar]
  20. Aguado-Delgado, J.; Gutierrez-Martinez, J.M.; Hilera, J.R.; de Marcos, L.; Otón, S. Accessibility in video games: A systematic review. Univers. Access Inf. Soc. 2020, 19, 169–193. [Google Scholar] [CrossRef]
  21. Porter, J.R.; Kientz, J.A. An empirical study of issues and barriers to mainstream video game accessibility. In Proceedings of the 15th International ACM SIGACCESS Conference on Computers and Accessibility, Bellevue, WA, USA, 21–23 October 2013; pp. 1–8. [Google Scholar]
  22. Grammenos, D.; Savidis, A.; Stephanidis, C. Designing universally accessible games. Comput. Entertain. (CIE) 2009, 7, 1–29. [Google Scholar] [CrossRef]
  23. Elsahar, Y.; Hu, S.; Bouazza-Marouf, K.; Kerr, D.; Mansor, A. Augmentative and alternative communication (AAC) advances: A review of configurations for individuals with a speech disability. Sensors 2019, 19, 1911. [Google Scholar] [CrossRef]
  24. Busarello, R.I.; Ulbricht, V.R.; Fadel, L.M.; de Freitas e Lopes, A.V. Gamification Approaches to Learning and Knowledge Development: A theorical review. In New Advances in Information Systems and Technologies. Advances in Intelligent Systems and Computing; Springer: Cham, Switzerland, 2016; pp. 1107–1116. [Google Scholar]
  25. Rodríguez-Ferrer, J.M.; Manzano-León, A.; Aguilar-Parra, J.M.; Cangas, A. Effectiveness of gamification and game-based learning in Spanish adolescents with dyslexia: A longitudinal quasi-experimental research. Res. Dev. Disabil. 2023, 141, 104603. [Google Scholar] [CrossRef]
  26. Christopoulos, A.; Mystakidis, S. Gamification in Education. Encyclopedia 2023, 3, 1223–1243. [Google Scholar] [CrossRef]
  27. Xiang, Y.; Zhang, Z.; Chang, D.; Tu, L. The Impact of Gamified Auditory-Verbal Training for Hearing-Challenged Children at Intermediate and Advanced Rehabilitation Stages. arXiv 2023, arXiv:2310.11047. [Google Scholar]
  28. Bayousuf, A.; Al-Khalifa, H.S.; Al-Salman, A. Haptics-based systems characteristics, classification, and applications. In Advanced Methodologies and Technologies in Artificial Intelligence, Computer Simulation, and Human-Computer Interaction; IGI Global: Hershey, PA, USA, 2019; pp. 778–794. [Google Scholar] [CrossRef]
  29. Wolf, M.J. The Video Game Explosion: A History from PONG to Playstation and Beyond; Bloomsbury Publishing USA: New York, NY, USA, 2007. [Google Scholar]
  30. McKay, C.M. No evidence that music training benefits speech perception in hearing-impaired listeners: A systematic review. Trends Hear. 2021, 25, 2331216520985678. [Google Scholar] [CrossRef] [PubMed]
  31. Henshaw, H.; Ferguson, M.A. Efficacy of individual computer-based auditory training for people with hearing loss: A systematic review of the evidence. PLoS ONE 2013, 8, e62836. [Google Scholar] [CrossRef] [PubMed]
  32. Ab Shukor, N.F.; Lee, J.; Seo, Y.J.; Han, W. Efficacy of music training in hearing aid and cochlear implant users: A systematic review and meta-analysis. Clin. Exp. Otorhinolaryngol. 2021, 14, 15–28. [Google Scholar] [CrossRef] [PubMed]
  33. Costa, C.; Marcelino, L.; Neves, J.; Sousa, C. Games for Education of Deaf Students: A Systematic Literature. In ECGBL 2019 13th European Conference on Game-Based Learning; Academic Conferences and Publishing Limited: Oxfordshire, UK, 2019; p. 170. [Google Scholar]
  34. Baldwin, J.M. Dictionary of Philosophy and Psychology: Volume 1 (1901), Volume 2 (1902), Volume 3 (1905); Macmillan: New York, NY, USA, 1901. [Google Scholar]
  35. Macpherson, F. (Ed.) Sensory Substitution and Augmentation: An Introduction. In Sensory Substitution and Augmentation; Oxford University Press: Oxford, UK, 2018. [Google Scholar]
  36. Bach-y Rita, P.; Kercel, S.W. Sensory substitution and the human–machine interface. Trends Cogn. Sci. 2003, 7, 541–546. [Google Scholar] [CrossRef] [PubMed]
  37. Choi, S.; Kuchenbecker, K.J. Vibrotactile display: Perception, technology, and applications. Proc. IEEE 2012, 101, 2093–2104. [Google Scholar] [CrossRef]
  38. Page, M.J.; McKenzie, J.E.; Bossuyt, P.M.; Boutron, I.; Hoffmann, T.C.; Mulrow, C.D.; Shamseer, L.; Tetzlaff, J.M.; Akl, E.A.; Brennan, S.E.; et al. The PRISMA 2020 statement: An updated guideline for reporting systematic reviews. Int. J. Surg. 2021, 88, 105906. [Google Scholar] [CrossRef]
  39. Kirk, R.E. Experimental design. In Sage Handbook of Quantitative Methods in Psychology; Sage Publications Ltd.: Thousand Oaks, CA, USA, 2009; pp. 23–45. [Google Scholar]
  40. Daza Gonzalez, M.; Phillips-Silver, J.; Maurno, N.; García, L.; Ruiz-Castañeda, P. Improving phonological skills and reading comprehension in deaf children: A new multisensory approach. Sci. Stud. Read. 2023, 27, 119–135. [Google Scholar] [CrossRef]
  41. Ganis, F.; Serafin, S.; Vatti, M. Tickle Tuner—Haptic Smartphone Cover for Cochlear Implat Users’s Musical Training; Springer International Publishing: Cham, Switzerland, 2022; pp. 686–687. [Google Scholar]
  42. Janidarmian, M.; Roshan Fekr, A.; Radecka, K.; Zilic, Z. Wearable Vibrotactile System as an Assistive Technology Solution. Mob. Netw. Appl. 2022, 27, 709–717. [Google Scholar] [CrossRef]
  43. Xohua-Chacón, A.; Benítez-Guerrero, E.; Muñoz-Arteaga, J.; Mezura-Godoy, C. Using a tangible system to promote inclusive, collaborative activities to learn relational algebra for students with hearing impairment. Univers. Access Inf. Soc. 2022, 22, 1185–1197. [Google Scholar] [CrossRef]
  44. Domenici, N.; Inuggi, A.; Tonelli, A.; Gori, M. A novel Android app to evaluate and enhance auditory and tactile temporal thresholds. In Proceedings of the 2021 43rd Annual International Conference of the IEEE Engineering in Medicine & Biology Society (EMBC), Guadalajara, Mexico, 1–5 November 2021; Volume 2021, pp. 5885–5888. [Google Scholar] [CrossRef]
  45. Tufatulin, G.; Koroleva, I.; Artyushkin, S.; Yanov, Y. The benefits of underwater vibrostimulation in the rehabilitation of children with impaired hearing. Int. J. Pediatr. Otorhinolaryngol. 2021, 149, 110855. [Google Scholar] [CrossRef] [PubMed]
  46. Cano, S.; Naranjo, J.; Henao, C.; Rusu, C.; Albiol-pérez, S. Serious game as support for the development of computational thinking for children with hearing impairment. Appl. Sci. 2021, 11, 115. [Google Scholar] [CrossRef]
  47. Iijima, R.; Shitara, A.; Sarcar, S.; Ochiai, Y. Smartphone Drum: Gesture-Based Digital Musical Instruments Application for Deaf and Hard of Hearing People; Association for Computing Machinery: New York, NY, USA, 2021. [Google Scholar] [CrossRef]
  48. Fletcher, M.; Song, H.; Perry, S. Electro-haptic stimulation enhances speech recognition in spatially separated noise for cochlear implant users. Sci. Rep. 2020, 10, 12723. [Google Scholar] [CrossRef] [PubMed]
  49. Giulia, C.; Chiara, D.V.; Esmailbeigi, H. GLOS: GLOve for Speech Recognition. In Proceedings of the 2019 41st Annual International Conference of the IEEE Engineering in Medicine and Biology Society (EMBC), Berlin, Germany, 23–27 July 2019; pp. 3319–3322. [Google Scholar] [CrossRef]
  50. Fletcher, M.D.; Mills, S.R.; Goehring, T. Vibro-Tactile Enhancement of Speech Intelligibility in Multi-talker Noise for Simulated Cochlear Implant Listening. Trends Hear. 2018, 22, 2331216518797838. [Google Scholar] [CrossRef] [PubMed]
  51. González-Garrido, A.; Ruiz-Stovel, V.; Gómez-Velázquez, F.; Vélez-Pérez, H.; Romo-Vázquez, R.; Salido-Ruiz, R.; Espinoza-Valdez, A.; Campos, L. Vibrotactile discrimination training affects brain connectivity in profoundly deaf individuals. Front. Hum. Neurosci. 2017, 11, 28. [Google Scholar] [CrossRef] [PubMed]
  52. Schmidt, M.; Bank, C.; Weber, G. Zoning-based gesture recognition to enable a mobile lorm trainer. In Computers Helping People with Special Needs. ICCHP 2016. Lecture Notes in Computer Science; Springer: Cham, Switzerland, 2016; Volume 9759, pp. 479–486. [Google Scholar] [CrossRef]
  53. Norberg, L.; Westin, T.; Mozelius, P.; Wiklund, M. Web accessibility by morse code modulated haptics for deaf-blind. In Technology, Rehabilitation and Empowerment of People with Special Needs; Nova Science Pub Inc.: Hauppauge, NY, USA, 2015; pp. 123–134. [Google Scholar]
  54. Parivash, R. Signal processing methods for improvement of environmental perception of persons with deafblindness. Adv. Mater. Res. 2014, 902, 398–404. [Google Scholar] [CrossRef]
  55. Snodgrass, M.; Stoner, J.; Angell, M. Teaching conceptually referenced core vocabulary for initial augmentative and alternative communication. AAC: Augment. Altern. Commun. 2013, 29, 322–333. [Google Scholar] [CrossRef] [PubMed]
  56. Nanayakkara, S.; Wyse, L.; Taylor, E.A. The haptic chair as a speech training aid for the deaf. In Proceedings of the 24th Australian Computer-Human Interaction Conference, OzCHI ’12, Melbourne, Australia, 26–30 November 2012; Association for Computing Machinery: New York, NY, USA, 2012; pp. 405–410. [Google Scholar] [CrossRef]
  57. Wang, N.; Huang, J. The modulation of tactile stimulation on Mandarin tone production in children with cochlear implant. Proceedings of 20th International Congress on Acoustics, ICA 2010, Sydney, Australia, 23–27 August 2010; Volume 4, pp. 3199–3202. [Google Scholar]
  58. Jayant, C.; Acuario, C.; Johnson, W.; Hollier, J.; Ladner, R. V-Braille: Haptic Braille perception using a touch-screen and vibration on mobile phones. In Proceedings of the 12th International ACM SIGACCESS Conference on Computers and Accessibility, ASSETS ’10, Tempe, AZ, USA, 25–27 October 2010; pp. 295–296. [Google Scholar] [CrossRef]
  59. Barbacena, I.; Freire, R.; Barros, A.; Neto, B.; Carvalho, E.; De Macedo, E. Voice codification evaluation based on a real-time training system with tactile feedback applied to deaf people. In Proceedings of the 2009 IEEE Instrumentation and Measurement Technology Conference, Singapore, 5–7 May 2009; pp. 697–700. [Google Scholar] [CrossRef]
  60. Karimi-Yazdi, A.; Sazgar, A.A.; Nadimi-Tehran, A.; Faramarzi, A.; Nassaj, F.; Yahyavi, S. Application and usage of tactile aid in Iran. Arch. Iran. Med. 2006, 9, 344–347. [Google Scholar]
  61. Yuan, H.; Reed, C.M.; Durlach, N.I. Tactual display of consonant voicing as a supplement to lipreading. J. Acoust. Soc. Am. 2005, 118, 1003–1015. [Google Scholar] [CrossRef]
  62. Evreinov, G.; Evreinova, T.; Raisamo, R. Mobile games for training tactile perception. In Entertainment Computing—ICEC 2004. ICEC 2004. Lecture Notes in Computer Science; Springer: Berlin/Heidelberg, Germany, 2004; Volume 3166, pp. 468–475. [Google Scholar] [CrossRef]
  63. Arnold, P.; Heiron, K. Tactile memory of deaf-blind adults on four tasks. Scand. J. Psychol. 2002, 43, 73–79. [Google Scholar] [CrossRef]
  64. Andersson, U.; Lyxell, B.; Rönnberg, J.; Spens, K.E. Effects of tactile training on visual speechreading: Performance changes related to individual differences in cognitive skills. J. Deaf. Stud. Deaf. Educ. 2001, 6, 116–129. [Google Scholar] [CrossRef] [PubMed]
  65. Bernstein, L.; Auer, E., Jr.; Tucker, P. Enhanced Speechreading in Deaf Adults: Can Short-Term Training/Practice Close the Gap for Hearing Adults? J. Speech, Lang. Hear. Res. 2001, 44, 5–18. [Google Scholar] [CrossRef] [PubMed]
  66. Galvin, K.; Blamey, P.; Cowan, R.; Oerlemans, M.; Clark, G. Generalization of tactile perceptual skills to new context following tactile-alone word recognition training with the Tickle Talker(TM). J. Acoust. Soc. Am. 2000, 108, 2969–2979. [Google Scholar] [CrossRef]
  67. Seltman, H.J. Experimental Design and Analysis; Carnegie Mellon University Pittsburgh: Pittsburgh, PA, USA, 2012. [Google Scholar]
  68. Robinson, E.A.; Doueck, H.J. Implications of the pre/post/then design for evaluating social group work. Res. Soc. Work. Pract. 1994, 4, 224–239. [Google Scholar] [CrossRef]
  69. Coursol, A.; Wagner, E.E. Effect of positive findings on submission and acceptance rates: A note on meta-analysis bias. Prof. Psychol. Res. Pract. 1986, 17, 136–137. [Google Scholar] [CrossRef]
  70. Singhal, T.; Schneider, O. Juicy haptic design: Vibrotactile embellishments can improve player experience in games. In Proceedings of the 2021 CHI Conference on Human Factors in Computing Systems, Yokohama, Japan, 8–13 May 2021; pp. 1–11. [Google Scholar]
  71. Brewster, S.A.; Brown, L.M. Tactons: Structured tactile messages for non-visual information display. In Proceedings of the Australasian User Interface Conference 2004, Dunedin, New Zealand, 18–22 January 2004. [Google Scholar]
  72. Verrillo, R.T.; Bolanowski, S.J. Tactile responses to vibration. In Handbook of Signal Processing in Acoustics; Springer: New York, NY, USA, 2008; pp. 1185–1213. [Google Scholar]
  73. Wilcox, T.; Woods, R.; Chapa, C.; McCurry, S. Multisensory exploration and object individuation in infancy. Dev. Psychol. 2007, 43, 479. [Google Scholar] [CrossRef] [PubMed]
  74. Ro, T.; Ellmore, T.M.; Beauchamp, M.S. A neural link between feeling and hearing. Cereb. Cortex 2013, 23, 1724–1730. [Google Scholar] [CrossRef]
  75. Verma, T.; Aker, S.C.; Marozeau, J. Effect of vibrotactile stimulation on auditory timbre perception for normal-hearing listeners and cochlear-implant users. Trends Hear. 2023, 27, 23312165221138390. [Google Scholar] [CrossRef]
  76. Merchel, S.; Altinsoy, M.E. Psychophysical comparison of the auditory and tactile perception: A survey. J. Multimodal User Interfaces 2020, 14, 271–283. [Google Scholar] [CrossRef]
  77. Granado, E.; Quizhpi, F.; Zambrano, J.; Colmenares, W. Remote experimentation using a smartphone application with haptic feedback. In Proceedings of the 2016 IEEE Global Engineering Education Conference (EDUCON), Abu Dhabi, United Arab Emirates, 10–13 April 2016; pp. 240–247. [Google Scholar]
  78. Serafin, S.; Adjorlu, A.; Percy-Smith, L.M. A Review of Virtual Reality for Individuals with Hearing Impairments. Multimodal Technol. Interact. 2023, 7, 36. [Google Scholar] [CrossRef]
  79. Mirzaei, M.; Kan, P.; Kaufmann, H. EarVR: Using ear haptics in virtual reality for deaf and Hard-of-Hearing people. IEEE Trans. Vis. Comput. Graph. 2020, 26, 2084–2093. [Google Scholar] [CrossRef] [PubMed]
Figure 1. Block diagram of data inclusion method.
Figure 1. Block diagram of data inclusion method.
Mti 08 00001 g001
Figure 2. Amount of articles per year of publication.
Figure 2. Amount of articles per year of publication.
Mti 08 00001 g002
Figure 3. Publication type. The size of each data point represents the amount of articles per year.
Figure 3. Publication type. The size of each data point represents the amount of articles per year.
Mti 08 00001 g003
Figure 4. Citations per article [5,10,11,12,13,14,15,16,40,41,42,43,44,45,46,47,48,49,50,51,52,53,54,55,56,57,58,59,60,61,62,63,64,65,66].
Figure 4. Citations per article [5,10,11,12,13,14,15,16,40,41,42,43,44,45,46,47,48,49,50,51,52,53,54,55,56,57,58,59,60,61,62,63,64,65,66].
Mti 08 00001 g004
Figure 5. Study types. The size of each data point represents the amount of articles per year.
Figure 5. Study types. The size of each data point represents the amount of articles per year.
Mti 08 00001 g005
Figure 6. Research focus. The size of each data point represents the amount of articles per year.
Figure 6. Research focus. The size of each data point represents the amount of articles per year.
Mti 08 00001 g006
Figure 7. Type of data collection method. The size of each data point represents the amount of articles per year.
Figure 7. Type of data collection method. The size of each data point represents the amount of articles per year.
Mti 08 00001 g007
Figure 8. Use of haptics in the articles. The size of each data point represents the amount of articles per year.
Figure 8. Use of haptics in the articles. The size of each data point represents the amount of articles per year.
Mti 08 00001 g008
Figure 9. Body locations where haptic feedback has been applied. The size of each data point represents the amount of articles per year.
Figure 9. Body locations where haptic feedback has been applied. The size of each data point represents the amount of articles per year.
Mti 08 00001 g009
Figure 10. Haptic feedback mappings. The size of each data point represents the amount of articles per year.
Figure 10. Haptic feedback mappings. The size of each data point represents the amount of articles per year.
Mti 08 00001 g010
Figure 11. Vibrotactile devices. The size of each data point represents the amount of articles per year.
Figure 11. Vibrotactile devices. The size of each data point represents the amount of articles per year.
Mti 08 00001 g011
Figure 12. Actuator technology. The size of each data point represents the amount of articles per year.
Figure 12. Actuator technology. The size of each data point represents the amount of articles per year.
Mti 08 00001 g012
Figure 13. Vibrotactile processing generation techniques. The size of each data point represents the amount of articles per year.
Figure 13. Vibrotactile processing generation techniques. The size of each data point represents the amount of articles per year.
Mti 08 00001 g013
Figure 14. Sample size for each study. The two lines indicate the mean number of participants for each group [5,10,11,12,13,14,15,16,40,41,42,43,44,45,46,47,48,49,50,51,52,53,54,55,56,57,58,59,60,61,62,63,64,65,66].
Figure 14. Sample size for each study. The two lines indicate the mean number of participants for each group [5,10,11,12,13,14,15,16,40,41,42,43,44,45,46,47,48,49,50,51,52,53,54,55,56,57,58,59,60,61,62,63,64,65,66].
Mti 08 00001 g014
Figure 15. Target impairment. The size of each data point represents the amount of articles per year.
Figure 15. Target impairment. The size of each data point represents the amount of articles per year.
Mti 08 00001 g015
Figure 16. Users’ training. The size of each data point represents the amount of articles per year.
Figure 16. Users’ training. The size of each data point represents the amount of articles per year.
Mti 08 00001 g016
Figure 17. Articles outcomes. The size of each data point represents the amount of articles per year.
Figure 17. Articles outcomes. The size of each data point represents the amount of articles per year.
Mti 08 00001 g017
Figure 18. Statistical significance. The size of each data point represents the amount of articles per year.
Figure 18. Statistical significance. The size of each data point represents the amount of articles per year.
Mti 08 00001 g018
Table 1. Exclusion codes and data.
Table 1. Exclusion codes and data.
DescriptionNo.%Note
Not Available54.27%Cannot find the manuscript
Not English10.85%
Not Game/Training/Edu1714.53%
Not Hearing-Impaired2823.93%
Not Last Publication32.56%Newer publications, same project
Not Primary Research54.27%e.g., review
Not Vibrotactile2622.22%
Off Topic3227.35%Multiple reasons (e.g., NV + NGTE + NHI)
No Intervention10.85%
No Validation65.13%
Total117100%
Table 2. Summary of the selected articles (Part one).
Table 2. Summary of the selected articles (Part one).
ArticleYearDescriptionStudy TypeHaptic UsageBody LocationMappingsVibrotactile ProcessingTarget GroupParticipantsTraining
Hopkins et al. [11]2023Pitch discrimination study with training on amateur and professional musicians with normal or severely impaired hearing.Pre-post testSensory substitutionFingertip, forefootSound—vibrotactileSynthetic generationHearing impaired19 participants, 15 normal hearing, four hearing impaired≤2 months
Daza Gonzalez et al. [40]2023Multisensory phonological and syntactic trainingPre-post testSensory augmentationWristSound—vibrotactileNot specifiedDeaf40 deaf and 28 hearing children>2 months
Ganis et al. [41]2022Design of a vibrotactile feedback device and test with melodic contour identificationPre-post testSensory augmentationHand, fingertipSound—vibrotactileTemporal envelope, full soundHearing impaired15 normal hearing participantsPre-test
Janidarmian et al. [42]2022Design of a vibrotactile feedback device for delivering customizable spatiotemporal tactile patternsPre-post testSensory substitutionLower backText—vibrotactileSynthetic generationSensory impairment10 healthy participantsPre-test
Xohua-Chacón et al. [43]2022Investigate algebra learning experience of university students with hypoacusis using tangible systemsMixed methodsSensory augmentationHandNoneNoneHearing impairedOne cochlear implanted, one normal hearingPre-test
Domenici et al. [44]2021Investigate whether temporal abilities can be enhanced using a novel Android appPre-post testSensory substitutionHandNoneSynthetic generationSensory impairment12 participants (no impairment specified)≤1 week
Tufatulin et al. [45]2021Determine limits of underwater vibrotactile stimuli perception and measure trainingMixed methodsSensory substitutionFull bodySound—vibrotactileSynthetic generation, full soundHearing impairedfive hearing impaired, 30 children, 15 with severe hearing loss, 15 normal hearingNone
Cano et al. [46]2021Design of a serious game for children with hearing impairment with physical and digital interfacesDevelopment and usability eval.Sensory augmentationHandVisual—vibrotactileSynthetic generationHearing impairedSeven children hearing impairedPre-test
Iijima et al. [47]2021Design of a musical game to let the hearing impaired enjoy music playingDevelopment and usability eval.Sensory substitutionHandGesture—vibrotactileSynthetic generationDeaf, hearing impairedSix deaf and hard of hearingPre-test
Table 3. Summary of the selected articles (Part two).
Table 3. Summary of the selected articles (Part two).
ArticleYearDescriptionStudy TypeHaptic UsageBody LocationMappingsVibrotactile ProcessingTarget GroupParticipantsTraining
Tan et al. [13]2020Test a tactile phonemic sleeve for word recognitionQuasi-experimentalSensory substitutionForearmPhoneme—vibrotactileComplexHearing impaired51 normal hearing≤1 month
Fletcher et al. [48]2020Assessing if electro-haptic stimulation substantially improves speech recognition in multi-talker noise when the speech and noise come from different locationsExperimentalSensory augmentationWristSound—vibrotactileTemporal envelopeCochlear implantNine CI users, each of whom was implanted in only one ear≤1 h
Shin et al. [12]2020Tactile glove that helps recognize pitch for hearing impaired individualsPre-post testSensory augmentationHandSound—location vibrotactileSynthetic generationHearing impairedTwo cochlear implant users≤1 month
Fletcher and Zgheib [10]2020Improve haptic sound-localization accuracy using a varied stimulus set and assess whether accuracy improved with prolonged trainingExperimentalSensory augmentationWristSound—location vibrotactileTemporal envelopeHearing impaired32 adults with normal touch perception (16 experimental group, 16 control group)≤1 month
Giulia et al. [49]2019Tactile glove for speech-to-vibrotactile feedbackDevelopment & usability eval.Sensory substitutionHandSound—location vibrotactileSynthetic generationDeaf-blindThree normal hearing≤1 month
Cieśla et al. [5]2019Assessing if multisensory stimulation, pairing audition and a minimal-size touch device, improves intelligibility of speech in noiseDevelopment & usability eval.Sensory substitutionFingertipSound—vibrotactileTemporal envelopeDeaf, Hearing impaired12 normal hearing≤1 h
Fletcher et al. [14]2019Vibrotactile feedback algorithm to improve speech-in-noise perceptionPre-post testSensory augmentationWristSound—vibrotactileTemporal envelopeHearing impaired10 cochlear implant users≤2 weeks
Fletcher et al. [50]2018Tactile presentation of low-frequency sound information to improve speech-in-noise performance for CI usersQuasi-experimentalSensory augmentationFingertipSound—vibrotactileTemporal envelopeCochlear implantEight normal-hearing participants listened to CI simulated speech-in-noise≤1 week
González-Garrido et al. [51]2017EEG study on vibrotactile language discrimination in deaf and hearing individualsQuasi-experimentalSensory substitutionFingertipSound—vibrotactileNot specifiedDeaf14 deaf, 14 normal hearing≤1 month
Table 4. Summary of the selected articles (Part three).
Table 4. Summary of the selected articles (Part three).
ArticleYearDescriptionStudy TypeHaptic UsageBody LocationMappingsVibrotactile ProcessingTarget GroupParticipantsTraining
Schmidt et al. [52]2016Design of an app for training of the Lorm-alphabet for facilitating communication between deaf-blind and sensory-abled individualsDevelopment and usability evalSensory augmentationFingertipLocation vibrotactile—textSynthetic generationDeaf-blindThree normal hearing≤1 h
Norberg et al. [53]2015Design of a Morse code modulated haptics prototype for deaf-blind individuals to navigate web pagesPilot studySensory substitutionHandText—vibrotactileNot specifiedDeaf-blindFour normal hearing≤1 h
Parivash [54]2014Assessment of four signal processing methods in an app for environmental perception of sounds in deaf-blind peopleQuasi-experimentalSensory substitutionAnkle, PalmSound—vibrotactileTemporal envelopeDeaf-blind13 deaf, 5 deaf-blindPre-test
Ranjbar and Stenström [15]2013Improve the ability of people with severe hearing impairment or deafblindness to detect, identify, and recognize the direction of sound-producing eventsField trialSensory substitutionForearm, palmSound—vibrotactileTemporal envelopeHearing impaired, deaf-blindFour with Usher syndrome I (deaf-blind)Individual
Snodgrass et al. [55]2013Intervention to teach three conceptually referenced tactile symbols for a child with multiple disabilitiesQuasi-experimentalSensory substitutionHandShape/texture—wordNoneDeaf-blind, intellectual disabilityOne deaf-blind>1 month
Nanayakkara et al. [56]2012Vibrotactile chair to perform speech production training in deaf childrenExperimentalSensory augmentationFull bodySound—vibrotactileFull soundDeafSix deaf children; 20 deaf children>2 months
Sakajiri et al. [16]2012Investigate the effect of voice pitch training using a tactile feedback systemQuasi-experimentalSensory substitutionFingertipSound—vibrotactileF0 extractionDeaf, hearing impairedEight normal-hearingNone
Wang and Huang [57]2010Vibrotactile feedback to improve speech production of Mandarin wordsExperimentalSensory augmentationFingertipSound—vibrotactileF0 extractionCochlear implant12 cochlear implanted childrenNone
Jayant et al. [58]2010Vibrotactile feedback to improve braille perceptionDevelopment and usability eval.Sensory augmentationFingertipText—vibrotactileNot specifiedDeaf-blind, blindSix deaf-blind, Three blindPre-test
Table 5. Summary of the selected articles (Part four).
Table 5. Summary of the selected articles (Part four).
ArticleYearDescriptionStudy TypeHaptic UsageBody LocationMappingsVibrotactile ProcessingTarget GroupParticipantsTraining
Barbacena et al. [59]2009Real-time vibrotactile and visual feedback to train hearing impaired individualsExperimentalSensory augmentationFingertipSound—vibrotactileF0 extractionDeaf53 hearing impairedPre-test
Karimi-Yazdi et al. [60]2006Comparison of one-, tow- and seven- channel tactile aids for speech recognition in severely hearing impaired individualsQuasi-experimentalSensory substitutionFingertip, wrist, neck, chest, abdominal skinSound—vibrotactileNot specifiedHearing impaired23 hearing impairedPre-test
Yuan et al. [61]2005Design and evaluation of tactual display to reinforce lipreadingExperimentalSensory augmentationFingertipSound—vibrotactileComplexDeafFour normal hearingPre-test
Evreinov et al. [62]2004Design of a tactile pen and evaluation of tactons generationDevelopment and usability eval.Sensory substitutionHandNoneSynthetic generationDeaf, blind26 normal hearingPre-test
Arnold and Heiron [63]2002Verify that the deaf-blind people’s tactile memory is better than that of sighted-hearing people through recognition and recall memory tasks and a matching pairs gameQuasi-experimentalSensory substitutionHandNoneNoneDeaf-blind10 deaf-blind and 10 sighted-hearingPre-test
Andersson et al. [64]2001Investigate effects of tactile aids on visual lipreading taskExperimentalSensory augmentationHandSound—vibrotactileTemporal envelopeHearing impaired14 hearing impairedPre-test
Bernstein et al. [65]2001Investigate how speechreading is affected by hearing impairment and vibrotactile trainingExperimentalSensory substitutionForearmVisual—vibrotactileTemporal envelopeHearing impaired, normal hearingEight normal hearing; 8 hearing impaired≤2 months
Galvin et al. [66]2000Investigate the potential value of tactile-alone training for hearing impairedExperimentalSensory substitutionHandSound—electrotactileComplexHearing impairedSix normal hearing≤1 week
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Ganis, F.; Gulli, A.; Fontana, F.; Serafin, S. The Role of Haptics in Training and Games for Hearing-Impaired Individuals: A Systematic Review. Multimodal Technol. Interact. 2024, 8, 1. https://doi.org/10.3390/mti8010001

AMA Style

Ganis F, Gulli A, Fontana F, Serafin S. The Role of Haptics in Training and Games for Hearing-Impaired Individuals: A Systematic Review. Multimodal Technologies and Interaction. 2024; 8(1):1. https://doi.org/10.3390/mti8010001

Chicago/Turabian Style

Ganis, Francesco, Andrea Gulli, Federico Fontana, and Stefania Serafin. 2024. "The Role of Haptics in Training and Games for Hearing-Impaired Individuals: A Systematic Review" Multimodal Technologies and Interaction 8, no. 1: 1. https://doi.org/10.3390/mti8010001

Article Metrics

Back to TopTop