Utilization of Detection of Non-Speech Sound for Sustainable Quality of Life for Deaf and Hearing-Impaired People: A Systematic Literature Review
Abstract
:1. Introduction
1.1. Difficulties Faced by DHI Individuals
1.2. Sound Detection
1.3. The Challenges and Missing Points Faced by Sound Detection
2. Methodology
2.1. Search Strategy
2.2. Search Query
2.3. Inclusion and Exclusion Criteria
2.4. Selection Procedure
3. Results
3.1. Common Methods Used to Notify DHI Individuals About Sound Events
3.2. Common Methods Used to Interpret Sound Events for DHI Individuals
3.3. The Target Group Aimed by Included Studies
3.4. Assistive Technologies/Tools Used by DHI People to Recognize the Sound Event
4. Discussion
5. Conclusions
5.1. Limitations
5.2. Recommendations and Future Directions
Supplementary Materials
Author Contributions
Funding
Data Availability Statement
Acknowledgments
Conflicts of Interest
References
- World Health Organization. Deafness and Hearing Loss. Available online: https://www.who.int/news-room/fact-sheets/detail/deafness-and-hearing-loss#:~:text=Over%205%25%20of%20the%20world’s,will%20have%20disabling%20hearing%20loss (accessed on 27 February 2024).
- Hermawati, S.; Pieri, K. Assistive technologies for severe and profound hearing loss: Beyond hearing aids and implants. Assist. Technol. 2019, 32, 182–193. [Google Scholar] [CrossRef] [PubMed]
- Zare, S.; Ghotbi-Ravandi, M.R.; ElahiShirvan, H.; Ahsaee, M.G.; Rostami, M. Predicting and weighting the factors affecting workers’ hearing loss based on audiometric data using C5 algorithm. Ann. Glob. Health 2019, 85, 88. [Google Scholar] [CrossRef] [PubMed]
- Alkhalifa, S.; Al-Razgan, M. Enssat: Wearable technology application for the deaf and hard of hearing. Multimed. Tools. Appl. 2018, 77, 22007–22031. [Google Scholar] [CrossRef]
- Mweri, J. Privacy and Confidentiality in Health Care Access for People Who are Deaf: The kenyan Case. Health Pol. 2018, 1, 2–5. [Google Scholar]
- Halim, Z.; Abbas, G. A kinect-based sign language hand gesture recognition system for hearing-and speech-impaired: A pilot study of Pakistani sign language. Assist. Technol. 2015, 27, 34–43. [Google Scholar] [CrossRef]
- Singleton, J.L.; Remillard, E.T.; Mitzner, T.L.; Rogers, W.A. Everyday technology use among older deaf adults. Disabil. Rehabil. Assist. Technol. 2019, 14, 325–332. [Google Scholar] [CrossRef]
- Dornhoffer, J.R.; Holcomb, M.A.; Meyer, T.A.; Dubno, J.R.; McRackan, T.R. Factors influencing time to cochlear implantation. Otol. Neurotol. 2020, 41, 173. [Google Scholar] [CrossRef]
- Hrastinski, I.; Wilbur, R.B. Academic achievement of deaf and hard-of-hearing students in an ASL/English bilingual program. J. Deaf Stud. Deaf Educ. 2016, 21, 156–170. [Google Scholar] [CrossRef]
- Ashori, M.; Jalil-Abkenar, S.S. Emotional intelligence: Quality of life and cognitive emotion regulation of deaf and hard-of-hearing adolescents. Deaf Educ. Int. 2021, 23, 84–102. [Google Scholar] [CrossRef]
- Jaiyeola, M.T.; Adeyemo, A.A. Quality of life of deaf and hard of hearing students in Ibadan metropolis, Nigeria. PLoS ONE 2018, 13, e0190130. [Google Scholar] [CrossRef]
- Otoom, M.; Alzubaidi, M.A.; Aloufee, R. Novel navigation assistive device for deaf drivers. Assist. Technol. 2020, 34, 129–139. [Google Scholar] [CrossRef] [PubMed]
- Mesaros, A.; Heittola, T.; Virtanen, T. Metrics for polyphonic sound event detection. Appl. Sci. 2016, 6, 162. [Google Scholar] [CrossRef]
- Mirzaei, M.R.; Ghorshi, S.; Mortazavi, M. Audio-visual speech recognition techniques in augmented reality environments. Visual Comput. 2014, 30, 245–257. [Google Scholar] [CrossRef]
- Gfeller, B.; Roblek, D.; Tagliasacchi, M. One-shot conditional audio filtering of arbitrary sounds. In Proceedings of the ICASSP 2021—2021 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP), Toronto, ON, Canada, 6–11 June 2021; pp. 501–505. [Google Scholar]
- Agrawal, J.; Gupta, M.; Garg, H. A review on speech separation in cocktail party environment: Challenges and approaches. Multimed. Tools Appl. 2023, 82, 31035–31067. [Google Scholar] [CrossRef]
- Zhao, C.; Liu, Y.; Yang, J.; Chen, P.; Gao, M.; Zhao, S. Sound-localisation performance in patients with congenital unilateral microtia and atresia fitted with an active middle ear implant. Eur. Arch. Otorhinolaryngol. 2021, 278, 31–39. [Google Scholar] [CrossRef]
- Mirzaei, M.; Kán, P.; Kaufmann, H. Effects of Using Vibrotactile Feedback on Sound Localization by Deaf and Hard-of-Hearing People in Virtual Environments. Electronics 2021, 10, 2794. [Google Scholar] [CrossRef]
- Page, M.J.; McKenzie, J.E.; Bossuyt, P.M.; Boutron, I.; Hoffmann, T.C.; Mulrow, C.D.; Shamseer, L.; Tetzlaff, J.M.; Akl, E.A.; Brennan, S.E.; et al. The PRISMA 2020 statement: An updated guideline for reporting systematic reviews. Syst. Rev. 2021, 10, 89. [Google Scholar] [CrossRef]
- Normadhi, N.B.A.; Shuib, L.; Nasir, H.N.M.; Bimba, A.; Idris, N.; Balakrishnan, V. Identification of personal traits in adaptive learning environment: Systematic literature review. Comput. Educ. 2019, 130, 168–190. [Google Scholar] [CrossRef]
- Aktaş, F.; Kavuş, E.; Kavuş, Y. A Real Time Infant Health Monitoring System for Hard of Hearing Parents by using Android-based Mobil Devices. J. Electr. Electron. Eng. 2017, 17, 3121–3127. [Google Scholar]
- Liu, T.; Wu, C.-C.; Huang, K.-C.; Liao, J.-J. Effects of frequency and signal-to-noise ratio on accuracy of target sound detection with varied inferences among Taiwanese hearing-impaired individuals. Appl. Acoust. 2020, 161, 107176. [Google Scholar] [CrossRef]
- Ramirez, A.E.; Donati, E.; Chousidis, C. A siren identification system using deep learning to aid hearing-impaired people. Eng. Appl. Artif. Intell. 2022, 114, 105000. [Google Scholar] [CrossRef]
- Saifan, R.R.; Dweik, W.; Abdel-Majeed, M. A machine learning based deaf assistance digital system. Comput. Appl. Eng. Educ. 2018, 26, 1008–1019. [Google Scholar] [CrossRef]
- Yağanoğlu, M.; Köse, C. Real-time detection of important sounds with a wearable vibration based device for hearing-impaired people. Electronics 2018, 7, 50. [Google Scholar] [CrossRef]
- Perrotta, M.V.; Asgeirsdottir, T.; Eagleman, D.M. Deciphering sounds through patterns of vibration on the skin. Neuroscience 2021, 458, 77–86. [Google Scholar] [CrossRef] [PubMed]
- Bandara, M.; Balasuriya, D. Design of a road-side threat alert system for deaf pedestrians. J. Inst. Eng. 2017, 2, 61–70. [Google Scholar] [CrossRef]
- Shimoyama, R. Wearable Hearing Assist System to Provide Hearing-Dog Functionality. Robotics 2019, 8, 49. [Google Scholar] [CrossRef]
- Teki, S.; Kumar, S.; Griffiths, T.D. Large-scale analysis of auditory segregation behavior crowdsourced via a smartphone app. PLoS ONE 2016, 11, e0153916. [Google Scholar] [CrossRef]
- Yağanoğlu, M.; Köse, C. Wearable vibration based computer interaction and communication system for deaf. Appl. Sci. 2017, 7, 1296. [Google Scholar] [CrossRef]
- Asakura, T. Augmented-Reality Presentation of Household Sounds for Deaf and Hard-of-Hearing People. Sensors 2023, 23, 7616. [Google Scholar] [CrossRef]
- Chin, C.-L.; Lin, C.-C.; Wang, J.-W.; Chin, W.-C.; Chen, Y.-H.; Chang, S.-W.; Huang, P.-C.; Zhu, X.; Hsu, Y.-L.; Liu, S.-H. A Wearable Assistant Device for the Hearing Impaired to Recognize Emergency Vehicle Sirens with Edge Computing. Sensors 2023, 23, 7454. [Google Scholar] [CrossRef]
- Lundbeck, M.; Grimm, G.; Hohmann, V.; Laugesen, S.; Neher, T. Sensitivity to angular and radial source movements as a function of acoustic complexity in normal and impaired hearing. Trends Hear. 2017, 21, 2331216517717152. [Google Scholar] [CrossRef] [PubMed]
- Bhat, G.S.; Shankar, N.; Panahi, I.M.S. Design and Integration of Alert Signal Detector and Separator for Hearing Aid Applications. IEEE Access 2020, 8, 106296–106309. [Google Scholar] [CrossRef] [PubMed]
- Lundbeck, M.; Grimm, G.; Hohmann, V.; Bramsløw, L.; Neher, T. Effects of Directional Hearing Aid Settings on Different Laboratory Measures of Spatial Awareness Perception. Audiol. Res. 2018, 8, 215. [Google Scholar] [CrossRef] [PubMed]
- Lundbeck, M.; Hartog, L.; Grimm, G.; Hohmann, V.; Bramsløw, L.; Neher, T. Influence of multi-microphone signal enhancement algorithms on the acoustics and detectability of angular and radial source movements. Trends Hear. 2018, 22, 2331216518779719. [Google Scholar] [CrossRef]
- Pralus, A.; Hermann, R.; Cholvy, F.; Aguera, P.-E.; Moulin, A.; Barone, P.; Grimault, N.; Truy, E.; Tillmann, B.; Caclin, A. Rapid Assessment of Non-Verbal Auditory Perception in Normal-Hearing Participants and Cochlear Implant Users. J. Clin. Med. 2021, 10, 2093. [Google Scholar] [CrossRef]
- da Rosa Tavares, J.E.; Victória Barbosa, J.L. Apollo SignSound: An intelligent system applied to ubiquitous healthcare of deaf people. J. Reliab. Intell. Environ. 2021, 7, 157–170. [Google Scholar] [CrossRef]
- Otálora, A.S.; Moreno, N.C.; Osorio, D.E.C.; Trujillo, L.A.G. Prototype of bracelet detection alarm sounds for deaf and hearing loss. ARPN J. Eng. Appl. Sci. 2017, 12, 1111–1117. [Google Scholar]
- An, J.-H.; Koo, N.-K.; Son, J.-H.; Joo, H.-M.; Jeong, S. Development on Deaf Support Application Based on Daily Sound Classification Using Image-based Deep Learning. JOIV: Int. J. Inform. Vis. 2022, 6, 250–255. [Google Scholar] [CrossRef]
- Jemaa, A.B.; Irato, G.; Zanela, A.; Brescia, A.; Turki, M.; Jaïdane, M. Congruent auditory display and confusion in sound localization: Case of elderly drivers. Transp. Res. Part F Traffic Psychol. Behav. 2018, 59, 524–534. [Google Scholar] [CrossRef]
- Mirzaei, M.; Kan, P.; Kaufmann, H. EarVR: Using ear haptics in virtual reality for deaf and Hard-of-Hearing people. IEEE Trans. Vis. Comput. Graph. 2020, 26, 2084–2093. [Google Scholar] [CrossRef]
- Brungart, D.S.; Cohen, J.; Cord, M.; Zion, D.; Kalluri, S. Assessment of auditory spatial awareness in complex listening environments. J. Acoust. Soc. Am. 2014, 136, 1808–1820. [Google Scholar] [CrossRef] [PubMed]
- Picou, E.M.; Rakita, L.; Buono, G.H.; Moore, T.M. Effects of increasing the overall level or fitting hearing aids on emotional responses to sounds. Trends Hear. 2021, 25, 23312165211049938. [Google Scholar] [CrossRef] [PubMed]
- Hamel, B.L.; Vasil, K.; Shafiro, V.; Moberly, A.C.; Harris, M.S. Safety-relevant environmental sound identification in cochlear implant candidates and users. Laryngoscope 2020, 130, 1547–1551. [Google Scholar] [CrossRef] [PubMed]
- Yağanoğlu, M. Real time wearable speech recognition system for deaf persons. Comput. Electr. Eng. 2021, 91, 107026. [Google Scholar] [CrossRef]
- Fullerton, A.M.; Vickers, D.A.; Luke, R.; Billing, A.N.; McAlpine, D.; Hernandez-Perez, H.; Peelle, J.E.; Monaghan, J.J.; McMahon, C.M. Cross-modal functional connectivity supports speech understanding in cochlear implant users. Cereb. Cortex 2023, 33, 3350–3371. [Google Scholar] [CrossRef]
- Dillard, L.K.; Der, C.M.; Laplante-Lévesque, A.; Swanepoel, D.W.; Thorne, P.R.; McPherson, B.; de Andrade, V.; Newall, J.; Ramos, H.D.; Kaspar, A. Service delivery approaches related to hearing aids in low-and middle-income countries or resource-limited settings: A systematic scoping review. PLoS Glob. Public Health 2024, 4, e0002823. [Google Scholar] [CrossRef]
- Han, J.S.; Lim, J.H.; Kim, Y.; Aliyeva, A.; Seo, J.-H.; Lee, J.; Park, S.N. Hearing Rehabilitation with a Chat-Based Mobile Auditory Training Program in Experienced Hearing Aid Users: Prospective Randomized Controlled Study. JMIR mHealth uHealth 2024, 12, e50292. [Google Scholar] [CrossRef]
- Kuriakose, B.; Shrestha, R.; Sandnes, F.E. Tools and technologies for blind and visually impaired navigation support: A review. IETE Tech. Rev. 2022, 39, 3–18. [Google Scholar] [CrossRef]
- Andersen, A.H.; Santurette, S.; Pedersen, M.S.; Alickovic, E.; Fiedler, L.; Jensen, J.; Behrens, T. Creating clarity in noisy environments by using deep learning in hearing aids. Semin. Hear. 2021, 42, 260–281. [Google Scholar] [CrossRef]
- Fitria, T.N. Augmented reality (AR) and virtual reality (VR) technology in education: Media of teaching and learning: A review. Int. J. Comput. Inf. Syst. 2023, 4, 14–25. [Google Scholar]
- Athanasopoulos, M.; Samara, P.; Athanasopoulos, I. Advances in 3D Inner Ear Reconstruction Software for Cochlear Implants: A Comprehensive Review. Methods Protoc. 2024, 7, 46. [Google Scholar] [CrossRef] [PubMed]
- Iyortsuun, N.K.; Kim, S.-H.; Jhon, M.; Yang, H.-J.; Pant, S. A review of machine learning and deep learning approaches on mental health diagnosis. Healthcare 2023, 11, 285. [Google Scholar] [CrossRef] [PubMed]
- Halbouni, A.; Gunawan, T.S.; Habaebi, M.H.; Halbouni, M.; Kartiwi, M.; Ahmad, R. Machine learning and deep learning approaches for cybersecurity: A review. IEEE Access 2022, 10, 19572–19585. [Google Scholar] [CrossRef]
- ZainEldin, H.; Gamel, S.A.; Talaat, F.M.; Aljohani, M.; Baghdadi, N.A.; Malki, A.; Badawy, M.; Elhosseini, M.A. Silent no more: A comprehensive review of artificial intelligence, deep learning, and machine learning in facilitating deaf and mute communication. Artif. Intell. Rev. 2024, 57, 188. [Google Scholar] [CrossRef]
- Smailov, N.; Dosbayev, Z.; Omarov, N.; Sadykova, B.; Zhekambayeva, M.; Zhamangarin, D.; Ayapbergenova, A. A novel deep CNN-RNN approach for real-time impulsive sound detection to detect dangerous events. Int. J. Adv. Comput. Sci. Appl. 2023, 14, 271–280. [Google Scholar] [CrossRef]
- Ohshiro, K.; Cartwright, M. How people who are deaf, Deaf, and hard of hearing use technology in creative sound activities. In Proceedings of the 24th International ACM SIGACCESS Conference on Computers and Accessibility, Athens, Greece, 23–26 October 2022; pp. 1–4. [Google Scholar]
- Findlater, L.; Chinh, B.; Jain, D.; Froehlich, J.; Kushalnagar, R.; Lin, A.C. Deaf and hard-of-hearing individuals’ preferences for wearable and mobile sound awareness technologies. In Proceedings of the 2019 CHI Conference on Human Factors in Computing Systems, Glasgow, UK, 4–9 May 2019; pp. 1–13. [Google Scholar]
- Birinci, F.G.; Saricoban, A. The effectiveness of visual materials in teaching vocabulary to deaf students of EFL. J. Lang. Linguist. Stud. 2021, 17, 628–645. [Google Scholar] [CrossRef]
- Campos, V.; Cartes-Velásquez, R.; Bancalari, C. Development of an app for the dental care of Deaf people: Odontoseñas. Univers. Access Inf. Soc. 2020, 19, 451–459. [Google Scholar] [CrossRef]
- Alothman, A.A. Language and Literacy of Deaf Children. Psychol. Educ. 2021, 58, 799–819. [Google Scholar] [CrossRef]
- Domagała-Zyśk, E.; Podlewska, A. Strategies of oral communication of deaf and hard-of-hearing (D/HH) non-native English users. Eur. J. Spec. Needs Educ. 2019, 34, 156–171. [Google Scholar] [CrossRef]
- Papatsimouli, M.; Sarigiannidis, P.; Fragulis, G.F. A Survey of Advancements in Real-Time Sign Language Translators: Integration with IoT Technology. Technologies 2023, 11, 83. [Google Scholar] [CrossRef]
- Sanders, M.E.; Kant, E.; Smit, A.L.; Stegeman, I. The effect of hearing aids on cognitive function: A systematic review. PLoS ONE 2021, 16, e0261207. [Google Scholar] [CrossRef] [PubMed]
- Guan, W.; Wang, S.; Liu, C. Influence of perceived discrimination on problematic smartphone use among Chinese deaf and hard-of-hearing students: Serial mediating effects of sense of security and social avoidance. Addict. Behav. 2023, 136, 107470. [Google Scholar] [CrossRef] [PubMed]
- Li, X.; Yang, Y.; Ye, Z.; Wang, Y.; Chen, Y. EarCase: Sound Source Localization Leveraging Mini Acoustic Structure Equipped Phone Cases for Hearing-challenged People. In Proceedings of the Twenty-Fourth International Symposium on Theory, Algorithmic Foundations, and Protocol Design for Mobile Networks and Mobile Computing, Washington, DC, USA, 23–26 October 2023; pp. 240–249. [Google Scholar]
- Ashraf, I.; Hur, S.; Park, Y. Smartphone sensor based indoor positioning: Current status, opportunities, and future challenges. Electronics 2020, 9, 891. [Google Scholar] [CrossRef]
- Khan, M.A.; Ahmad, I.; Nordin, A.N.; Ahmed, A.E.-S.; Mewada, H.; Daradkeh, Y.I.; Rasheed, S.; Eldin, E.T.; Shafiq, M. Smart android based home automation system using internet of things (IoT). Sustainability 2022, 14, 10717. [Google Scholar] [CrossRef]
- Gilberto, L.G.; Bermejo, F.; Tommasini, F.C.; García Bauza, C. Virtual Reality Audio Game for Entertainment & Sound Localization Training. ACM Trans. Appl. Percept. 2024. [Google Scholar] [CrossRef]
- Lee, S.; Hubert-Wallander, B.; Stevens, M.; Carroll, J.M. Understanding and designing for deaf or hard of hearing drivers on Uber. In Proceedings of the 2019 CHI Conference on Human Factors in Computing Systems, Glasgow, UK, 4–9 May 2019; pp. 1–12. [Google Scholar]
Criteria for Inclusion | Criteria for Exclusion |
---|---|
|
|
Objective | Methodology | Target Group | Technology/Tool | Alert Type | Environment of Test | Results | Ref. |
---|---|---|---|---|---|---|---|
Translation services and alerts about surrounding sounds. | Mixed-methods research. | DHI | Smartphone and Google glass (GG) | Vibration and visual | Internal environment and external environment | The results demonstrate its ease of use and utility. | [4] |
Monitoring infants by hearing impaired parents. | Applied research. | Hearing impaired | Smartphone and an Arduino board | Vibration and visual | Internal environment | Successful real-time monitoring and notification of abnormal conditions. | [21] |
The impact of noise on the precision of target sound detection. | Mixed-design experimental study. | Hearing impaired | A digital audiometer and laptop | Auditory | Internal environment | The accuracy of the target compared to frequencies of 125 and 1000 Hz, there was less sound detected at 6000 Hz. | [22] |
The siren system detector. | Deep learning-based siren sound Identification. | Hearing impaired | Computer and deep learning | Visual | Internal environment and external environment | 91% accuracy was reached in the real world. | [23] |
Enable the deaf to distinguish alarm sounds. | User-centric development. | Deaf | Deaf assistance digital system (DADS) and neural networks | Vibration and visual | Internal environment and external environment | More than 90% accuracy. | [24] |
Improve the quality of life of the hearing impaired. | Mixture of experimental and field-testing approaches. | Hearing impaired | Wearable device | Vibration | Internal environment and external environment | It has been rated for its usefulness at 97% and its clarity at 90%. | [25] |
Decode sounds through vibration patterns. | Quantitative experimental methodology. | DHI | Wearable device | Vibration | Internal environment | The results showed that younger participants tended to perform better. | [26] |
Alert system for deaf pedestrians. | User-centered approach. | Deaf | Wearable device | Vibration | External environment | The accuracy of detecting large vehicles was 85%, and small vehicles were 92.5%. | [27] |
Estimating sound direction. | Mixed-methods research. | Deaf | Wearable device | Vibration | Internal environment | The time estimated to track the sound was 2.8 s. | [28] |
Creating a synthetic signal | Mixed-methods research. | Hearing impaired | Smartphone | Auditory | External environment | The results highlighted that target-detection performance by app users was robust. | [29] |
Estimating sound direction. | User-centered design. | DHI | Wearable device | Vibration and visual | Internal environment and external environment | The best success rate was 98%. | [30] |
Identify sounds occurring in daily life. | User-centered design. | DHI | Augmented reality (AR) and machine learning (ML) | Vibration and visual | Internal environment and external environment | The outcomes demonstrated that household areas can be made more comfortable for everyday living. | [31] |
Recognize emergency vehicle. | Applied research. | Hearing impaired | Wearable device | Vibration and visual | External environment | Offline mode achieved an accuracy of 97.1% and in online mode, an accuracy of 95.2%. | [32] |
Investigated how many simultaneous sound sources. | Comparative experimental design. | Hearing-impaired | Hearing aids | Auditory | Internal environment and external environment | The results showed that echo impairs this ability. | [33] |
Recognizing alarms in different noisy environments. | Quantitative experimental design. | Hearing-impaired | Hearing aids, smartphone, and convolutional recurrent neural network (CRNN) | Auditory | Internal environment and external environment | The results show the practical use and effectiveness of the system in real, noisy scenarios. | [34] |
How various hearing aid directional processing settings impact spatial awareness. | Quantitative experimental design. | Hearing-impaired | Hearing aids | Auditory | External environment | The study revealed that the number of simultaneous sound sources significantly affected the participants’ ability to perceive spatial information, but no effect in different hearing aid settings. | [35] |
Potential methods to improve spatial awareness. | Quantitative experimental design with comparative and perceptual testing elements. | Hearing-impaired | Hearing aids | Auditory | Internal environment and external environment | The results provide a foundation for enhancing the spatially dynamic sound detectability with hearing aids. | [36] |
Assess the pattern of nonverbal auditory perceptual deficits in ten cochlear implant (CI) users. | Comparative experimental design. | Hearing-impaired | CI and iPad | Auditory | Internal environment | The CI users showed shortfall in the perception of emotions and minute deviations in the pitch change recognition test. | [37] |
Detect ambient dangers. | Mixed-methods research. | Deaf | Smartphone and ML | Vibration and visual | Internal environment | Results showed that 90% of them approved of the perceived utility and ease of use. | [38] |
Distinguish between the sounds of an emergency and an alarm. | Applied engineering and design. | Deaf | Bracelet and artificial neural networks (ANN) | Vibration and visual | Internal environment | The test showed an efficiency of 75%. | [39] |
Convert common daily sounds into Mel-spectrograms. | Quantitative research methodology. | DHI | Bracelet, smartphone, and deep learning | Vibration and visual | Internal environment and external environment | Results showed the classification rate for mixed sounds was 80%. | [40] |
Effect of hearing impairment on older drivers’ performances. | Quantitative experimental design. | Hearing-impaired | Hearing aids | Auditory | External environment | The findings indicate that drivers who wore hearing aids did not outperform those who did not. | [41] |
Importance of using AR for DHI people. | Mixed-methods research. | DHI | AR | Vibration and visual | Internal environment | DHI people were inspired by EarVR to utilize VR more frequently. | [42] |
Tested on their auditory spatial awareness in complicated listening contexts. | Quantitative experimental design. | Hearing-impaired | Hearing aids | Auditory | Internal environment | The findings demonstrate that as job complexity increased and as competing sound sources increased in the acoustic landscape, performance declined. | [43] |
Evaluated the effect of hearing aid use on emotional responses to nonspeech sounds. | Comparative experimental design. | DHI | Hearing aids | Auditory | Internal environment | The findings suggest that current interventions do not ameliorate the effects of hearing loss on emotional responses to sound. | [44] |
Identification of safety-relevant ambient sound. | Cross-sectional comparative methodology. | Hearing-impaired | CI | Auditory | External environment | There was no appreciable difference in the participants’ safety-relevant environmental sound recognition skills between CI-E and CI-C. | [45] |
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content. |
© 2024 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
Mohammed, H.B.M.; Cavus, N. Utilization of Detection of Non-Speech Sound for Sustainable Quality of Life for Deaf and Hearing-Impaired People: A Systematic Literature Review. Sustainability 2024, 16, 8976. https://doi.org/10.3390/su16208976
Mohammed HBM, Cavus N. Utilization of Detection of Non-Speech Sound for Sustainable Quality of Life for Deaf and Hearing-Impaired People: A Systematic Literature Review. Sustainability. 2024; 16(20):8976. https://doi.org/10.3390/su16208976
Chicago/Turabian StyleMohammed, Hassan B. M., and Nadire Cavus. 2024. "Utilization of Detection of Non-Speech Sound for Sustainable Quality of Life for Deaf and Hearing-Impaired People: A Systematic Literature Review" Sustainability 16, no. 20: 8976. https://doi.org/10.3390/su16208976
APA StyleMohammed, H. B. M., & Cavus, N. (2024). Utilization of Detection of Non-Speech Sound for Sustainable Quality of Life for Deaf and Hearing-Impaired People: A Systematic Literature Review. Sustainability, 16(20), 8976. https://doi.org/10.3390/su16208976