Sign in to use this feature.

Years

Between: -

Subjects

remove_circle_outline
remove_circle_outline
remove_circle_outline

Journals

Article Types

Countries / Regions

Search Results (2)

Search Parameters:
Keywords = conditioned play audiometry

Order results
Result details
Results per page
Select all
Export citation of selected articles as:
15 pages, 6077 KB  
Article
Pediatric Speech Audiometry Web Application for Hearing Detection in the Home Environment
by Stanislav Ondáš, Eva Kiktová, Matúš Pleva, Mária Oravcová, Lukáš Hudák, Jozef Juhár and Július Zimmermann
Electronics 2020, 9(6), 994; https://doi.org/10.3390/electronics9060994 - 13 Jun 2020
Cited by 15 | Viewed by 9092
Abstract
This paper describes the development of the speech audiometry application for pediatric patients in Slovak language and experiences obtained during testing with healthy children, hearing-impaired children, and elderly persons. The first motivation behind the presented work was to reduce the stress and fear [...] Read more.
This paper describes the development of the speech audiometry application for pediatric patients in Slovak language and experiences obtained during testing with healthy children, hearing-impaired children, and elderly persons. The first motivation behind the presented work was to reduce the stress and fear of the children, who must undergo postoperative audiometry, but over time, we changed our direction to the simple game-like mobile application for the detection of possible hearing problems of children in the home environment. Conditioned play audiometry principles were adopted to create a speech audiometry application, where children help the virtual robot Thomas assign words to pictures; this can be described as a speech recognition test. Several game scenarios together with the setting condition issues were created, tested, and discussed. First experiences show a positive influence on the children’s mood and motivation. Full article
(This article belongs to the Special Issue Human Computer Interaction for Intelligent Systems)
Show Figures

Figure 1

8 pages, 1830 KB  
Article
New Method for Pure-Tone Audiometry Using Electrooculogram: A Proof-of-Concept Study
by Do Yeon Kim, Jinuk Kwon, Joo-Young Kim, Ho-Seung Cha, Yong-Wook Kim, In Young Kim and Chang-Hwan Im
Sensors 2018, 18(11), 3651; https://doi.org/10.3390/s18113651 - 28 Oct 2018
Cited by 2 | Viewed by 4847
Abstract
Precise and timely evaluation of an individual’s hearing loss plays an important role in determining appropriate treatment strategies, including medication and aural rehabilitation. However, currently available hearing assessment systems do not satisfy the need for an objective assessment tool with a simple and [...] Read more.
Precise and timely evaluation of an individual’s hearing loss plays an important role in determining appropriate treatment strategies, including medication and aural rehabilitation. However, currently available hearing assessment systems do not satisfy the need for an objective assessment tool with a simple and non-invasive procedure. In this paper, we propose a new method for pure-tone audiometry, which may potentially be used to assess an individual’s hearing ability objectively and quantitatively, without need for the user’s active response. The proposed method is based on the auditory oculogyric reflex, where the eyes involuntary rotate towards the source of a sound, in response to spatially moving pure-tone audio stimuli modulated at specific frequencies and intensities. We quantitatively analyzed horizontal electrooculograms (EOG) recorded with a pair of electrodes under two conditions—when pure-tone stimuli were (1) “inaudible” or (2) “audible” to a participant. Preliminary experimental results showed significantly increased EOG amplitude in the audible condition compared to the inaudible condition for all ten healthy participants. This demonstrates potential use of the proposed method as a new non-invasive hearing assessment tool. Full article
(This article belongs to the Special Issue Sensors for Biosignal Processing)
Show Figures

Figure 1

Back to TopTop