Next Article in Journal
MultiSenseX: A Sustainable Solution for Multi-Human Activity Recognition and Localization in Smart Environments
Next Article in Special Issue
Multidisciplinary ML Techniques on Gesture Recognition for People with Disabilities in a Smart Home Environment
Previous Article in Journal
Attention-Based Hybrid Deep Learning Models for Classifying COVID-19 Genome Sequences
Previous Article in Special Issue
From Eye Movements to Personality Traits: A Machine Learning Approach in Blood Donation Advertising
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Adaptive Real-Time Translation Assistance Through Eye-Tracking

by
Dimosthenis Minas
*,
Eleanna Theodosiou
,
Konstantinos Roumpas
and
Michalis Xenos
Software Quality and Human-Computer Interaction Laboratory, University of Patras, 26504 Rio, Greece
*
Author to whom correspondence should be addressed.
Submission received: 31 October 2024 / Revised: 1 December 2024 / Accepted: 16 December 2024 / Published: 2 January 2025
(This article belongs to the Special Issue Machine Learning for HCI: Cases, Trends and Challenges)

Abstract

This study introduces the Eye-tracking Translation Software (ETS), a system that leverages eye-tracking data and real-time translation to enhance reading flow for non-native language users in complex, technical texts. By measuring the fixation duration, we can detect moments of cognitive load, ETS selectively provides translations, maintaining reading flow and engagement without undermining language learning. The key technological components include a desktop eye-tracker integrated with a custom Python-based application. Through a user-centered design, ETS dynamically adapts to individual reading needs, reducing cognitive strain by offering word-level translations when needed. A study involving 53 participants assessed ETS’s impact on reading speed, fixation duration, and user experience, with findings indicating improved comprehension and reading efficiency. Results demonstrated that gaze-based adaptations significantly improved their reading experience and reduced cognitive load. Participants positively rated ETS’s usability and were noted through preferences for customization, such as pop-up placement and sentence-level translations. Future work will integrate AI-driven adaptations, allowing the system to adjust based on user proficiency and reading behavior. The study contributes to the growing evidence of eye-tracking’s potential in educational and professional applications, offering a flexible, personalized approach to reading assistance that balances language exposure with real-time support.
Keywords: eye-tracking technology; real-time translation; reading comprehension; human-computer interaction; adaptive learning tools; non-native language assistance; reading aid eye-tracking technology; real-time translation; reading comprehension; human-computer interaction; adaptive learning tools; non-native language assistance; reading aid

Share and Cite

MDPI and ACS Style

Minas, D.; Theodosiou, E.; Roumpas, K.; Xenos, M. Adaptive Real-Time Translation Assistance Through Eye-Tracking. AI 2025, 6, 5. https://doi.org/10.3390/ai6010005

AMA Style

Minas D, Theodosiou E, Roumpas K, Xenos M. Adaptive Real-Time Translation Assistance Through Eye-Tracking. AI. 2025; 6(1):5. https://doi.org/10.3390/ai6010005

Chicago/Turabian Style

Minas, Dimosthenis, Eleanna Theodosiou, Konstantinos Roumpas, and Michalis Xenos. 2025. "Adaptive Real-Time Translation Assistance Through Eye-Tracking" AI 6, no. 1: 5. https://doi.org/10.3390/ai6010005

APA Style

Minas, D., Theodosiou, E., Roumpas, K., & Xenos, M. (2025). Adaptive Real-Time Translation Assistance Through Eye-Tracking. AI, 6(1), 5. https://doi.org/10.3390/ai6010005

Article Metrics

Back to TopTop