Using Synchronized Eye Movements to Predict Attention in Online Video Learning
Abstract
:1. Introduction
1.1. Online Learning and Attention Assessment
1.2. Eye Tracking and Synchronized Eye Movements
1.3. Learning Styles and Eye Movements
- Sensing/Intuitive continuum refers to the way individuals prefer to take in information. Sensing learners prefer concrete and practical information, relying on facts and details. Intuitive learners prefer conceptual and innovative information, concerned with theories and meanings.
- Visual/Verbal continuum describes how learners prefer to receive information. Visual learners learn best through visual aids such as diagrams, charts, and graphs. Verbal learners prefer written and spoken explanations.
- Active/Reflective continuum refers to the way individuals prefer to process information. Active learners prefer to learn through doing or discussing, while reflective learners prefer to think about and consider information before acting on it.
- Sequential/Global continuum determines how you prefer to organize and progress toward understanding information. Sequential learners prefer organized and linear presentations, while global learners prefer a broader context and see the overall structure before focusing on details.
1.4. The Current Study
2. Method
2.1. Participants
2.2. Stimuli and Procedure
2.3. Data Analyses
2.3.1. Preprocessing of Eye Movement Data
2.3.2. Intersubject Correlation of Eye Movement Data
2.3.3. Learning Style Questionnaire
- If the score for a dimension is 1 or 3, learners are fairly well balanced on the two categories of that dimension, with only a mild preference for one or the other.
- If the score for a dimension is 5 or 7, learners have a moderate preference for one category of that dimension. Learners may learn less easily in an environment that fails to address that preference at least some of the time than they would in a more balanced environment.
- If the score for a dimension is 9 or 11, learners have a strong preference for one category of that dimension. Learners may have difficulty learning in an environment that fails to address that preference at least some of the time.
3. Results
3.1. Scores of the Pretest and Posttest
3.2. ISC Difference between Attending and Distracted Conditions
3.3. ISC Prediction on Test Score and Attention Level
3.4. ISC of Different Learning Styles
4. Discussions
4.1. ISC of Eye Movements Predict Learners’ Attention at a More Nuanced Level but Not Their Academic Performance
4.2. Learners’ Learning Styles Do Not Affect ISC’s Prediction Rate of Attention Assessment in Video Learning
Author Contributions
Funding
Institutional Review Board Statement
Informed Consent Statement
Data Availability Statement
Conflicts of Interest
References
- Pokhrel, S.; Chhetri, R. A Literature Review on Impact of COVID-19 Pandemic on Teaching and Learning. High. Educ. Future 2021, 8, 133–141. [Google Scholar] [CrossRef]
- Madsen, J.; Júlio, S.U.; Gucik, P.J.; Steinberg, R.; Parra, L.C. Synchronized Eye Movements Predict Test Scores in Online Video Education. Proc. Natl. Acad. Sci. USA 2021, 118, e2016980118. [Google Scholar] [CrossRef] [PubMed]
- Newmann, F.M. Student Engagement and Achievement in American Secondary Schools; Teachers College Press: New York, NY, USA, 1992. [Google Scholar]
- Polderman, T.J.C.; Boomsma, D.I.; Bartels, M.; Verhulst, F.C.; Huizink, A.C. A Systematic Review of Prospective Studies on Attention Problems and Academic Achievement. Acta Psychiatr. Scand. 2010, 122, 271–284. [Google Scholar] [CrossRef] [PubMed]
- Reynolds, R.E.; Shirey, L.L. The Role of Attention in Studying And Learning. In Learning and Study Strategies; Weinstein, C.E., Goetz, E.T., Alexander, P.A., Eds.; Academic Press: San Diego, CA, USA, 1988; pp. 77–100. [Google Scholar] [CrossRef]
- Chen, C.-M.; Wang, J.-Y.; Yu, C.-M. Assessing the Attention Levels of Students by Using a Novel Attention Aware System Based on Brainwave Signals. Br. J. Educ. Technol. 2017, 48, 348–369. [Google Scholar] [CrossRef]
- Chen, C.-M.; Wang, J.-Y. Effects of Online Synchronous Instruction with an Attention Monitoring and Alarm Mechanism on Sustained Attention and Learning Performance. Interact. Learn. Environ. 2018, 26, 427–443. [Google Scholar] [CrossRef]
- Lin, C.-H.; Chen, C.-M.; Lin, Y.-T. Improving Effectiveness of Learners’ Review of Video Lectures by Using an Attention-Based Video Lecture Review Mechanism Based on Brainwave Signals. In 2018 7th International Congress on Advanced Applied Informatics (IIAI-AAI); IEEE: Yonago, Japan, 2018; pp. 152–157. [Google Scholar] [CrossRef]
- Li, Q.; Ren, Y.; Wei, T.; Wang, C.; Liu, Z.; Yue, J. A Learning Attention Monitoring System via Photoplethysmogram Using Wearable Wrist Devices. In Artificial Intelligence Supported Educational Technologies; Pinkwart, N., Liu, S., Eds.; Advances in Analytics for Learning and Teaching; Springer International Publishing: Cham, Swizerland, 2020; pp. 133–150. [Google Scholar] [CrossRef]
- Abate, A.F.; Cascone, L.; Nappi, M.; Narducci, F.; Passero, I. Attention Monitoring for Synchronous Distance Learning. Future Gener. Comput. Syst. 2021, 125, 774–784. [Google Scholar] [CrossRef]
- Terraza Arciniegas, D.F.; Amaya, M.; Piedrahita Carvajal, A.; Rodriguez-Marin, P.A.; Duque-Muñoz, L.; Martinez-Vargas, J.D. Students’ Attention Monitoring System in Learning Environments Based on Artificial Intelligence. IEEE Lat. Am. Trans. 2022, 20, 126–132. [Google Scholar] [CrossRef]
- Alemdag, E.; Cagiltay, K. A Systematic Review of Eye Tracking Research on Multimedia Learning. Comput. Educ. 2018, 125, 413–428. [Google Scholar] [CrossRef]
- Jamil, N.; Belkacem, A.N.; Lakas, A. On Enhancing Students’ Cognitive Abilities in Online Learning Using Brain Activity and Eye Movements. Educ. Inf. Technol. 2023, 28, 4363–4397. [Google Scholar] [CrossRef]
- Pouta, M.; Lehtinen, E.; Palonen, T. Student Teachers’ and Experienced Teachers’ Professional Vision of Students’ Understanding of the Rational Number Concept. Educ. Psychol. Rev. 2021, 33, 109–128. [Google Scholar] [CrossRef]
- Sharma, K.; Giannakos, M.; Dillenbourg, P. Eye-Tracking and Artificial Intelligence to Enhance Motivation and Learning. Smart Learn. Environ. 2020, 7, 13. [Google Scholar] [CrossRef]
- Tsai, M.-J.; Hou, H.-T.; Lai, M.-L.; Liu, W.-Y.; Yang, F.-Y. Visual Attention for Solving Multiple-Choice Science Problem: An Eye-Tracking Analysis. Comput. Educ. 2012, 58, 375–385. [Google Scholar] [CrossRef]
- Hasson, U.; Landesman, O.; Knappmeyer, B.; Vallines, I.; Rubin, N.; Heeger, D.J. Neurocinematics: The Neuroscience of Film. Projections 2008, 2, 1–26. [Google Scholar] [CrossRef]
- Yang, F.-Y.; Chang, C.-Y.; Chien, W.-R.; Chien, Y.-T.; Tseng, Y.-H. Tracking Learners’ Visual Attention during a Multimedia Presentation in a Real Classroom. Comput. Educ. 2013, 62, 208–220. [Google Scholar] [CrossRef]
- Liu, Q.; Yang, X.; Chen, Z.; Zhang, W. Using Synchronized Eye Movements to Assess Attentional Engagement. Psychol. Res. 2023, 87, 2039–2047. [Google Scholar] [CrossRef]
- Esterman, M.; Rosenberg, M.D.; Noonan, S.K. Intrinsic Fluctuations in Sustained Attention and Distractor Processing. J. Neurosci. 2014, 34, 1724–1730. [Google Scholar] [CrossRef]
- Felder, R.M.; Silverman, L.K. Learning and Teaching Styles in Engineering Education. Eng. Educ. 1988, 78, 674–681. [Google Scholar]
- Nugrahaningsih, N.; Porta, M.; Klasnja-Milicevic, A. Assessing Learning Styles through Eye Tracking for E-Learning Applications. ComSIS 2021, 18, 1287–1309. [Google Scholar] [CrossRef]
- Al-Wabil, A.; ElGibreen, H.; George, R.P.; Al-Dosary, B. Exploring the Validity of Learning Styles as Personalization Parameters in eLearning Environments: An Eyetracking Study. In Proceedings of the 2010 2nd International Conference on Computer Technology and Development, Cairo, Egypt, 2–4 November 2010; pp. 174–178. [Google Scholar] [CrossRef]
- Mehigan, T.J.; Barry, M.; Kehoe, A.; Pitt, I. Using Eye Tracking Technology to Identify Visual and Verbal Learners. In Proceedings of the 2011 IEEE International Conference on Multimedia and Expo, Barcelona, Spain, 11–15 July 2011; pp. 1–6. [Google Scholar] [CrossRef]
- Luo, Z. Using Eye-Tracking Technology to Identify Learning Styles: Behaviour Patterns and Identification Accuracy. Educ. Inf. Technol. 2021, 26, 4457–4485. [Google Scholar] [CrossRef]
- Cao, J.; Nishihara, A. Understanding Learning Style by Eye Tracking in Slide Video Learning. J. Educ. Multimed. Hypermedia 2012, 21, 335–358. [Google Scholar]
- Cao, J.; Nishihara, A. Viewing Behaviors Affected by Slide Features and Learning Style in Slide Video from a Sequence Analysis Perspective. J. Inf. Syst. Educ. 2013, 12, 1–12. [Google Scholar] [CrossRef]
- Ou, C.; Joyner, D.A.; Goel, A.K. Designing and Developing Videos for Online Learning: A Seven-Principle Model. OLJ 2019, 23, 82–104. [Google Scholar] [CrossRef]
- Mayer, R.E.; DaPra, C.S. An Embodiment Effect in Computer-Based Learning with Animated Pedagogical Agents. J. Exp. Psychol. Appl. 2012, 18, 239–252. [Google Scholar] [CrossRef] [PubMed]
- Wang, F.; Li, W.; Mayer, R.E.; Liu, H. Animated Pedagogical Agents as Aids in Multimedia Learning: Effects on Eye-Fixations during Learning and Learning Outcomes. J. Educ. Psychol. 2018, 110, 250–268. [Google Scholar] [CrossRef]
- Diedenhofen, B.; Musch, J. Cocor: A Comprehensive Solution for the Statistical Comparison of Correlations. PLoS ONE 2015, 10, e0121945. [Google Scholar] [CrossRef]
- Sauter, M.; Hirzle, T.; Wagner, T.; Hummel, S.; Rukzio, E.; Huckauf, A. Can Eye Movement Synchronicity Predict Test Performance With Unreliably-Sampled Data in an Online Learning Context? In 2022 Symposium on Eye Tracking Research and Applications; ACM: Seattle, WA, USA, 2022; pp. 1–5. [Google Scholar] [CrossRef]
- Mu, L.; Cui, M.; Wang, X.; Qiao, J.; Tang, D. Learners’ Attention Preferences of Information in Online Learning: An Empirical Study Based on Eye-Tracking. Interact. Technol. Smart Educ. 2019, 16, 186–203. [Google Scholar] [CrossRef]
Dimension | Active | Reflective | Sensing | Intuitive | Visual | Verbal | Sequential | Global |
---|---|---|---|---|---|---|---|---|
No. | 4 | 16 | 14 | 6 | 15 | 5 | 5 | 16 |
Dimension | Active | Reflective | Sensing | Intuitive | Visual | Verbal | Sequential | Global |
---|---|---|---|---|---|---|---|---|
rEM | 0.68 | 0.55 | 0.49 | 0.73 | 0.46 | 0.73 | 0.55 | 0.63 |
rEM + PS | 0.69 | 0.58 | 0.56 | 0.74 | 0.48 | 0.67 | 0.70 | 0.59 |
Dimension | Active | Reflective | Sensing | Intuitive | Visual | Verbal | Sequential | Global |
---|---|---|---|---|---|---|---|---|
95% CI | [−0.02, 0.02] | [−0.15, 0.13] | [−0.04, 0.03] | [−0.10, 0.01] | [−0.05, 0.03] | [−0.06, 0.06] | [−0.15, 0.13] | [−0.01, 0.04] |
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content. |
© 2024 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
Su, C.; Liu, X.; Gan, X.; Zeng, H. Using Synchronized Eye Movements to Predict Attention in Online Video Learning. Educ. Sci. 2024, 14, 548. https://doi.org/10.3390/educsci14050548
Su C, Liu X, Gan X, Zeng H. Using Synchronized Eye Movements to Predict Attention in Online Video Learning. Education Sciences. 2024; 14(5):548. https://doi.org/10.3390/educsci14050548
Chicago/Turabian StyleSu, Caizhen, Xingyu Liu, Xinru Gan, and Hang Zeng. 2024. "Using Synchronized Eye Movements to Predict Attention in Online Video Learning" Education Sciences 14, no. 5: 548. https://doi.org/10.3390/educsci14050548
APA StyleSu, C., Liu, X., Gan, X., & Zeng, H. (2024). Using Synchronized Eye Movements to Predict Attention in Online Video Learning. Education Sciences, 14(5), 548. https://doi.org/10.3390/educsci14050548