SPEED: A Graphical User Interface Software for Processing Eye Tracking Data
Abstract
:1. Introduction
1.1. Applications of Eye Tracking
1.2. Eye-Tracking Devices
2. Eye-Tracker Software
3. SPEED Software
3.1. Events Loader
3.2. Features
- Placing April Tags on each corner of the observation area (https://github.com/April Robotics/apriltag) and performing a “Marker Mapper” Enrichment
- Providing the reference image and scanning and recording for the “Reference Image Mapper” (https://docs.pupil-labs.com/neon/pupil-cloud/enrichments/).
3.3. Plots
3.4. Interface
4. Example of Use
4.1. Background
4.2. Participants
4.3. Moral Decision-Making Task
- Eight non-moral (NM) or control dilemmas, in which there is no moral choice (no emotional involvement);
- Eight impersonal moral (MI) dilemmas, in which the protagonist neither causes nor induces harm to others by his or her actions but behaves in a socially wrong way (significant emotional involvement);
- Eight personal moral (MP) dilemmas, in which the protagonist behaves in ways that may induce harm to others but with good and positive purposes (very high emotional involvement).
- Unobserved (U): the dilemma was kept unedited, with no observers present, plus the additional audio cue “Sai che nessun altro ti vede” (Translation: “You know that no one else is observing you”);
- Media (M): a dilemma where journalists or members of the media observe the experimental subject, plus the additional audio cue “Sai che sei osservato da un giornalista” (Translation: “You know that you are observed by a journalist”);
- Authority (A): a dilemma in which a law enforcement officer or security guard observes the subject, plus the additional audio cue “Sai che sei osservato da un poliziotto” (Translation: “You know that you are observed by a policeman”).
4.4. Procedure
4.5. Data Analysis
4.6. Results
5. Conclusions
Author Contributions
Funding
Institutional Review Board Statement
Informed Consent Statement
Data Availability Statement
Acknowledgments
Conflicts of Interest
Abbreviations
ASD | Autism Spectrum Disorder |
GUI | Graphical User Interface |
MDMT | Moral Decision-Making Task |
ROI | Region of Interest |
SPEED | labScoc Processing and Extraction of Eye Tracking Data |
Appendix A. Algorithms
Appendix A.1. Indexing Movements
Algorithm A1 Indexing movements |
1: function FillMissingFixationID( ) 2: for do 3: if is not a number then 4: 5: end if 6: end for 7: return 8: end function 9: 10: function IndexMovement() 11: TABLE gaze_mark ← FillMissingFixationID(TABLE gaze_mark) 12: 13: for do 14: if == −1 then 15: = row number of 16: while () do 17: = row number of + 1 + 0.5 18: 19: end while 20: end if 21: end for 22: return TABLE gaze_mark 23: end function |
Appendix A.2. Extracting Movements
Algorithm A2 Extracting Movements |
|
Appendix A.3. Get the First Element of a List
Algorithm A3 Get the First Element of a List |
|
Appendix A.4. Get the Last Element of a List
Algorithm A4 Get the Last Element of a List |
|
Appendix A.5. Euclidian Distance
Algorithm A5 Euclidian distance from last and first element on a list |
|
Appendix A.6. Sum of Euclidean Distances
Algorithm A6 Sum of Euclidean distances between consecutive elements in a list |
|
Appendix A.7. Gaze Fixation Extractor
Algorithm A7 GazeFixationExtractor |
|
Appendix A.8. Gaze Movement Extractor
Algorithm A8 GazeMovementExtractor |
|
Appendix A.9. Adding on ROI
Appendix A.10. Process Blink
Algorithm A9 Adding on ROI |
|
Algorithm A10 Process Blink |
|
References
- Skaramagkas, V.; Giannakakis, G.; Ktistakis, E.; Manousos, D.; Karatzanis, I.; Tachos, N.S.; Tripoliti, E.; Marias, K.; Fotiadis, D.I.; Tsiknakis, M. Review of eye tracking metrics involved in emotional and cognitive processes. IEEE Rev. Biomed. Eng. 2021, 16, 260–277. [Google Scholar] [CrossRef] [PubMed]
- Mele, M.L.; Federici, S. Gaze and eye-tracking solutions for psychological research. Cogn. Process. 2012, 13, 261–265. [Google Scholar] [CrossRef] [PubMed]
- Jacob, R.J.; Karn, K.S. Eye tracking in human-computer interaction and usability research: Ready to deliver the promises. In The Mind’s Eye; Elsevier: Amsterdam, The Netherlands, 2003; pp. 573–605. [Google Scholar]
- D’Aurizio, G.; Di Pompeo, I.; Passarello, N.; Troisi Lopez, E.; Sorrentino, P.; Curcio, G.; Mandolesi, L. Visuospatial working memory abilities in children analyzed by the bricks game task (BGT). Psychol. Res. 2023, 87, 2111–2119. [Google Scholar] [CrossRef] [PubMed]
- Viganò, S.; Bayramova, R.; Doeller, C.F.; Bottini, R. Spontaneous eye movements reflect the representational geometries of conceptual spaces. Proc. Natl. Acad. Sci. USA 2024, 121, e2403858121. [Google Scholar] [CrossRef]
- Cherng, Y.G.; Baird, T.; Chen, J.T.; Wang, C.A. Background luminance effects on pupil size associated with emotion and saccade preparation. Sci. Rep. 2020, 10, 15718. [Google Scholar] [CrossRef]
- Mathôt, S. Pupillometry: Psychology, physiology, and function. J. Cogn. 2018, 1, 16. [Google Scholar] [CrossRef]
- Maffei, A.; Angrilli, A. Spontaneous eye blink rate: An index of dopaminergic component of sustained attention and fatigue. Int. J. Psychophysiol. 2018, 123, 58–63. [Google Scholar] [CrossRef]
- Maffei, A.; Angrilli, A. Spontaneous blink rate as an index of attention and emotion during film clips viewing. Physiol. Behav. 2019, 204, 256–263. [Google Scholar] [CrossRef]
- Siegle, G.J.; Ichikawa, N.; Steinhauer, S. Blink before and after you think: Blinks occur prior to and following cognitive load indexed by pupillary responses. Psychophysiology 2008, 45, 679–687. [Google Scholar] [CrossRef]
- Johansson, R.; Holsanova, J.; Holmqvist, K. Pictures and spoken descriptions elicit similar eye movements during mental imagery, both in light and in complete darkness. Cogn. Sci. 2006, 30, 1053–1079. [Google Scholar] [CrossRef]
- Di Pompeo, I.; D’Aurizio, G.; Burattini, C.; Bisegna, F.; Curcio, G. Positive mood induction to promote well-being and health: A systematic review from real settings to virtual reality. J. Environ. Psychol. 2023, 91, 102095. [Google Scholar] [CrossRef]
- Weeks, J.W.; Howell, A.N.; Goldin, P.R. Gaze avoidance in social anxiety disorder. Depress. Anxiety 2013, 30, 749–756. [Google Scholar] [CrossRef] [PubMed]
- Vargas-Cuentas, N.I.; Roman-Gonzalez, A.; Gilman, R.H.; Barrientos, F.; Ting, J.; Hidalgo, D.; Jensen, K.; Zimic, M. Developing an eye-tracking algorithm as a potential tool for early diagnosis of autism spectrum disorder in children. PLoS ONE 2017, 12, e0188826. [Google Scholar] [CrossRef]
- Saputro, J.S.; Anggarani, F.K.; Yusuf, M.; Dewa, R.B.; Susetyo, R.A.; Yusuf, H. Design and Development of Eye Tracking System for Children with Autism. In Proceedings of the E3S Web of Conferences; EDP Sciences: Ulis, France, 2023; Volume 465, p. 02060. [Google Scholar]
- Boraston, Z.; Blakemore, S.J. The application of eye-tracking technology in the study of autism. J. Physiol. 2007, 581, 893–898. [Google Scholar] [CrossRef]
- Beesley, T.; Pearson, D.; Le Pelley, M. Eye tracking as a tool for examining cognitive processes. In Biophysical Measurement in Experimental Social Science Research; Elsevier: Amsterdam, The Netherlands, 2019; pp. 1–30. [Google Scholar]
- Raney, G.E.; Campbell, S.J.; Bovee, J.C. Using eye movements to evaluate the cognitive processes involved in text comprehension. J. Vis. Exp. JoVE 2014, e50780. [Google Scholar] [CrossRef]
- Wedel, M.; Pieters, R.; van der Lans, R. Modeling eye movements during decision making: A review. Psychometrika 2023, 88, 697–729. [Google Scholar] [CrossRef]
- Glöckner, A.; Herbold, A.K. An eye-tracking study on information processing in risky decisions: Evidence for compensatory strategies based on automatic processes. J. Behav. Decis. Mak. 2011, 24, 71–98. [Google Scholar] [CrossRef]
- Iloka, B.C.; Anukwe, G.I. Review of eye-tracking: A neuromarketing technique. Neurosci. Res. Notes 2020, 3, 29–34. [Google Scholar] [CrossRef]
- Gheorghe, C.M.; Purcărea, V.L.; Gheorghe, I.R. Using eye-tracking technology in Neuromarketing. Rom. J. Ophthalmol. 2023, 67, 2. [Google Scholar]
- Kassner, M.; Patera, W.; Bulling, A. Pupil: An open source platform for pervasive eye tracking and mobile gaze-based interaction. In Proceedings of the 2014 ACM International Joint Conference on Pervasive and Ubiquitous Computing: Adjunct Publication, Seattle, WA, USA, 13–17 September 2014; pp. 1151–1160. [Google Scholar]
- Niehorster, D.C.; Santini, T.; Hessels, R.S.; Hooge, I.T.; Kasneci, E.; Nyström, M. The impact of slippage on the data quality of head-worn eye trackers. Behav. Res. Methods 2020, 52, 1140–1160. [Google Scholar] [CrossRef]
- Mazza, M.; Pino, M.C.; Vagnetti, R.; Peretti, S.; Valenti, M.; Marchetti, A.; Di Dio, C. Discrepancies between explicit and implicit evaluation of aesthetic perception ability in individuals with autism: A potential way to improve social functioning. BMC Psychol. 2020, 8, 74. [Google Scholar] [CrossRef] [PubMed]
- Mustra, M.; Delac, K.; Grgic, M. Overview of the DICOM standard. In Proceedings of the 2008 50th International Symposium ELMAR, Zadar, Croatia, 10–12 September 2008; Volume 1, pp. 39–44. [Google Scholar]
- Di Matteo, A.; Lozzi, D.; Mignosi, F.; Polsinelli, M.; Placidi, G. A DICOM-based Standard for Quantitative Physical Rehabilitation. Comput. Struct. Biotechnol. J. 2025, 28, 40–49. [Google Scholar] [CrossRef]
- Sogo, H. GazeParser: An open-source and multiplatform library for low-cost eye tracking and analysis. Behav. Res. Methods 2013, 45, 684–695. [Google Scholar] [CrossRef] [PubMed]
- Filetti, M.; Tavakoli, H.R.; Ravaja, N.; Jacucci, G. PeyeDF: An eye-tracking application for reading and self-indexing research. arXiv 2019, arXiv:1904.12152. [Google Scholar]
- Dalmaijer, E.; Mathôt, S.; Stigchel, S.V. Pygaze: An open-source, cross-platform toolbox for minimal-effort programming of eyetracking experiments. Behav. Res. Methods 2013, 46, 913–921. [Google Scholar] [CrossRef]
- Peirce, J.; Gray, J.R.; Simpson, S.; MacAskill, M.; Höchenberger, R.; Sogo, H.; Kastman, E.; Lindeløv, J.K. PsychoPy2: Experiments in behavior made easy. Behav. Res. Methods 2019, 51, 195–203. [Google Scholar] [CrossRef]
- Peirce, J.W. Psychopy—psychophysics software in python. J. Neurosci. Methods 2007, 162, 8–13. [Google Scholar] [CrossRef]
- Lejarraga, T.; Schulte-Mecklenbeck, M.; Smedema, D. The pyetribe: Simultaneous eyetracking for economic games. Behav. Res. Methods 2016, 49, 1769–1779. [Google Scholar] [CrossRef]
- Venthur, B.; Scholler, S.; Williamson, J.; Dähne, S.; Treder, M.; Kramarek, M.C.; Müller, K.R.; Blankertz, B. Pyff—A pythonic framework for feedback applications and stimulus presentation in neuroscience. Front. Neurosci. 2010, 4, 179. [Google Scholar] [CrossRef]
- Gramfort, A.; Luessi, M.; Larson, E.; Engemann, D.; Strohmeier, D.; Brodbeck, C.; Hämäläinen, M. Meg and eeg data analysis with mne-python. Front. Neurosci. 2013, 7, 267. [Google Scholar] [CrossRef]
- De Tommaso, D.; Wykowska, A. TobiiGlassesPySuite: An open-source suite for using the Tobii Pro Glasses 2 in eye-tracking studies. In Proceedings of the 11th ACM Symposium on Eye Tracking Research & Applications, Denver, CO, USA, 25–28 June 2019; pp. 1–5. [Google Scholar]
- Suryavanshi, K.; Gahlot, P.; Thapa, S.B.; Gandhi, A.; Raman, R. Gamification on OTT platforms: A Behavioural Study for User Engagement. Int. J. Adv. Comput. Sci. Appl. 2022, 13. [Google Scholar] [CrossRef]
- Tanisaro, P.; Schöning, J.; Kurzhals, K.; Heidemann, G.; Weiskopf, D. Visual analytics for video applications. It-Inf. Technol. 2015, 57, 30–36. [Google Scholar] [CrossRef]
- Andersson, R.; Sandgren, O. ELAN Analysis Companion (EAC): A software tool for time-course analysis of ELAN-annotated data. J. Eye Mov. Res. 2016, 9, 1–8. [Google Scholar] [CrossRef]
- Sippel, K.; Kübler, T.; Fuhl, W.; Schievelbein, G.; Rosenberg, R.; Rosenstiel, W. Eyetrace2014-eyetracking data analysis tool. In Proceedings of the International Conference on Health Informatics. SCITEPRESS, Lisbon, Portugal, 12–15 January 2015; Volume 2, pp. 212–219. [Google Scholar]
- Pupil Labs. Pupil Cloud. 2024. Available online: https://cloud.pupil-labs.com (accessed on 16 June 2024).
- Baumann, C.; Dierkes, K. Neon Accuracy Test Report; Pupil Labs: Berlin, Germany, 2023. [Google Scholar]
- Van Rossum, G.; Drake, F.L., Jr. Python Reference Manual; Centrum voor Wiskunde en Informatica: Amsterdam, The Netherlands, 1995. [Google Scholar]
- Migliore, S.; D’Aurizio, G.; Parisi, F.; Maffi, S.; Squitieri, B.; Curcio, G.; Mancini, F. Moral judgment and empathic/deontological guilt. Psychol. Rep. 2019, 122, 1395–1411. [Google Scholar] [CrossRef]
- Zangrossi, A.; Cona, G.; Celli, M.; Zorzi, M.; Corbetta, M. Visual exploration dynamics are low-dimensional and driven by intrinsic factors. Commun. Biol. 2021, 4, 1100. [Google Scholar] [CrossRef]
- Ajzen, I. The social psychology of decision making. In Social Psychology: Handbook of Basic Principles; The Guilford Press: New York, NY, USA, 1996; pp. 297–325. [Google Scholar]
- Stankovic, A.; Fairchild, G.; Aitken, M.R.; Clark, L. Effects of psychosocial stress on psychophysiological activity during risky decision-making in male adolescents. Int. J. Psychophysiol. 2014, 93, 22–29. [Google Scholar] [CrossRef]
- Greene, J.D.; Sommerville, R.B.; Nystrom, L.E.; Darley, J.M.; Cohen, J.D. An fMRI investigation of emotional engagement in moral judgment. Science 2001, 293, 2105–2108. [Google Scholar] [CrossRef]
- Sugrue, L.P.; Corrado, G.S.; Newsome, W.T. Choosing the greater of two goods: Neural currencies for valuation and decision making. Nat. Rev. Neurosci. 2005, 6, 363–375. [Google Scholar] [CrossRef]
- Ciaramelli, E.; Muccioli, M.; Làdavas, E.; Di Pellegrino, G. Selective deficit in personal moral judgment following damage to ventromedial prefrontal cortex. Soc. Cogn. Affect. Neurosci. 2007, 2, 84–92. [Google Scholar] [CrossRef]
- The Jamovi Project. Jamovi, Version 2.6; Computer Software. 2025. Available online: https://www.jamovi.org (accessed on 15 March 2025).
- Abusharha, A.A. Changes in blink rate and ocular symptoms during different reading tasks. Clin. Optom. 2017, 9, 133–138. [Google Scholar] [CrossRef]
- WMA. World Medical Association Declaration of Helsinki: Ethical principles for medical research involving human subjects. JAMA 2013, 310, 2191–2194. [Google Scholar] [CrossRef] [PubMed]
Category | Algorithm(s) | Indexes | Plots | Statistics |
---|---|---|---|---|
Blink |
|
|
|
|
Fixations |
|
|
| |
Gaze |
|
|
|
|
Saccade |
|
|
| |
Pupillometry |
|
|
|
|
Equation | Description |
---|---|
Calculates the Euclidean distance between two points, and , in a two-dimensional plane. | |
Finds the maximum element within a given list. | |
Finds the minimum element within a given list. | |
Calculates the sum of all elements in a given list. | |
Determines the number of elements (length) of a given list. | |
Computes the average (mean) of the elements in a given list. | |
Calculates the standard deviation of the elements in a given list, indicating the dispersion of the values around the mean. |
Condition | Variation | No. |
---|---|---|
Non-Moral (NM) | - | 8 |
Personal (MP) | Unobserved (U-MP) | 8 |
Authority (A-MP) | 8 | |
Media (M-MP) | 8 | |
Impersonal (MI) | Unobserved (U-MI) | 8 |
Authority (A-MI) | 8 | |
Media (M-MI) | 8 | |
Total | 56 |
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content. |
© 2025 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
Lozzi, D.; Di Pompeo, I.; Marcaccio, M.; Ademaj, M.; Migliore, S.; Curcio, G. SPEED: A Graphical User Interface Software for Processing Eye Tracking Data. NeuroSci 2025, 6, 35. https://doi.org/10.3390/neurosci6020035
Lozzi D, Di Pompeo I, Marcaccio M, Ademaj M, Migliore S, Curcio G. SPEED: A Graphical User Interface Software for Processing Eye Tracking Data. NeuroSci. 2025; 6(2):35. https://doi.org/10.3390/neurosci6020035
Chicago/Turabian StyleLozzi, Daniele, Ilaria Di Pompeo, Martina Marcaccio, Matias Ademaj, Simone Migliore, and Giuseppe Curcio. 2025. "SPEED: A Graphical User Interface Software for Processing Eye Tracking Data" NeuroSci 6, no. 2: 35. https://doi.org/10.3390/neurosci6020035
APA StyleLozzi, D., Di Pompeo, I., Marcaccio, M., Ademaj, M., Migliore, S., & Curcio, G. (2025). SPEED: A Graphical User Interface Software for Processing Eye Tracking Data. NeuroSci, 6(2), 35. https://doi.org/10.3390/neurosci6020035