A Proposed Method of Automating Data Processing for Analysing Data Produced from Eye Tracking and Galvanic Skin Response
Abstract
:1. Introduction
How to Extract Records from Eye Tracking Devices and How to Process and Analyse Records
- Produce the database.
- Define the problem.
- Understand the problem.
- Process the data, with the following sub-steps:
- 4.1.
- Prepare the data: clean the data, which includes operations to correct erroneous data, filter incorrect data from the dataset, and reduce unnecessary data. Other data cleaning tasks involve detecting discrepancies and dirty data (fragments of original data that make no sense)—these tasks are more closely related to understanding the original data and usually need human involvement in the process—and transformation of the data (in this step, the data are converted or consolidated so that the result of the data-mining process can be applied or be more effective). Sub-tasks within transformation include smoothing, construction of characteristics, and data aggregation or data summary; these require human supervision.
- 4.2.
- Integrate the data (covers the fusion of data from various data stores). This process is applied to avoid redundancies and inconsistencies in the resulting dataset. Typical operations in data integration are identification and unification of variables and domains, analysis of the correlation of attributes, duplication of tuples, and detection of conflicts in data values from different sources.
- 4.3.
- Normalise the data (this refers to the unity of the measure used, as this may affect the data analysis). This means that all of the attributes must be expressed in the same units of measurement and must use a common scale or range. This step is particularly useful in statistical learning methods.
- 4.4.
- Impute missing data (this is a form of cleaning data, the aim of which is to fill in missing values using a reasonable estimation via different methods, such as applying a mean value).
- 4.5.
- Identify noise (this refers to the act of smoothing in data transformation, the main objective of which is to detect random errors or variance in a measured variable).
- Apply machine learning techniques.
- Evaluation.
- Exploit the results.
2. Materials and Methods
2.1. Participants
2.2. Instruments
- (a)
- Tobii Pro Lab v. 1.241.54542 [59]. This is a hardware and software platform that is used by researchers to analyse human behaviour. It has a high degree of flexibility and can be used to perform advanced studies on attention and provide deep analysis of cognitive processes. This software was chosen due to it currently being one of the most widely used in research [60], as it makes it easy to integrate other measuring instruments, such as GSR.
- (b)
- Shimmer 3 GSR+ [61]. The Shimmer3 GSR+ (Galvanic Skin Response) unit provides connections and pre-amplification for a Galvanic Skin Response (Electrodermal Resistance Measurement—EDR/Electrodermal Activity (EDA) data acquisition channel). The GSR+ unit is suitable for measuring the electrical characteristics or conductance of the skin. The device is compatible for use in conjunction with Tobii Pro Lab v. 1.241.54542 and the metrics are integrated with those obtained with the Tobii Pro Lab v. 1.241.54542 device.
- (c)
- Virtual Laboratory 8—Eye tracking technology applied to Early Intervention II. A virtual classroom (also called a lab) in the eEarlyCare-T Research Project. This classroom is free and open access after logging into the platform https://www2.ubu.es/eearlycare_t/en/project (accessed on 8 August 2024) [62]. Images from the virtual classroom are provided in Figure A1.
- (d)
- Python libraries: Pandas, sklearn, matplotlib, seaborn.
2.3. Procedure
2.4. Data Analysis
3. Results
- Step 1. Clean and filter the files (one file per participant was exported), in this case, they were extracted in .xlsx format.
- Step 2. Import libraries to read the extracted files.
- Step 3. Create a function to concatenate the files
- data_folder: path to the folder containing data files
- extension: file extension (e.g., .xlsx)
- concatenated_data: A DataFrame formed by concatenating all the files.
- Initialize files as the list of files in data_folder.
- Initialize an empty list list_dataframes.
- For each file in files:
- ○
- Set full_path as the complete path of the file.
- ○
- If full_path is a valid file and its extension matches extension:
- ▪
- If the extension is .xlsx:
- ▪
- Display message: “Reading file i + 1/len(files): file”.
- ▪
- Read the file into a DataFrame df.
- ▪
- Append df to list_dataframes.
- ▪
- Else: Continue to the next file.
- ○
- If an error occurs while reading the file:
- ▪
- Display message: “Error reading file file: error_message”.
- If list_dataframes is not empty:
- ○
- Concatenate all DataFrames in list_dataframes into concatenated_data.
- ○
- Display message: “All files have been processed!”.
- ○
- Return concatenated_data.
- Else:
- ○
- Display message: “No files were found for processing”.
- ○
- Return None.
- Step 4. Data Preprocessing.
- data: DataFrame containing the data
- delete_rows: dictionary where keys are column names and values are rows to be deleted
- filter_rows: dictionary where keys are column names and values are rows of interest
- select_columns: list containing the columns to be kept
- processed_data: A DataFrame with the processed data.
- For each column, values in delete_rows:
- ○
- Remove rows from data where the column contains any of the values.
- For each column, values in filter_rows:
- ○
- Filter data to keep only rows where the column contains any of the values.
- Select columns of interest in data using select_columns.
- Return processed_data, which is now the processed DataFrame.
- Step 5. Data Processing with Filters and Selection.
- Create a dictionary delete_rows:
- ○
- Keys: ‘Participant name’
- ○
- Values: [‘Test XX 2’, ‘Participant_X3’, ‘NP_01’, ‘NP_01_2’, ‘NP_02’]
- ○
- Purpose: To remove rows where the ‘Participant name’ matches any value in the list.
- Create a dictionary filter_rows:
- ○
- Keys: ‘Sensor’
- ○
- Values: [‘Mouse’, ‘GSR’]
- ○
- Purpose: To filter rows where the ‘Sensor’ column contains only ‘Mouse’ and ‘GSR’.
- Create a list select_columns:
- ○
- [‘Participant name’, ‘Gender’, ‘Audio’, ‘Recording duration’, ‘Eye movement type’, ‘Galvanic skin response (GSR)’]
- ○
- Purpose: To select the columns of interest from the DataFrame.
- Applythe processing:
- ○
- Call processing_Data function with the parameters: data, delete_rows, filter_rows, and select_columns.
- Return the processed data as processed_data.
- Step 6. Data grouping and aggregation
- data: DataFrame containing participant data
- grouped: DataFrame with data grouped by ‘Participant name’ and aggregated metrics
- Group the data by ‘Participant name’, applying the following aggregation functions:
- ○
- Gender: Select the first value (assuming gender is consistent across all rows).
- ○
- Audio: Select the first value (assuming the audio characteristic is consistent across all rows).
- ○
- Recording duration: Select the first value (assuming test duration is the same across all rows).
- ○
- Eye movement type:
- ▪
- If both ‘Fixation’ and ‘Saccade’ exist in the column:
- ▪
- Compute the ratio of ‘Fixation’ events to ‘Saccade’ events.
- ▪
- Else: Return None.
- ○
- Galvanic skin response (GSR): Compute the mean value for GSR.
- Reset the index of the grouped DataFrame.
- Rename the ‘Eye movement type’ column to ‘Fixation to Saccade Ratio’ for a more descriptive name.
- Return the grouped DataFrame.
- Step 7. Load data
- location: File path of the Excel file
- data: Loaded DataFrame from the Excel file
- Load the Excel file from location into a DataFrame data
- Return the data
- Step 8. Combine data
- data1: First DataFrame
- data2: Second DataFrame
- column: Column name on which to merge both DataFrames
- data: Merged DataFrame based on the common column
- Merge data1 and data2 on the column
- Return the merged data
- Step 9. Clustering preparation
- data: DataFrame containing the original data
- data_normalized: DataFrame with encoded, imputed, and normalized data ready for clustering
- Detect Boolean variables:
- ○
- Identify Boolean columns.
- ○
- Store these column names in booleans.
- Detect non-Boolean categorical variables:
- ○
- Identify categorical columns.
- ○
- Store these column names in categorical.
- Encode categorical variables:
- ○
- Convert categorical variables into Boolean (binary) variables.
- ○
- Store the result in data_encoded.
- Convert Boolean columns to binary values:
- ○
- For each column in booleans:
- ▪
- Convert the column values to binary (0 or 1).
- Handle missing values:
- ○
- Apply a mean imputation strategy to fill missing values:
- ▪
- Impute missing values using mean in data_encoded.
- ▪
- Store the imputed data in data_filled.
- Normalize the data:
- ○
- Apply StandardScaler to normalize the data:
- ▪
- Scale the imputed data so that all features have the same scale.
- ▪
- Store the normalized data in data_normalized.
- Return the data_normalized.
- Final Step: Apply Clustering Preparation to Combined Data
- Call clustering_preparation(data_combined) to normalize and prepare the data for clustering.
- Return data_normalized.
- Algorithm: K-Means Clustering (Elbow Method)
- data: DataFrame containing the preprocessed data for clustering
- k_range: List of values representing the range of potential cluster numbers
- inertia: List of inertia values for different cluster sizes
- Initialize an empty list inertia to store the inertia values for each K.
- For each value of k in k_range:
- ○
- Create a KMeans object with the following parameters:
- ▪
- n_clusters = k: Specifies the number of clusters.
- ▪
- random_state = 42: Ensures reproducibility.
- ▪
- n_init = 10: Number of times the KMeans algorithm will be run with different initializations.
- ○
- Fit the KMeans model to data using the fit method.
- ○
- Append the inertia value of the current KMeans model to the inertia list.
- Algorithm: K-Means Clustering Visualization
- data: Original DataFrame with all data (including the cluster assignment)
- normalised_data: Data that has been preprocessed and normalized for clustering
- delete_columns: List of columns to be removed (if necessary) before visualization
- clusters: The number of clusters from the KMeans model
- A plot showing relationships between variables involved in clustering.
- Initialize a figure for the plot with size (12, 12).
- Create a pairplot using Seaborn (sns.pairplot):
- ○
- Set data as the source DataFrame.
- ○
- Specify the variables to plot using numerical_cols (the numerical columns involved in the clustering).
- ○
- Set hue to ‘Cluster’ to color the points by their assigned cluster.
- ○
- Use the palette=‘tab10’ to specify a color palette for the clusters.
- ○
- Set plot_kws with additional plot options:
- ▪
- alpha = 0.6: Transparency of the points.
- ▪
- s = 100: Size of the scatter plot points.
- ▪
- edgecolor = ‘w’: White edge around the points.
- Add a title to the plot:
- ○
- Title format: “Relationships between variables involved in clustering with KMeans({clusters} clusters)”.
- ○
- Adjust the title position with y=1.02.
- Save the plot as a PDF file:
- ○
- File name: Variables_KMeans_{clusters}Cluster.pdf.
- Show the plot using plt.show().
- Step 10. Linear Regression and Feature Impact Visualization
- data: DataFrame containing the feature variables and target variable
- target_column: The column name of the target variable to be predicted
- df_coefficients: DataFrame containing the coefficients and p-values of each feature
- Split the DataFrame into features (X) and target variable (y):
- ○
- Set y as the target variable column.
- ○
- Set X as the remaining features.
- Scale the features using StandardScaler:
- ○
- Apply StandardScaler to scale the feature values to the same scale.
- ○
- Store the scaled data in X_scaled.
- Initialize the Linear Regression model:
- ○
- Create a LinearRegression object.
- Create empty lists to store coefficients and p-values:
- ○
- coefficients: Stores the regression coefficients (weights).
- ○
- p_values: Stores the p-values for each coefficient, indicating their statistical significance.
- Fit the linear regression model for each feature:
- ○
- For each feature column in X:
- ▪
- Extract the feature column from X_scaled: X_temporal = X_scaled[:, [i]].
- ▪
- Fit the model.
- ▪
- Append the coefficient (model.coef_[0]) to coefficients.
- ▪
- Add a constant column to X_temporal for calculating the intercept: X_temporal sm.add_constant(X_temporal).
- ▪
- Create and fit the OLS model.
- ▪
- Append the p-value (model_sm.pvalues [1]) to p_values.
- Create a DataFrame with the coefficients and p-values:
- ○
- Store the results in df_coefficients.
- Sort the DataFrame by the absolute value of the coefficient:
- ○
- Sort df_coefficients by coefficient magnitude.
- Visualize the regression coefficients:
- ○
- Create a figure for the bar plot.
- ○
- Use Seaborn to create a bar plot.
- ○
- Set plot titles and axis labels.
- ○
- Save the plot as a PDF.
- ○
- Show the plot.
- Return the DataFrame with the sorted coefficients and p-values (df_coefficients).
4. Discussion
5. Conclusions
5.1. Study Limitations
5.2. Future Lines of Research
Author Contributions
Funding
Institutional Review Board Statement
Data Availability Statement
Conflicts of Interest
Appendix A
References
- Retamosa, M.; Aliagas, I.; Millán, A. Displaying ingredients on healthy snack packaging: A study on visual attention, choice, and purchase intention. J. Sens. Stud. 2024, 39, e12944. [Google Scholar] [CrossRef]
- Bajaj, R.; Ali Syed, A.; Singh, S. Analysing applications of neuromarketing in efficacy of programmatic advertising. J. Consum. Behav. 2024, 23, 939–958. [Google Scholar] [CrossRef]
- Bhardwaj, S.; Thapa, S.B.; Gandhi, A. Advances in neuromarketing and improved understanding of consumer behaviour: Analysing tool possibilities and research trends. Cogent Bus. Manag. 2024, 11, 2376773. [Google Scholar] [CrossRef]
- Calderón-Fajardo, V.; Anaya-Sánchez, R.; Rejón-Guardia, F.; Molinillo, S. Neurotourism Insights: Eye Tracking and Galvanic Analysis of Tourism Destination Brand Logos and AI Visuals. Tour. Manag. Stud. 2024, 20, 53–78. [Google Scholar] [CrossRef]
- Modi, N.; Singh, J. An analysis of perfume packaging designs on consumer’s cognitive and emotional behavior using eye gaze tracking. Multimed. Tools Appl. 2024, 83, 82563–82588. [Google Scholar] [CrossRef]
- Thiebaut, R.; Elshourbagi, A. The Effect of Neuromarketing and Subconscious Branding on Business Profitability and Brand Image: A New Business Model Innovation for Startups. In Fostering Global Entrepreneurship Through Business Model Innovation; Gupta, V., Ed.; IGI Global: Hershey, PA, USA, 2024; pp. 217–252. [Google Scholar] [CrossRef]
- Cong, L.; Luan, S.; Young, E.; Mirosa, M.; Bremer, P.; Torrico, D.D. The Application of Biometric Approaches in Agri-Food Marketing: A Systematic Literature Review. Foods 2023, 12, 2982. [Google Scholar] [CrossRef] [PubMed]
- Al-Nafjan, A.; Aldayel, M.; Kharrat, A. Systematic Review and Future Direction of Neuro-Tourism Research. Brain Sci. 2023, 13, 682. [Google Scholar] [CrossRef] [PubMed]
- Khondakar, F.K.; Sarowar, H.; Chowdhury, M.H.; Majumder, S.; Hossain, A.; Dewan, M.A.A.; Hossain, Q.D. A systematic review on EEG-based neuromarketing: Recent trends and analyzing techniques. Brain Inform. 2024, 11, 17. [Google Scholar] [CrossRef]
- Tavares-Filho, E.R.; Hidalgo, L.G.S.; Lima, L.M.; Spers, E.E.; Pimentel, T.C.; Esmerino, E.A.; Cruz, A.G. Impact of animal origin of milk, processing technology, type of product, and price on the Boursin cheese choice process: Insights of a discrete choice experiment and eye tracking. J. Food Sci. 2024, 89, 640–655. [Google Scholar] [CrossRef]
- Madlenak, R.; Chinoracky, R.; Stalmasekova, N.; Madlenakova, L. Investigating the Effect of Outdoor Advertising on Consumer Decisions: An Eye-Tracking and A/B Testing Study of Car Drivers’ Perception. Appl. Sci. 2023, 13, 6808. [Google Scholar] [CrossRef]
- Kim, M.; Lee, J.; Lee, S.Y.; Ha, M.; Park, I.; Jang, J.; Jang, M.; Park, S.; Kwon, J.S. Development of an eye-tracking system based on a deep learning model to assess executive function in patients with mental illnesses. Sci. Rep. 2024, 14, 18186. [Google Scholar] [CrossRef] [PubMed]
- Perkovich, E.; Laakman, A.; Mire, S.; Yoshida, H. Conducting head-mounted eye-tracking research with young children with autism and children with increased likelihood of later autism diagnosis. J. Neurodev. Disord. 2024, 16, 7. [Google Scholar] [CrossRef] [PubMed]
- Amirbay, A.; Mukhanova, A.; Baigabylov, N.; Kudabekov, M.; Mukhambetova, K.; Baigusheva, K.; Baibulova, M.; Ospanova, T. Development of an algorithm for identifying the autism spectrum based on features using deep learning methods. Int. J. Electr. Comput. Eng. (IJECE) 2024, 14, 5513–5523. [Google Scholar] [CrossRef]
- Bent, C.; Glencross, S.; McKinnon, K.; Hudry, K.; Dissanayake, C.; The Victorian ASELCC Team; Vivanti, G. Predictors of Developmental and Adaptive Behaviour Outcomes in Response to Early Intensive Behavioural Intervention and the Early Start Denver Model. J. Autism Dev. Disord. 2024, 54, 2668–2681. [Google Scholar] [CrossRef]
- Ibragimov, B.; Mello-Thoms, C. The Use of Machine Learning in Eye Tracking Studies in Medical Imaging: A Review. IEEE J. Biomed. Health Inform. 2024, 28, 3597–3612. [Google Scholar] [CrossRef] [PubMed]
- Mehmood, I.; Li, H.; Umer, W.; Tariq, S.; Wu, H. Non-invasive detection of mental fatigue in construction equipment operators through geometric measurements of facial features. J. Saf. Res. 2024, 89, 234–250. [Google Scholar] [CrossRef]
- Schmidt, A.; Mohareri, O.; DiMaio, S.; Yip, M.C.; Salcudean, S.E. Tracking and mapping in medical computer vision: A review. Med. Image Anal. 2024, 94, 103131. [Google Scholar] [CrossRef] [PubMed]
- Boujelbane, M.A.; Trabelsi, K.; Salem, A.; Ammar, A.; Glenn, J.M.; Boukhris, O.; AlRashid, M.M.; Jahrami, H.; Chtourou, H. Eye Tracking During Visual Paired-Comparison Tasks: A Systematic Review and Meta-Analysis of the Diagnostic Test Accuracy for Detecting Cognitive Decline. J. Alzheimer’s Dis. 2024, 99, 207–221. [Google Scholar] [CrossRef]
- Klotzek, A.; Jemni, M.; Groves, S.J.; Carrick, F.R. Effects of Cervical Spinal Manipulation on Saccadic Eye Movements. Brain Sci. 2024, 14, 292. [Google Scholar] [CrossRef]
- Pauszek, J.R. An introduction to eye tracking in human factors healthcare research and medical device testing. Hum. Factors Healthc. 2023, 3, 100031. [Google Scholar] [CrossRef]
- Passaro, A.; Zullo, A.; Di Gioia, M.; Curcio, E.; Stasolla, F. A Narrative Review on the Use of Eye-Tracking in Rett Syndrome: Implications for Diagnosis and Treatment. OBM Genet. 2024, 8, 250. [Google Scholar] [CrossRef]
- Hill, W.; Lindner, H. Using eye tracking to assess learning of a multifunction prosthetic hand: An exploratory study from a rehabilitation perspective. J. Neuroeng. Rehabil. 2024, 21, 148. [Google Scholar] [CrossRef] [PubMed]
- Pulay, M.Á.; Szabó, É.F. Developing Visual Perceptual Skills with Assistive Technology Supported Application for Children with Cerebral Palsy. Acta Polytech. Hung. 2024, 21, 25–38. [Google Scholar] [CrossRef]
- Feldmann, L.; Zsigo, C.; Mörtl, I.; Bartling, J.; Wachinger, C.; Oort, F.; Schulte-Körne, G.; Greimel, E. Emotion regulation in adolescents with major depression—Evidence from a combined EEG and eye-tracking study. J. Affect. Disord. 2023, 340, 899–906. [Google Scholar] [CrossRef] [PubMed]
- Tao, Z.; Sun, N.; Yuan, Z.; Chen, Z.; Liu, J.; Wang, C.; Li, S.; Ma, X.; Ji, B.; Li, K. Research on a New Intelligent and Rapid Screening Method for Depression Risk in Young People Based on Eye Tracking Technology. Brain Sci. 2023, 13, 1415. [Google Scholar] [CrossRef]
- Brien, D.C.; Riek, H.C.; Yep, R.; Huang, J.; Coe, B.; Areshenkoff, C.; Grimes, D.; Jog, M.; Lang, A.; Marras, C.; et al. Classification and staging of Parkinson’s disease using video-based eye tracking. Park. Relat. Disord. 2023, 110, 105316. [Google Scholar] [CrossRef] [PubMed]
- Ghiţă, A.; Hernández-Serrano, O.; Moreno, M.; Monràs, M.; Gual, A.; Maurage, P.; Gacto-Sánchez, M.; Ferrer-García, M.; Porras-García, B.; Gutiérrez-Maldonado, J. Exploring Attentional Bias toward Alcohol Content: Insights from Eye-Movement Activity. Eur. Addict. Res. 2024, 30, 65–79. [Google Scholar] [CrossRef]
- Puttevils, L.; De Bruecker, M.; Allaert, J.; Sanchez-Lopez, A.; De Schryver, N.; Vervaet, M.; Baeken, C.; Vanderhasselt, M.-A. Attentional bias to food during free and instructed viewing in anorexia nervosa: An eye tracking study. J. Psychiatr. Res. 2023, 164, 468–476. [Google Scholar] [CrossRef]
- Guo, X.; Liu, Y.; Tan, Y.; Xia, Z.; Fu, H. Hazard identification performance comparison between virtual reality and traditional construction safety training modes for different learning style individuals. Saf. Sci. 2024, 180, 106644. [Google Scholar] [CrossRef]
- Virlet, L.; Sparrow, L.; Barela, J.; Berquin, P.; Bonnet, C. Proprioceptive intervention improves reading performance in developmental dyslexia: An eye-tracking study. Res. Dev. Disabil. 2024, 153, 104813. [Google Scholar] [CrossRef]
- Liang, Z.; Ga, R.; Bai, H.; Zhao, Q.; Wang, G.; Lai, Q.; Chen, S.; Yu, Q.; Zhou, Z. Teaching expectancy improves video-based learning: Evidence from eye-movement synchronization. Br. J. Educ. Technol. 2024. [Google Scholar] [CrossRef]
- Kok, E.M.; Niehorster, D.C.; van der Gijp, A.; Rutgers, D.R.; Auffermann, W.F.; van der Schaaf, M.; Kester, L.; van Gog, T. The effects of gaze-display feedback on medical students’ self-monitoring and learning in radiology. Adv. Health Sci. Educ. 2024. online ahead of print. [Google Scholar] [CrossRef] [PubMed]
- Sáiz-Manzanares, M.C.; Marticorena-Sánchez, R.; Escolar-Llamazares, M.C.; González-Díez, I.; Martín Antón, L.J. Using integrated multimodal technology: A way to personalised learning in Health Sciences and Biomedical engineering Students. Appl. Sci. 2024, 14, 7017. [Google Scholar] [CrossRef]
- Mullen, B. Self-Attention Theory: The Effects of Group Composition on the Individual. In Theories of Group Behavior; Springer Series in Social, Psychology; Mullen, B., Goethals, G.R., Eds.; Springer: New York, NY, USA, 1987. [Google Scholar] [CrossRef]
- Korteland, R.J.; Kok, E.; Hulshof, C.; van Gog, T. Teaching through their eyes: Effects on optometry teachers’ adaptivity and students’ learning when teachers see students’ gaze. Adv. Health Sci. Educ. 2024. [Google Scholar] [CrossRef]
- Sáiz-Manzanares, M.C.; Marticorena-Sánchez, R.; Martín-Antón, L.J.; González-Diez, I.; Carbonero-Martín, I. Using eye tracking technology to analyse cognitive load in multichannel activities in university students. Int. J. Hum. Comput. Interact. 2024, 40, 3263–3281. [Google Scholar] [CrossRef]
- Wronski, P.; Wensing, M.; Ghosh, S.; Gärttner, L.; Müller, W.; Koetsenruijter, J. Use of a quantitative data report in a hypothetical decision scenario for health policymaking: A computer-assisted laboratory study. BMC Med. Inform. Decis. Mak. 2021, 21, 32. [Google Scholar] [CrossRef]
- Lee, Y.; de Jong, N.; Donkers, J.; Jarodzka, H.; van Merriënboer, J.G. Measuring Cognitive Load in Virtual Reality Training via Pupillometry. IEEE Trans. Learn. Technol. 2024, 17, 704–710. [Google Scholar] [CrossRef]
- Wang, Y.; Lu, Y.; Shen, C.-Y.; Luo, S.-J.; Zhang, L.-Y. Exploring product style perception: A comparative eye-tracking analysis of users across varying levels of self-monitoring. Displays 2024, 84, 102790. [Google Scholar] [CrossRef]
- Cazes, M.; Noël, A.; Jamet, E. Cognitive effects of humorous drawings on learning: An eye-tracking study. Appl. Cogn. Psychol. 2024, 38, e4178. [Google Scholar] [CrossRef]
- Tarkowski, S.; Caban, J.; Dzieńkowski, M.; Nieoczym, A.; Zarajczyk, J. Driver’s distraction and its potential influence on the extension of reaction time. Arch. Automot. Eng. 2022, 98, 65–78. [Google Scholar] [CrossRef]
- Cheng, G.; Di, Z.; Xie, H.; Wang, F.L. Exploring differences in self-regulated learning strategy use between high- and low-performing students in introductory programming: An analysis of eye-tracking and retrospective think-aloud data from program comprehension. Comput. Educ. 2024, 208, 104948. [Google Scholar] [CrossRef]
- Omobolanle, O.; Abiola, A.; Nihar, G.; Mohammad, K.; Abiola, A. Detecting Learning Stages within a Sensor-Based Mixed Reality Learning Environment Using Deep Learning. J. Comput. Civ. Eng. 2024, 37, 04023011. [Google Scholar] [CrossRef]
- Bouwer, R.; Dirkx, K. The eye-mind of processing written feedback for revision. Learn. Instr. 2024, 85, 101745. [Google Scholar] [CrossRef]
- Ferreira, C.P.; Soledad Gonzalez-Gonzalez, C.; Francisca Adamatti, D.; Moreira, F. Analysis Learning Model with Biometric Devices for Business Simulation Games: Brazilian Case Study. IEEE Access 2024, 12, 95548–95564. [Google Scholar] [CrossRef]
- Sáiz-Manzanares, M.C.; Marticorena-Sánchez, R.; Martín-Antón, L.J.; Almeida, L.; Carbonero-Martín, I. Application and challenges of eye tracking technology in Higher Education. Comunicar 2023, 76, 1–12. [Google Scholar] [CrossRef]
- Sáiz-Manzanares, M.C.; Ramos Pérez, I.; Arnaiz-Rodríguez, Á.; Rodríguez-Arribas, S.; Almeida, L.; Martin, C.F. Analysis of the learning process through eye tracking technology and feature selection techniques. Appl. Sci. 2021, 11, 6157. [Google Scholar] [CrossRef]
- Holmqvist, K.L.U.; Nyström, M.L.U.; Andersson, R.L.U.; Dewhurst, R.L.U.; Halszka, J.; van de Weijer, J.L. Eye Tracking: A Comprehensive Guide to Methods and Measures; Oxford University Press: Oxford, UK, 2011. [Google Scholar]
- Vortmann, L.-M.; Putze, F. Combining Implicit and Explicit Feature Extraction for Eye Tracking: Attention Classification Using a Heterogeneous Input. Sensors 2021, 21, 8205. [Google Scholar] [CrossRef]
- Sáiz Manzanares, M.C.; Rodríguez Diez, J.J.; Marticorena Sánchez, R.; Zaparaín Yáñez, M.J.; Cerezo Menéndez, R. Lifelong Learning from Sustainable Education: An Analysis with Eye Tracking and Data Mining Techniques. Sustainability 2020, 12, 1970. [Google Scholar] [CrossRef]
- García, S.; Luengo, J.; Herrera, F. Data Preprocessing in Data Mining; Springer: London, UK, 2015. [Google Scholar] [CrossRef]
- Thilderkvist, E.; Dobslaw, F. On current limitations of online eye-tracking to study the visual processing of source code. Inf. Softw. Technol. 2024, 174, 107502. [Google Scholar] [CrossRef]
- Cho, S.-W.; Lim, Y.-H.; Seo, K.-M.; Kim, J. Integration of eye-tracking and object detection in a deep learning system for quality inspection analysis. J. Comput. Des. Eng. 2024, 11, 158–173. [Google Scholar] [CrossRef]
- Liu, Z.; Yeh, W.-C.; Lin, K.-Y.; Lin, C.-S.H.; Chang, C.-Y. Machine learning based approach for exploring online shopping behavior and preferences with eye tracking. Comput. Sci. Inf. Syst. 2024, 21, 593–623. [Google Scholar] [CrossRef]
- Zhang, H.; Wang, C. Integrated neural network-based pupil tracking technology for wearable gaze tracking devices in flight training. IEEE Access 2024, 12, 133234–133244. [Google Scholar] [CrossRef]
- Born, J.; Ram, B.; Ramachandran, N.; Romero Pinto, S.A.; Winkler, S.; Ratnam, R. Multimodal Study of the Effects of Varying Task Load Utilizing EEG, GSR and Eye-Tracking. bioRxiv 2019, 798496. [Google Scholar] [CrossRef]
- Lindsay, G.W. Attention in Psychology, Neuroscience, and Machine Learning. Front. Comput. Neurosci. 2024, 14. [Google Scholar] [CrossRef]
- Tobii AB Corp. Tobii Pro Lab [Computer Software], version 1.241.54542; Tobii Corp: Danderyd, Sweden, 2024. [Google Scholar]
- Grabinger, L.; Hauser, F.; Wolff, C.; Mottok, J. On Eye Tracking in Software Engineering. SN Comput. Sci. 2024, 5, 729. [Google Scholar] [CrossRef]
- Shimmer3 GSR. Shimmer [Computer Software], version 3; Shimmer Corp.: Dublin, Ireland; Boston, MA, USA, 2024. [Google Scholar]
- Sáiz-Manzanares, M.C.; Marticorena-Sánchez, R. Manual for the Development of Self-Regulated Virtual Laboratories; Servicio de Publicaciones de la Universidad de Burgos: Burgos, Spain, 2024. [Google Scholar] [CrossRef]
Metric | Unit of Measurement | Meaning | Interpretation in the Context of the Human Learning |
---|---|---|---|
Fixation count | Count | Number of fixations on a stimulus or part of a stimulus. | A higher number of fixations may be related to a difficulty in processing that information because it is novel or difficult for the learner to process. |
Fixation duration | Milliseconds | Duration of fixation. | Refers to the reaction times of the learner. A longer duration may be related to a higher cognitive load in the processing of the stimulus. It may also be related to the use of metacognitive orientation strategies, i.e., searching for information or relating it to previous knowledge. |
Saccade count | Count | Refers to the shift of gaze from one part of the stimulus to another. | A higher number of saccades implies the use of metacognitive orientation strategies, i.e., searching for information or relating it to prior knowledge. Likewise, the greater the amplitude of the saccade, the lower the cognitive effort, although this may also be related to information processing problems. Younger learners apply shorter saccades. |
Pupil diameter | Millimetres | The mean pupil diameter is collected for all fixations within an AOI during a time interval. | It refers to the interest that a stimulus or part of it has for the learner. A larger pupil diameter may be related to increased cognitive load and/or difficulty in processing a task. |
Number of visits | Number of bindings within a stimulus or part of a stimulus (of an area of interest or AOI). | Refers to attention to a stimulus or part of a stimulus. | |
Scan Path Length or Gaze Point | X and Y position coordinates | Refers to the chain of fixations in order of succession. | It involves a pattern of visual tracking on a stimulus. It gives information about how each learner processes information. |
SCR count | Count | The number of skin conductance responses (SCRs), for each Interval in Time of Interest. | It provides information about the emotional state of a learner. The SCR count can be used to identify which specific moments of a dynamic stimulus and specific information within a stimulus elicit an emotional response in a learner. A higher SCR count indicates a higher level of emotional arousal. |
ER SCR Amplitude | Microsiemens | The amplitude of each event related skin conductance response (ER-SCR), for each Interval in Time of Interest. Time of Interest intervals that do not have an ER-SCR are calculated using filtered GSR data. | When an SCR occurs between 1 and 5 s after an event (ER-SCR), it is considered whether the stimulus elicits an emotional reaction in the learner. This measure can provide information about different emotional states such as: anxiety, stress, frustration, or relaxation. |
GSR average | Microsiemens | The mean of the average galvanic skin response (GSR) signal, after filtering for each time of interest. | When environmental conditions are held constant, slow fluctuations in the GSR signal (between seconds and minutes) reflect changes in the participant’s emotional arousal. The researcher can use the average GSR metric in different sections of the session to determine whether a learner might be stressed, frustrated, or relaxed during the course of an experiment. |
Metric | Meaning | Unit of Measurement |
---|---|---|
Fixation point | The normalized horizontal and vertical coordinate of the fixation point. | Normalized coordinates (DACS) |
Average pupil diameter | The average diameter of the pupil of the fixation. Calculated using the resulting pupil diameter after applying pupil diameter. | Millimeters |
Saccade direction | The angle of the straight line between the preceding fixation and succeeding fixation. This can only to applied to whole saccades. | Degrees |
Average velocity | The average velocity across all samples belonging to the saccade, even outside the interval. | Degrees/second |
Peak velocity | The maximum velocity across all samples belonging to the saccade, even outside the interval. | Degrees/second |
Saccade amplitude | The amplitude for whole saccades. | Degrees |
Mouse position | The position of the mouse. | Pixels (DACS) |
Galvanic skin response (GSR) | The raw galvanic skin response signal of the participant. | Microsiemens |
Average GSR | The average galvanic skin response (GSR) signal, after filtering, for an interval. | Microsiemens |
Number of SCR | The number of skin conductance responses (SCRs) for an interval. | Count |
Amplitude of event related SCR | The amplitude of each event-related skin conductance response (ER-SCR), for an interval. ER-SCRs are calculated using filtered GSR data. | Microsiemens |
Variables (Characteristics) | Standardized Beta Coefficients | Standard Errors of Coefficients | p-Value |
---|---|---|---|
Type of presentation | 0.48 | 0.23 | 0.07 |
Recording duration | −0.45 | 0.23 | 0.07 |
Gender | −0.15 | 0.25 | 0.57 |
GSR | −0.16 | 0.24 | 0.73 |
Fixation saccade ratio | −0.19 | 0.24 | 0.52 |
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content. |
© 2024 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
Sáez-García, J.; Sáiz-Manzanares, M.C.; Marticorena-Sánchez, R. A Proposed Method of Automating Data Processing for Analysing Data Produced from Eye Tracking and Galvanic Skin Response. Computers 2024, 13, 289. https://doi.org/10.3390/computers13110289
Sáez-García J, Sáiz-Manzanares MC, Marticorena-Sánchez R. A Proposed Method of Automating Data Processing for Analysing Data Produced from Eye Tracking and Galvanic Skin Response. Computers. 2024; 13(11):289. https://doi.org/10.3390/computers13110289
Chicago/Turabian StyleSáez-García, Javier, María Consuelo Sáiz-Manzanares, and Raúl Marticorena-Sánchez. 2024. "A Proposed Method of Automating Data Processing for Analysing Data Produced from Eye Tracking and Galvanic Skin Response" Computers 13, no. 11: 289. https://doi.org/10.3390/computers13110289
APA StyleSáez-García, J., Sáiz-Manzanares, M. C., & Marticorena-Sánchez, R. (2024). A Proposed Method of Automating Data Processing for Analysing Data Produced from Eye Tracking and Galvanic Skin Response. Computers, 13(11), 289. https://doi.org/10.3390/computers13110289