Next Article in Journal
Temporal Synchrony in Bodily Interaction Enhances the Aha! Experience: Evidence for an Implicit Metacognitive Predictive Processing Mechanism
Previous Article in Journal
The Predictive Role of Contemporary Filial Piety and Academic Achievement on Multidimensional Emotional Intelligence Among Chinese Undergraduates
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
This is an early access version, the complete PDF, HTML, and XML versions will be available soon.
Article

The Value of Individual Screen Response Time in Predicting Student Test Performance: Evidence from TIMSS 2019 Problem Solving and Inquiry Tasks

1
Measurement, Evaluation, and Data Science, Faculty of Education, University of Alberta, Edmonton, AB T6G 2G5, Canada
2
Centre for Research in Applied Measurement and Evaluation, Faculty of Education, University of Alberta, Edmonton, AB T6G 2G5, Canada
*
Author to whom correspondence should be addressed.
J. Intell. 2025, 13(7), 82; https://doi.org/10.3390/jintelligence13070082
Submission received: 7 March 2025 / Revised: 23 June 2025 / Accepted: 4 July 2025 / Published: 6 July 2025
(This article belongs to the Section Contributions to the Measurement of Intelligence)

Abstract

The time students spend on answering a test item (i.e., response time) and its relationship to performance can vary significantly from one item to another. Thus, using total or average response time across all items to predict overall test performance may lead to a loss of information, particularly with respect to within-person variability, which refers to fluctuations in a student’s standardized response times across different items. This study aims to demonstrate the predictive and explanatory value of including within-person variability in predicting and explaining students’ test scores. The data came from 13,829 fourth-grade students who completed the mathematics portion of Problem Solving and Inquiry (PSI) tasks in the 2019 Trends in International Mathematics and Science Study (TIMSS). In this assessment, students navigated through a sequence of interactive screens, each containing one or more related items, while response time was recorded at the screen level. This study used a profile analysis approach to show that students’ standardized response times—used as a practical approximation of item-level timing—varied substantially across screens, indicating within-person variability. We further decompose the predictive power of response time for overall test performance into pattern effect (the predictive power of within-person variability in response time) and level effect (the predictive power of the average response time). Results show that the pattern effect significantly outweighed the level effect, indicating that most of the predictive power of response time comes from within-person variability. Additionally, each screen response time had unique predictive power for performance, with the relationship varying in strength and direction. This finding suggests that fine-grained response time data can provide more information to infer the response processes of students in the test. Cross-validation and analyses across different achievement groups confirmed the consistency of results regarding the predictive and explanatory value of within-person variability. These findings offer implications for the design and administration of future educational assessments, highlighting the potential benefits of collecting and analyzing more fine-grained response time data as a predictor of test performance.
Keywords: response time; process data; profile analysis; within-person variability; individual differences response time; process data; profile analysis; within-person variability; individual differences

Share and Cite

MDPI and ACS Style

Tan, B.; Bulut, O. The Value of Individual Screen Response Time in Predicting Student Test Performance: Evidence from TIMSS 2019 Problem Solving and Inquiry Tasks. J. Intell. 2025, 13, 82. https://doi.org/10.3390/jintelligence13070082

AMA Style

Tan B, Bulut O. The Value of Individual Screen Response Time in Predicting Student Test Performance: Evidence from TIMSS 2019 Problem Solving and Inquiry Tasks. Journal of Intelligence. 2025; 13(7):82. https://doi.org/10.3390/jintelligence13070082

Chicago/Turabian Style

Tan, Bin, and Okan Bulut. 2025. "The Value of Individual Screen Response Time in Predicting Student Test Performance: Evidence from TIMSS 2019 Problem Solving and Inquiry Tasks" Journal of Intelligence 13, no. 7: 82. https://doi.org/10.3390/jintelligence13070082

APA Style

Tan, B., & Bulut, O. (2025). The Value of Individual Screen Response Time in Predicting Student Test Performance: Evidence from TIMSS 2019 Problem Solving and Inquiry Tasks. Journal of Intelligence, 13(7), 82. https://doi.org/10.3390/jintelligence13070082

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop