Next Article in Journal
Emotional Intelligence and Its Relationship with Subjective Well-Being and Academic Achievement in University Students
Previous Article in Journal
Planning, Cognitive Reflection, Inter-Temporal Choice, and Risky Choice in Chess Players: An Expertise Approach
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Spatial Reasoning and Its Contribution to Mathematical Performance Across Different Content Domains: Evidence from Chinese Students

1
School of Educational Science, Nantong University, Nantong 226019, China
2
College of Elementary Education, Capital Normal University, Beijing 100048, China
3
Faculty of Education, East China Normal University, Shanghai 200062, China
*
Authors to whom correspondence should be addressed.
J. Intell. 2025, 13(4), 41; https://doi.org/10.3390/jintelligence13040041
Submission received: 18 November 2024 / Revised: 19 March 2025 / Accepted: 21 March 2025 / Published: 24 March 2025

Abstract

:
Recent studies have provided convincing evidence highlighting the strong relationship between spatial reasoning and mathematical performance. However, there is a limited body of research exploring the contributions of different spatial reasoning constructs to mathematical performance across various content domains, particularly within non-Western contexts. This study investigates the relationship between spatial reasoning skills—including mental rotation, spatial visualization, and spatial orientation—and mathematical performance across various domains (number, geometric shapes and measures and data display) among Chinese elementary school students in grade four (ages 9–10). The results indicate that overall spatial reasoning significantly predicts mathematical performance across various domains. All three spatial reasoning constructs significantly contribute to performance in the number and geometric shapes and measures domains, with mental rotation and spatial orientation being the strongest predictors of performance in these respective content domains. For data display performance, spatial orientation and spatial visualization significantly contribute, with spatial visualization being the strongest predictor. Although no significant gender differences were found in the overall link between spatial reasoning and mathematical performance, subgroup regression analysis showed variations. For male students, spatial orientation was the main predictor across content areas. For female students, mental rotation was the key predictor for number and geometry, while spatial visualization was most significant for data display.

1. Introduction

Spatial reasoning has garnered increasing attention from researchers in education and psychology due to its critical role in students’ academic performance in STEM subjects and their future engagement in STEM careers (Lubinski 2010; Uttal et al. 2013; Wai et al. 2009). With increasing evidence highlighting the significant correlation between spatial reasoning and mathematical performance (Gunderson et al. 2012; Mix et al. 2016; Verdine et al. 2017; Frick 2019), considerable efforts have been directed toward enhancing students’ math skills through spatial training (Hawes et al. 2017; Lowrie et al. 2017, 2018, 2019, 2021). However, not all positive effects of spatial training have successfully transferred to mathematical performance (Hawes et al. 2015; Cornu et al. 2017). This may be due to the unclear relationship between different spatial reasoning constructs and their varying impacts on mathematical performance across different content domains, including both geometric and non-geometric areas. Cross-cultural comparative research shows that, compared to several Asian mathematics curricula which have put more emphasis on geometry, space appears to have received higher curricular esteem in Western contexts, such as in Australia (Lowrie et al. 2016). The conception and practice of spatial reasoning differ across cultural contexts. Considering that students from East Asian countries typically perform above the international average in mathematics and science assessments (OECD 2014, 2016, 2019), there is also a need for research in non-Western contexts, such as China, to provide a more comprehensive understanding of this relationship. Moreover, despite the growing recognition of spatial reasoning’s role in mathematical learning, there remains a clear need for further psychometric analyses of spatial reasoning measures (Uttal et al. 2024). Robust psychometric evaluation is crucial for accurately capturing students’ spatial abilities and their relationship with mathematical performance. To address these gaps, this study aimed to validate the modified spatial reasoning test for quality and applicability among Chinese students in grades 4 to 6 (ages 9–12), and to further investigate the nuanced relationships between spatial reasoning constructs and mathematical performance across different content domains among 4th grade students (ages 9–10).

1.1. Spatial Reasoning Constructs

Spatial reasoning has been a topic of significant research interest over a long period, yet its complexity has led to a lack of consensus on its precise definition and components. Nevertheless, it is widely recognized that spatial reasoning encompasses multiple sub-skills rather than being a singular capability. McGee (1979) was one of the earliest researchers to identify two primary components of spatial reasoning: spatial visualization and spatial orientation. Spatial visualization involves the mental manipulation of objects, including imagining rotations, twists, and transformations. Spatial orientation involves understanding the arrangement of elements within visual stimuli, not being confused by changes in direction, and recognizing spatial positions relative to one’s body. Linn and Petersen (1985) later expanded this understanding, categorizing spatial ability into three factors: spatial perception, mental rotation, and spatial visualization. They distinguished mental rotation from spatial visualization, considering them separate constructs. Spatial visualization may involve processes from spatial perception and mental rotation but requires multi-step analytical strategies, making it distinct from the other two. Tartre (1990) expanded on McGee’s work and further subdivided spatial visualization into two distinct factors: mental rotation (manipulating entire objects) and mental transformation (manipulating parts of objects).
Over the past decade, spatial orientation and spatial visualization have consistently been recognized as two primary spatial constructs (Clements and Sarama [2009] 2014). Additionally, some studies identify mental rotation as an independent spatial construct, distinct from spatial visualization. In this context, Ramful et al. (2017) advanced the field by proposing a three-tier framework of spatial reasoning that encompasses mental rotation, spatial orientation, and spatial visualization. This framework was developed through an analysis of tasks from the primary and lower secondary mathematics curricula. Building on this framework, the Spatial Reasoning Instrument, consisting of multiple tasks, was developed specifically designed to aid teachers and researchers in educational settings. Their work is notable for its emphasis on the practical application of spatial reasoning within schools, distinguishing it from other studies that may be more theoretical or conducted in controlled environments. This focus on the educational context provides a unique perspective on spatial reasoning, bridging the gap between theoretical constructs and their application in real-world educational settings.
Definitions and components of spatial reasoning differ depending on the discipline and focus of the study. Despite these variations, the present study is specifically concerned with spatial reasoning as it manifests in educational contexts, particularly within the framework of school-based learning. Accordingly, this research adopts the spatial reasoning framework proposed by Ramful et al. (2017), which offers a robust and education-oriented approach. Their delineation of spatial reasoning into three distinct constructs—mental rotation, spatial orientation, and spatial visualization—provides a comprehensive structure that is particularly relevant for evaluating students’ spatial reasoning skills in the context of primary and secondary education. The subsequent sections will offer an in-depth examination of each of these three spatial constructs.

1.1.1. Mental Rotation

Mental rotation is the ability to accurately rotate a two-dimensional (2D) shape or a three-dimensional (3D) object in the mind (Lowrie et al. 2018). During this process, the object’s internal structure remains unchanged, and the observer’s perspective stays constant. This ability is typically assessed through tasks that require individuals to identify or match identical figures presented at different orientations, such as the comparison tasks devised by Shepard and Metzler (1971).

1.1.2. Spatial Orientation

Spatial orientation refers to the ability to reconfigure one’s position within space, involving the process of understanding spatial relationships from various scales, perspectives, and contexts. Unlike mental rotation, which involves object-centered transformations, spatial orientation is characterized by egocentric transformations (Hegarty and Waller 2005; Kozhevnikov and Hegarty 2001). This means it is centered on the observer’s perspective, requiring them to imagine spatial changes from their own viewpoint and incorporate perspective shifts, where the observer visualizes an object or scene from different orientations. In spatial orientation tasks, individuals must position themselves at a specified location, determine spatial relationships with other objects from that position, assess the object’s state or movement, and identify its final position after movement.

1.1.3. Spatial Visualization

Spatial visualization is the ability to mentally manipulate or transform the visual and spatial properties of objects (Lowrie et al. 2019). It is the most complex of the three spatial reasoning constructs. Unlike mental rotation and spatial orientation, which involve treating the object as a whole during transformations, spatial visualization encompasses intricate internal changes within the object (Sorby 1999). These transformations can alter the object’s state or structure, such as unfolding a three-dimensional shape into a two-dimensional figure, folding a two-dimensional net into a three-dimensional shape, performing reflections, decomposing a composite figure into parts, or engaging in activities like origami and paper cutting.

1.2. Links Between Spatial Reasoning and Mathematical Performance

Spatial reasoning has long been recognized as a critical component in the development of mathematical understanding. However, the relationship between spatial thinking and mathematics is not straightforward (Clements and Sarama [2009] 2014). One common theoretical perspective is that mathematical thinking is supported by spatial representations (Hegarty and Kozhevnikov 1999; Mix and Cheng 2012). In this context, spatial representations function as cognitive tools enhancing the understanding of relative magnitude, measurement units, and arithmetic operations (Cipora et al. 2015; Congdon et al. 2018a; Newcombe et al. 2018). On the “mental blackboard” of spatial representation, various mathematical concepts, relationships, and operations are allowed for modeling and visualization (Lourenco et al. 2018; Mix 2019). Alternatively, it is argued that mathematics is inherently spatial. For instance, symmetry is not only a fundamental concept in geometry but also a core attribute of mathematical thinking (Sinclair 2004). In learning about equations, symmetry provides a perspective for interpreting the equal sign beyond its role as a procedural step in calculations (Verdine et al. 2017; Patahuddin et al. 2018). Furthermore, the number line is a clear spatial representation of how children conceptualize integers (Gunderson et al. 2012). Additionally, neuroscientific research indicates that similar neural pathways are engaged when processing both spatial and numerical information (Walsh 2003; Hubbard et al. 2005), suggesting a shared cognitive process between these domains.
There is substantial empirical evidence demonstrating that spatial skills are linked to individual differences in math knowledge. Cheng and Mix (2014) found that training in mental rotation improved students’ performance on missing-term problems (e.g., 4 + __ = 12). An analysis of data from 804 sixth grade students in Singapore revealed significant statistical differences in six mathematical domains, including whole numbers, algebraic patterns, data and probability, and geometry and measurement, between students with high and low visuospatial levels (Logan 2015). Gilligan et al. (2019) explored the developmental relationship between spatial ability and mathematical achievement in children aged 6–10, demonstrating that spatial ability remained a significant predictor of math performance even after controlling for other known predictors. Moreover, several longitudinal studies have shown that spatial ability at one age predicts mathematical performance at a later age. For example, Gunderson et al. (2012) demonstrated that 5-year-olds’ mental rotation skills predicted their approximate calculation abilities at age 8, suggesting that spatial ability helps children acquire linear spatial representations of numbers, which in turn facilitates the development of their numerical knowledge. This study was one of the earliest to investigate the potential developmental mechanisms underlying the relationship between spatial ability and mathematical achievement, and it has since been followed by numerous longitudinal studies exploring this relationship from various perspectives and across different age groups (Verdine et al. 2014; Casey et al. 2015; Lauer and Lourenco 2016; Gilligan et al. 2017; Frick 2019). Furthermore, factor analysis studies suggest that spatial and mathematical abilities are separate but highly correlated (Hawes et al. 2019), and children with strong spatial skills consistently perform better on math achievement tests from kindergarten through to the end of elementary school, even when controlling for the influence of language abilities (Mix et al. 2016, 2017).
Despite extensive empirical evidence supporting a strong correlation between spatial skills and mathematical performance, the transfer of benefits from spatial training to improved math outcomes has not consistently met expectations. While several long-term interventions focused on spatial reasoning have demonstrated significant improvements in math performance following the completion of the training programs (Hawes et al. 2017; Lowrie et al. 2017, 2018, 2019, 2021), other studies have observed that although students’ spatial abilities were enhanced through spatial training, these improvements did not consistently translate into better mathematical achievement (Hawes et al. 2015; Cornu et al. 2017). Among these intervention studies showing a successful transfer effect, the program designs predominantly centered on geometric content. However, research indicates that the importance of spatial reasoning extends beyond the realm of geometry and is crucial for various mathematical tasks (Mix 2019). Therefore, it is crucial to explore the relationship between spatial reasoning and mathematics more comprehensively, including various spatial structures and mathematical content domains.

1.3. Present Study

As described, the fine-grained relationship between different spatial reasoning constructs and their varying impacts on mathematical performance across different content domains remains unclear. This lack of clarity may contribute to the mixed evidence regarding the extent to which improvements in spatial skills translate into better mathematical performance. Rather than making assumptions about these relationships, our study seeks to explore and provide further insights into them. Given the exceptional performance of Chinese students in international mathematics assessments, exploring the spatial reasoning abilities of Chinese students is crucial for a more comprehensive understanding of the relationship between spatial reasoning and mathematical performance. The Chinese education system, deeply rooted in Confucian culture, places a high value on academic achievement and task persistence (Stankov 2010). However, as in other Asian mathematics curricula, space has received less emphasis in Chinese mathematics curricula compared to Western contexts. This cultural context provides a distinctive research perspective that could offer a more comprehensive understanding of how spatial reasoning contributes to mathematical performance. It is essential to assess whether, and to what extent, spatial reasoning remains a significant predictor of mathematical performance in this specific educational context, as it can provide a more comprehensive understanding of how spatial reasoning supports mathematical learning across diverse curricular frameworks.
The overarching goal of the study is to further understand the nuanced relationships between spatial reasoning and mathematical performance across different content domains. Based on the substantial body of research evidence previously reviewed on the links between spatial reasoning and mathematical performance, this study aims to further explore this relationship without presupposing specific outcomes, and additionally address the following research questions:
  • To what extent does spatial reasoning continue to predict mathematical performance among Chinese students who have had relatively limited exposure to space-related curriculum content?
  • How do different constructs of spatial reasoning (mental rotation, spatial visualization, and spatial orientation) specifically impact mathematical performance in various content domains (number, geometric shapes and measures, and data display) among Chinese elementary school students?

2. Materials and Methods

2.1. Design

This study is structured into two phases. The first phase, a preliminary analysis, aims to validate the modified spatial reasoning test for quality and applicability among Chinese students in grades 4 to 6 (ages 9–10). As emphasized in a recent review on how to best assess spatial skills, there is a clear need for more psychometric analyses of spatial thinking measures, and adapted tests require reporting the basic psychometric properties to ensure their reliability and validity (Uttal et al. 2024). This phase addresses these concerns by rigorously evaluating the psychometric properties of the spatial reasoning test, thereby laying a solid foundation for the study findings. The second phase, the main study, examines the predictive relationship between spatial reasoning and mathematical performance through a battery of tests administered among Chinese elementary students in grade 4 (ages 9–10), focusing on how spatial reasoning influences students’ outcomes across various mathematical content domains.

2.2. Procedure and Participants

Four elementary schools located in three cities of China, namely Suzhou, Hangzhou, and Shenzhen, were involved in this study. These schools varied in educational quality, which helped to ensure the diversity of our sample. Data collection was conducted following ethical approval. Participation was voluntary, with verbal informed consent obtained from all students. No incentives were provided for participation.
The participants in the first phase of the study were 477 students (256 boys and 221 girls) from grades 4, 5, and 6, aged between 9 and 12. We focused on this age range because the Spatial Reasoning Instrument (Ramful et al. 2017), which served as a reference for our modified spatial reasoning test, was developed specifically for elementary school students in this age group (Lowrie et al. 2019). One class per grade was selected from each school. In all, 12 intact classes of students completed the spatial reasoning test in their own classrooms to ensure a range of academic ability levels within the respective schools. The tests were administered by classroom teachers within a 40 min session. The final sample consisted of 477 valid responses from the 486 collected test papers, including 158 fourth graders, 158 fifth graders, and 161 sixth graders.
The participants for the second phase, the main study, involved 816 fourth grade students (432 boys and 384 girls), aged 9 to 10. This focus on fourth graders was due to availability, time constraints, and other limitations within the four schools, as well as the consideration that the mathematics test we adopted was typically aimed at fourth grade students. Data collection for the main study occurred one month after the first phase, with none of the classes or students overlapping between the two phases. One class per grade was selected from each school, ensuring a diverse range of academic abilities. A total of 21 intact classes completed both the spatial reasoning and mathematics tests in their own classrooms. From the collected 875 spatial reasoning tests and 872 mathematics tests, the final sample consisted of 816 students who provided valid responses for both tests.

2.3. Materials

2.3.1. Spatial Reasoning Test

The spatial reasoning test was developed based on the three-tier framework proposed by Ramful et al. (2017), and adjustments were made to the specific subject and corresponding content of each spatial construct. To align with the modified theoretical framework, 28 items from the Spatial Reasoning Instrument (Ramful et al. 2017) were retained, and 4 additional items were adapted from the Thinking About 3D Shapes test (Owens 2001) to ensure comprehensive coverage of each construct subject. Specifically, the test comprises 9 mental rotation tasks, 10 spatial orientation tasks, and 13 spatial visualization tasks. To ensure the content validity of the spatial reasoning test, six experts in mathematics education evaluated the test content using a structured questionnaire. The questionnaire introduced the test framework and asked experts to assess whether the items accurately measured elementary students’ spatial reasoning skills. Based on their ratings of the alignment between the items and the intended constructs, we calculated the Content Validity Index (CVI), which yielded a value of 1, indicating high content validity. Table 1 provides an overview of the three constructs measured by the spatial reasoning test, along with the associated subjects, content, and items. Each item is labeled using the format “construct + number” (e.g., MR2 indicates that it is the second item of the test and it measures mental rotation). The test consists of 32 multiple choice items, scored on a 0–1 scale, with a maximum possible score of 32. Examples of items, along with their corresponding constructs and subjects, are presented in Table 2.

2.3.2. Mathematics Test

Among the various assessments focusing on students’ mathematical performance, the Trends in International Mathematics and Science Study (TIMSS) is one of the most comprehensive international studies. This study takes the 32 released mathematical items for the fourth grade in TIMSS-2011 as the instruments, since the items of TIMSS-2011 are in the public domain and can be downloaded from the website of the IEA. The items of TIMSS-2011 are equipped with high reliability and validity and were still applied in recently published research (Hu et al. 2021; Xu et al. 2023). The mathematics assessment framework for TIMSS 2011 is organized around two dimensions: content and cognitive domains. There are three content domains described in the TIMSS 2011 Mathematics Framework for the fourth grade: number, geometric shapes and measures, and data display. The released items for the fourth grade in TIMSS-2011 are divided into six blocks (M01, M02, M03, M05, M06, M07). Considering the time constraints for students, this study selected three blocks—M01, M02, and M03—totaling 32 items as the test material, which include 17 items in the number domain, 11 in geometric shapes and measures, and 4 in data display. Some of these 32 items contained two or three sub-questions. Scoring adhered strictly to the TIMSS-2011 scoring guide, with a maximum possible score of 38: 20 for the number domain, 14 for geometric shapes and measures, and 4 for data display. Examples of items and their corresponding content domains are presented in Table 3.

2.4. Data Analysis

In the first phase, our aim was to validate the psychometric properties of the spatial reasoning test. The data analysis included evaluating the internal consistency of the test using Cronbach’s alpha, analyzing item difficulty and discrimination based on classical test theory, assessing construct validity through confirmatory factor analysis, and performing item analysis based on the Rasch model. In the second phase, the main study, to investigate the relationship between spatial reasoning and mathematics performance, regression analyses were conducted.

3. Results

3.1. Phase I: Instrument Analysis Results

3.1.1. Reliability and Construct Validity

The internal consistency of the spatial reasoning test was evaluated using Cronbach’s alpha, yielding a result of 0.866 (>0.85), which indicates good reliability. The construct validity of the spatial reasoning test was evaluated using confirmatory factor analysis (CFA). As previously mentioned, the test was developed based on a three-tier framework of spatial reasoning ability. It is designed to measure overall spatial reasoning, but includes three constructs: mental rotation, spatial orientation, and spatial visualization. Accordingly, a second-order three-factor model was specified. As shown in Table 4, the fit indices from the CFA indicate a good model fit: the chi-square/df ratio is less than three, suggesting a well-fitting model; the root mean square error of approximation (RMSEA) is 0.030, below the 0.05 threshold, indicating excellent fit, with the upper limit of the 90% confidence interval at 0.035 also supporting good model fit; the standardized root mean square residual (SRMR) is 0.044, under the 0.08 threshold, reflecting an ideal model fit; and both the comparative fit index (CFI) and the Tucker–Lewis index (TLI) exceed 0.9 (0.918 and 0.912, respectively), further indicating that the model is acceptable. Therefore, based on these fit indices, the model demonstrates a good overall fit.
Figure 1 illustrates the second-order three-factor model of spatial reasoning derived from the CFA. The standardized factor loadings of the items range between 0.413 and 0.587, with all p-values for the factor loadings being less than 0.01, and most being less than 0.001, indicating significant relationships between the first-order factors and their respective items. The factor loadings of the three first-order factors—mental rotation (MR), spatial orientation (SO), and spatial visualization (SV)—on the second-order factor, spatial reasoning ability (SR), are 0.868, 0.715, and 0.872, respectively, indicating strong correlations between the second-order factor and the three first-order factors. This suggests that all three constructs effectively measure spatial reasoning ability. Based on this analysis, the test demonstrates good construct validity.

3.1.2. Item Difficulty and Discrimination

Table 5 presents the item difficulty and item discrimination indices based on classical test theory analysis. The overall difficulty level of the test items ranges from 0.27 to 0.92. Specifically, the difficulty level for mental rotation items spans from 0.46 to 0.81, for spatial orientation items from 0.46 to 0.92, and for spatial visualization items from 0.27 to 0.85. The discrimination indices fall between 0.29 and 0.52, all of which indicate medium-to-high levels of item discrimination.

3.1.3. Item Analysis Based on Rasch Model

The Rasch model provides an objective standard for evaluating and refining assessment tools. Since unidimensional models in item response theory (IRT) are applicable only when a test measures a single latent trait, we first used principal component analysis (PCA) to test whether the spatial reasoning test conforms to the unidimensionality assumption. Before conducting PCA, we performed the Kaiser–Meyer–Olkin (KMO) measure of sampling adequacy and Bartlett’s test of sphericity. The KMO statistic was 0.89, falling within the “meritorious” range (0.8–0.9). Bartlett’s test of sphericity yielded a p-value of less than 0.001, well below the 0.05 threshold, indicating the data’s appropriateness for factor analysis (Dziuban and Shirkey 1974). In PCA, a ratio greater than three between the first and second eigenvalues generally supports unidimensionality. Our analysis revealed that the first component’s eigenvalue was 6.325, while the second component’s eigenvalue was 1.931, resulting in a ratio of approximately 3.28, which meets the unidimensionality assumption. Therefore, although the spatial reasoning test encompasses three constructs—mental rotation, spatial orientation, and spatial visualization—it primarily measures a single underlying structure: spatial reasoning.
The R-4.2.0 software offers various methods for parameter estimation, and this study employed the joint maximum likelihood estimation (JMLE) method, which simultaneously estimates both ability and item parameters. The person separation reliability was 0.845, indicating a high level of reliability (with values closer to one being preferable). The second column of Table 6 presents the item difficulty estimates, ranging from −2.989 to 1.259. Negative values indicate easier items, while positive values denote more difficult items. The mean item difficulty was −0.805, while the mean student ability was −0.040, suggesting that the overall test was slightly below moderate difficulty, with student ability slightly exceeding the test’s difficulty level. The third column of Table 6 shows the standard errors of the item difficulty estimates, which reflect the precision of these estimates—the closer to 0, the better. The standard errors ranged from 0.105 to 0.182, indicating that the estimation errors for item difficulty were relatively small.
Columns four through to seven in Table 6 present the fit statistics based on residuals for each item, including the mean square fit (MNSQ) and t-statistics. These statistics assess the degree to which the data fit the Rasch model. The Rasch model reports both outfit and infit MNSQ statistics. Outfit MNSQ, the unweighted fit mean square, is more sensitive to outliers, and thus more capable of identifying misfit issues. Infit MNSQ, the weighted fit mean square, mitigates the sensitivity to outliers, addressing the issue of certain items being misjudged as misfitting due to extreme values. An MNSQ value of one indicates perfect fit, values greater than one suggest underfit (where the data contain too much unexplained variance or noise), and values less than one indicate overfit (where the model overpredicts the data). The acceptable range for MNSQ indices is typically between 0.5 and 1.5, with values closer to 1 being ideal. As shown in Table 1, Table 2, Table 3 and Table 4, outfit MNSQ values range from 0.648 to 1.211, and infit MNSQ values range from 0.843 to 1.192, both of which fall within the acceptable range, indicating a good fit. Outfit t and infit t are the transformed t-statistics of the fit mean square values. These t-statistics only need to be further examined if the MNSQ indices fall outside the acceptable range (Boone et al. 2014). Generally, a reasonable range for the t-statistics is between −2 and 2. Apart from items 27 and 30, all items meet these criteria. Overall, the fit indices for the 32 items in this test indicate a good fit with the assumptions of the Rasch model.

3.2. Phase II: Main Study Results

3.2.1. Descriptive Statistics

In this study, descriptive statistics were calculated for all eight variables, including overall spatial score, the three spatial reasoning constructs (mental rotation, spatial orientation, and spatial visualization), the overall mathematics score, and the three mathematics content domains (number, geometric shapes and measures, and data display). The means, standard deviations, range, skewness, and kurtosis for these variables are presented in Table 7. In line with the guidelines suggested by Kline (2016), if the absolute value of skewness is less than 3 and the absolute value of kurtosis is less than 10, the data can be considered approximately normally distributed.
The gender breakdown for these variables is provided in Table 8. Through independent samples t-tests, we compared females and males on the spatial and mathematics measures. No significant gender differences were found for performance in mathematics test, t = 1.27, p = 0.20 (>0.05). However, significant gender differences were found for performance in the spatial reasoning test, in favor of males, t = 2.60, p = 0.0096(<0.01). Further comparisons on the three spatial constructs and three mathematics content domains revealed gender differences in favor of males for mental rotation, t = 4.63, p < 0.001, and number, t = 2.52, p = 0.012 (<0.05); gender differences in favor of females for data display, t = −2.73, p = 0.006 (<0.01); no significant gender differences were found for spatial orientation (t = 1.44, p = 0.15), spatial visualization (t = 0.14, p = 0.89), and geometric shapes and measures (t = 0.34, p = 0.73). The findings of gender differences in overall performance on the spatial reasoning and mathematics tests, as well as in mental rotation tasks, align with previous research (Linn and Petersen 1985; Voyer et al. 1995; Hegarty 2018; Harris et al. 2021), However, the expected gender differences in spatial orientation reported in prior studies were not observed in this study (Logan and Lowrie 2017; Harris et al. 2021).

3.2.2. Correlations Between Spatial and Math Variables

Table 9 summarizes the results of the bivariate correlations between all measured variables. Significant correlations were found between all spatial and mathematical measures (p < 0.001).

3.2.3. Regression Analysis

Based on the strong correlations observed between spatial measures and mathematics measures, further regression analyses were conducted to explore the predictive relationships between spatial reasoning and mathematics performance. The analyses first aimed to identify the predictors of overall mathematics performance. Subsequently, predictors of performance in specific content domains, including number, geometric shapes and measures, and data display, were identified through separate regression models. In the multiple regression models, the collinearity statistics (i.e., VIF) were all within acceptable limits.

Predictors of Overall Mathematical Performance

As shown in Table 10, a simple linear regression indicated that overall spatial reasoning significantly predicts math performance (β = 0.66, p < 0.001), explaining 34.12% of the variance, with each point increase in spatial reasoning associated with a 0.66-point rise in math scores. A multiple regression analysis of three spatial constructs—mental rotation (MR), spatial orientation (SO), and spatial visualization (SV)—found that all significantly predict math performance (p < 0.001). Controlling for other variables, each one-point increase in MR, SO, and SV corresponds to 0.69, 0.67, and 0.64-point increases in math scores, respectively.
To further examine whether the impact of spatial measures on overall mathematics performance varies by gender, a subgroup regression analysis was conducted. Initial simple linear regressions for male and female students, as shown in Table 10, indicated that the overall spatial score had a significant positive effect on mathematics scores in both groups. Introducing the interaction term between spatial reasoning and gender showed no significant difference between groups (p = 0.982 > 0.05). Further multiple regression analyses for each gender group revealed that MR, SO, and SV positively impacted mathematics performance, with spatial orientation being the strongest predictor for males and mental rotation for females. The interaction terms for MR, SO, and SV with gender (p = 0.229, p = 0.059, p = 0.677) were not significant, indicating consistent effects across genders.

Predictors of Performance in Number

As shown in Table 11, simple linear regression reveals that overall spatial reasoning significantly predicts number performance, with each one-point increase in spatial reasoning corresponding to a 0.35-point increase in scores (β = 0.35, p < 0.001). The model explains 27.04% of the variance. Multiple regression analysis indicates that mental rotation (MR), spatial orientation (SO), and spatial visualization (SV) all significantly affect number scores (p < 0.001), with MR, SO, and SV predicting increases of 0.37, 0.41, and 0.29 points in number scores, respectively.
To assess if the predictive relationship between spatial reasoning and number performance varies by gender, subgroup regression analyses were conducted. Simple linear regressions revealed a significant positive effect of overall spatial score on number performance for both males and females. The interaction term for spatial reasoning and gender had a p-value of 0.706, indicating no significant differences in the effect across genders. Further multiple regression analyses showed that mental rotation, spatial orientation, and spatial visualization all significantly impact number performance in both genders. For males, spatial orientation was the strongest predictor, while for females, mental rotation was the most influential. Interaction terms for these spatial constructs with gender had p-values of 0.097, 0.312, and 0.658, respectively, suggesting no significant differences in their impacts on number performance between genders.

Predictors of Performance in Geometric Shapes and Measures

To assess whether spatial reasoning influences students’ performance in geometric shapes and measures, a simple linear regression was performed. As shown in Table 12, the overall spatial score significantly predicts geometric performance (β = 0.27, p < 0.001), indicating that each one-point increase in overall spatial score corresponds to a 0.27-point increase in scores. The model explains 30.75% of the variance. A multiple regression further revealed that mental rotation (MR), spatial orientation (SO), and spatial visualization (SV) all significantly contribute to performance in geometric shapes and measures. Specifically, MR (β = 0.32, p < 0.001), SO (β = 0.20, p < 0.001), and SV (β = 0.27, p < 0.001) each show positive effects on scores.
In the subgroup regression analyses, the impact of spatial reasoning on performance in geometric shapes and measures was examined for potential gender differences. Simple linear regressions for each gender group, as shown in Table 12, indicated that overall spatial scores significantly and positively influenced performance in geometric shapes and measures for both groups. Introducing an interaction term between spatial reasoning and gender revealed no significant difference between the groups (p = 0.896). Further subgroup multiple regressions showed that in the male group, mental rotation (MR), spatial orientation (SO), and spatial visualization (SV) all significantly predicted performance in geometric shapes and measures, with MR and SO having nearly equal impacts. In the female group, MR and SV were significant predictors, with MR being the stronger predictor, while SO did not significantly influence performance. Interaction terms between each spatial construct and gender indicated that SO’s impact significantly differed by gender (p = 0.02), while MR and SV’s impacts did not (p = 0.676 and p = 0.15, respectively).

Predictors of Performance in Data Display

To evaluate the influence of spatial reasoning on students’ data display performance, a simple linear regression was performed. The results, shown in Table 13, indicate a small but significant effect (β = 0.05, p < 0.001), with each one-point increase in overall spatial score corresponding to a 0.05-point rise in data display scores. Further analysis using multiple regression revealed that mental rotation did not significantly impact data display (p > 0.05), while spatial orientation (β = 0.06, p < 0.001) and spatial visualization (β = 0.08, p < 0.001) both had small, yet significant, positive effects.
Subgroup regression analyses were conducted to assess whether the impact of spatial measures on performance in data display differed by gender. Initial simple linear regressions for male and female students, as shown in Table 13, revealed that overall spatial score had a significant, but small, positive effect on performance in data display in both groups. The interaction term between overall spatial score and gender showed no significant difference in regression coefficients (p = 0.206). Further multiple regression analyses indicated that, while the predictors of performance in data display differed by gender—spatial orientation and spatial visualization were significant for males, and only spatial visualization was significant for females—the tests for coefficient differences between groups revealed no statistically significant differences (p = 0.948, p = 0.116, and p = 0.692, respectively), suggesting that the effects of these spatial constructs are consistent across genders.

4. Discussion

4.1. Summary of Findings

This study aimed to explore the role of spatial reasoning in predicting mathematical performance across different content domains among Chinese elementary school students, who have relatively limited exposure to space-related curriculum content. To achieve this, we administered a set of tests to students from four schools in three cities of China. We developed a spatial reasoning test based on a three-tier framework, adjusting existing instruments to ensure comprehensive coverage of each subject of the construct. This modified test was validated through preliminary analysis for quality and applicability among Chinese students in grades 4 to 6, providing a solid foundation for our study. Using this spatial reasoning test alongside TIMSS mathematics tests, we examined 816 fourth grade students to investigate the predictive relationship between spatial reasoning and mathematics performance. The findings demonstrate that despite the limited exposure to space-related content in the curriculum, spatial reasoning remains a significant predictor of mathematical performance among Chinese students, reinforcing the strong relationship between spatial reasoning and mathematics evidenced in prior studies (Frick 2019; Gunderson et al. 2012; Mix et al. 2016; Verdine et al. 2017). Specifically:
Overall Spatial Reasoning as a Predictor. The results indicate that overall spatial reasoning significantly predicts mathematical performance across various domains. This influence extends beyond the overtly spatial aspects of mathematics, such as geometric shapes and measures (Battista 2007; Clements and Battista 1992), to include the seemingly less spatial aspects of mathematics, such as the number domain, as highlighted in numerous studies (Hawes and Ansari 2020; Mix and Cheng 2012; Xie et al. 2020). Furthermore, spatial reasoning also significantly contributes to the data display domain, which is less-explored in relevant studies. Given that data literacy is regarded as one of the essential skills for citizens in the era of “big data” (Borges-Rey 2017), this finding may provide a new perspective on how spatial reasoning supports the development of data literacy. The significant impact of spatial reasoning on mathematical performance across all content domains in elementary school supports the notion that mathematics is inherently associated with spatial thinking (Congdon et al. 2018b; Jirout and Newcombe 2018).
Spatial reasoning constructs as predictors. All three spatial constructs—mental rotation, spatial orientation, and spatial visualization—significantly contribute to performance in the number and geometric shapes and measures domains. Mental rotation and spatial orientation emerge as the strongest predictors of performance in these respective domains. This finding contrasts with a previous study that found only object-based spatial constructs (mental rotation and spatial visualization) to be significant predictors (Harris et al. 2021). However, our results align closely with research indicating that egocentric transformations (e.g., mental rotation) showed the strongest relation to performance in arithmetic operations within the number domain, whereas allocentric transformations (e.g., spatial orientation) were strongly related to geometry (Frick 2019). The dominance of mental rotation (MR) in the number domain, even after controlling for spatial orientation (SO) and spatial visualization (SV), suggests that it uniquely captures dynamic egocentric transformations, such as mentally rotating numerical symbols (e.g., distinguishing ‘6’ from ‘9’) or manipulating quantities in working memory (Hawes et al. 2019). This persistent effect, despite the theoretical overlap between MR and SV, indicates that MR’s contribution is distinct and tied to real-time numerical transformations rather than stepwise spatial integration. Similarly, spatial orientation (SO) retained its strong predictive power for geometry after accounting for MR and SV. This aligns with its theorized role in allocentric perspective-taking, which is essential for decoding geometric diagrams (e.g., identifying congruent angles from different viewpoints). For the data display domain, both spatial orientation and spatial visualization significantly contribute, with spatial visualization being the strongest predictor. One possible explanation is that these constructs both involve decoding information, specifically interpreting graphic information, including visual elements and the spatial relationships among those elements within the graphics (Lowrie and Diezmann 2007; Lowrie and Logan 2018). When solving problems in the data display domain, students must recognize the various elements within a chart or graph and understand their relationships. For example, in the sample item of data display previously presented in Table 3, students are required to interpret visual elements, such as text or numbers in rows and columns, images of ice cream, and the spatial relationships between these elements and their corresponding mathematical meanings. The primacy of spatial visualization (SV) in predicting data display performance, even when MR and SO are controlled, likely stems from its reliance on schematic representations, which encode abstract spatial relations (e.g., chart layouts) rather than visual appearance (Hegarty and Kozhevnikov 1999). This representational advantage may enable students to efficiently extract, organize, and manipulate spatial information within complex graphical displays, facilitating mathematical reasoning in this domain. Notably, spatial orientation (SO) also uniquely contributed, potentially facilitating the spatial structuring of graphical layouts. However, its smaller effect size compared to SV suggests that SO may primarily aid in structuring spatial relationships within graphical displays, while SV plays a more central role in integrating and reasoning about these relationships.
Gender Differences. Although no significant statistical gender differences were found in the overall relationship between spatial reasoning and mathematical performance across content domains, subgroup regression analysis revealed variations. For male students, spatial orientation is the primary predictor of mathematical performance across content domains. For female students, however, mental rotation is the strongest predictor for number and geometry performance, while spatial visualization is the most significant predictor for data display performance. This pattern may be linked to differences in problem-solving strategies between genders. Prior research suggests that males tend to rely more on spatial strategies when solving mathematical problems, whereas females, even when presented with spatial tasks, may employ more verbal–analytical reasoning that does not require generating and manipulating mental images (Danan and Ashkenazi 2022). While this tendency does not necessarily lead to differences in overall math performance, it may influence which spatial constructs contribute most to mathematical success. In the present study, the stronger predictive role of spatial orientation for males aligns with the idea that they are more likely to engage in spatially based approaches to problem-solving. In contrast, the greater relevance of mental rotation and spatial visualization for females may reflect their tendency to approach mathematical tasks differently, potentially integrating verbal–analytical reasoning with specific spatial skills. This finding contrasts with Harris et al. (2021), who found that mental rotation and spatial visualization were more predictive of male math performance, while spatial orientation was more predictive of female math performance. The authors of that study suggested that this discrepancy might be due to a ceiling effect for males in spatial orientation tasks, potentially limiting the observed contribution of spatial orientation to math performance for males.

4.2. Educational Implications

Given the exceptional performance of students from East Asian countries in international mathematics assessments and the cultural context where space has received less emphasis in mathematics curricula compared to Western contexts (Lowrie et al. 2016), this study focuses on the spatial reasoning and mathematical performance of Chinese elementary school students, offering evidence from a different cultural perspective that further reinforces the strong relationship between spatial reasoning and mathematics. This highlights the importance of integrating spatial reasoning into elementary mathematics curricula as a fundamental component, particularly within the Asian mathematics curricula, where it is less heavily emphasized. To achieve this, it is crucial to encourage more non-Western researchers to focus on and expand the study of spatial reasoning, thereby enhancing reciprocal global research interactions.
By focusing on different mathematics content domains, this study clarifies the relationship between spatial constructs and mathematics, providing insights for more targeted educational interventions, particularly those that involve embedded interventions (Bruce et al. 2017; Davis and the Spatial Reasoning Study Group 2015; Hawes et al. 2017). This approach aims to further promote the spatialization of the mathematics curriculum, providing alternative ways that go beyond traditional approaches focused on computation, memorization, and repetition, for students to engage in mathematics (Mulligan 2015).

4.3. Limitations and Future Directions

This study investigated students’ spatial reasoning and their contribution to mathematics performance through a validated spatial reasoning test and the TIMSS mathematics test. However, several limitations should be noted. First, the participating schools were located in economically developed regions of China. Given that children from diverse economic backgrounds may rely on different skills and methods for problem-solving (Jordan et al. 1994; Butterworth et al. 2011), the generalizability of these findings to broader populations remains uncertain. Future research should include more economically diverse samples and examine the moderating effect of economic background on the relationship between spatial reasoning and mathematical performance. Second, as a correlational study, this research explored the predictive relationship between spatial reasoning and mathematics performance, but could not establish causality. Further intervention studies are necessary to explore the potential causal effects of spatial abilities on mathematical performance. Third, the current correlational analysis is based on test scores, which may not be fine-grained enough for deeper insights. Future studies might employ more sophisticated analytical methods that can better uncover latent traits. Additionally, the mathematics test was divided into three different content domains, with fewer items in the data display domain. This may have resulted in a less varied score distribution and ceiling effect, potentially impacting subsequent analyses. Future research focusing on the data display domain should consider using more comprehensive assessment tools. Finally, our study did not empirically compare the Chinese context with those which have a different curricular focus. Future studies could consider the link between spatial skills and mathematics in different educational settings.

Author Contributions

Conceptualization, T.X., S.S. and Q.K.; Data curation, S.S.; Formal analysis, T.X. and S.S.; Investigation, T.X.; Methodology, T.X. and S.S.; Project administration, Q.K.; Resources, S.S. and Q.K.; Software, T.X.; Supervision, Q.K.; Validation, T.X., S.S. and Q.K.; Visualization, T.X.; Writing—original draft, T.X.; Writing—review & editing, T.X. and S.S. All authors have read and agreed to the published version of the manuscript.

Funding

This research received no external funding.

Institutional Review Board Statement

The study was approved by the Institutional Review Board (IRB) of East China Normal University (Protocol: HR663-2022, approved on 11 November 2022).

Informed Consent Statement

Informed consent was obtained from all subjects involved in the study.

Data Availability Statement

The datasets used and/or analyzed during the current study are available from the corresponding author upon reasonable request.

Conflicts of Interest

The authors declare no conflict of interest.

References

  1. Battista, Michael. T. 2007. The development of geometric and spatial thinking. In Second Handbook of Research on Mathematics Teaching and Learning. Edited by Frank K. Lester, Jr. Charlotte: Information Age Publishing, pp. 843–908. [Google Scholar]
  2. Boone, William J., John R. Staver, and Melissa S. Yale. 2014. Rasch Analysis in the Human Sciences. New York: Springer. [Google Scholar]
  3. Borges-Rey, Eddy. L. 2017. Data literacy and citizenship: Understanding ‘big Data’ to boost teaching and learning in science and mathematics. In Handbook of Research on Driving STEM Learning with Educational Technologies. Hershey: IGI Global, pp. 65–79. [Google Scholar]
  4. Bruce, Catherine D., Brent Davis, Nathalie Sinclair, Lynn McGarvey, David Hallowell, Michelle Drefs, Krista Francis, Zachary Hawes, Joan Moss, Joanne Mulligan, and et al. 2017. Under-standing gaps in research networks: Using spatial reasoning as a window into the importance of networked educational research. Educational Studies in Mathematics 95: 143–61. [Google Scholar] [CrossRef]
  5. Butterworth, Brian, Robert Reeve, and Fiona Reynolds. 2011. Using mental representations of space when words are unavailable: Studies of enumeration and arithmetic in indigenous Australia. Journal of Cross-Cultural Psychology 42: 630–38. [Google Scholar] [CrossRef]
  6. Casey, Beth M., Elizabeth Pezaris, Bonnie Fineman, Amanda Pollock, Lindsay Demers, and Eric Dearing. 2015. A longitudinal analysis of early spatial skills compared to arithmetic and verbal skills as predictors of fifth-grade girls’ math reasoning. Learning and Individual Differences 40: 90–100. [Google Scholar] [CrossRef]
  7. Cheng, Yi-Ling, and Kelly S. Mix. 2014. Spatial training improves children’s mathematics ability. Journal of Cognition and Development 15: 2–11. [Google Scholar] [CrossRef]
  8. Cipora, Krzysztof, Katarzyna Patro, and Hans-Christoph Nuerk. 2015. Are Spatial-Numerical Associations a Cornerstone for Arithmetic Learning? The Lack of Genuine Correlations suggests: No. Mind, Brain, and Education 9: 190–207. [Google Scholar] [CrossRef]
  9. Clements, Douglas H., and Julie Sarama. 2014. Learning and Teaching Early Math: The Learning Trajectories Approach. New York: Routledge, pp. 107–11. First published 2009. [Google Scholar] [CrossRef]
  10. Clements, Douglas H., and Michael T. Battista. 1992. Geometry and spatial reasoning. In Handbook of Research on Mathematics Teaching and Learning. Edited by Douglas A. Grouws. New York: Macmillan, pp. 420–64. [Google Scholar]
  11. Congdon, Eliza L., Marina Vasilyeva, Kelly S. Mix, and Susan C. Levine. 2018b. From intuitive spatial measurement to understanding of units. In Visualizing Mathematics: The Role of Spatial Reasoning in Mathematical Thought. Edited by Kelly S. Mix and Michael T. Battista. Berlin: Springer, pp. 25–46. [Google Scholar]
  12. Congdon, Eliza L., Mee-Kyoung Kwon, and Susan C. Levine. 2018a. Learning to measure through action and gesture: Children’s prior knowledge matters. Cognition 180: 182–90. [Google Scholar] [CrossRef]
  13. Cornu, Véronique, Christine Schiltz, Tahereh Pazouki, and Romain Martin. 2017. Training early visuo-spatial abilities: A controlled classroom-based intervention study. Applied Developmental Science 23: 1–21. [Google Scholar] [CrossRef]
  14. Danan, Yehudit, and Sarit Ashkenazi. 2022. The influence of sex on the relations among spatial ability, math anxiety, and math performance. Trends in Neuroscience and Education 29: 100196. [Google Scholar] [CrossRef]
  15. Davis, Brent, and The Spatial Reasoning Study Group, eds. 2015. Spatial Reasoning in the Early Years: Principles, Assertions, and Speculations. New York: Routledge. [Google Scholar]
  16. Dziuban, Charles D., and Edwin C. Shirkey. 1974. When is a correlation matrix appropriate for factor analysis? Some decision rules. Psychological Bulletin 81: 358. [Google Scholar]
  17. Frick, Andrea. 2019. Spatial transformation abilities and their relation to later mathematics performance. Psychological Research 83: 1465–84. [Google Scholar] [CrossRef]
  18. Gilligan, Katie A., Alex Hodgkiss, Michael SC Thomas, and Emily K. Farran. 2019. The developmental relations between spatial cognition and mathematics in primary school children. Developmental Science 22: e12786. [Google Scholar] [CrossRef] [PubMed]
  19. Gilligan, Katie A., Eirini Flouri, and Emily K. Farran. 2017. The contribution of spatial ability to mathematics achievement in middle childhood. Journal of Experimental Child Psychology 163: 107–25. [Google Scholar] [CrossRef] [PubMed]
  20. Gunderson, Elizabeth A., Gerardo Ramirez, Sian L. Beilock, and Susan C. Levine. 2012. The relation between spatial skill and early number knowledge: The role of the linear number line. Developmental Psychology 48: 1229–41. [Google Scholar] [CrossRef]
  21. Harris, Danielle, Tom Lowrie, Tracy Logan, and Mary Hegarty. 2021. Spatial reasoning, mathematics, and gender: Do spatial constructs differ in their contribution to performance? British Journal of Educational Psychology 91: 409–41. [Google Scholar] [CrossRef]
  22. Hawes, Zachary, and Daniel Ansari. 2020. What explains the relationship between spatial and mathematical skills? A review of evidence from brain and behavior. Psychonomic Bulletin & Review 27: 465–82. [Google Scholar] [CrossRef]
  23. Hawes, Zachary, Joan Moss, Beverly Caswell, and Daniel Poliszczuk. 2015. Effects of mental rotation training on children’s spatial and mathematics performance: A randomized controlled study. Trends in Neuroscience and Education 4: 60–68. [Google Scholar] [CrossRef]
  24. Hawes, Zachary, Joan Moss, Beverly Caswell, Jisoo Seo, and Daniel Ansari. 2019. Relations between numerical, spatial, and executive function skills and mathematics achievement: A latent-variable approach. Cognitive Psychology 109: 68–90. [Google Scholar] [CrossRef]
  25. Hawes, Zachary, Joan Moss, Beverly Caswell, Sarah Naqvi, and Sharla MacKinnon. 2017. Enhancing children’s spatial and numerical skills through a dynamic spatial approach to early geometry instruction: Effects of a 32-week intervention. Cognition and Instruction 35: 236–64. [Google Scholar] [CrossRef]
  26. Hegarty, Mary. 2018. Ability and sex differences in spatial thinking: What does the mental rotation test really measure? Psychonomic Bulletin & Review 25: 1212–19. [Google Scholar] [CrossRef]
  27. Hegarty, Mary, and David Waller. 2005. Individual Differences in Spatial Abilities. In The Cambridge Handbook of Visuospatial Thinking. Edited by Priti Shah and Akira Miyake. Cambridge: Cambridge University Press, pp. 121–69. [Google Scholar] [CrossRef]
  28. Hegarty, Mary, and Maria Kozhevnikov. 1999. Types of visual–spatial representations and mathematical problem solving. Journal of Educational Psychology 91: 684–89. [Google Scholar] [CrossRef]
  29. Hu, Tao, Jing Yang, Rongxiu Wu, and Xiaopeng Wu. 2021. An international comparative study of students’ scientific explanation based on cognitive diagnostic assessment. Frontiers in Psychology 12: 795497. [Google Scholar] [CrossRef] [PubMed]
  30. Hubbard, Edward M., Manuela Piazza, Philippe Pinel, and Stanislas Dehaene. 2005. Interactions between number and space in parietal cortex. Nature Reviews Neuroscience 6: 435–48. [Google Scholar] [CrossRef] [PubMed]
  31. Jirout, Jamie, and Nora S. Newcombe. 2018. How much as compared to what: Relative magnitude as a key idea in mathematics cognition. In Visualizing Mathematics: The Role of Spatial Reasoning in Mathematical Thought. Edited by Kelly S. Mix and Michael T. Battista. Berlin: Springer, pp. 3–24. [Google Scholar]
  32. Jordan, Nancy C., Janellen Huttenlocher, and Susan Cohen Levine. 1994. Assessing early arithmetic abilities: Effects of verbal and nonverbal response types on the calculation performance of middle-and low-income children. Learning and Individual Differences 6: 413–32. [Google Scholar] [CrossRef]
  33. Kline, Rex B. 2016. Principles and Practice of Structural Equation Modeling, 4th ed. New York: The Guilford Press. [Google Scholar]
  34. Kozhevnikov, Maria, and Mary Hegarty. 2001. A dissociation between object manipulation spatial ability and spatial orientation ability. Memory & Cognition 29: 745–56. [Google Scholar] [CrossRef]
  35. Lauer, Jillian E., and Stella F. Lourenco. 2016. Spatial processing in infancy predicts both spatial and mathematical aptitude in childhood. Psychological Science 27: 1291–98. [Google Scholar] [CrossRef]
  36. Linn, Marcia C., and Anne C. Petersen. 1985. Emergence and characterization of sex differences in spatial ability: A meta-analysis. Child Development 56: 1479–98. [Google Scholar] [CrossRef] [PubMed]
  37. Logan, Tracy. 2015. The influence of test mode and visuospatial ability on mathematics assessment performance. Mathematics Education Research Journal 27: 423–41. [Google Scholar] [CrossRef]
  38. Logan, Tracy, and Tom Lowrie. 2017. Gender perspectives on spatial tasks in a national assessment: A secondary data analysis. Research in Mathematics Education 19: 199–216. [Google Scholar] [CrossRef]
  39. Lourenco, Stella F., Chi-Ngai Cheung, and Lauren S. Aulet. 2018. Is visuospatial reasoning related to early mathematical development? A critical review. In Heterogeneity of Function in Numerical Cognition. Edited by Avishai Henik and Wim Fias. London: Academic, pp. 177–210. [Google Scholar]
  40. Lowrie, Tom, and Carmel M. Diezmann. 2007. Solving graphics problems: Student performance in the junior grades. Journal of Educational Research 100: 369–77. [Google Scholar] [CrossRef]
  41. Lowrie, Tom, and Tracy Logan. 2018. The interaction between spatial reasoning constructs and mathematics understandings in elementary classrooms. In Visualizing Mathematics: The Role of Spatial Reasoning in Mathematical Thought. Edited by Kelly S. Mix and Michael T. Battista. Berlin: Springer, pp. 253–76. [Google Scholar]
  42. Lowrie, Tom, Danielle Harris, Tracy Logan, and Mary Hegarty. 2021. The impact of a spatial intervention program on students’ spatial reasoning and mathematics performance. The Journal of Experimental Education 89: 259–77. [Google Scholar] [CrossRef]
  43. Lowrie, Tom, Tracy Logan, and Ajay Ramful. 2016. Cross cultural comparison of grade 6 students’ performance and strategy use on graphic and non-graphic tasks. Learning and Individual Differences 52: 97–108. [Google Scholar] [CrossRef]
  44. Lowrie, Tom, Tracy Logan, and Ajay Ramful. 2017. Visuospatial training improves elementary students’ mathematics performance. British Journal of Educational Psychology 87: 170–86. [Google Scholar] [CrossRef]
  45. Lowrie, Tom, Tracy Logan, and Mary Hegarty. 2019. The influence of spatial visualization training on students’ spatial reasoning and mathematics performance. Journal of Cognition and Development 20: 729–51. [Google Scholar] [CrossRef]
  46. Lowrie, Tom, Tracy Logan, Danielle Harris, and Mary Hegarty. 2018. The impact of an intervention program on students’ spatial reasoning: Student engagement through mathematics-enhanced learning activities. Cognitive Research: Principles and Implications 3: 50. [Google Scholar] [CrossRef]
  47. Lubinski, David. 2010. Spatial ability and STEM: A sleeping giant for talent identification and development. Personality and Individual Differences 49: 344–51. [Google Scholar] [CrossRef]
  48. McGee, Mark G. 1979. Human spatial abilities: Psychometric studies and environmental, genetic, hormonal, and neurological influences. Psychological Bulletin 86: 889–918. [Google Scholar] [CrossRef] [PubMed]
  49. Mix, Kelly S. 2019. Why are spatial skill and mathematics related? Child Development Perspectives 13: 121–26. [Google Scholar] [CrossRef]
  50. Mix, Kelly S., and Yi-Ling Cheng. 2012. The relation between space and math: Developmental and educational implications. Advances in Child Development and Behavior 42: 197–243. [Google Scholar] [CrossRef]
  51. Mix, Kelly S., Susan C. Levine, Yi-Ling Cheng, Christopher J. Young, David Z. Hambrick, and Spyros Konstantopoulos. 2017. The latent structure of spatial skills and mathematics: A replication of the two-factor model. Journal of Cognition and Development 18: 465–92. [Google Scholar] [CrossRef]
  52. Mix, Kelly S., Susan C. Levine, Yi-Ling Cheng, Chris Young, D. Zachary Hambrick, Raedy Ping, and Spyros Konstantopoulos. 2016. Separate but correlated: The latent structure of space and mathematics across development. Journal of Experimental Psychology: General 145: 1206–27. [Google Scholar] [CrossRef]
  53. Mulligan, Joanne. 2015. Looking within and beyond the geometry curriculum: Connecting spatial reasoning to mathematics learning. ZDM Mathematics Education 47: 511–17. [Google Scholar] [CrossRef]
  54. Newcombe, Nora S., Wenke Möhring, and Andrea Frick. 2018. How big is many? Development of spatial and numerical magnitude understanding. In Heterogeneity of Function in Numerical Cognition. Edited by Avishai Henik and Wim Fias. London: Academic, pp. 157–76. [Google Scholar]
  55. OECD. 2014. PISA 2012 Results: What Students Know and Can Do (Volume I, Revised edition, February 2014): Student Performance in Mathematics, Reading and Science. Paris: PISA, OECD Publishing. [Google Scholar] [CrossRef]
  56. OECD. 2016. PISA 2015 Results (Volume I): Excellence and Equity in Education. Paris: PISA, OECD Publishing. [Google Scholar] [CrossRef]
  57. OECD. 2019. PISA 2018 Results (Volume I): What Students Know and Can Do. Paris: PISA, OECD Publishing. [Google Scholar] [CrossRef]
  58. Owens, Kay. 2001. Development of the Test: Thinking About 3D Shapes. Sydney: NSW Department of Education and Training. [Google Scholar]
  59. Patahuddin, Sitti, Tracy Logan, and Ajay Ramful. 2018. Characteristics of spatial visualisation: Perspectives from area of composite shapes. In Making Waves, Opening Spaces, Proceedings of the 41st Annual Conference of the Mathematics Education Research Group of Australasia. Edited by J. Hunter, P. Perger and L. Darragh. Auckland: MERGA, pp. 623–30. [Google Scholar]
  60. Ramful, Ajay, Thomas Lowrie, and Tracy Logan. 2017. Measurement of spatial ability: Construction and validation of the spatial reasoning instrument for middle school students. Journal of Psychoeducational Assessment 35: 709–27. [Google Scholar] [CrossRef]
  61. Shepard, Roger N., and Jacqueline Metzler. 1971. Mental Rotation of Three-Dimensional Objects. Science 171: 701–3. [Google Scholar] [CrossRef] [PubMed]
  62. Sinclair, Nathalie. 2004. The Roles of the Aesthetic in Mathematical Inquiry. Mathematical Thinking and Learning 6: 261–84. [Google Scholar] [CrossRef]
  63. Sorby, Sheryl A. 1999. Developing 3-D spatial visualization skills. Engineering Design Graphics Journal 63: 21–32. [Google Scholar]
  64. Stankov, Lazar. 2010. Unforgiving Confucian culture: A breeding ground for high academic achievement, test anxiety and self-doubt? Learning and Individual Differences 20: 555–63. [Google Scholar] [CrossRef]
  65. Tartre, Lindsay Anne. 1990. Spatial skills, gender & mathematics. In Mathematics and Gender: Influences on Teachers and Students. Edited by E. Fennema and G. Leder. New York: Teachers’ College Press, pp. 27–59. [Google Scholar]
  66. Uttal, David H., David I. Miller, and Nora S. Newcombe. 2013. Exploring and enhancing spatial thinking: Links to achievement in science, technology, engineering, and mathematics? Current Directions in Psychological Science: A Journal of the American Psychological Society 22: 367–73. [Google Scholar] [CrossRef]
  67. Uttal, David H., Nina Simms Kiley McKee, Mary Hegarty, and Nora S. Newcombe. 2024. How Can We Best Assess Spatial Skills? Practical and Conceptual Challenges. Journal of Intelligence 12: 8. [Google Scholar] [CrossRef]
  68. Verdine, Brian N., Roberta Michnick Golinkoff, Kathy Hirsh-Pasek, Nora S. Newcombe, and Drew H. Bailey. 2017. Links between spatial and mathematical skills across the preschool years. Monographs of the Society for Research in Child Development 82: 7–30. [Google Scholar] [CrossRef]
  69. Verdine, Brian N., Roberta M. Golinkoff, Kathryn Hirsh-Pasek, Nora S. Newcombe, Andrew T. Filipowicz, and Alicia Chang. 2014. Deconstructing building blocks: Preschoolers’ spatial assembly performance relates to early mathematical skills. Child Development 85: 1062–76. [Google Scholar] [CrossRef]
  70. Voyer, Daniel, Susan Voyer, and M. Philip Bryden. 1995. Magnitude of sex differences in spatial abilities: A meta-analysis and consideration of critical variables. Psychological Bulletin 117: 250–70. [Google Scholar] [CrossRef] [PubMed]
  71. Wai, Jonathan, David Lubinski, and Camilla P. Benbow. 2009. Spatial ability for STEM domains: Aligning over 50 years of cumulative psychological knowledge solidifies its importance. Journal of Educational Psychology 101: 817–35. [Google Scholar] [CrossRef]
  72. Walsh, Vincent. 2003. A theory of magnitude: Common cortical metrics of time, space and quantity. Trends in Cognitive Sciences 7: 483–88. [Google Scholar] [CrossRef] [PubMed]
  73. Xie, Fang, Li Zhang, Xu Chen, and Ziqiang Xin. 2020. Is spatial ability related to mathematical ability: A meta-analysis. Educational Psychology Review 32: 113–55. [Google Scholar] [CrossRef]
  74. Xu, Tianshu, Xiaopeng Wu, Siyu Sun, and Qiping Kong. 2023. Cognitive diagnostic analysis of students’ mathematical competency based on the DINA model. Psychology in the Schools 60: 3135–50. [Google Scholar] [CrossRef]
Figure 1. The second-order three-factor model of spatial reasoning.
Figure 1. The second-order three-factor model of spatial reasoning.
Jintelligence 13 00041 g001
Table 1. Framework of the spatial reasoning test.
Table 1. Framework of the spatial reasoning test.
ConstructSubjectContentItem
Mental Rotation (MR)2D rotationDetermining the outcome of a rotation of a 2D object; differentiating between reflection and rotation, clockwise and anticlockwise turnMR2, MR4, MR5, MR7, MR8, MR11
3D rotationDetermining the outcome of a rotation of a 3D object; differentiating between reflection and rotationMR20, MR26, MR29
Spatial Orientation (SO)Orientation and locationDetermining the position of an object in the situation; determining the position of the object relative to that of another object or the observerSO1, SO3, SO6, SO12, SO17
Alternate viewsFront, top, or side view; identifying the orthogonal views of an objectSO9, SO15
Navigating with mapsMoving and reorienting in a forward or inverted map according to the given routeSO13, SO24, SO30
Spatial Visualization (SV)Part–whole relationshipsIdentifying parts from the whole and vice versaSV14, SV18
Reflection and symmetryFinding the symmetry in an object; reflecting an objectSV3, SV4, SV5
Folding and cuttingVisualizing the outcome of folding/unfolding/cutting a particular configuration; identifying cross sections of 3D objectsSV25, SV28, SV32
Transformation between 2D and 3DConstructing a 3D shape from a given 2D shape and vice versaSV21, SV22, SV23, SV27, SV31
Table 2. Sample items in the spatial reasoning test.
Table 2. Sample items in the spatial reasoning test.
ConstructSubjectItem
Mental Rotation
(MR)
3D mental rotation26. The diagram below represents a model made out of cubes. Which of the following is the same as the model?
Jintelligence 13 00041 i001
Spatial Orientation (SO)Navigating with maps24. A hamster was placed at the start of a maze, as shown below. The hamster ran through the maze. It turned to its right, then turned left, then turned right. Where did the hamster finish?
Jintelligence 13 00041 i002
Spatial Visualization (SV)Transformation between 2D and 3D21. Diagram 1 represents a rectangular piece of paper. Which of the following hollow 3D shapes cannot be obtained by folding this rectangular paper?
Jintelligence 13 00041 i003
Table 3. Sample items in the mathematics test.
Table 3. Sample items in the mathematics test.
Item IDContent DomainItem
M051091NumberWhich fraction is not equal to the others?
Jintelligence 13 00041 i004
M051123Geometric Shapes and MeasuresHow many lines of symmetry does this figure have?
Jintelligence 13 00041 i005
M051109Data DisplayHow many children chose vanilla as their favorite flavor?
Jintelligence 13 00041 i006
Table 4. Fit indices from the CFA.
Table 4. Fit indices from the CFA.
χ2dfχ2/dfCFITLIRMSEA
Second-order three-factor model653.6644611.4180.9180.9120.030
Table 5. Item difficulty (p) and item discrimination (D) based on classical test theory analysis.
Table 5. Item difficulty (p) and item discrimination (D) based on classical test theory analysis.
Mental RotationSpatial OrientationSpatial Visualization
ItempDItempDItempD
20.740.4110.880.36100.300.36
40.810.3530.880.38140.780.35
50.680.3760.920.31160.490.37
70.490.4490.820.35180.850.39
80.560.45120.870.37190.640.36
110.540.40130.820.36210.560.41
200.510.40150.870.41220.590.35
260.540.47170.640.41230.270.35
290.460.41240.640.41250.390.35
300.460.29270.370.52
280.380.36
310.660.35
320.720.37
Table 6. Item information based on the Rasch model.
Table 6. Item information based on the Rasch model.
ItemDifficulty EstimateSEOutfit MNSQOutfit tInfit MNSQInfit t
1−2.4130.1500.662−1.4510.887−1.224
2−1.3640.1170.831−1.2090.950−0.905
3−2.3910.1490.648−1.5420.857−1.593
4−1.8400.1280.828−0.9260.962−0.533
5−0.9780.1101.0720.6451.0110.248
6−2.9890.1820.928−0.1050.866−1.044
70.0010.1050.980−0.2300.976−0.532
8−0.3420.1050.927−0.8480.954−1.075
9−1.8560.1290.862−0.7140.960−0.561
101.0490.1141.1241.2061.0280.516
11−0.2240.1051.0120.1771.0090.214
12−2.3690.1480.804−0.7760.843−1.794
13−1.8560.1290.894−0.5280.945−0.789
14−1.6120.1220.958−0.2040.973−0.425
15−2.2870.1450.653−1.6110.843−1.881
160.0220.1051.0660.8541.0661.504
17−0.7480.1081.0280.3010.991−0.188
18−2.1500.1390.689−1.5280.891−1.376
19−0.7590.1081.0040.0701.0551.196
20−0.1060.1051.0040.0751.0220.523
21−0.3310.1050.994−0.0471.0140.347
22−0.5040.1061.0770.8471.0721.629
231.2590.1181.0980.8811.0420.707
24−0.7820.1080.984−0.1190.994−0.123
250.5370.1081.1331.5801.0851.747
26−0.2560.1050.899−1.2260.939−1.418
270.6280.1090.785−2.7190.864−2.907
280.6160.1091.0881.0441.0691.410
290.1840.1060.972−0.3321.0030.083
300.1520.1061.2112.5671.1924.098
31−0.8500.1091.0340.3421.0651.372
32−1.1970.1140.901−0.7401.0110.238
Table 7. Descriptive statistics for eight variables.
Table 7. Descriptive statistics for eight variables.
MeanSDMinMaxSkewnessKurtosisSE
Spatial measures
Overall spatial score19.095.17631−0.02−0.580.18
Mental rotation5.112.2809−0.1−0.80.08
Spatial orientation7.381.8110−0.59−0.080.06
Spatial visualization6.62.31130.01−0.40.08
Mathematics measures
Overall mathematics score29.565.88638−1.091.130.21
Number15.73.48120−1.061.110.12
Geometric shapes and measures102.52214−0.750.140.09
Data display3.460.7904−1.532.190.03
Table 8. Descriptive statistics for eight variables by gender.
Table 8. Descriptive statistics for eight variables by gender.
Male (N = 432)Female (N = 384)
MeanSDRangeSkewnessKurtosisMeanSDRangeSkewnessKurtosis
Spatial measures
Overall spatial score 19.535.356–31−0.08−0.6618.594.936–310.01−0.5
Mental rotation5.452.250–9−0.27−0.674.722.250–90.08−0.81
Spatial orientation7.461.821–10−0.6−0.17.281.781–10−0.58−0.07
Spatial visualization6.612.481–130−0.556.592.092–120.04−0.31
Mathematics measures
Overall mathematics score29.815.957–38−1.091.0329.295.816–38−1.11.25
Number15.993.422–20−1.121.2515.383.511–20−11
Geometric shapes and measures 10.432.512–14−0.750.1610.372.522–14−0.750.1
Data display3.390.840–4−1.472.073.540.731–4−1.541.82
Table 9. Correlations among the variables.
Table 9. Correlations among the variables.
12345678
Spatial measures
1. Overall spatial score-
2. Mental rotation0.82 ***-
3. Spatial orientation0.76 ***0.44 ***-
4. Spatial visualization0.84 ***0.51 ***0.48 ***-
Mathematics measures
5. Overall mathematics score0.58 ***0.48 ***0.44 ***0.49 ***-
6. Number0.52 ***0.43 ***0.41 ***0.42 ***0.93 ***-
7. Geometric shapes and measures0.55 ***0.48 ***0.39 ***0.47 ***0.86 ***0.62 ***-
8. Data display0.29 ***0.18 ***0.25 ***0.29 ***0.63 ***0.52 ***0.44 ***-
*** p < 0.001.
Table 10. Predictors of overall mathematical performance.
Table 10. Predictors of overall mathematical performance.
ModelFAdj. R2βSEt
Overall Sample Regression (N = 816)
Step 1: Simple Linear Regression
Overall spatial score421.5 ***0.340.660.0320.53 ***
Step 2: Multiple Regression
140.2 ***0.34
Mental rotation 0.690.097.74 ***
Spatial orientation 0.670.116.10 ***
Spatial visualization 0.640.097.09 ***
Subsample Regression: Male (N = 432) Female (N = 384)
Step 3: Simple Linear Regression
MaleOverall spatial score240.8 ***0.360.670.0415.52 ***
FemaleOverall spatial score177.6 ***0.320.660.0513.33 ***
Step 4: Multiple Regression
Male 80.89 ***0.36
Mental rotation 0.590.124.77 ***
Spatial orientation 0.860.155.75 ***
Spatial visualization 0.600.125.11 ***
Female 60.69 ***0.32
Mental rotation 0.810.136.17 ***
Spatial orientation 0.450.162.78 **
Spatial visualization 0.680.144.77 ***
*** p < 0.001; ** p < 0.01.
Table 11. Predictors of performance in number.
Table 11. Predictors of performance in number.
ModelFAdj. R2βSEt
Overall Sample Regression (N = 816)
Step 1: Simple Linear Regression
Overall spatial score301.7 ***0.270.350.0217.37 ***
Step 2: Multiple Regression
101 ***0.27
Mental rotation 0.370.066.68 ***
Spatial orientation 0.410.075.98 ***
Spatial visualization 0.290.065.19 ***
Subsample Regression: Male (N = 432) Female (N = 384)
Step 3: Simple Linear Regression
MaleOverall spatial score170.1 ***0.280.340.0313.04 ***
FemaleOverall spatial score126.3 ***0.250.360.0311.24 ***
Step 4: Multiple Regression
Male 57.6 ***0.28
Mental rotation 0.270.073.55 ***
Spatial orientation 0.470.095.55 ***
Spatial visualization 0.320.074.46 ***
Female 42.65 ***0.25
Mental rotation 0.450.085.45 ***
Spatial orientation 0.330.103.24 **
Spatial visualization 0.270.092.98 **
*** p < 0.001; ** p < 0.01.
Table 12. Predictors of performance in geometric shapes and measures.
Table 12. Predictors of performance in geometric shapes and measures.
ModelFAdj. R2βSEt
Overall Sample Regression (N = 816)
Step 1: Simple Linear Regression
Overall spatial score361.4 ***0.310.270.0119.01 ***
Step 2: Multiple Regression
121.5 ***0.31
Mental rotation 0.320.048.17 ***
Spatial orientation 0.200.054.22 ***
Spatial visualization 0.270.046.87 ***
Subsample Regression: Male (N = 432) Female (N = 384)
Step 3: Simple Linear Regression
MaleOverall spatial score291.8 ***0.340.2730.0214.83 ***
FemaleOverall spatial score145.7 ***0.270.2690.0212.07 ***
Step 4: Multiple Regression
Male 57.6 ***0.34
Mental rotation 0.3130.055.88 ***
Spatial orientation 0.3080.064.78 ***
Spatial visualization 0.2150.054.26 ***
Female 42.65 ***0.28
Mental rotation 0.3460.065.94 ***
Spatial orientation 0.0840.071.18
Spatial visualization 0.3310.065.24 ***
*** p < 0.001.
Table 13. Predictors of performance in data display.
Table 13. Predictors of performance in data display.
ModelFAdj. R2βSEt
Overall Sample Regression (N = 816)
Step 1: Simple Linear Regression
Overall spatial score77.15 ***0.100.050.018.78 ***
Step 2: Multiple Regression
29.82 ***0.10
Mental rotation 0.000.010.12
Spatial orientation 0.060.023.43 ***
Spatial visualization 0.080.015.43 ***
Subsample Regression: Male (N = 432) Female (N = 384)
Step 3: Simple Linear Regression
MaleOverall spatial score53.51 ***0.110.050.017.32 ***
FemaleOverall spatial score29.31 ***0.070.040.015.41 ***
Step 4: Multiple Regression
Male 19.61 ***0.34
Mental rotation 0.010.020.54
Spatial orientation 0.090.023.42 ***
Spatial visualization 0.070.023.40 ***
Female 11.31 ***0.29
Mental rotation 0.010.020.49
Spatial orientation 0.030.021.33
Spatial visualization 0.080.023.78 ***
*** p < 0.001.
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Xu, T.; Sun, S.; Kong, Q. Spatial Reasoning and Its Contribution to Mathematical Performance Across Different Content Domains: Evidence from Chinese Students. J. Intell. 2025, 13, 41. https://doi.org/10.3390/jintelligence13040041

AMA Style

Xu T, Sun S, Kong Q. Spatial Reasoning and Its Contribution to Mathematical Performance Across Different Content Domains: Evidence from Chinese Students. Journal of Intelligence. 2025; 13(4):41. https://doi.org/10.3390/jintelligence13040041

Chicago/Turabian Style

Xu, Tianshu, Siyu Sun, and Qiping Kong. 2025. "Spatial Reasoning and Its Contribution to Mathematical Performance Across Different Content Domains: Evidence from Chinese Students" Journal of Intelligence 13, no. 4: 41. https://doi.org/10.3390/jintelligence13040041

APA Style

Xu, T., Sun, S., & Kong, Q. (2025). Spatial Reasoning and Its Contribution to Mathematical Performance Across Different Content Domains: Evidence from Chinese Students. Journal of Intelligence, 13(4), 41. https://doi.org/10.3390/jintelligence13040041

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop