Next Article in Journal
Contributions of Neuroleadership to the School Administrator and Teachers for the Development of Organizational Behavior
Previous Article in Journal
Co-Creation in Contextual Competences for Sustainability: Teaching for Sustainability, Student Interaction and Extracurricular Engagement
Previous Article in Special Issue
Fostering AI Literacy in Elementary Science, Technology, Engineering, Art, and Mathematics (STEAM) Education in the Age of Generative AI
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Examining Pre-Service Chemistry Teachers’ Technological Pedagogical Content Knowledge (TPACK) of Using Data-Logging in the Chemistry Classroom

1
School of Chemistry, South China Normal University, Guangzhou 510631, China
2
Department of Mathematics and Information Technology, The Education University of Hong Kong, Hong Kong SAR, China
*
Author to whom correspondence should be addressed.
Sustainability 2023, 15(21), 15441; https://doi.org/10.3390/su152115441
Submission received: 26 September 2023 / Revised: 24 October 2023 / Accepted: 25 October 2023 / Published: 30 October 2023

Abstract

:
In recent decades, the integration of technology, particularly data logging, has become a cornerstone of effective science teaching, placing increased emphasis on the cultivation of teachers’ Technological Pedagogical Content Knowledge (TPACK). While the TPACK framework has garnered attention in science education, limited research has explored the interplay of TPACK components within a constructivist context, particularly in the context of data-logging-supported chemistry class. To bridge this research gap, this study conducted a comprehensive survey on TPACK with 181 pre-service chemistry teachers, probing their TPACK as it pertains to data logging. Results from both exploratory and confirmatory factor analyses underscored the reliability and validity of the survey instrument. Subsequently, structural equation modeling was employed to illuminate the intricate relationships among various facets of TPACK. Findings suggested a sequential developmental pattern within TPACK, with teachers’ Technological Pedagogical Knowledge (TPK), Technological Content Knowledge (TCK), and Pedagogical Content Knowledge (PCK) all exerting a positive influence on their overall TPACK. Moreover, this study unveiled a significant correlation between pre-service chemistry teachers’ data-logging TPACK and their capacity for design thinking. Interestingly, gender differences in TPACK were negligible. In light of these findings, this study not only contributes to our understanding of TPACK development but also has practical implications for nurturing pre-service chemistry teachers’ proficiency in TPACK when using data-logging.

1. Introduction

Since its inception, the Technological Pedagogical Content Knowledge (TPACK) framework has garnered considerable attention in teacher education, serving as a pivotal framework for enhancing educators’ professional competence and guiding teacher training initiatives [1]. TPACK, a crucial measure of teachers’ pedagogical content knowledge in integrating technologies, consists of three primary factors and four composite factors that arise from their intersection. Prior research has validated the viability of this seven-factor structure within the TPACK model [2,3].
To bolster teachers’ overall TPACK proficiency, targeted strategies across various domains are recommended. Numerous scholars have dedicated their efforts to probing the development of teachers’ TPACK and engaging in comprehensive discussions regarding the structural interplay among TPACK components [1,4,5,6,7]. These studies provide valuable insights into exploring the developmental trajectory of TPACK. Quantitative research has primarily concentrated on creating instruments to assess teachers’ TPACK [8,9,10], acknowledging the inherent complexity in establishing connections between the TPACK model’s structure and its constituent elements. Furthermore, the distinctive characteristics of various disciplines result in variations in teaching tools and methodologies employed in the classroom. While most pre-service teachers possess a foundational understanding of TPACK, their proficiency levels require further refinement, as indicated by assessments of their TPACK competence [5,9,11]. To enhance overall TPACK proficiency, it is imperative to implement targeted strategies across various dimensions. Consequently, researchers should consider multiple factors, including gender and design thinking, that exert influence on TPACK levels [12,13,14,15].
Nonetheless, research on tracing the developmental trajectory of TPACK remains limited, with its findings subject to ongoing examination and validation. The identification and comprehension of these seven factors remain intricate theoretical facets that necessitate further refinement. Furthermore, there is currently a dearth of TPACK assessment tools tailored to specific technologies in science education. Thus, it is imperative to develop TPACK measurement instruments for specific disciplines and technologies, enabling the assessment of teachers’ TPACK proficiency in those specific domains.
Given the aforementioned research gaps, this study employs a quantitative research approach to assess and discuss the pre-service teachers’ TPACK levels in using data-logging in chemistry classes and their relationship with design thinking. Additionally, it evaluates the reliability and validity of the questionnaire and the suitability of the seven-factor model in the context of data-logging supported chemistry classes. Ultimately, the study endeavors to provide implications for the developmental pathway of TPACK in technology supported chemistry education.

2. Literature Review

2.1. The Composition and Framework of TPACK

The TPACK framework, illustrated in Figure 1, consists of three components: Content Knowledge (CK), Pedagogical Knowledge (PK), and Technological Knowledge (TK). Additionally, it encompasses four composite factors that arise from the intersection of these three components. These composite factors include Pedagogical Content Knowledge (PCK), Technological Content Knowledge (TCK), Technological Pedagogical Knowledge (TPK), and Technological Pedagogical Content Knowledge (TPACK) [16]. Within the theoretical framework of TPACK, Mishra and Koehler [1] underscored the pivotal role of technology in subject-specific instruction when delineating these seven TPACK variables. For definitions and examples of these seven components, please refer to Table 1.
Researchers have strived to underpin the TPACK framework outlined by Mishra and Koehler [1] with empirical evidence that aligns with the theoretical characterization of its components. However, early investigations encountered challenges in disentangling the seven components articulated in the framework through the use of self-assessment questionnaires intended to gauge teachers’ TPACK proficiency. These challenges were primarily attributed to the interplay within the framework [17,18,19].
For instance, Schmidt [20] and Sahin [21] undertook factor analyses in their research, identifying seven factors. Nevertheless, the process of extracting each factor individually during the analysis lacked standardization and rigor, and the research sample size was too modest, thereby constraining the robustness of their findings. Similarly, certain large-scale TPACK surveys have struggled to exhibit strong structural validity through factor analysis. For example, Koh et al. [22] assessed the TPACK levels of 1185 pre-service teachers in Singapore utilizing a modified questionnaire derived from Schmidt et al. [20]. Their exploratory factor analysis yielded only five factors, with CK and TK identified as distinct factors. The remaining technology-related factors (TPK, TCK, and TPACK) were conglomerated into one factor, while non-technology-dependent PK and PCK were grouped as another factor [22]. Findings have surfaced in other studies employing factor analysis [23,24,25].
Building on these earlier findings, Cox and Graham [17] embarked on an exploration of the TPACK framework, shedding light on the characteristics of several components. Subsequently, Chai et al. [26] expanded upon Cox and Graham’s conceptual analysis by leveraging exploratory and confirmatory factor analysis. They effectively identified all seven factors and unearthed additional factors associated with the second CK. Notably, it is essential to acknowledge that all teachers in Singapore receive training to instruct the second topic, which aligns with the second CK. However, not all educators possess training in the second teaching discipline. To substantiate the TPACK theoretical framework, Chai et al. [26] further refined the initial questionnaire and assessed the TPACK levels of pre-service teachers in Asia. Through exploratory and confirmatory factor analysis, they successfully elucidated seven meaningful variables. These studies collectively highlight that, while the theoretical framework of TPACK is becoming clearer, and empirical research is furnishing support for its seven-factor model structure, it remains an intricate construct necessitating continued development and validation to comprehensively unravel its seven facets. This study aims to contribute to this ongoing endeavor by conducting exploratory and confirmatory factor analyses.

2.2. Factor Structure Relationship of TPACK

Efforts to bolster teachers’ overall TPACK have prompted the proposal of targeted developmental strategies for various factors. Researchers have conducted comprehensive discussions concerning the structural interplay among TPACK factors and have explored pathways to cultivate teachers’ TPACK. However, the academic community grapples with two perspectives regarding the development and structural relationships of TPACK factors [27,28]. The first perspective, known as the integrated view, posits that the central component of the TPACK framework emerges from the integration of a teacher’s other knowledge components (CK, PK, TK, PCK, TPK, TCK). Accordingly, TPACK is intricately linked to each of these domains, with a high level of TPACK expected to correlate with elevated levels of TPK, TCK, PCK, TK, PK, and CK. In contrast, the second perspective, known as the transformative view, accentuates the convergence of knowledge components into a distinct and intricate knowledge construct. According to this view, TPACK represents a unique form of knowledge that transcends the amalgamation of its individual constituents. In the transformative view, TK, PK, and CK do not exert direct influence on TPACK; instead, TPK, TCK, and PCK have a direct impact on the formation of TPACK [6]. These contrasting viewpoints reflect divergent interpretations of the relationships among the various components of TPACK, underscoring the complexity inherent in understanding the formation of TPACK.
Further research is essential to elucidate the structural relationships and developmental trajectories of TPACK factors within the divergent perspectives outlined above. While Mishra and Koehler [1] advocate for the transformative view of TPACK and offer a theoretical framework for it, empirical validation of this perspective remains scarce [29,30,31]. Studies employing structural equation models to probe the interconnections within the TPACK knowledge domain have produced somewhat inconclusive results [22,32,33,34]. While the prevailing body of research aligns with the second developmental pathway, some investigations have indicated that TCK, TPK, and PCK may not directly and positively influence TPACK [6,32,33].
For instance, Dong et al. [33] conducted a comparative analysis of pre-service teachers and in-service teachers, revealing that TPK and TCK served as robust predictors of TPACK, while PCK exhibited no direct impact on TPACK. Additionally, they discovered that CK had no direct effect on PCK among pre-service teachers. Similarly, Koh et al. [22] observed that other factors positively predicted teachers’ TPACK, but CK and PCK did not exhibit such an effect. Pamuk et al. [34] contended that TCK, TPK, and PCK, with TPK and TCK being the most influential contributors, exerted a more positive influence on TPACK compared to TK, PK, and CK. While these studies have offered empirical support for the examination of the TPACK developmental pathway, the research on path analysis of TPACK remains relatively limited, and the findings have yet to achieve consensus. Consequently, this study seeks to further explore and corroborate these findings through the application of a structural equation model.

2.3. TPACK Measurement Tools

The development of measurement tools for assessing teachers’ TPACK has garnered significant attention in research due to the uncertainties of the relationship between the TPACK model structure and its internal factors [26,35,36,37,38,39]. Currently, the most commonly used method to measure TPACK is self-reporting [6,37,38,40,41]. Self-report tools offer the advantage of quickly and inexpensively gathering large amounts of quantitative data [42]. However, in addition to the inherent limitations of self-reporting, such as social expectation bias, response bias, subjective or misunderstood items, and response limitations due to fixed-choice problems [42], there are specific issues with current self-report methods used to measure TPACK [35]. First, a major problem is the lack of a common definition for each factor within TPACK, leading to greater heterogeneity in self-report tools. Second, self-report questionnaires evaluating TPACK tend to be lengthy [36,39,43] with around 30 to 50 items, which can be time-consuming and stressful for participants to complete. Third, some tools exhibit an uneven distribution of items across each subscale, resulting in different measurement accuracies for different factors [34]. Fourth, very few self-report instruments consider the specific educational contexts and teachers’ use of technology, as they often prioritize universality over contextual considerations. These challenges highlight the need for improved measurement tools that address the specific issues associated with self-reporting TPACK.
The TPACK measurement scale developed by Schmidt, based on the seven-factor model proposed by Mishra and Koehler [1], is currently the most widely used self-report tool for assessing TPACK [20]. This scale has demonstrated good reliability and validity and has served as a benchmark for many subsequent studies. For example, Chai et al. [26] modified Schmidt’s scale by converting the original five-point scale into a seven-point scale. They also improved the differentiation of each knowledge component by revising the item descriptions, such as adding descriptions of “no use of technology” to the PCK items. Such modifications can contribute to better reliability and structural validity. Additionally, investigations into specific pedagogical uses of technology can enhance the structural validity of the TPACK self-report tool, as demonstrated in the work by Koh et al. [22]. This is because different disciplines have distinct characteristics, and the teaching tools and techniques employed in the instructional process can vary. Moreover, TPACK is a type of knowledge that is closely tied to specific curriculum themes and teaching practices [17], thus necessitating the consideration of particular educational contexts when assessing TPACK. It is important to acknowledge the influence of these contexts on teachers’ TPACK and to develop measurement tools that account for these contextual factors. This can help ensure the accuracy and relevance of TPACK assessments in various educational settings.

2.4. Current Status of TPACK and Design Thinking for TPACK

In addition to the mentioned studies on the composition and structure of TPACK, researchers have also focused on measuring the levels of the seven factors in TPACK through self-report measures [44,45]. These studies often categorize teachers into different stages, such as recognition, acceptance, adaptation, exploration, and promotion [46]. Furthermore, they differentiate between in-service teachers and pre-service teachers in assessing TPACK levels. Studies examining the TPACK levels of in-service teachers have shown that although teachers generally have some understanding and willingness to integrate technology into their teaching practices and have reached the stage of adaptation, their overall TPACK levels have not reached high-level exploration and promotion. For instance, Yeh et al. [45] found that science teachers, while having a certain understanding of TPACK, still need to strengthen their application and innovative abilities in their teaching practices. Moreover, considering the distinct characteristics of different subjects, there are variations in the teaching tools and techniques employed in different subject areas.
In data-logging techniques, physical parameters are sensed, measured, and recorded in experimental situations using electronic instruments [47]. The data can be recorded remotely by the data logger and input into a computer. The data can be shown in tables and graphs, and recent specialist data-logging software also enables in-depth data analysis. It is now possible to complete the entire cycle of data collection, storage, display, and analysis in a variety of experimental situations [47]. Data-logging is now widely used in chemistry classes and plays a vital role in course design [48]. However, it is still unknown if teachers are able to integrate the course materials and data-logging technology well.
Based on the aforementioned results, it is clear that the TPACK levels of both in-service and pre-service teachers are not satisfactory. To improve overall TPACK levels and develop targeted strategies for the enhancement of various factors, researchers have begun to investigate the various factors that influence TPACK levels [49,50,51]. In most studies, gender has been found to have no significant impact on teachers’ TPACK levels [14]. However, some research in the field of computer technology indicates that men may have a higher level of technical knowledge (TK) compared to women [52,53]. Early studies examining the factors influencing TPACK can be categorized into objective and subjective factors. Objective factors primarily refer to TPACK-related training courses and technological support in the teaching environment.
However, research suggests that even after removing the barriers to teachers’ use of technology, there is no significant change in their integration of technology into teaching practices [54]. On the other hand, subjective factors pertain to teachers’ basic concepts of integrating technology. Specifically, it refers to teachers’ acceptance of using technology in teaching activities to optimize teaching effects. Some researchers have linked teachers’ TPACK levels to their beliefs and argue that teachers’ initiative and emphasis on integrating technology in teaching have a significant impact on the development of their TPACK [49]. Considering these factors can help inform the design of professional development programs and interventions aimed at enhancing teachers’ TPACK levels. It is important to address both objective and subjective factors, such as providing relevant training and support, while also fostering positive beliefs and attitudes towards technology integration in teaching.
Indeed, the lack of design thinking has been identified as the third-order obstacle to technology integration, following the first-order technological obstacle and the second-order teacher belief [51,55]. Design thinking (DT) refers to the cognitive processes undertaken by teachers when designing technology-integrated chemistry curricula. It reflects teachers’ intentions and beliefs in actively integrating knowledge from various perspectives for curriculum design [56,57]. In traditional teaching approaches, teachers often view teaching as the transmission of information and may not give adequate attention to designing learning environments and activities that promote students’ knowledge construction. However, the rapid pace of technological change presents new challenges to traditional teaching methods. Several studies have highlighted that TPACK is essentially design knowledge [26], and design thinking is a crucial aspect to consider among the factors influencing TPACK. Some researchers have conducted correlation analyses between design thinking and teachers’ TPACK levels, and the results have demonstrated a significant correlation between teachers’ TPACK levels and design thinking [15,55,58]. However, empirical studies on design thinking as an influencing factor of TPACK are currently limited.
Therefore, this study aims to analyze the correlation between teachers’ TPACK levels and design thinking, with the objective of providing empirical data to support these perspectives. By exploring the relationship between design thinking and TPACK, this study seeks to shed light on the importance of design thinking in enhancing teachers’ TPACK and inform the development of effective strategies for integrating technology in chemistry education.

3. Research Purpose and Questions

Building upon the literature review, this study adopts the TPACK seven-factor model as its theoretical foundation, shaping the TPACK scale tailored for preservice chemistry teachers integrating data-logging technology in chemistry classes. The study aims to address the following research questions:
  • Does the survey demonstrate the reliability and validity in measuring preservice chemistry teachers’ TPACK related to data-logging technology?
  • What are the levels of the seven TPACK factors specific to data-logging technology among preservice chemistry teachers and what is the developmental trajectory characterizing the acquisition of TPACK in data-logging technology among preservice chemistry teachers?
  • Do gender and design thinking exert any discernible influence on these levels?

4. Methods

4.1. Participants

This study recruited 181 pre-service chemistry teachers from a reputable university in Guangdong Province. The participants are majoring in chemistry education. The major aim is to cultivate future chemistry teachers. Initially, 181 pre-service teachers participated in the survey. The final validation rate is 100. The participant cohort comprised 135 females and 46 males, all in their third year of a 4-year university program. These individuals had successfully completed mandatory chemistry courses and teacher education-related modules, equipping them with fundamental knowledge and pedagogical skills in chemistry. Additionally, they were introduced to data-logging technology and TPACK concepts as part of a dedicated course. All the participants volunteered to attend the study, were well informed of the content of the research, and were allowed to quit the research at any time.

4.2. Instrumentation and Procedures

In this study, two instruments were employed: the TPACK survey developed by Chai et al. [43] and an assessment tool designed to gauge design thinking in the context of chemistry teaching. To tailor the TPACK survey to the specific context of this research, adaptations were made. It focused on two key aspects: content-related components specific to the field of chemistry (CK, TCK, PCK, and TPACK), and technology-related components pertaining to data-logging technologies (TK, TCK, PCK, and TPACK). The revised survey consisted of 24 items, with each TPACK component featuring 3 to 5 items. Responses were recorded on a 7-point Likert scale. The TPACK survey’s reliability was evaluated using Kronbach’s alpha values, which ranged from 0.79 to 0.93, signifying a satisfactory level of internal consistency.
The assessment tool designed to evaluate design thinking in chemistry teaching, previously utilized for assessing experienced chemistry teachers in middle schools, also exhibited strong reliability, with a Kronbach’s alpha value of 0.92. To enhance content validity, the research instrument underwent a meticulous evaluation by an education professor with expertise in TPACK. This assessment encompassed the scientific rigor of each factor dimension, the precision and relevance of item design, and the accuracy and standardization of item expression. The questionnaire was then refined based on expert feedback to ensure its validity and readiness for use in the formal research phase.
During the course, the instructor, who also served as the first author of this study, commenced by presenting the course objectives and outlining the expectations for the pre-service chemistry teachers. Subsequently, each participant received a survey tool comprising seven TPACK subscales and Instructional Design Thinking subscales. The instructor guided the participants through each questionnaire item, offering clarifications whenever necessary, prior to their completion of the survey. On average, it required approximately 25 min for participants to finish the questionnaires. Importantly, all participants engaged in the study willingly, resulting in a total collection of 181 questionnaires.
Following the course, participants were tasked with crafting a chemistry lesson that drew from the principles of data-logging technology experiments. They were encouraged to submit their instructional designs for evaluation. Moreover, as part of a deeper exploration into the influence of TPACK on instructional design, two participants were randomly selected for unstructured interviews. During this interview, the interviewer engaged in a more open-ended and flexible conversation with the interviewee. The goal was to allow the interviewee to speak freely and express their thoughts and experiences related to TPACK without being constrained by specific questions.

4.3. Data Analysis

To address the first research question, we initiated a two-step process. First, Principal Component Analysis (PCA) using the variance maximum rotation method was used to scrutinize the seven-factor structure of our tool. Subsequently, a Confirmatory Factor Analysis (CFA) was conducted to further validate this seven-factor structure. As recommended by Hair et al. [59] and Kline and Santor [60], it is prudent to consider multiple fit indices to assess the acceptability of model-data fit. In this study, we examined five indices: χ2/df (less than 3.0), Comparative Fit Index (CFI) (greater than 0.90), Tucker-Lewis Index (TLI) (greater than 0.90), Root Mean Square Error of Approximation (RMSEA) (less than 0.07), and Standardized Root Mean Square Residual (SRMR) (less than 0.08).
Convergent validity was assessed using the factor loadings obtained from both PCA and CFA, while the results of the model fit test offered insights into the discriminant validity of TPACK. Additionally, standardized factor loadings enabled the calculation of Average Variance Extracted (AVE) and Construct Reliability, two other indicators of aggregate validity. AVE is computed as the sum of the squares of all standardized factor loadings divided by the number of items. Construct reliability is derived by summing the squares of each factor loading and the error variance terms for each factor [59]. To evaluate discriminant validity, we compared the square root of AVEs with the estimated correlations between factors.
If the measurement model demonstrated an acceptable fit, we proceeded to conduct a higher-order Confirmatory Factor Analysis to address the second research question. This involved testing the goodness-of-fit of the “second-order” structure of the TPACK seven-factor model to ascertain whether the data supported Mishra and Koehler’s model. The assessment criteria and procedures remained consistent with those outlined in the first research question, following the guidelines provided by [59].
To address the second research question, we employed a Structural Equation Model (SEM) after hypothesis formulation and confirmation of the measurement model based on our literature review. The SEM entailed a path analysis to investigate the relationships between the seven TPACK model variables and determine the influence of independent variables on dependent variables. In addition to fulfilling the model fit criteria stipulated by Hair et al. [59], we assessed whether the structural model met specific assumptions by scrutinizing path coefficients. Path coefficients, representing standardized regression coefficients (beta), indicated a positive influence relationship when greater than 0 and a negative influence relationship when the opposite was true. To evaluate the measurement relationship of the model, we looked for significant Pearson correlation values and standardized factor loading coefficients above 0.5, which are indicative of a robust measurement relationship.
For the third research question, we conducted a descriptive statistical analysis to gauge the average level of data-logging TPACK among pre-service chemistry teachers. Subsequently, we utilized a t-test to explore significant differences in each data-logging TPACK component between different genders. Furthermore, a Pearson correlation analysis was performed to examine the association between the scores of each data-logging TPACK component and the scores of the chemistry instructional design thinking gauge. All analyses were conducted using widely recognized data analysis software, specifically SPSS (Statistical Package for Social Sciences) 25.0 and AMOS (Analysis of Moment Structures) 22.0.

5. Results and Discussions

5.1. Reliability and Validity of Data-Logging TPACK Survey

To examine the first-order structure of TPACK, we conducted an exploratory factor analysis utilizing principal component analysis with the variance maximum rotation method, employing SPSS 25.0 software. This analysis encompassed the dataset comprising 25 items from the TPACK survey. Outlying items were removed from consideration, resulting in the identification of seven distinct factors. To ascertain the appropriateness of the data for factor analysis, we conducted an applicability test for all items. The outcomes revealed a Kaiser–Meyer–Olkin (KMO) value of 0.94 and a highly significant Bartlett’s sphericity test (χ2 = 4320.58, p < 0.001), affirming that the data were well-suited for factor analysis.
Following the removal of items with factor loadings below 0.5, we retained a total of 24 items. The cumulative variance accounted for by the seven factors amounted to approximately 84.47% (individual factor interpretations are presented in Table 2), signifying an interpretative capacity for TPACK. Furthermore, factor loadings for each item ranged from 0.53 to 0.85, all surpassing the 0.50 threshold, thereby confirming acceptable convergent validity at the item level. Additionally, Cronbach’s alpha values for each factor exceeded 0.85 (as detailed in Table 2), indicative of the questionnaire’s outstanding reliability.
To address the second and third research questions, which involved formulating hypotheses and confirming the measurement model based on the literature review, we employed a Structural Equation Model (SEM). This SEM was instrumental in conducting a path analysis and deriving TPACK scores, facilitating the exploration of relationships between the seven variables of the TPACK model and the assessment of how independent variables influenced dependent variables.
In order to ensure the validity of the structural model, it was imperative to assess whether it met the criteria for model fit, in line with the guidelines established by Hair et al. [59]. Furthermore, we scrutinized the path coefficients to determine if the structural model adhered to specific assumptions. These path coefficients represent standardized regression coefficients (betas) and signify either positive or negative influence relationships. A path coefficient greater than 0 indicates a positive influence, while the converse suggests a negative influence.
The measurement relationship within the model was evaluated by examining the significance of Pearson correlation values and the standardized factor loading coefficients. A significant correlation and a standardized factor loading coefficient exceeding 0.5 were indicative of a robust measurement relationship (Table 3). In summary, the utilization of SEM enabled a comprehensive examination of the TPACK model’s variables and their interrelationships while adhering to stringent criteria for model fit and evaluating measurement relationships.
The results of the confirmatory factor analysis revealed significant correlations among the first-order factors of TPACK, illustrating a robust pairwise correlation structure that aligns well with the TPACK model, as depicted in Figure 2. Building upon the seven-factor TPACK model, we constructed and tested a “second-order” structural model employing high-order confirmatory factor analysis. The outcomes of this analysis affirmed that the second-order structural model aligns effectively with the data (Figure 3). This assertion is substantiated by the following fit indices: χ2/df = 2.05 (well below 3.00), RMSEA = 0.07 (below the 0.08 threshold), SRMR = 0.04 (below the 0.08 threshold), IFI = 0.94 (exceeding 0.90), TLI = 0.93 (surpassing 0.90), and CFI = 0.94 (surpassing 0.90).
Furthermore, the load values of each factor in the second-order model range from 0.77 to 0.94, all surpassing the 0.50 threshold. This underscored the robust explanatory power of each first-order factor concerning the higher-order factor, offering strong support for the validity of the “second-order” structural model.

5.2. Descriptive Analysis of Pre-Service Teachers’ TPACK Levels

To further assess the level of each factor, the average scores of participants in the seven factors were calculated (Table 4). Based on the data analysis in the table, using the definition method proposed by Niess et al. [46], all factors of data-logging TPACK for preservice chemistry teachers were found to be above average, with scores averaging around 5. This indicates that the levels of these factors fall between adaptation and exploration, suggesting that preservice chemistry teachers have a solid understanding and competency in data-logging TPACK.
Overall, the teachers’ data-logging TPACK score was 4.95, which falls in the middle range. This could be attributed to the emphasis on integrating data-logging technology with chemical knowledge and teaching methods during the training on data-logging technology. As a result, preservice teachers possess a certain level of awareness and ability to incorporate data-logging technology in chemistry teaching, but it may not be at an advanced level. For instance, in the collected instructional designs, some of them primarily focus on teachers’ explanations without fully engaging students’ active involvement. However, a small number of instructional designs do predict and discuss the experimental results of data-logging technology through group collaboration.
The score for two-dimensional integrated knowledge was higher than that of single-dimensional knowledge, indicating that participants were more confident in their understanding and ability to apply two-dimensional integrated knowledge. For example, in the collected instructional designs, most designers were able to effectively utilize teaching strategies such as POE (Predict, Observe, Explain) and ECIE (Explore, Create, Investigate, Evaluate) to leverage the advantages of data-logging technology in teaching. This could be attributed to the participants’ emphasis on the effective integration of single-dimensional knowledge in courses such as Chemistry Didactics, Chemistry Instructional Design, and Modern Educational Technology. As a result, the participants were well-versed in two-dimensional integrated knowledge such as PCK, TPK, and TCK.
Among the knowledge components of a single dimension, CK received the highest score. Additionally, PCK and TCK, which are integrated with CK, also received higher scores. This suggests that preservice teachers have a higher level of confidence in their professional knowledge and understanding of chemistry. They are also able to provide more in-depth and accurate explanations of chemistry knowledge in instructional design. This could be attributed to the fact that the participants have completed the four major chemistry courses and have developed a deeper understanding of the subject.
In addition, TK scored the lowest of all knowledge. Limited by the integration of data-logging technology, the topics of the instructional design were not innovative, indicating that the participants were not familiar with data-logging technology. In addition, some of the collected instructional designs only use the form of data-logging technology, and the in-depth analysis and value mining are rarely discussed. TK in this study is limited to data-logging technology, and its instrument structure is relatively complex. In addition, previous participants only had preliminary exposure to how to integrate data-logging technology into chemistry classroom and lacked systematic understanding of data-logging technology knowledge and confidence in practical operation. Therefore, some attention should be paid to data-logging technology in the education of preservice chemistry teachers.

5.3. Structural Equation Model of TPACK Componts

In line with both the integrated and transformative perspectives of TPACK, a path model for TPACK can be constructed, as illustrated in Figure 4. In the integrated view, factors such as TK, PK, CK, TCK, TPK, and PCK exhibit direct and positive effects on teachers’ TPACK. Conversely, the transformative view posits that TK, PK, and CK have direct and positive effects on teachers’ TCK, TPK, and PCK, which in turn have direct and positive effects on teachers’ TPACK. This leads to the establishment of the following assumptions, as summarized in Table 5.
The path analysis was executed using AMOS 22.0 software to explore the hypothetical path model of TPACK, as illustrated in Figure 4. The outcomes of this analysis revealed that the revised model aligned well with the data, as indicated by favorable fit indices, including χ2/df, RMSEA, SRMR, TLI, and CFI. This strong alignment underscored the reliability and appropriateness of the structural equation model for rigorous hypothesis testing. The culmination of this analysis is unveiled in Figure 5, where solid lines signify the significant relationships that corroborate earlier research findings. Conversely, dashed lines denote suggested, albeit empirically insignificant, connections unearthed in our study. For a comprehensive overview of these relationships between TPACK dimensions, please consult Table 6, which succinctly summarizes the path coefficients.
These path analysis results served to validate the intricate interplay among various facets of TPACK, lending robust support to our hypotheses that stem from the integrated and transformative perspectives on TPACK. Such insights significantly enhance our comprehension of the factors that shape teachers’ TPACK and bear considerable implications for the advancement of educational practices and professional development efforts.
The data presented above indicate that, with the exception of hypotheses H1 and H10–H12, all other hypotheses have been substantiated. In simpler terms, aside from the insignificant relationships observed between CK and PCK, the remaining pathways in the model align harmoniously with our initial hypotheses. This alignment provides substantial support for the transformative perspective of TPACK and does not lend credence to the integrated perspective. The study findings shed light on the significant impact of the two-dimensional fusion knowledge encompassing TK, CK, and PK on TPACK. These dimensions exerted direct influences on TPACK, thereby strongly underpinning its development. Notably, among these three sources, TK emerged as the most influential, as evidenced by its substantial path coefficient, signifying its paramount impact on shaping TPACK. Additionally, the results revealed that TK and PK directly influenced teachers’ TPK, while TK and CK directly affected teachers’ TCK. Furthermore, PK was observed to exert direct effects on teachers’ TPK. It is important to note that these findings also highlighted interrelationships among the three single-dimensional knowledge factors (CK, PK, and TK).
Nevertheless, it is imperative to highlight that our investigation revealed a lack of a significant relationship between CK and PCK. This finding suggests that teachers did not perceive a substantial impact of CK on their PCK. Moreover, the direct effect of CK on TPACK was also found to be statistically insignificant. Several plausible explanations may elucidate this lack of significance. First and foremost, it is conceivable that CK alone does not suffice for the cultivation of PCK. Mere possession of CK, in this context, does not necessarily translate to proficiency in employing teaching strategies tailored to convey that knowledge effectively (PCK). This discovery resonates with prior studies that similarly failed to establish a predictive link between CK and PCK among Chinese preservice teachers [26,33]. Second, disparities in outcomes may be attributed to variations in item design. In our study, the CK items in the survey placed a greater emphasis on chemical thinking methodologies and subject comprehension, which diverged from the CK items emphasized in other investigations [6,32]. These distinct emphases within CK items can yield varying impacts on PCK, potentially accounting for the divergence between our findings and those of certain previous studies [6,32]. Lastly, it is worth considering that the incongruity between our research findings and the hypothesized model could be attributed, at least in part, to the relatively modest sample size employed in our study. Random factors may introduce some degree of error into research findings, and the observed inconsistency may, in part, be an outcome of this limitation.

5.4. T-Test of Gender Differences

Among the 181 preservice teachers who took part in this study, 46 were male, while 135 were female. The research explored the participants’ data-logging TPACK and explored potential variations based on their gender. The findings for each factor are meticulously detailed in Table 7. Furthermore, the study conducted an in-depth analysis of discrepancies between these groups, encompassing not only the overall scale but also factors such as CK, PK, TK, PCK, TPK, TCK, and the holistic TPACK construct (Table 7).
As detailed in Table 7, both male students (with an average score of M = 5.21) and female students (with an average score of M = 4.92) displayed elevated levels of proficiency in the comprehensive TPACK scale. A t-test analysis was executed to ascertain the statistical significance of the disparity between these two groups, resulting in a non-significant outcome (t = 0.424, p > 0.05). In simpler terms, gender did not emerge as a significant determining factor in relation to data-logging TPACK proficiency. Both female and male preservice teachers reported similar levels of effectiveness in this regard.
Moreover, there were no significant differences in mean scores between male and female preservice teachers concerning factors such as PK, TPK, TCK, and the holistic TPACK construct (p > 0.05). This suggests that both female and male preservice teachers exhibited comparable elevated levels of proficiency in these facets, akin to their data-logging TPACK effectiveness.
However, gender disparities were evident in the areas of CK, TK, and PCK (as outlined in Table 7). Specifically, male preservice teachers demonstrated a heightened level of competence in CK, TK, and PCK compared to their female counterparts. This disparity may be influenced by gender-related self-efficacy and cognitive styles, with male preservice teachers generally exhibiting greater confidence in their knowledge, particularly in technology-related domains.

5.5. Correlation Analysis of Design Thinking and TPACK

A correlation analysis was conducted to explore the relationship between design thinking and the seven facets of data-logging TPACK, as presented in Table 8. The analysis unveiled a noteworthy positive correlation between the seven TPACK components and design thinking. This observation suggested that the development of data-logging TPACK in preservice chemistry teachers plays a pivotal role in enhancing their capacity to employ higher-order thinking skills, enabling the integration of knowledge from diverse dimensions when designing courses. Additionally, the cultivation of thinking skills during the design refinement and reflection process contributes significantly to elevating the data-logging TPACK proficiency of preservice chemistry teachers.
Among the seven dimensions within the TPACK model, it is worth noting that the correlation coefficient between Pedagogical Knowledge (PK), TPACK, and design thinking (DT) exceeded 0.7, signifying a particularly strong connection. This underscores the crucial role of design thinking in fostering the development of teaching and learning principles, practices, and strategies. Moreover, it underscores the effective utilization of data-logging technology in chemistry instruction. Emphasizing the theory and methodology of “teaching and learning” becomes imperative, as it propels instructional design thinking to a higher echelon. Simultaneously, the advancement of knowledge concerning teaching principles and strategies, coupled with the adept use of data-logging technology in chemistry instruction, greatly augments educators’ design thinking acumen.
In interviews conducted with participants, they conveyed that TPACK facilitated a more systematic approach to instructional design. When crafting courseware, teachers could strategically fuse essential knowledge with related expansions, thereby enriching the teaching content and fostering critical thinking within the classroom. This, in turn, improved the overall quality of lessons. Furthermore, the utilization of technology paved the way for the creation of self-directed learning materials, facilitating systematic comparisons of chemical knowledge both horizontally and vertically. This approach further fostered systematic thinking within the realm of chemistry education.

6. Conclusions and Implications

This study aimed to explore the structure, developmental pathway, and influential factors within the data-logging TPACK framework. The study yielded three key findings. First, it successfully established acceptable levels of both convergence validity and discriminant validity, substantiating the validity of the seven-factor TPACK model. This validation provides robust evidence for the soundness of this model’s construction. Second, the results of the structural equation model analysis unveiled that the developmental trajectory of data-logging TPACK among preservice chemistry teachers aligns harmoniously with the transformative perspective. However, it was noted that CK did not exert a direct influence on PCK, a finding congruent with the views expressed by Chai [26] and Dong [33]. Third, this study revealed that the level of data-logging TPACK of preservice chemistry teachers exceeded a moderate threshold. Additionally, it found a mutually reinforcing relationship between data-logging TPACK and design thinking. Notably, gender did not emerge as a significant variable with regard to TPACK proficiency.
By successfully bridging the subject specialization in chemistry with the technical domain of data-logging technology, this research significantly contributes to our understanding of the TPACK framework’s structure, developmental trajectory, and influential factors. It lays a robust theoretical foundation for the sustainable growth of teachers’ proficiency in integrating professional technology into their pedagogy. Furthermore, it assists preservice teachers in forging meaningful connections between diverse knowledge domains.

6.1. Deepen the Basic Research of TPACK

Suggestions for basic research on data-logging TPACK include the following three points. First, there is a need to examine the data-logging TPACK of teachers at different stages of development. The relationship between various factors of TPACK is complex, and this study focused on preservice teachers. It is important to investigate whether the seven-factor model of TPACK is applicable to teachers at different stages beyond preservice teachers and further validate its development. Second, it is important to explore the TPACK of teachers with different levels of technical expertise. This study only included data-logging technology knowledge, but it would be valuable to investigate whether the development path of TPACK differs under other technologies. This can be explored through adapted questionnaires and further research. Third, efforts should be made to improve the validity of the questionnaire. The positive self-reporting scale items used in this study may have influenced participants’ responses and affected the validity. Additionally, the varying number of items for each factor in the questionnaire resulted in different levels of accuracy. Adjustments can be made in the questionnaire design to minimize positive suggestions and ensure validity. Furthermore, it would be beneficial to unify the number of items for each factor to ensure consistency and accuracy. Moreover, when optimizing the questionnaire design, it is important to clarify the concepts corresponding to the seven factors in TPACK. This will help participants accurately distinguish these factors and express subject and technology specificity in a specific and rigorous manner. Controlling the total number of items will also contribute to achieving good reliability and validity. These considerations can serve as a reference for adapting questionnaires in related research. Lastly, follow-up research can consider incorporating qualitative research methods such as classroom observation, in-depth interviews, or collection of instructional designs and teaching reflections [61]. This approach can provide a bottom-up summary of the structural model and development path of TPACK, as well as explore strategies to promote the generation of TPACK.

6.2. Develop the Research on the Current Status of TPACK

The suggestions for researching the current state of data-logging TPACK include the following three points. First, the participants and region selected in this study had certain limitations, and the research time was before the preservice teachers’ practice, which failed to represent the actual situation of all preservice chemistry teachers. In the follow-up study, samples can be taken from many places, questionnaires of this study can be improved, and preservice teachers at different educational stages or in-service teachers with different attributes can be examined and compared, so as to make the data more representative [45]. Second, the relationship between chemistry teachers’ TPACK and their technological environment support and teachers’ concept can be further quantitatively investigated [49,54]. Third, this study mainly adopted quantitative research methods, and the subsequent research can use gauges to assign scores to the teaching design [61], so as to make the measurement more comprehensive, realize the mutual verification of data, and enhance the reliability and validity of the research.

6.3. Improving Teacher Education on TPACK

The suggestions for chemistry teacher education primarily include the following three points. First, there is a need to adjust the cultivation system that focuses solely on single-dimensional knowledge. In this study, the score for single-dimensional knowledge was significantly lower compared to two-dimensional composite knowledge. This could be attributed to the depth of chemistry knowledge and theoretical teaching knowledge, which may lead to the neglect of technological knowledge. To address this, chemistry specialty courses can consciously establish the relationship between professional knowledge, teaching knowledge, and technological knowledge. For example, combining college chemistry knowledge with middle school chemistry knowledge can help clarify concepts. In chemistry specialty experiment courses, sensors such as pH meters and conductance meters can be associated with data-logging technology knowledge. In education courses, relevant knowledge can be analyzed and practiced in depth, incorporating micro-practice and heterogeneous sharing within the same class. Emphasizing the study of technology courses and improving curriculum and material settings, as well as practical teaching scenarios, is also crucial. Schools can increase technology investments to enhance the hardware equipment for data-logging technology experiments, providing more practical teaching opportunities for preservice chemistry teachers.
Second, it is important to strengthen the cultivation of integrated knowledge. Normal university students can be guided to develop an organic integration of TPACK based on the seven-factor model. Ideas of integrating CK, PK, and TK can be infused into the classroom to deepen the understanding of TPACK among preservice chemistry teachers. In course learning, situations that promote the integration of CK, PK, and TK can be created. For instance, in courses related to data-logging technology experiments, students can be guided to not only learn about experimental principles and operations but also to think about how to leverage the advantages of data-logging technology to innovate and enhance chemistry teaching through classroom interactions, homework assignments, and the development of complete instructional designs.
Third, it is essential to focus on the development of design thinking. Considering the mutual promotion between the seven factors of TPACK and design thinking, the TPACK level of preservice chemistry teachers can be enhanced through the manifestation and deepening of design thinking. In courses such as chemistry teaching theory and chemistry instructional design, college teachers can adopt a cooperative learning approach. Each group can select teaching content and teaching strategies, integrate appropriate technology, and engage in instructional design, display, and reporting. This process can follow a cycle of “design-practice-reflection-redesign-practice”. Within their groups, preservice teachers can deepen their design thinking through communication and reflection on their designs. They can then broaden their thinking through learning and evaluation with other groups. This approach helps preservice chemistry teachers continuously improve their TPACK level.

7. Limitations and Future Research

It is important to note that the study may not have ensured the representation of all preservice chemistry teachers in China. The participants in this study were selected from a single school and grade using a convenient sampling method, which may have limited regional representation. Additionally, the sample size was relatively small, with only 181 subjects, which may not accurately reflect the overall situation of all preservice chemistry teachers in China. Furthermore, the study was conducted in the second semester of the junior year, and the participants had not yet completed all teacher-related courses or gained practical teaching experience. Therefore, the data may not fully reflect the TPACK level of preservice chemistry teachers after they have completed their entire teacher education program. We also acknowledge that the scope of technological knowledge in this study was limited to data-logging technology, which may restrict the generalizability of the findings. While data-logging technology is indeed a significant modern technology in chemistry experiment teaching, not all teacher education programs include it in their curriculum. As a result, the level of preservice teachers’ data-logging TPACK may be closely tied to the specific courses they have taken. Moreover, apart from data-logging technology, there are various other information technology applications such as multimedia equipment and network resources that are utilized in chemistry teaching. However, these technologies were not included in this study. Therefore, the conclusions drawn from this study can only be applied to the TPACK level of preservice chemistry teachers in relation to data-logging technology and may not be generalizable to other technologies. Further research is needed to explore the TPACK measurement for other technology-specific domains. This study primarily measured the level of data-logging technology TPACK among preservice teachers using quantitative and self-report methods, which may require further validation. The questionnaire used in this study had a certain degree of positive bias, which means that the research results were greatly influenced by the subjective perspectives of the participants. This may have an impact on the validity of the study. Additionally, the number of items for each factor in the questionnaire was different, leading to differences in the accuracy of each item. This issue should be further addressed and optimized in future studies. Moreover, this study focused on the declarative level of data-logging technology TPACK and did not assess the performance-based TPACK level of preservice teachers through interviews, instructional design evaluations, teaching records, or other materials. The mutual verification of data was not achieved in this study, which may limit the comprehensive understanding of preservice teachers’ TPACK level in relation to data-logging technology.

Author Contributions

Writing—original draft, F.D. and W.L.; Writing—review & editing, D.S. and Z.Z. All authors have read and agreed to the published version of the manuscript.

Funding

This research received no external funding.

Institutional Review Board Statement

Not applicable.

Informed Consent Statement

Informed consent was obtained from all the participants.

Data Availability Statement

Not applicable.

Conflicts of Interest

The authors declare no conflict of interest.

References

  1. Mishra, P.; Koehler, M.J. Technological pedagogical content knowledge: A framework for teacher knowledge. Teach. Coll. Rec. 2006, 108, 1017–1054. [Google Scholar] [CrossRef]
  2. Max, A.L.; Lukas, S.; Weitzel, H. The pedagogical makerspace: Learning opportunity and challenge for prospective teachers’ growth of TPACK. Br. J. Educ. Technol. 2023. [Google Scholar] [CrossRef]
  3. Njiku, J. Assessing the development of mathematics teachers TPACK through an observation rubric. Educ. Inf. Technol. 2023, 1–24. [Google Scholar] [CrossRef]
  4. Kadıoğlu-Akbulut, C.; Cetin-Dindar, A.; Acar-Şeşen, B.; Küçük, S. Predicting preservice science teachers’ TPACK through ICT usage. Educ. Inf. Technol. 2023, 28, 11269–11289. [Google Scholar] [CrossRef] [PubMed]
  5. Saubern, R.; Henderson, M.; Heinrich, E.; Redmond, P. TPACK–time to reboot? Australas. J. Educ. Technol. 2020, 36, 1–9. [Google Scholar] [CrossRef]
  6. Schmid, M.; Brianza, E.; Petko, D. Self-reported technological pedagogical content knowledge (TPACK) of pre-service teachers in relation to digital technology use in lesson plans. Comput. Hum. Behav. 2021, 115, 106586. [Google Scholar] [CrossRef]
  7. Wu, Y.T.; Chai, C.S.; Wang, L.J. Exploring secondary school teachers’ TPACK for video-based flipped learning: The role of pedagogical beliefs. Educ. Inf. Technol. 2022, 27, 8793–8819. [Google Scholar] [CrossRef]
  8. Luo, S.; Zou, D. A systematic review of research on technological, pedagogical, and content knowledge (TPACK) for online teaching in the humanities. J. Res. Technol. Educ. 2022, 1–15. [Google Scholar] [CrossRef]
  9. Voithofer, R.; Nelson, M.J. Teacher educator technology integration preparation practices around TPACK in the United States. J. Teach. Educ. 2021, 72, 314–328. [Google Scholar] [CrossRef]
  10. Zou, D.; Huang, X.; Kohnke, L.; Chen, X.; Cheng, G.; Xie, H. A bibliometric analysis of the trends and research topics of empirical research on TPACK. Educ. Inf. Technol. 2022, 27, 10585–10609. [Google Scholar] [CrossRef]
  11. Valtonen, T.; Leppänen, U.; Hyypiä, M.; Sointu, E.; Smits, A.; Tondeur, J. Fresh perspectives on TPACK: Pre-service teachers’ own appraisal of their challenging and confident TPACK areas. Educ. Inf. Technol. 2020, 25, 2823–2842. [Google Scholar] [CrossRef]
  12. Castéra, J.; Marre, C.C.; Yok Margaret, C.K.; Kezang, S.; Impedovo, M.A.; Tago, S.; Pedregosa, A.D.; Malik, S.K.; Armand, H. Self-reported TPACK of teacher educators across six countries in Asia and Europe. Educ. Inf. Technol. 2020, 25, 3003–3019. [Google Scholar] [CrossRef]
  13. Choi, B.; Young, M.F. TPACK-L: Teachers’ pedagogical design thinking for the wise integration of technology. Technol. Pedagog. Educ. 2021, 30, 217–234. [Google Scholar] [CrossRef]
  14. Özgür, H. Relationships between teachers’ technostress, technological pedagogical content knowledge (TPACK), school support and demographic variables: A structural equation modeling. Comput. Hum. Behav. 2020, 112, 106468. [Google Scholar] [CrossRef]
  15. Yildiz Durak, H.; Atman Uslu, N.; Canbazoğlu Bilici, S.; Güler, B. Examining the predictors of TPACK for integrated STEM: Science teaching self-efficacy, computational thinking, and design thinking. Educ. Inf. Technol. 2023, 28, 7927–7954. [Google Scholar] [CrossRef]
  16. Koehler, M.; Mishra, P. What is technological pedagogical content knowledge (TPACK)? Contemp. Issues Technol. Teach. Educ. 2009, 9, 60–70. [Google Scholar] [CrossRef]
  17. Cox, S.; Graham, C.R. Diagramming TPACK in Practice: Using an Elaborated Model of the TPACK Framework to Analyze and Depict Teacher Knowledge. Techtrends Tech. Trends 2009, 53, 60–69. [Google Scholar] [CrossRef]
  18. Krauskopf, K.; Zahn, C.; Hesse, F.W. Leveraging the affordances of Youtube: The role of pedagogical knowledge and mental models of technology functions for lesson planning with technology. Comput. Educ. 2012, 58, 1194–1206. [Google Scholar] [CrossRef]
  19. Voogt, J.; Fisser, P.; Pareja Roblin, N.; Tondeur, J.; van Braak, J. Technological pedagogical content knowledge–a review of the literature. J. Comput. Assist. Learn. 2013, 29, 109–121. [Google Scholar] [CrossRef]
  20. Schmidt, D.A.; Baran, E.; Thompson, A.D.; Mishra, P.; Koehler, M.J.; Shin, T.S. Technological pedagogical content knowledge (TPACK) the development and validation of an assessment instrument for preservice teachers. J. Res. Technol. Educ. 2009, 42, 123–149. [Google Scholar] [CrossRef]
  21. Sahin, I. Development of Survey of Technological Pedagogical and Content Knowledge (TPACK). Turk. Online J. Educ. Technol. 2011, 10, 97–105. [Google Scholar]
  22. Koh, J.H.L.; Chai, C.S.; Tsai, C.C. Examining the technological pedagogical content knowledge of Singapore pre-service teachers with a large-scale survey. J. Comput. Assist. Learn. 2010, 26, 563–573. [Google Scholar] [CrossRef]
  23. Archambault, L.M.; Barnett, J.H. Revisiting technological pedagogical content knowledge: Exploring the TPACK framework. Comput. Educ. 2010, 55, 1656–1662. [Google Scholar] [CrossRef]
  24. Lee, M.H.; Tsai, C.C. Exploring teachers’ perceived self efficacy and technological pedagogical content knowledge with respect to educational use of the World Wide Web. Instr. Sci. 2010, 38, 1–21. [Google Scholar] [CrossRef]
  25. Ong, Q.K.L.; Annamalai, N. Technological pedagogical content knowledge for twenty-first century learning skills: The game changer for teachers of industrial revolution 5.0. Educ. Inf. Technol. 2023, 1–42. [Google Scholar] [CrossRef]
  26. Chai, C.S.; Koh, J.H.L.; Tsai, C.C. A review of technological pedagogical content knowledge. J. Educ. Technol. Soc. 2013, 16, 31–51. Available online: http://www.jstor.org/stable/jeductechsoci.16.2.31 (accessed on 25 September 2023).
  27. Archambault, L.; Crippen, K. Examining TPACK among K-12 online distance educators in the United States. Contemp. Issues Technol. Teach. Educ. 2009, 9, 71–88. [Google Scholar]
  28. Graham, C.R. Theoretical considerations for understanding technological pedagogical content knowledge (TPACK). Comput. Educ. 2011, 57, 1953–1960. [Google Scholar] [CrossRef]
  29. Angeli, C.; Valanides, N. Preservice elementary teachers as information and communication technology designers: An instructional systems design model based on an expanded view of pedagogical content knowledge. J. Comput. Assist. Learn. 2005, 21, 292–302. [Google Scholar] [CrossRef]
  30. Jang, S.J.; Chen, K.C. From PCK to TPACK: Developing a transformative model for pre-service science teachers. J. Sci. Educ. Technol. 2010, 19, 553–564. [Google Scholar] [CrossRef]
  31. Jin, Y. The nature of TPACK: Is TPACK distinctive, integrative or transformative? In Proceedings of the Society for Information Technology & Teacher Education International Conference, Las Vegas, NV, USA, 18 March 2019; Association for the Advancement of Computing in Education (AACE): Waynesville, NC, USA, 2019; pp. 2199–2204. [Google Scholar]
  32. Celik, I.; Sahin, I.; Akturk, A.O. Analysis of the relations among the components of technological pedagogical and content knowledge (TPACK): A structural equation model. J. Educ. Comput. Res. 2014, 51, 1–22. [Google Scholar] [CrossRef]
  33. Dong, Y.; Chai, C.S.; Sang, G.Y.; Koh, J.H.L.; Tsai, C.C. Exploring the profiles and interplays of pre-service and in-service teachers’ technological pedagogical content knowledge (TPACK) in China. J. Educ. Technol. Soc. 2015, 18, 158–169. Available online: http://www.jstor.org/stable/jeductechsoci.18.1.158 (accessed on 25 September 2023).
  34. Pamuk, S.; Ergun, M.; Cakir, R.; Yilmaz, H.B.; Ayas, C. Exploring relationships among TPACK components and development of the TPACK instrument. Educ. Inf. Technol. 2015, 20, 241–263. [Google Scholar] [CrossRef]
  35. Abbitt, J.T. Measuring technological pedagogical content knowledge in preservice teacher education: A review of current methods and instruments. J. Res. Technol. Educ. 2011, 43, 281–300. [Google Scholar] [CrossRef]
  36. Bilici, S.C.; Yamak, H.; Kavak, N.; Guzey, S.S. Technological Pedagogical Content Knowledge Self-Efficacy Scale (TPACK-SeS) for Pre-Service Science Teachers: Construction, Validation, and Reliability. Eurasian J. Educ. Res. 2013, 52, 37–60. [Google Scholar]
  37. Koehler, M.J.; Shin, T.S.; Mishra, P. How do we measure TPACK? Let me count the ways. In Educational Technology, Teacher Knowledge, and Classroom Impact: A Research Handbook on Frameworks and Approaches; IGI Global: Hershey, PA, USA, 2012; pp. 16–31. [Google Scholar]
  38. Willermark, S. Technological pedagogical and content knowledge: A review of empirical studies published from 2011 to 2016. J. Educ. Comput. Res. 2018, 56, 315–343. [Google Scholar] [CrossRef]
  39. Valtonen, T.; Sointu, E.; Kukkonen, J.; Kontkanen, S.; Lambert, M.C.; Mäkitalo-Siegl, K. TPACK updated to measure pre-service teachers’ twenty-first century skills. Australas. J. Educ. Technol. 2017, 33. [Google Scholar] [CrossRef]
  40. Mourlam, D.; Chesnut, S.; Bleecker, H. Exploring preservice teacher self-reported and enacted TPACK after participating in a learning activity types short course. Australas. J. Educ. Technol. 2021, 37, 152–169. [Google Scholar] [CrossRef]
  41. von Kotzebue, L. Beliefs, Self-reported or Performance-Assessed TPACK: What Can Predict the Quality of Technology-Enhanced Biology Lesson Plans? J. Sci. Educ. Technol. 2022, 31, 570–582. [Google Scholar] [CrossRef]
  42. Cautin, R.L.; Lilienfeld, S.O. The Encyclopedia of Clinical Psychology; John Wiley & Sons: New York, NY, USA, 2015; Volume 5. [Google Scholar]
  43. Chai, C.S.; Koh, J.H.L.; Tsai, C.C. Exploring the factor structure of the constructs of technological, pedagogical, content knowledge (TPACK). Asia-Pac. Educ. Res. 2011, 20, 595–603. [Google Scholar]
  44. Lachner, A.; Fabian, A.; Franke, U.; Preiß, J.; Jacob, L.; Führer, C.; Thomas, P. Fostering pre-service teachers’ technological pedagogical content knowledge (TPACK): A quasi-experimental field study. Comput. Educ. 2021, 174, 104304. [Google Scholar] [CrossRef]
  45. Yeh, Y.F.; Lin, T.C.; Hsu, Y.S.; Wu, H.K.; Hwang, F.K. Science teachers’ proficiency levels and patterns of TPACK in a practical context. J. Sci. Educ. Technol. 2015, 24, 78–90. [Google Scholar] [CrossRef]
  46. Niess, M.; Browning, C.; Driskell, S.; Johnston, C.; Harrington, R. Mathematics teacher TPACK standards and revising teacher preparation. In Proceedings of the Society for Information Technology & Teacher Education International Conference, Charleston, SC, USA, 2 March 2009; Association for the Advancement of Computing in Education (AACE): Waynesville, NC, USA, 2009; pp. 3588–3601. [Google Scholar]
  47. Newton, L.R. Data-logging in practical science: Research and reality. Int. J. Sci. Educ. 2000, 22, 1247–1259. [Google Scholar] [CrossRef]
  48. Cetin-Dindar, A.; Boz, Y.; Sonmez, D.Y.; Celep, N.D. Development of pre-service chemistry teachers’ technological pedagogical content knowledge. Chem. Educ. Res. Pract. 2018, 19, 167–183. [Google Scholar] [CrossRef]
  49. Deng, F.; Chai, C.S.; Tsai, C.C.; Lee, M.H. The relationships among Chinese practicing teachers’ epistemic beliefs, pedagogical beliefs and their beliefs about the use of ICT. J. Educ. Technol. Soc. 2014, 17, 245–256. [Google Scholar]
  50. Ifinedo, E.; Rikala, J.; Hämäläinen, T. Factors affecting Nigerian teacher educators’ technology integration: Considering characteristics, knowledge constructs, ICT practices and beliefs. Comput. Educ. 2020, 146, 103760. [Google Scholar] [CrossRef]
  51. Tsai, C.C.; Chai, C.S. The “third”-order barrier for technology-integration instruction: Implications for teacher education. Australas. J. Educ. Technol. 2012, 28. [Google Scholar] [CrossRef]
  52. Lu, G.; Liu, Q.; Xie, K.; Long, T.; Zheng, X. Quality or Quantity: How Do Teachers’ Knowledge and Beliefs Persuade Them to Engage in Technology Integration in a Massive Government-Led Training Programme? Asia-Pac. Educ. Res. 2023, 32, 459–471. [Google Scholar] [CrossRef]
  53. Weidlich, J.; Kalz, M. How well does teacher education prepare for teaching with technology? A TPACK-based investigation at a university of education. Eur. J. Teach. Educ. 2023, 1–21. [Google Scholar] [CrossRef]
  54. Mueller, J.; Wood, E.; Willoughby, T.; Ross, C.; Specht, J. Identifying discriminating variables between teachers who fully integrate computers and teachers with limited integration. Comput. Educ. 2008, 51, 1523–1537. [Google Scholar] [CrossRef]
  55. Kale, U.; Roy, A.; Yuan, J. To design or to integrate? Instructional design versus technology integration in developing learning interventions. Educ. Technol. Res. Dev. 2020, 68, 2473–2504. [Google Scholar] [CrossRef]
  56. Baran, E.; AlZoubi, D. Design thinking in teacher education: Morphing preservice teachers’ mindsets and conceptualizations. J. Res. Technol. Educ. 2023, 1–19. [Google Scholar] [CrossRef]
  57. Ericson, J.D. Mapping the relationship between critical thinking and design thinking. J. Knowl. Econ. 2022, 13, 406–429. [Google Scholar] [CrossRef]
  58. Chen, N.; Hong, H.Y.; Chai, C.S.; Liang, J.C. Highlighting ECE Teachers’ Proximal Processes as Designers: An Investigation of Teachers’ Design Thinking Engagement, TPACK Efficacy, and Design Vitality. Early Educ. Dev. 2023, 1–20. [Google Scholar] [CrossRef]
  59. Hair, J.F.; Black, W.C.; Babin, B.J.; Anderson, R.E. Multivariate Data Analysis, 7th ed.; Prentice Hall: Englewood Cliff, NJ, USA, 2010. [Google Scholar]
  60. Kline, R.B.; Santor, D.A. Principles & practice of structural equation modelling. Can. Psychol. 1999, 40, 381. [Google Scholar]
  61. Hofer, M.; Grandgenett, N. TPACK development in teacher education: A longitudinal study of preservice teachers in a secondary MA Ed. program. J. Res. Technol. Educ. 2012, 45, 83–106. [Google Scholar] [CrossRef]
Figure 1. TPACK framework.
Figure 1. TPACK framework.
Sustainability 15 15441 g001
Figure 2. First-order pairwise correlation structure diagram of data-logging TPACK.
Figure 2. First-order pairwise correlation structure diagram of data-logging TPACK.
Sustainability 15 15441 g002
Figure 3. Second-order structure diagram of data-logging TPACK.
Figure 3. Second-order structure diagram of data-logging TPACK.
Sustainability 15 15441 g003
Figure 4. TPACK frame structural equation model for data-logging technology.
Figure 4. TPACK frame structural equation model for data-logging technology.
Sustainability 15 15441 g004
Figure 5. TPACK developing framework.
Figure 5. TPACK developing framework.
Sustainability 15 15441 g005
Table 1. TPACK component definition and examples.
Table 1. TPACK component definition and examples.
TPACK ComponentDefinitionExample
CKTeachers’ knowledge about the subject matter to be learned or taught.Chemistry teachers should have enough knowledge of chemistry.
PKTeachers’ deep knowledge about the processes and practices or methods of teaching and learning.Such as classroom management, student learning style and student evaluation.
TKTeachers’ knowledge about applying technology tools and resources.Such as data-logging technology, the Internet, the use of computer software, and hardware.
PCKTeachers’ knowledge of pedagogy that is applicable to the teaching of specific content. (Excluding TK)Such as finding the best way to help students understand the concept of chemical equilibrium.
TCKTeachers’ knowledge about the manner in which technology and content are reciprocally related. (Excluding PK)Such as allowing students to use data-logging technology to record data and to explore the influencing factors of chemical reactions.
TPKTeachers’ knowledge of the existence, components, and capabilities of various technologies as they are used in teaching and learning settings, and conversely, knowing how teaching might change as the result of using particular technologies. (Excluding CK)Such as data-logging technology used in teaching and students’ learning evaluation.
TPACKTeachers’ knowledge emerges from interactions among content, pedagogy, and technology knowledge.Such as the application of data-logging technology, specifically for salt hydrolysis, in the context of inquisitive teaching.
Table 2. Results of exploratory factor analysis, confirmatory factor analysis, and reliability analysis.
Table 2. Results of exploratory factor analysis, confirmatory factor analysis, and reliability analysis.
FactorItemEFA LoadingCFA LoadingαInterpretation Rate/%
CKI have sufficient knowledge of chemistry.0.850.750.8711.4
I can think in a chemical way.0.790.86
I have many approaches to present my understanding of chemistry.0.700.85
PKI can design effective learning tasks to extend students’ thinking.0.770.770.867.3
I can guide students to adopt appropriate learning strategies.0.580.86
I know how to choose effective teaching methods to guide students to study and think about the teaching content.0.530.87
TKI can recognize and correctly use common sensors with data-logging.0.750.820.9111.3
I can properly use the data collector of the data-logging.0.750.94
I can properly use the data processing software of data-logging.0.790.88
PCKEven without using data-logging, I can help students understand chemistry knowledge in other ways.0.760.890.9116.3
Even without using data-logging, I can help students overcome chemistry myths.0.880.88
Even without using data-logging, I can guide students to construct chemistry knowledge.0.820.93
Even without using data-logging, I can guide students to solve real problems in chemistry.0.830.87
TCKI can conduct chemical experiments using data-logging.0.750.840.9411.1
I can explore chemistry principle using data-logging.0.670.91
I can select the appropriate data-logging sensor based on the specific chemical content.0.710.87
TPKI know how to use data-logging to optimize classroom instruction.0.700.870.929.6
I can design student learning tasks with data-logging.0.700.89
I can use data-logging to help students understand concepts or principles.0.650.91
TPACKI can design topics related to chemistry content and facilitate student autonomy through appropriate data-logging.0.600.870.9517.4
I can create real problem situations related to chemistry knowledge and use data-logging to engage students in learning.0.740.90
I can plan activities around chemistry content and help students use data-logging for collaborative learning.0.770.87
I can design chemical inquiry activities based on data-logging to help students understand chemical knowledge or principles.0.790.91
I can integrate chemistry content, data-logging and teaching methods to design student-centered instruction.0.720.87
Notes: AVE = ∑λ2/n, n is the number of items per construct; Construct Reliability = (∑λ)2/((∑λ)2 + (∑(1 − λ2)).
Table 3. Results of differentiate validity.
Table 3. Results of differentiate validity.
CKPKTKPCKTPKTCKTPACK
CK0.82
PK0.79 **0.83
TK0.60 **0.69 **0.88
PCK0.66 **0.79 **0.59 **0.89
TPK0.59 **0.73 **0.73 **0.62 **0.89
TCK0.58 **0.79 **0.79 **0.61 **0.84 **0.87
TPACK0.66 **0.73 **0.73 **0.65 **0.86 **0.82 **0.88
Note: The diagonal line is the AVE arithmetic square root; ** p < 0.01.
Table 4. The level of chemistry preservice teachers’ data-logging TPACK.
Table 4. The level of chemistry preservice teachers’ data-logging TPACK.
FactorCKPKTKPCKTPKTCKTPACK
Mean5.024.904.885.145.005.174.95
S.D.0.960.840.940.970.970.990.97
Table 5. Hypotheses for the TPACK Development Model.
Table 5. Hypotheses for the TPACK Development Model.
Hypothesis No.Hypothesis
H1CK has a direct and positive effect on PCK
H2CK has a direct and positive effect on TCK
H3PK has a direct and positive effect on PCK
H4PK has a direct and positive effect on TPK
H5TK has a direct and positive effect on TCK
H6TK has a direct and positive effect on TPK
H7PCK has a direct and positive effect on TPACK
H8TCK has a direct and positive effect on TPACK
H9TPK has a direct and positive effect on TPACK
H10PK has a direct and positive effect on TPACK
H11CK has a direct and positive effect on TPACK
H12TK has a direct and positive effect on TPACK
Table 6. Path coefficients of the TPACK structural equation model.
Table 6. Path coefficients of the TPACK structural equation model.
Hypothesis PathEstimatep ValuesFit or Not
H1CK→PCK//No
H2CK→TCK0.19**Yes
H3PK→PCK0.73***Yes
H4PK→TPK0.43***Yes
H5TK→TCK0.69***Yes
H6TK→TPK0.44***Yes
H7PCK→TPACK0.17**Yes
H8TCK→TPACK0.28**Yes
H9TPK→TPACK0.53**Yes
H10PK→TPACK//No
H11CK→TPACK//No
H12TK→TPACK//No
Note: *** p < 0.001, ** p < 0.01.
Table 7. Findings regarding the teachers’ levels of data-logging TPACK in terms of gender.
Table 7. Findings regarding the teachers’ levels of data-logging TPACK in terms of gender.
FactorGenderNMSDtp
CKMale465.281.092.14 *0.034
Female1354.930.90
PKMale465.010.930.940.348
Female1354.870.81
TKMale465.141.022.21 *0.029
Female1354.800.90
PCKMale465.461.092.56 *0.011
Female1355.040.91
TPKMale465.161.101.160.252
Female1354.950.92
TCKMale465.361.131.330.187
Female1355.120.94
TPACKMale465.101.081.290.200
Female1354.890.93
TPACK GeneralMale465.210.902.030.066
Female1354.920.75
Note: * p < 0.05.
Table 8. The correlation between design thinking and data-logging TPACK.
Table 8. The correlation between design thinking and data-logging TPACK.
CKPKTKPCKTPKTCKTPACK
DT0.65 **0.79 **0.66 **0.60 **0.66 **0.55 **0.76 **
Note: ** p < 0.01.
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Deng, F.; Lan, W.; Sun, D.; Zheng, Z. Examining Pre-Service Chemistry Teachers’ Technological Pedagogical Content Knowledge (TPACK) of Using Data-Logging in the Chemistry Classroom. Sustainability 2023, 15, 15441. https://doi.org/10.3390/su152115441

AMA Style

Deng F, Lan W, Sun D, Zheng Z. Examining Pre-Service Chemistry Teachers’ Technological Pedagogical Content Knowledge (TPACK) of Using Data-Logging in the Chemistry Classroom. Sustainability. 2023; 15(21):15441. https://doi.org/10.3390/su152115441

Chicago/Turabian Style

Deng, Feng, Wanrong Lan, Daner Sun, and Zhizi Zheng. 2023. "Examining Pre-Service Chemistry Teachers’ Technological Pedagogical Content Knowledge (TPACK) of Using Data-Logging in the Chemistry Classroom" Sustainability 15, no. 21: 15441. https://doi.org/10.3390/su152115441

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop