Next Article in Journal
Implementation of Linear Programming and Decision-Making Model for the Improvement of Warehouse Utilization
Next Article in Special Issue
Components and Indicators of the Robot Programming Skill Assessment Based on Higher Order Thinking
Previous Article in Journal
Design and Fabrication of Broad-Beam Microstrip Antenna Using Parasitic Patches and Cavity-Backed Slot Coupling
Previous Article in Special Issue
Exploring the Innovation Diffusion of Big Data Robo-Advisor
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

A New TPACK Training Model for Tackling the Ongoing Challenges of COVID-19

1
Science Education Center, National Taiwan Normal University, Taipei 116, Taiwan
2
Department of Science Education, National Taipei University of Education, Taipei 106, Taiwan
3
Facultad de Ciencias de la Educación, Universidad Pedagógica y Tecnológica de Colombia, Tunja 150003, Colombia
4
Graduate Institute of Science Education, National Taiwan Normal University, Taipei 116, Taiwan
*
Author to whom correspondence should be addressed.
Appl. Syst. Innov. 2022, 5(2), 32; https://doi.org/10.3390/asi5020032
Submission received: 17 December 2021 / Revised: 12 February 2022 / Accepted: 18 February 2022 / Published: 25 February 2022
(This article belongs to the Special Issue Applied Systems on Emerging Technologies and Educational Innovations)

Abstract

:
This study investigated the effects of integrating the “CloudClassRoom” (CCR) and the DEmo-CO-design/teach-feedback-DEbriefing (DECODE) model to improve pre-service teachers’ online technological pedagogical and content knowledge (TPACK). The DECODE model includes four stages: Teacher’s DEmonstrations, Students CO-train in using CloudClassRoom, Students CO-design a CloudClassRoom-integrated course, Students CO-teach, and finally DE-brief what they have learned through the stages mentioned above. This model integrates teacher-student experiences, teaching-learning processes, and technology-embedded systems to promote collaborative and active learning, information and resources sharing, and creative communication. A self-evaluating questionnaire with open-ended questions evaluated participants’ technological pedagogical and content knowledge outcomes. CloudClassRoom significantly increases technology-related knowledge considering the current social distancing measures provoked by COVID-19. The findings show that DECODE with CloudClassRoom provides an integrated process for improving pre-service teachers’ technological pedagogical and content knowledge, assisting pre-service teachers in designing educational technology-integrated courses.

1. Introduction

In the current context of the COVID-19 crisis, the continuity of the learning processes is a crucial factor in containing the negative effects of social distancing measures. The technological incorporation has been one of the most important resources for developing university education resilience, keeping quality standards, and sustainability of learning processes in and post the COVID-19 crisis [1]. The online classroom systems have supported different strategies to recognize the stakeholders’ situations, promote the teammates’ support, and the university commitment [2]. In addition, various reports express that the current educational contexts originated from social distancing measures, require responses at the public policies level, and research focused on classroom experiences [3]. Therefore, to develop the learning process, this set of conditions entails a systemic response, conscious of the web-based sustainability and online classroom systems [4].

1.1. Online Classroom for Teaching

1.1.1. Definitions of Online Classroom

The online classroom is a web-based virtual learning environment with flexibility, interactivity, interoperability, articulate learning methods based on information technologies and learning theories. They aim to develop a learning scene composed of interactions and synchronization between expressions, course materials and explanations, context-oriented understanding, and timely feedback. The online classroom considers an instructional component related to the teachers’ control over the learning processes; on the other hand, the online classroom is a dynamic learning environment that promotes the online learners’ satisfaction with their learning experiences. Thus, the learning process control corresponds to interactional patterns, guidance, and explanation factors. These processes encourage collaboration and cognitive skills development, establishing a relationship between group assignments, question formulation, and critical thinking ability [5]. Globally, the predominant professional development criteria in the digital economy promote learning experiences linked to needs, jobs, and technologies [6]. This aspect entails a balance between e-education and traditional education, an equilibration influenced by the new technologies, labor market trends, and the digital economy evolution [7].
Traditional learning uses a teacher-centered approach and textbook-oriented methods to understand concepts. Students have a passive role in the teacher-centered approach because information, content, and assessment come from the teacher’s perspective, expectations, and initiative [8,9]. These methods do not provide students with adequate feedback [10]. The online classroom overcomes the reduction of learning methods and processes based on typical learners’ profiles since experimentation practices, exploration contents, and inclusive instruction focus on the different learners’ needs. The online classroom functions impact perception, learning ability, and engagement, seeking to meet learners’ needs (demand) through curriculum planning, design, and assessments [11]. This kind of collaborative learning environment is oriented to generate content-based interactive learning. Active learning complements technologies, learning objectives, needs, and own paces.

1.1.2. Application of Online Classroom

Present-day, online classroom systems are implemented in different areas to reach proficiency and increase the students’ achievements because they offer many resources and possibilities to foster the learners’ engagement. The web-based learning process promotes belonging to a learning community, aligns online curriculum designs with learning demands [12], and complements self-directed learning with collaborative learning [13]. This required flexibility is based on teachers–students’ interaction to promote the learners’ autonomy in the definition of learning goals [14]. Online classrooms highlight students’ dependence on teachers [15] regarding factors like the nature of the content, the infrastructure, the skills, the readiness, and the follow-up [16].
In general terms, the advantages of the online classroom for teaching are related to the flexibility, self-motivation, and engagement for students to achieve different levels of autonomy into the learning processes. Online classroom advantages are connected to flexibility, improved communication, course management, and course design regarding the teacher’s or instructor’s practices. The student-centered approach demands accessibility, affordability, learning pedagogy, immediate feedback, strengthening skills for problem-solving, critical thinking, and adaptability, designing flexible programs [17]. This approach fosters students’ cognitive understanding of course materials, and the suitable creation of pertinent instructional supplements, among others [18]. However, the ongoing context has posed many challenges to keep the students engaged, support the teaching practices, and adapt the learning paces. There are also other factors to consider, including lack of standards for quality, e-resources and contents development, needs of infrastructure, and reduction of digital equity and literacy gaps.
Currently, the global context has allowed the pressure derived from the COVID-19 crisis to adapt educational processes; therefore, leveraging technologies to facilitate students in conducting a scientific inquiry has been widely regarded as an important trend in education worldwide. Presently, pressures derived from the global COVID-19 crisis have been conceptualized as a disruption whose impacts affected both educational systems and learning spaces [19,20]. Several challenges and obstacles arose from the social distancing and measures taken to adapt educational processes [21]. Teachers, students, and administrators were unprepared to respond suddenly to the new requirements [22]. In addition, face-to-face classes were suspended, and technological instructional modes became crucial tools to develop synchronous and asynchronous interactions [23,24]. Globally, the crisis generated by the COVID-19 pandemic mobilized the educational systems’ resilient capacities to avoid learning loss and ensure the right to education [25,26].
Nowadays, an educational transition stage has been begun from emergency remote learning to online pedagogy [27]. The COVID-19 pandemic has accentuated that quality education includes learning outcomes and social and emotional development, implying an avoidable curriculum change. Therefore, online educators’ pedagogical and technical competencies should be contemporized to explore different teaching, learning, and evaluating options [28,29]. Hence, digital literacy development [30] and teacher professional development [31] are two strategic fields to innovate offering technologies and methods to leverage the educational process into the post-pandemic context. Likewise, technology incorporation in learning processes should generate student adaptation capacity based on their abilities, perceptions, and skills [32].
Several types of technological resources have been implemented in this transition period, such as online instructional resources focused on educational content, real-time lessons on virtual platforms, online support for parents and students, and self-paced formalized lessons [33]. Consequently, pedagogical benefits have been identified: information search platforms, recording of teachings, meetings and other materials, available and shared resources, student engagement, assessments, and professional development. In opposition, challenges present in the current transition are digital inequalities, inadequate technological academic infrastructure, lack of teacher training, extended workload, obstacles to evaluate and monitor, among other factors [34]. In summary, the current transition period challenges and opportunities are outlined, in which technologies incorporation processes, teacher professional development, and student performances require different efforts and tools to adapt by innovating. Consequently, we are witnessing teachers needing more competencies in using online classrooms and Technological Pedagogical and Content Knowledge (TPACK, in which “A” means “and”).

1.2. Training of Online Classroom

1.2.1. The Training of Technological Pedagogical Content Knowledge (TPACK)

The TPACK approach “represents an emerging form of transformative knowledge through an integrative process generated from the existing instructional forms into new forms that potentially maximize the effectiveness of integrating technology into teaching” (p. 567, [35]). The TPACK model poses that knowledge (K) about content (C), pedagogy (P), and technology (T) is central for developing practical teaching skills. Additionally, this model emphasizes the complex interplay among these three bodies of knowledge: pedagogical content knowledge (PCK), technological content knowledge (TCK), technological pedagogical knowledge (TPK), and all three taken together as technological pedagogical content knowledge (TPCK) [36].
TPACK is a teacher knowledge framework for technology integration and promotes effective teaching in classroom contexts [37]. TPACK model is a pedagogical construct that provides a theoretical basis and practical issues to design learning processes based on prior teachers’ and students’ experiences (perceptions, attitudes, beliefs) using technologies [38]. Moreover, the TPACK framework offers a set of categories to design measurements and assessment instruments to implement approaches for developing technologies incorporation processes [29]. As a means to contemporize Teacher Professional Development, TPACK has been defined as a framework linked to collaborative situations and underlines learning collaborative scenarios based on cross-disciplinary expertise [39]. In addition, TPACK contributes to designing learning processes according to students’ needs and paces [40].
Furthermore, in pre-service teacher-student training TPACK framework provides methodologies and tools such as questionnaires, rubrics, and expert judgments, to identify teacher digital competencies [41]. Besides, TPACK guidelines measure the impacts of teaching experiences and gender-based perceptions over technology incorporation practices [35]. TPACK model underlines the alignment among ICT instruments, content goals, and selected pedagogy, thus, also emphasizing that ICT integration practice determines teachers’ knowledge on technology integration, going beyond the uses of social networks [42]. Likewise, the TPACK approach proposes transforming pre-service teacher students from passive learners and consumers of technology to active learners and designers of technological resources [43], improving their engagement with technology incorporation into classroom contexts.
The TPACK framework focuses on information technology integration into learning situations, highlighting this requirement for teachers’ academic development and implementing the public policy of education [44]. Therefore, we underline the TPACK importance to identify the “competencies that teachers need to develop to be able to teach with technology adequately” (p. 203, [33]). These competencies are related to identifying different factors such as topics and contents that should be taught through Information and Communication Technologies, appropriate pedagogical representations to connect technologies, learners’ demands, teaching strategies, selected tools, and learner-centered strategies. TPACK approach also considers form of assessments: expert, peer, self and teacher’s competencies.
In brief, TPACK is a construct to value the challenges solved by implementing online classrooms [29]. This construct is oriented to integrate technology, pedagogy, and content into a transactional perspective. The model components are “constructivist and project-based approaches such as learning-by-doing, problem-based learning, collaborative learning frameworks, and design-based learning” [28]. We must be conscious of that, in this trend, science teachers and pre-service science teachers are obligated to expand their pedagogical knowledge base rather than pursue fancy technology innovation(s). Technological Pedagogical Content Knowledge (TPACK) is the key issue necessary for scholarly dialogue about educational technology [45].

1.2.2. The Cases of Training of Online Classroom

The definition of TPACK training as constructivist and project-based approaches, which Koehler, Mishra, and Yahya [28] presented, is based on seminar oriented by the learning technology by design approach and semester-long research. Both activities were developed in the teacher training field, and the digital literacy concept was associated with a personal relationship with technologies. The seminar was proposed to observe the discourse constructed within a collaborative design process of online courses and the transformation of TPACK conception into a more integrated theoretical framework. Guzey and Roehrig’s (2009) [46] study case presents a research experience on the Technology Enhanced Communities (TEC) program to connect a learning community with the science teachers’ practices required to support student inquiry. TEC was a yearlong intensive and introductory program about inquiry teaching, technologies, and follow group meetings to analyze the teacher action research. The TEC program impacted teachers’ knowledge of science, pedagogy, and technology through lab activities, inquiry-based teaching, and classroom discussions. Both studies considered critical factors of the TPACK development, the school context, and teachers’ pedagogical reasoning.
Therefore, a pragmatic approach must provide science teachers with a meaningful context to incorporate the innovative technologies pedagogically situated in science teaching. The 4-phase cyclic Modeled Analysis, Guided Development, Articulated Implementation, and Reflected Evaluation (MAGDAIRE) was constructed based on the theoretical framework of cognitive apprenticeship [47]. This framework aims to engage pre-service science teachers in collaboratively designing, developing, and implementing technology-infused instructional modules, supporting a mentoring team composed of educational researchers, senior teachers, and educational technology developers.

1.2.3. Pre-Service Teacher Training

According to Tondeur, Scherrer, Siddq, and Baran (2017) [48], pre-service teachers training in technology entails six domains: “(1) using teacher educators as role models, (2) reflecting on the role of technology in education, (3) learning how to use technology by design, (4) collaborating with peers, (5) scaffolding authentic technologies experiences, and (6) providing continuous feedback” (p. 50, [46]). Swallow and Olofson (2017) [49] relate that micro-level “beliefs, preferences, and goals of the teachers surfaced as primary conditions for developing TPACK” (p. 235). They also present two other levels: first, the meso–context, related to the constraints, limitations, and institutional conditions; second, the social and national contexts into the macro–level.
Despite the meso–level barriers, the pre-service and In-service students should be able to interrelate contexts into the school environment and develop a sensibility to tune the knowledge base oscillation. The alignment between curriculum design and student interests influences the pedagogical decision [38]. This pedagogical decision is produced within the “community of practices for science education. There is a wide range of efficient technological environments and applications available for science teaching and learning” (p. 3225, [50]). TPACK approach highlights the connection between pre-service teachers’ technology learning experiences, disciplinary knowledge areas, and subject-specific pedagogies and the positive influence of assessment forms on integrating technology processes into the teaching practices [51]. However, the lack of pedagogical experiences is one of the most important barriers to this integration, given that, despite students’ technological and content knowledge, without pedagogical knowledge, the interplay between TPACK cannot be mobilized [52]. This barrier is confronted through online classroom design, guided by the TPACK model. Therefore, in this study, we want to develop a new training method for online classroom.

1.3. Research Aims

This study develops a training model of online-classroom technology, composed of students learning technological materials, design, and teaching. These features support participants in constructing knowledge and online-classroom technology usage in the course. Moreover, this training model may improve participants’ technological pedagogical content knowledge. The research aims are outlined as follows.
  • To develop a new training model for pre-service science teachers’ TPACK and online-classroom technology.
  • To evaluate pre-service science teachers’ TPACK with online-classroom technology through the training model.
  • To verify the utility of the training model.

2. Decode Model and CloudClassRoom (CCR)

2.1. Training Model: Decode Model

The training model is called the DECODE model. Implementing the DECODE model can provide science teachers with a meaningful context in innovative technology. This model was framed from the MAGDAIRE model, which focuses on collaboratively designing, developing, and implementing technology-infused instructional modules [47]. Although MAGDAIRE was useful for training teachers’ TPACK, it is not suitable for the class because MAGDAIRE needs to be fully supported with a mentoring team composed of educational researchers, senior teachers, and educational technology developers. In comparison with the MAGDAIRE model [44] and Project-Based Learning (PBL) methodology [53], DECODE offers an integrating approach of factors favoring technology incorporation processes in learning practices. MAGDAIRE is a model linked to Content Knowledge (CK) through lesson planning and course design approach, and PBL aims to articulate I.C.T. integration, attitudes, and students’ confidence in remote learning.
DECODE aims to be used in the class to engage pre-service science teachers’ TPACK in a collaborative training environment. DECODE can be considered “DEmo-CO-DEsign/teach.” The “DE” refers to teacher’s demonstrations, the “CO” refers to the collaboration of students, and the “DE” refers to the design of the course. Cooperation values are integrated into ICT and TPCK to promote collaborative and active learning, access to information, shared resources, and creative communication [42]. Meanwhile, the DECODE model is based on a device operating into classroom learning, oriented towards pedagogical and technological interaction. DECODE framework can integrate knowledge, technology, and curriculum; therefore, attitudes, motivation, technology incorporation, and other factors could be developed in pre-service student teaching.
The process moves pre-service science teachers from passive users of the innovative technologies into active designers, content providers, and practitioners of technology-infused science teaching. This process involves four steps: Teacher’s Demonstrations, Students Co-train the use of ICT, Students Co-Design an ICT-integrated instructional module, Students Co-teach the module and receive feedback (see Table 1).
(1)
The first stage is “Teacher Demonstration,” a technique with instructional characteristics such as “passive guidance or support, preparatory activities or tasks, concurrent activities, retrospective activities, and prospective activities” (p. 220, [54]). In the same way, a demonstration is a dynamic model to exemplify task performance linked with knowledge, skills, abilities, and learning contents. This technique is also related to the situated cognitive and collaborative learning approach; it is connected to collaborative problem solving, displaying multiple roles, confronting ineffective strategies and misconceptions, and providing collaborative work skills [55].
(2)
The second stage is “Students Training”, guided by the TPACK model aims to open a set of opportunities to experiment with ICT-specific knowledge areas and increase the integration degree between technology and teaching practices. Thus, technology should be “understood as a methodical and holistic process inserted in the initial teacher training, starting from a collaborative learning, active participation, and through the design of materials as a final product that leads to significant learning processes” [56]. Teacher Training is also linked to improving learning practices such as planning, action and performance, evaluation, and improving metacognitive question prompts (comprehension, strategic, and reflection questions) [57].
(3)
The third stage is “Students Design Course,” a strategy based on collaborative design teams or peer design “for stimulating and supporting teacher learning. This approach to technology integration will help move pre-service teachers from being passive learners and consumers of technological resources to being more active learners and producer/designers of technology resources, thereby increasing user involvement and local ownership” (p. 561, [43]). The Student Design Course is a type of participation more sensitive towards learning goals requirements, what should be taught, and technology integration: “The idea of learning by design is not new. However, we believe that the TPCK framework provides yet another argument for the pedagogical value of such activities, especially when considering the integration of educational technology in pedagogy” (p. 148, [58]).
(4)
The fourth stage is “Students Teach Course,” a strategy for developing a collaborative learning environment, following the antecedents above. In general terms, the “cooperative teaching (co-teaching or teaching in pairs) contribute to the attainment of several objectives: increasing learning time by facilitating work in several groups at the same time, inclusion through heterogeneous classes, providing alternative teaching approaches adapted to learners by working in stations, providing alternative tasks, increasing personal attention to each pupil, and allowing experienced teachers to mentor young teachers” (p. 1402, [59]).

2.2. Online Classroom: CloudClassRoom (CCR)

This study employed an educational technology called CloudClassRoom into the DECODE model. CloudClassRoom is an interactive response system developed to transform smartphone devices into powerful interactive tools for classroom learning [60,61]. The process moves pre-service science teachers from passive users of the innovative technologies into active designers, content providers, and practitioners of technology-infused science teaching. Instruments such as the Chien, Lee, Li, and Chang (2015) [62] “instant response device, have gradually become an integral part of the science classroom” (p. 1089) because these are learning devices to foster student engagement in peer discussion and improving learning outcomes. Clickers have a few differentiating characteristics compared to traditional clickers: they are an economical solution, stimulate students’ higher-order thinking, develop more student cognitive engagement, and provide more information for creating a learning analytic. A more pertinent pedagogical decision using voting results to positively influence students’ discussions’ processes is possible [63]. Moreover, clicker-integrated instruction is more effective than a traditional lecturer due to the feedback-intervention effects and more permanent learning results.
CloudClassRoom was developed to transform smartphone devices into powerful interactive tools for classroom learning. CloudClassRoom works on every Internet-capable device without additional software or plug-in installation; it operates across iOS, Android, and Windows platforms. Once the teacher connects their device to CloudClassRoom, they can easily initiate anonymous quizzes. In addition to the traditional forced-choice answer format, CloudClassRoom enables students to respond with short texts, pictures, or emoticons. Students’ answers are automatically aggregated in real-time and analyzed, providing the teacher with a general picture of student learning progress just in time. The functions in CloudClassRoom are as follows (see Figure 1):
(1)
Questioning: In CloudClassRoom, there are three basic types of questions: True/False, Multiple Choice, and Open-Ended (see Figure 1a). They form the fundamental dialogue and discuss strategy on CloudClassRoom. Questions allow developing learning-oriented interactions, timely feedback, peer discussions, quizzes, flipped classes, and possibilities to participate in online classroom experiences. Public questions are one of the most important guides for students, promoting their connections with collective pace from their own pace.
(2)
Analyzing: True/False and Multiple Choice can be analyzed with percentage (%) and amount; Open-Ended questions can be analyzed with Semantic analysis by Artificial Intelligence. Teachers and students, in different ways, have at their disposal outcomes, students’ responses, trends, tables, and graphics, useful data for dynamic analyses to support pedagogical reasoning and decisions about what should be taught (see Figure 1b).
(3)
Grouping: After answering, participants can be grouped according to homo responses, heterogeneous responses to foster discussions checking and exploring other points of view, refining questions, enhancing the comprehension of a specific subject matter, or planning tasks (see Figure 1c); this is a resource to support a collaborative learning environment.
(4)
Testing: A test with items can be published and used offline or organized by a test bank. (Figure 1d) Assessments and feedback are based on testing to develop an integral learning-oriented interaction and generate references to improve different classroom performances and factors like planning, roles, paces, distribution, classification, competition, playing, and reinforcement, among others.
(5)
Managing: Different functions are available: Roll call, distribution of seats and positions, appoint an assistant, messages, and a general question area with more important questions (see Figure 1e). They allow for support of the interactions with control tools in the classroom through information about, for example, students present at a specific time, possibilities and opportunities for grouping, several ways for teacher-students dialogue, and identification of trends.
(6)
Reviewing: File downloads can make learning data available for teachers, discussion records, students’ responses, progress in the application of tests (see Figure 1f). After a class, there is available information to analyze, make decisions, display, evaluate, check, and plan. Each classroom produces qualitative and quantitative information related to task and action results.
(7)
Free access: online webpage, non-register, free, and guest profile, are features of exploration on the CloudClassRoom, to know it and notice what the possibilities to connect it with learning spaces are.
CloudClassRoom is useful because “teachers and students interact with each other by using their own devices, such as PCs, laptops, PDAs, smartphones or tablets at any place, at any time. CloudClassRoom has several features that are more advanced than conventional instant response systems, including text response, multimedia presentation, instant group formation, and teacher-student role swapping” [61]. Besides, CloudClassRoom interplays knowledge, comprehension, and application dimensions. CloudClassRoom is an efficient resource to develop formative assessments; it offers open-ended questions, group activities, responses in real-time, timely feedback support, and follow-up tools. These characteristics define CloudClassRoom as an example of mobile learning with direct implications for higher education [64].

3. Methodology and Methods

3.1. Measurements

This study used self-evaluating questionnaire and an open feedback sheet to evaluate the difference of participants’ TPACK before and after DECODE. Figure 2 is the flowchart of the research technique.

3.1.1. TPACK Questionnaire

This questionnaire is composed of eight questions developed to evaluate participants’ TPACK. This questionnaire, referred from Archambault [65,66], was answered on a 6-point Likert scale (strongly agree, agree, slightly agree, slightly disagree, disagree, and strongly disagree), and the participants were instructed to select an option that reflected their feelings from the six options. Based on the definition of TPACK, the questions are listed following, and the internal consistency reliability score for this questionnaire was 0.88 (Cronbach α).
(1)
Content knowledge: “I understand the content in my expert discipline.”
(2)
Pedagogical knowledge: “I understand the various teaching strategies and methods used in the class.”
(3)
Pedagogical content knowledge: “I understand how to present content in my expert discipline and the teaching method conforming to the subject content and students’ level.”
(4)
Technological knowledge: “I understand the interface, operation, and question-making methods in CloudClassRoom”.
(5)
Technological content knowledge: “I understand how to use CloudClassRoom to present the content in my expert discipline conforming to the subject content.”
(6)
Technological pedagogical knowledge: “I understand how to use CloudClassRoom to implement various teaching strategies and evaluations in the class.”
(7)
Technological Pedagogical and Content Knowledge: “I understa what content is suitable for presentation with CloudClassRoom, and can convey knowledge truly in my expert discipline.” “I understand how to use CCR to present the content in my expert discipline and assist students in constructing knowledge well in class.”

3.1.2. The Questionnaire of Students’ Feedback for Each Group’s Presentation

The questionnaire consisted of two open-ended questions, which were used to obtain participants’ suggestions after each group presentation as a teaching demonstration. The questions are “Please list the advantages of this group” and “Please list the disadvantages of this group.” Content analysis will be used, and the results can be used to evaluate the presenting students’ TPACK according to their advantages and disadvantages.

3.1.3. The Questionnaire of Students’ Feedback for the Decode Model

The questionnaire consisted of an open-ended question used to obtain participants’ thoughts of the DECODE model taught by the teacher. The questions are “What do you think about this curriculum and CCR?”. The content analysis will be used, and the results can be used to evaluate the utility of the DECODE model.

3.2. Procedural and Statistical Analysis

This study was conducted at a university in Taiwan with an average percentile rank of about 95.5. We recruited 60 students, ranging in age from 23 to 25 years, 34 male and 26 female. Participants were from two courses named “Earth science teaching method” and “Science education seminar,” which provide the training of pre-service teachers in science education. Note that their major discipline is a limitation of this study in discussing and generalizing its results.
The participants were randomly divided into several groups of two to three participants per group. A pre-test and post-test were employed in the research design to evaluate the pre-service teachers’ TPACK. The research process is displayed in Table 2. The total implementation time for DECODE was six hours spread over three weeks. In the first week, pre-service teachers complete the TPACK questionnaire and learn CCR through the first and second stages of DECODE (Teacher’s Demonstrations and Self-Co-training). In the second week, pre-service teachers group several groups and design a course for each group’s third DECODE stage. In the third week, pre-service teachers present their course through the third stage of DECOCE and complete the post-test, including the TPACK questionnaire and feedback for DECODE. Figure 3 shows the photos of the DECODE course.
The data collection and analysis methods employed were as follows: (1) The TPACK questionnaire was implemented before and after the DECODE. Paired samples t-test was used to evaluate participants’ TPACK. Additionally, the effect size was used because of the small sample size. (2) A content analysis method was employed to evaluate participants’ feedback after each group’s presentation to evidence participants’ performance. We also assess pre-service teachers’ TPACK according to the concept map, teaching material, and CCR questions [47,67,68,69]. First, the concept maps represent the content knowledge of a person about the concept of the subject. The person who draws the concept map correctly structurally understands the concept well. Second, the teaching material represents the pedagogic knowledge of a person about the teaching method or procedure. The person who edits the teaching material sequentially carefully develops their teaching idea well. Third, the CCR questions represent the technological pedagogic knowledge of a person about the application of educational technology. The person who uses CCR to question appropriately evinces a favorable disposition to the application of technology. Moreover, we use content analysis to get the participants’ feedback for the DECODE after the DECODE course.

4. Results

4.1. The Performance of Pre-Service Science Teachers’ TPACK

According to the t-test results (Table 3), the post-test values were significantly higher than the pre-test values for TK, TCK, TPK and TPACK (t(59) = 5.06, p < 0.01; t(59) = 4.98, p < 0.01; t(59) = 5.09, p < 0.01; t(59) = 5.08, p < 0.01). Scores for CK, PK, and PCK were not significantly different between the pre-test and post-test (t(59) = −1.17, p = 0.65; t(59) = 1.29, p = 0.20; t(59) = 0.46, p = 0.13, respectively). The pre-test and post-test coefficients of effect size were high (0.80–0.98). The result shows that the technological-related knowledge (e.g., TK, TCK, TPK) pre-service teachers performed in post-test are higher than in pre-test, additionally, standard deviation becomes smaller. It means they appropriately understood the application of CCR for teaching a course.

4.2. Pre-Service Science Teachers Designed the Course

Figure 4, Figure 5 and Figure 6 show some presentations about the CCR-integrated course from each group. The (a) parts are the subject, and the concept map of the course, (b) parts are the teaching material, and (c) parts are the CCR question in the course. Those materials of the courses show the performances about designing a course from pre-service teachers. The original language is Chinese. The pictures have been translated into English. Two researchers scored each course designed by pre-service teachers, corresponding references between concept maps, teaching material, and CCR questions. The average score of all courses is 86.95, inter-coder reliability is 0.87. This result shows that participants can construct the concept of a scientific subject and design an instructional course with educational technology (CCR).

4.3. The Participants’ Feedback on the DECODE Model

Through the post-test questionnaire, the participants recognized the utility of CCR and valued DECODE as a useful resource. The results revealed that the stages of DECODE assisted pre-service teachers in developing an educational-technology-integrated system. Feedback of advantages of DECODE are as follows (the value in front of each feedback note indicates the % of participants):
(30%) This curriculum provided abundant practice in designing a technology-based course. We learn more from CCR and DECODE than from traditional curriculum platforms.
(23%) CCR is a technology for discussion. CCR can help us create a course focusing on student discussion.
(17%) DECODE provides opportunities for curriculum application with educational technology.
(10%) DECODE is useful for us to learn about the CCR through teacher demonstrations, and teachers use CCR to ask us about designing the course in this curriculum.
(8%) Understanding the application of CCR through DECODE is practical for us.
(5%) Use CCR to guide students to apply educational technology media, promoting more design ideas to apply said media into developing a course.
(7%) No comment.
On the other hand, the participants expressed that the DECODE model may offer better guidelines to develop a division of tasks, training, and course design, although it focuses on implementing educational technology. Feedback suggestions for DECODE are as follows (the value in front of each feedback note indicates the % of participants):
(25%) A feature of CCR is to create discussions; teachers need more training to ensure that the students have sufficient focus on the questions.
(17%) Because there are many functions in CCR, the teachers who are first-time users of CCR may give up. Thus, it is important to allow more time for teacher training.
(11%) Although much new information content is provided in DECODE, connecting with design ideas does not seem easy.
(7%) The team report should include a division of labor.
(40%) No comment.
After training, pre-service teachers thought DECODE could help them understand the functions and operation of CCR and assist them in co-designing a CCR-integrated course. Additionally, DECODE helps to design a course through group discussion. However, while DECODE gives a flexible discussion space and course design, it does not fully guide novices. Hence, each stage in DECODE needs to point out specifics, such as introducing CCR’s functions, guidelines of course designing, and description of the group division of labor.

5. Discussion

5.1. TPACK through the Training Model

The increase of technological-related knowledge was similar to other studies, which showed improvement in participants’ performance. If the curriculum is less teacher-centered and more cooperative, it will be more attractive and interactive. Therefore, this process may enhance their high confidence in the TPK post-test [70,71]. In our study, the participants who gave DECODE feedback in this study indicated similar reasons. For example, some pre-service teachers stated, “There is new content for me to learn in the functions and implements of online-classroom.” “The exercise of questioning questions by CCR helped me understand how to use an online-classroom.” “The exercise of questioning questions by CCR can help me understand how to use an online-classroom.” “DECODE gives an experience for designing a CCR-integrated course.” Participants’ responses revealed the importance of technology use and course designing while training.
Our study shows that the CK, PK, and PCK post-test are significantly different from the pre-test. The average pre-test in this study is high, and the standard deviation is low, which may mean that most participants think they are well about their major discipline knowledge. However, the average on the post-test is lower, and the standard deviation is large, which may mean that participants’ thoughts became low and divergence. We hypothesize that asking questions about subject knowledge should increase CK. However, the result of this study does not show an increase. A reason for the consequence of CK may be the self-realized through designing [71,72]. This result may hint that if participants have practical design courses about pedagogic content knowledge training through learner-centered practice, it is an opportunity and experience to examine themselves and realize more teaching methods and knowledge than those they learned and used.
Overall, the DECODE model gives a utility to technological-related practical training. While comparing research, it should be noted that the pre-and post-test coefficients of effect size are similar to the general educational-technology teaching method (approximately 0.4–1.15) [29,73,74]. This study showed that the DECODE model should be effective, considerably affecting participants’ TK, TCK, and TPK. The course’s teaching materials designed by pre-service teachers could be considered the practice evaluation. The evaluation score shows that participants can design an appropriate instructional course. According to the contents of the questionnaire of students’ feedback for each group’s presentation, we can also assess participants’ performance; additionally, participants who can give feedback may examine their TPACK capabilities [75]. Stated feedback advantages include: “The topic of the course and the questions can arouse students’ interest in learning.“ The course is very relevant to daily life experience and meeting the goals of scientific literacy.” “The course can let students understand the importance of science through science journalism or some media.” “Using CCR for students to explore the socio-scientific issue is useful.” Stated feedback disadvantages include:” “After CCR questioning, the teacher needs to explain whether students’ answer is true or false.” “There are less scientific explanations for students about their answers.” “CCR is only used for measuring but not learning.” After DECODE feedback, we think that the pre-service teacher could develop a CCR-integrated course, and the goal would be to increase interest and deliver science knowledge.

5.2. Adjustment of the Training Model

DECODE model integrates features from numerous teacher-professional-development training and creates a flexible discussion space by grouping. Simultaneously, group discussion for designing the course could continually adjust participants to achieve deliberate practice. In addition, DECODE is a model that offers a framework and a strategy based on a pedagogical scenario supported by functions (asking, grouping, feedback, among others). Thus, DECODE boosts to identify a specific learning demand according to the actors and context related to a learning process. Unlike projects focused on assessing and measuring practices and performances using as reference different cognitive indicators and tools without pedagogical interactions [76]. The participants’ response indicated that through DECODE, they consider CCR useful for teaching. Following the theory of planning behavior (TPB), which predicts the link among attitude, behavioral intention, and behavior [77].
However, DECODE should encourage participants to develop CCR-integrated courses promoting technology applications. While DECODE gives a flexible discussion space and course design, it does not fully guide novices. Hence, each stage in DECODE needs to point out specifics, such as introducing CCR’s functions, guidelines of course designing, and description of the group division of labor. Referring to the participants who gave DECODE feedback in this study, some of them responded, “The textbook in training proved to be of little help for learning how to design a course,” “It is not very easy to understand the textbooks for learning how to teach.” Participants did not gain much pedagogic-related knowledge through DECODE. Although we do not focus on specific teaching knowledge in a scientific subject, PK and PCK may not increase. Improvement is needed on PK because the impact of the course is not enough about the practice of teaching or interacting with students. To elaborate on DECODE, we should focus on building teacher-student interaction and specific instructional cases [28,78].
Still, we encourage pre-service teachers to integrate educational technology into an instructional course so TPK may increase. This study shows that a training process such as DECODE provides a practical opportunity to design educational technology-integrated courses by developing activities to improve and self-evaluate TPACK [29,79].

6. Conclusions and Future Work

With COVID-19’s global impact on society and education, the online classroom is becoming a fundamental development for pre-service teachers to learn and train [21,27]. As a result of a pilot study, the DECODE model can facilitate teachers’ use of the educational technology in instructional courses, improving their TPACK towards a more connected model that addresses accessible technologies, pedagogy, and subject matter jointly. The characteristics of DECODE were integrated for teaching educational technology. Therefore, the DECODE model moves pre-service science teachers from passive users of the innovative technologies into active designers, content providers, and practitioners of technology-infused science teaching. In this regard, DECODE presents four characteristics as a model: (a) conceptually and methodologically structured to develop educational research on learning and teaching processes in higher education, (b) oriented by methodological guidelines to construct research cases and identify learning demands of the social contexts, (c) designed to combine learning experiences with research tools, and (d) linked to experimenting on a web-based system.
CCR and DECODE create learning scenarios supported by technology with a clear pedagogical situation model from the previous characteristics. This pedagogical model is related to research outcomes integrated into the wider field of knowledge-technology-curriculum. CCR and DECODE contribute to identifying critical factors and learning demand of the learning/teaching practices in the pre-service student training. There is a correspondence between CCR functions (questioning, analyzing, grouping, testing, managing, reviewing, free using) and a pedagogical interactional pattern: asking, group assignment, assessment and feedback, discussion, and question refinement, among others. These aspects create a scenario for experimenting and testing. An online classroom system influences TK and PK, and CK depends on the specific social and teaching context. From an integrated perspective, these four aspects should be considered: (a) disciplinary-based knowledge (CK) is an agent of pedagogical interaction, whose main scenario is the demonstration, to represent the student performance as pedagogical experimenting; (b) an environment supported by technology is proposed as pedagogical situation model based on collaboration processes, and active teacher and students roles; (c) students are pedagogical actors, given that their reflections are related to the designer and researcher practices.
Our findings show that DECODE allowed participants to learn educational technology named “ClouClassRoom” and develop a CCR-integrated course. We demonstrated the DECODE process, which integrated four key ideas. First, the teacher demonstration in the first stage guided participants to know the functions of CCR and the occasion to ask a question in the course. Through this stage, it can improve participants’ TK. Second, grouping participants into several groups could represent the interaction of an online classroom between teacher and students. Participants can play the role of teacher or students to learn the functions of CCR used with teacher or students. It also encouraged participants to discuss the opinions of different roles. Through this stage, the participants’ TPK can be improved. Third, grouping participants for developing the instructional course cooperatively can construct the concept of the subject carefully through discussion.
Additionally, DECODE can inspire participants to propose their opinions and discuss with others in designing instructional courses. Through this stage, the participants’ TCK and TPK can be enhanced. Fourth, the course participants designed may lead them to reflect on the usefulness of CCR and the course and elaborate on their teaching material and method. Through this stage, the participants’ TPACK is improved. This study integrated four training ideas and created a flexible discussion space for groups, a practical process based on designing and teaching, which helps users acquire the experience and self-reflection to design an online-classroom-integrated course. After DECODE, participants gained a working understanding of TK, TCK, TPK, TPACK. Although implementing these four stages expands the utility of DECODE for training pre-service teachers’ TPACK about CCR, certain aspects of the operation of DECODE should be revised. Given that DECODE offers a flexible discussion space and course design, we need guidance on course design guidelines and grouping.
In the future, we will develop the formative assessment into the DECODE stages to evaluate participants’ TPACK, improve the learning process and help us elaborate DECODE. According to the fourth goal of sustainable development goals (SDGs), “Quality Education,” ongoing studies will improve the quality courses of information and communication technology. Future plans include implementing the DECODE model with other educational technology and subjects such as social sciences and science-technology-engineer-mathematics (STEM).

Author Contributions

Conceptualization, P.-H.C., C.-Y.C., M.-C.L., and H.-H.L.; Data curation, P.-H.C., M.-C.L., and C.-Y.C.; Formal analysis, P.-H.C.; Funding acquisition, C.-Y.C.; Investigation, P.-H.C., and C.-Y.C.; Methodology, P.-H.C., and C.-Y.C.; Project administration, P.-H.C., and C.-Y.C.; Resources, C.-Y.C.; Supervision, C.-Y.C.; Validation, P.-H.C., and C.-Y.C.; Visualization, P.-H.C., J.M., M.-C.L., and H.-H.L.; Writing—original draft, P.-H.C., and J.M.; Writing—review & editing, J.M., and C.-Y.C. All authors have read and agreed to the published version of the manuscript.

Funding

This work was financially supported by the National Science Council of Taiwan under contracts the MOST 107-2634-F-008-003, the MOST 110-2423-H-003-003 and the “Institute for Research Excellence in Learning Sciences” of National Taiwan Normal University (NTNU) from The Featured Areas Research Center Program within the framework of the Higher Education Sprout Project by the Ministry of Education (MOE) in Taiwan.

Institutional Review Board Statement

Not applicable.

Informed Consent Statement

Not applicable.

Data Availability Statement

Data is contained within the article.

Conflicts of Interest

The authors declared no potential conflict of interest with respect to the research, authorship, and/or publication of this article.

References

  1. Sánchez Ruiz, L.M.; Moll-López, S.; Moraño-Fernández, J.A.; Llobregat-Gómez, N. B-learning and technology: Enablers for university education resilience. An experience case under COVID-19 in Spain. Sustainability 2021, 13, 3532. [Google Scholar] [CrossRef]
  2. Md Yunus, M.; Ang, W.S.; Hashim, H. Factors affecting teaching English as a Second Language (TESL) postgraduate students’ behavioural intention for online learning during the COVID-19 pandemic. Sustainability 2021, 13, 3524. [Google Scholar] [CrossRef]
  3. Nicolò, G.; Aversano, N.; Sannino, G.; Tartaglia Polcini, P. Investigating web-based sustainability reporting in Italian public universities in the era of COVID-19. Sustainability 2021, 13, 3468. [Google Scholar] [CrossRef]
  4. Liu, C.; McCabe, M.; Dawson, A.; Cyrzon, C.; Shankar, S.; Gerges, N.; Kellett-Renzella, S.; Chye, Y.; Cornish, K. Identifying predictors of university students’ wellbeing during the COVID-19 pandemic—A data-driven approach. Int. J. Environ. Res. Public Health 2021, 18, 6730. [Google Scholar] [CrossRef]
  5. Yang, Z.; Liu, Q. Research and development of web-based virtual online classroom. Comput. Educ. 2007, 48, 171–184. [Google Scholar] [CrossRef]
  6. Bickle, M.C.; Rucker, R. Student-to-student interaction: Humanizing the online classroom using technology and group assignments. Q. Rev. Distance Educ. 2018, 19, 1–11, 56. [Google Scholar]
  7. Palvia, S.; Aeron, P.; Gupta, P.; Mahapatra, D.; Parida, R.; Rosner, R.; Sindhi, S. Online Education: Worldwide Status, Challenges, Trends, and Implications; Taylor & Francis: Oxfordshire, UK, 2018. [Google Scholar]
  8. Zafar, H.; Akhtar, S.H. Analyzing the effectiveness of activity based teaching and traditional teaching method through students’ achievement in sub domain knowledge at secondary level. Lang. India 2021, 21, 149–162. [Google Scholar]
  9. Noreen, R.; Rana, A.M.K. Activity-based teaching versus traditional method of teaching in mathematics at elementary level. Bull. Educ. Res. 2019, 41, 145–159. [Google Scholar]
  10. Hokor, E.K.; Sedofia, J. Developing probabilistic reasoning in preservice teachers: Comparing the learner-centered and teacher-centered approaches of teaching. Int. J. Stud. Educ. Sci. 2021, 2, 120–145. [Google Scholar]
  11. Houston, L. Efficient strategies for integrating universal design for learning in the online classroom. J. Educ. Online 2018, 15, n3. [Google Scholar] [CrossRef]
  12. Davis, N.L.; Gough, M.; Taylor, L.L. Online teaching: Advantages, obstacles and tools for getting it right. J. Teach. Travel Tour. 2019, 19, 256–263. [Google Scholar] [CrossRef]
  13. Dumford, A.D.; Miller, A.L. Online learning in higher education: Exploring advantages and disadvantages for engagement. J. Comput. High. Educ. 2018, 30, 452–465. [Google Scholar] [CrossRef]
  14. Stöhr, C.; Demazière, C.; Adawi, T. The polarizing effect of the online flipped classroom. Comput. Educ. 2020, 147, 103789. [Google Scholar] [CrossRef]
  15. Tang, T.; Abuhmaid, A.M.; Olaimat, M.; Oudat, D.M.; Aldhaeebi, M.; Bamanger, E. Efficiency of flipped classroom with online-based teaching under COVID-19. Interact. Learn. Environ. 2020, 28, 1–2. [Google Scholar] [CrossRef]
  16. Muthuprasad, T.; Aiswarya, S.; Aditya, K.; Jha, G.K. Students’ perception and preference for online education in India during COVID-19 pandemic. Soc. Sci. Humanit. Open 2021, 3, 100101. [Google Scholar] [CrossRef]
  17. Dhawan, S. Online learning: A panacea in the time of COVID-19 crisis. J. Educ. Technol. Syst. 2020, 49, 5–22. [Google Scholar] [CrossRef]
  18. Steele, J.P.; Robertson, S.N.; Mandernach, B.J. Beyond Content: The value of instructor-student connections in the online classroom. J. Scholarsh. Teach. Learn. 2018, 18, 130–150. [Google Scholar]
  19. Yang, X.; Zhang, M.; Kong, L.; Wang, Q.; Hong, J.-C. The effects of scientific self-efficacy and cognitive anxiety on science engagement with the “question-observation-doing-explanation” model during school disruption in COVID-19 pandemic. J. Sci. Educ. Technol. 2021, 30, 380–393. [Google Scholar] [CrossRef]
  20. Almahasees, Z.; Mohsen, K.; Amin, M. Faculty’s and students’ perceptions of online learning during COVID-19. Front. Educ. 2021, 6, 638470. [Google Scholar] [CrossRef]
  21. Singh, V.; Thurman, A. How many ways can we define online learning? A systematic literature review of definitions of online learning (1988–2018). Am. J. Distance Educ. 2019, 33, 289–306. [Google Scholar] [CrossRef]
  22. Batubara, B.M. The problems of the world of education in the middle of the COVID-19 pandemic. Bp. Int. Res. Crit. Inst. Humanit. Soc. Sci. Humanit. Open 2021, 4, 450–457. [Google Scholar] [CrossRef]
  23. Husni Rahiem, M.D. Indonesian university students’ likes and dislikes about emergency remote learning during the COVID-19 pandemic. Asian J. Univ. Educ. 2021, 17, 1–18. [Google Scholar] [CrossRef]
  24. Moorhouse, B.L. Adaptations to a face-to-face initial teacher education course ‘forced’ online due to the COVID-19 pandemic. J. Educ. Teach. 2020, 46, 609–611. [Google Scholar] [CrossRef] [Green Version]
  25. Zhu, X.; Liu, J. Education in and after COVID-19: Immediate responses and long-term visions. Postdigital Sci. Educ. 2020, 2, 695–699. [Google Scholar] [CrossRef] [Green Version]
  26. Donnelly, R.; Patrinos, H.A. Learning loss during COVID-19: An early systematic review. Prospects 2021, 1–9. [Google Scholar] [CrossRef]
  27. Hartshorne, R.; Baumgartner, E.; Kaplan-Rakowski, R.; Mouza, C.; Ferdig, R.E. Special issue editorial: Preservice and inservice professional development during the COVID-19 pandemic. J. Technol. Teach. Educ. 2020, 28, 137–147. [Google Scholar]
  28. Koehler, M.J.; Mishra, P.; Yahya, K. Tracing the development of teacher knowledge in a design seminar: Integrating content, pedagogy and technology. Comput. Educ. 2007, 49, 740–762. [Google Scholar] [CrossRef]
  29. Angeli, C.; Valanides, N. Epistemological and methodological issues for the conceptualization, development, and assessment of ICT–TPCK: Advances in technological pedagogical content knowledge (TPCK). Comput. Educ. 2009, 52, 154–168. [Google Scholar] [CrossRef]
  30. Eshet, Y. Digital literacy: A conceptual framework for survival skills in the digital era. J. Educ. Multimed. Hypermedia 2004, 13, 93–106. [Google Scholar]
  31. Philipsen, B.; Tondeur, J.; Pareja Roblin, N.; Vanslambrouck, S.; Zhu, C. Improving teacher professional development for online and blended learning: A systematic meta-aggregative review. Educ. Technol. Res. Dev. 2019, 67, 1145–1174. [Google Scholar] [CrossRef]
  32. Díaz-Noguera, M.D.; Hervás-Gómez, C.; la Calle-Cabrera, D.; María, A.; López-Meneses, E. Autonomy, motivation, and digital pedagogy are key factors in the perceptions of Spanish higher-education students toward online learning during the COVID-19 pandemic. Int. J. Environ. Res. Public Health 2022, 19, 654. [Google Scholar] [CrossRef] [PubMed]
  33. Angeli, C.; Valanides, N. Technology mapping: An approach for developing technological pedagogical content knowledge. J. Educ. Comput. Res. 2013, 48, 199–221. [Google Scholar] [CrossRef]
  34. Oyedotun, T.D. Sudden change of pedagogy in education driven by COVID-19: Perspectives and evaluation from a developing country. Res. Glob. 2020, 2, 100029. [Google Scholar] [CrossRef]
  35. Jang, S.-J.; Tsai, M.-F. Exploring the TPACK of Taiwanese secondary school science teachers using a new contextualized TPACK model. Australas. J. Educ. Technol. 2013, 29, 566–580. [Google Scholar] [CrossRef] [Green Version]
  36. Mishra, P.; Koehler, M.J. Technological pedagogical content knowledge: A framework for teacher knowledge. Teach. Coll. Rec. 2006, 108, 1017–1054. [Google Scholar] [CrossRef]
  37. Koehler, M.; Mishra, P. What is technological pedagogical content knowledge (TPACK)? Contemp. Issues Technol. Teach. Educ. 2009, 9, 60–70. [Google Scholar] [CrossRef] [Green Version]
  38. Hechter, R.P.; Phyfe, L.D.; Vermette, L.A. Integrating technology in education: Moving the TPCK framework towards practical applications. Educ. Res. Perspect. 2012, 39, 136. [Google Scholar]
  39. Araújo Filho, R.; Gitirana, V. Pre-service teachers’ knowledge: Analysis of teachers’ education situation based on TPACK. Math. Enthus. 2022, 19, 594–631. [Google Scholar] [CrossRef]
  40. Supriatna, N.; Abbas, E.W.; Rini, T.P.W.; Subiyakto, B. Technological, pedagogical, content knowledge (TPACK): A discursions in learning innovation on social studies. Innov. Soc. Stud. J. 2020, 2, 135–142. [Google Scholar]
  41. Gómez-Trigueros, I.M.; Yáñez de Aldecoa, C. The digital gender gap in teacher education: The TPACK framework for the 21st century. Eur. J. Investig. Health Psychol. Educ. 2021, 11, 1333–1349. [Google Scholar] [CrossRef]
  42. Chuang, H.-H.; Weng, C.-Y.; Huang, F.-C. A structure equation model among factors of teachers’ technology integration practice and their TPCK. Comput. Educ. 2015, 86, 182–191. [Google Scholar] [CrossRef]
  43. Agyei, D.D.; Voogt, J. Developing technological pedagogical content knowledge in pre-service mathematics teachers through collaborative design. Australas. J. Educ. Technol. 2012, 28, 547–564. [Google Scholar] [CrossRef] [Green Version]
  44. Khan, S. A model for integrating ICT into teacher training programs in Bangladesh based on TPCK. Int. J. Educ. Dev. Using ICT 2014, 10, 21–31. [Google Scholar]
  45. Niess, M.L. Preparing teachers to teach science and mathematics with technology: Developing a technology pedagogical content knowledge. Teach. Teach. Educ. 2005, 21, 509–523. [Google Scholar] [CrossRef]
  46. Guzey, S.S.; Roehrig, G.H. Teaching science with technology: Case studies of science teachers’ development of technological pedagogical content knowledge (TPCK). Contemp. Issues Technol. Teach. Educ. 2009, 9, 25–45. [Google Scholar]
  47. Chang, C.-Y.; Chien, Y.-T.; Chang, Y.-H.; Lin, C.-Y. MAGDAIRE: A model to foster pre-service teachers’ ability in integrating ICT and teaching in Taiwan. Australas. J. Educ. Technol. 2012, 28, 983–999. [Google Scholar] [CrossRef] [Green Version]
  48. Tondeur, J.; Scherer, R.; Siddiq, F.; Baran, E. A comprehensive investigation of TPACK within pre-service teachers’ ICT profiles: Mind the gap! Australas. J. Educ. Technol. 2017, 33, 46–60. [Google Scholar] [CrossRef] [Green Version]
  49. Swallow, M.J.; Olofson, M.W. Contextual understandings in the TPACK framework. J. Res. Technol. Educ. 2017, 49, 228–244. [Google Scholar] [CrossRef]
  50. Srisawasdi, N. The role of TPACK in physics classroom: Case studies of pre-service physics teachers. Procedia-Soc. Behav. Sci. 2012, 46, 3235–3243. [Google Scholar] [CrossRef] [Green Version]
  51. Angeli, C.; Valanides, N. TPCK in pre-service teacher education: Preparing primary education students to teach with technology. In Proceedings of the AERA Annual Conference, New York, NY, USA, 24–28 March 2018. [Google Scholar]
  52. Pamuk, S. Understanding pre-service teachers’ technology use through TPACK framework. J. Comput. Assist. Learn. 2012, 28, 425–439. [Google Scholar] [CrossRef]
  53. Belda-Medina, J. ICTs and Project-Based Learning (PBL) in EFL: Pre-service teachers’ attitudes and digital skills. Int. J. Appl. Linguist. Engl. Lit. 2021, 10, 63–70. [Google Scholar] [CrossRef]
  54. Grossman, R.; Salas, E.; Pavlas, D.; Rosen, M.A. Using instructional features to enhance demonstration-based training in management education. Acad. Manag. Learn. Educ. 2013, 12, 219–243. [Google Scholar] [CrossRef]
  55. Brown, J.S.; Collins, A.; Duguid, P. Situated cognition and the culture of learning. Educ. Res. 1989, 18, 32–42. [Google Scholar] [CrossRef]
  56. Rodríguez Moreno, J.; Agreda Montoro, M.; Ortiz Colón, A.M. Changes in teacher training within the TPACK model framework: A systematic review. Sustainability 2019, 11, 1870. [Google Scholar] [CrossRef] [Green Version]
  57. Kramarski, B.; Michalsky, T. Three metacognitive approaches to training pre-service teachers in different learning phases of technological pedagogical content knowledge. Educ. Res. Eval. 2009, 15, 465–485. [Google Scholar] [CrossRef]
  58. Koehler, M.J.; Mishra, P. What happens when teachers design educational technology? The development of technological pedagogical content knowledge. J. Educ. Comput. Res. 2005, 32, 131–152. [Google Scholar] [CrossRef] [Green Version]
  59. Zach, S. Co-teaching—An approach for enhancing teaching-learning collaboration in physical education teacher education (PETE). J. Phys. Educ. Sport 2020, 20, 1402–1407. [Google Scholar]
  60. Chien, Y.-T.; Chang, C.-Y. Supporting socio-scientific argumentation in the classroom through automatic group formation based on students’ real-time responses. In Science Education in East Asia; Khine, M.S., Ed.; Springer: Cham, Switzerland, 2015; pp. 549–563. [Google Scholar]
  61. Liou, W.-K.; Bhagat, K.K.; Chang, C.-Y. Beyond the flipped classroom: A highly interactive cloud-classroom (HIC) embedded into basic materials science courses. J. Sci. Educ. Technol. 2016, 25, 460–473. [Google Scholar] [CrossRef]
  62. Chien, Y.-T.; Lee, Y.-H.; Li, T.-Y.; Chang, C.-Y. Examining the effects of displaying clicker voting results on high school students’ voting behaviors, discussion processes, and learning outcomes. Eurasia J. Math. Sci. Technol. Educ. 2015, 11, 1089–1104. [Google Scholar]
  63. Chien, Y.-T.; Chang, Y.-H.; Chang, C.-Y. Do we click in the right way? A meta-analytic review of clicker-integrated instruction. Educ. Res. Rev. 2016, 17, 1–18. [Google Scholar] [CrossRef] [Green Version]
  64. Alexander, B.; Ashford-Rowe, K.; Barajas-Murph, N.; Dobbin, G.; Knott, J.; McCormack, M.; Pomerantz, J.; Seilhamer, R.; Weber, N. Horizon Report 2019 Higher Education Edition; EDU19: Boulder, CO, USA, 2019. [Google Scholar]
  65. Archambault, L.M.; Barnett, J.H. Revisiting technological pedagogical content knowledge: Exploring the TPACK framework. Comput. Educ. 2010, 55, 1656–1662. [Google Scholar] [CrossRef]
  66. Schmidt, D.A.; Baran, E.; Thompson, A.D.; Mishra, P.; Koehler, M.J.; Shin, T.S. Technological pedagogical content knowledge (TPACK) the development and validation of an assessment instrument for pre-service teachers. J. Res. Technol. Educ. 2009, 42, 123–149. [Google Scholar] [CrossRef]
  67. Jeon, E.S.; Kim, S.L. Analyses of early childhood teachers’ concept maps on economic education. Int. J. Adv. Cult. Technol. 2019, 7, 43–48. [Google Scholar]
  68. Wu, P.-H. Effects of real-time diagnosis on mobile learning and self-regulation mechanism on students’ learning achievement and behavior of using concept mapping. Int. J. Digit. Learn. Technol. 2017, 9, 1–27. [Google Scholar] [CrossRef]
  69. Wang, Y.H. The effectiveness of integrating teaching strategies into IRS activities to facilitate learning. J. Comput. Assist. Learn. 2017, 33, 35–50. [Google Scholar] [CrossRef] [Green Version]
  70. Kurt, G.; Mishra, P.; Kocoglu, Z. Technological pedagogical content knowledge development of Turkish pre-service teachers of English. In Proceedings of the Society for Information Technology & Teacher Education International Conference, New Orleans, LA, USA, 25 March 2013; pp. 5073–5077. [Google Scholar]
  71. Agyei, D.D.; Keengwe, J. Using technology pedagogical content knowledge development to enhance learning outcomes. Educ. Inf. Technol. 2014, 19, 155–171. [Google Scholar] [CrossRef]
  72. Kleickmann, T.; Richter, D.; Kunter, M.; Elsner, J.; Besser, M.; Krauss, S.; Baumert, J. Teachers’ content knowledge and pedagogical content knowledge: The role of structural differences in teacher education. J. Teach. Educ. 2013, 64, 90–106. [Google Scholar] [CrossRef]
  73. Durdu, L.; Dag, F. Pre-service teachers’ TPACK development and conceptions through a TPACK-based course. Aust. J. Teach. Educ. 2017, 42, 10. [Google Scholar] [CrossRef]
  74. Young, J.R.; Young, J.L.; Hamilton, C. The use of confidence intervals as a meta-analytic lens to summarize the effects of teacher education technology courses on pre-service teacher TPACK. J. Res. Technol. Educ. 2013, 46, 149–172. [Google Scholar] [CrossRef]
  75. Chien, Y.-T.; Chang, C.-Y.; Yeh, T.-K.; Chang, K.-E. Engaging pre-service science teachers to act as active designers of technology integration: A MAGDAIRE framework. Teach. Teach. Educ. 2012, 28, 578–588. [Google Scholar] [CrossRef]
  76. Erna, M.; Elfizar, E.; Dewi, C. The development of E-worksheet using kvisoft flipbook maker software based on lesson study to improve teacher’s critical thinking ability. Int. J. Interact. Mob. Technol. 2021, 15, 39–55. [Google Scholar] [CrossRef]
  77. Ajzen, I. The theory of planned behavior. Organ. Behav. Hum. Decis. Processes 1991, 50, 179–211. [Google Scholar] [CrossRef]
  78. Jang, S.J. Development of a Research-Based Model for Enhancing PCK of Secondary Science Teachers; Nova Science Publishers Inc.: New York, NY, USA, 2009. [Google Scholar]
  79. Dalgarno, N.; Colgan, L. Supporting novice elementary mathematics teachers’ induction in professional communities and providing innovative forms of pedagogical content knowledge development through information and communication technology. Teach. Teach. Educ. 2007, 23, 1051–1065. [Google Scholar] [CrossRef]
Figure 1. The functions of CloudClassRoom. (a) questioning, (b) analyzing, (c) grouping, (d) testing, (e) managing, (f) reviewing.
Figure 1. The functions of CloudClassRoom. (a) questioning, (b) analyzing, (c) grouping, (d) testing, (e) managing, (f) reviewing.
Asi 05 00032 g001aAsi 05 00032 g001b
Figure 2. The flowchart of the research technique.
Figure 2. The flowchart of the research technique.
Asi 05 00032 g002
Figure 3. The procedure of the DECODE model. (a) Teacher demonstration, (b) Students Co-train the CCR, (c) Students Co-design a course, (d) Students Co-teach the module.
Figure 3. The procedure of the DECODE model. (a) Teacher demonstration, (b) Students Co-train the CCR, (c) Students Co-design a course, (d) Students Co-teach the module.
Asi 05 00032 g003
Figure 4. This topic is “the evolution of Biology.” The course is pre-service teacher-designed. (a) the subject and concept map, (b) the teaching material, and (c) CCR questions.
Figure 4. This topic is “the evolution of Biology.” The course is pre-service teacher-designed. (a) the subject and concept map, (b) the teaching material, and (c) CCR questions.
Asi 05 00032 g004
Figure 5. This topic is “the microscope.” The course is pre-service teacher-designed. (a) the subject and concept map, (b) the teaching material, and (c) CCR questions.
Figure 5. This topic is “the microscope.” The course is pre-service teacher-designed. (a) the subject and concept map, (b) the teaching material, and (c) CCR questions.
Asi 05 00032 g005
Figure 6. This topic is “earthquakes.” The course is pre-service teacher-designed. (a) the subject and concept map, (b) the teaching material, and (c) CCR questions.
Figure 6. This topic is “earthquakes.” The course is pre-service teacher-designed. (a) the subject and concept map, (b) the teaching material, and (c) CCR questions.
Asi 05 00032 g006
Table 1. The Decode Model.
Table 1. The Decode Model.
The Feature of the Training ModelAimProcedure
Teacher’s DemonstrationsImproving pre-service teachers’ TK about ICT.The educator demonstrates the questioning function of ICT.
Students Co-train the use of ICTImproving pre-service teachers’ TCK with ICT.The students were grouped into a team with 2–3 people. In a group, each student takes turns to be a teacher and student, to practice the operation of ICT.
Students Co-design an ICT -an integrated courseImproving pre-service teachers’ PCK in their major discipline.Each group is asked to develop a course that should use ICT. Each group needs to finish the concept map, teaching content, and questions about ICT.
Students Co-teach the course e and receive feedbacksImproving pre-service teachers’ TPACKEach group takes turns demonstrating their course to other students with briefing and ICT. After the group’s demonstration, other students give feedback to this group.
Table 2. The procedure of the DECODE model in each course.
Table 2. The procedure of the DECODE model in each course.
WeekTime (Min)Activity
1st10Pre-test: TPACK questionnaire
50DECODE: Teacher’s Demonstrations
60DECODE: Students Co-train the CCR
2nd120DECODE: Students Co-design a course
3rd100DECODE: Students Co-teach the module & students receive feedbacks
10
10
Post-test: TPACK questionnaire
Post-test: Feedback for DECODE
Table 3. The Result of TPACK Test.
Table 3. The Result of TPACK Test.
TPACK TestPre-TestPost-TesttEffect Size
MeanStdMeanStd
CK4.930.614.771.03−1.17−0.19
PK4.220.924.431.051.290.21
PCK4.520.914.580.960.460.06
TK3.581.334.500.935.06 **0.80
TCK3.281.404.451.064.98 **0.94
TPK3.221.304.381.065.09 **0.98
TPACK3.421.324.510.885.08 **0.97
TPACK test3.800.874.520.844.35 **0.84
** p < 0.01.
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Share and Cite

MDPI and ACS Style

Cheng, P.-H.; Molina, J.; Lin, M.-C.; Liu, H.-H.; Chang, C.-Y. A New TPACK Training Model for Tackling the Ongoing Challenges of COVID-19. Appl. Syst. Innov. 2022, 5, 32. https://doi.org/10.3390/asi5020032

AMA Style

Cheng P-H, Molina J, Lin M-C, Liu H-H, Chang C-Y. A New TPACK Training Model for Tackling the Ongoing Challenges of COVID-19. Applied System Innovation. 2022; 5(2):32. https://doi.org/10.3390/asi5020032

Chicago/Turabian Style

Cheng, Ping-Han, José Molina, Mei-Chun Lin, Hsiang-Hu Liu, and Chun-Yen Chang. 2022. "A New TPACK Training Model for Tackling the Ongoing Challenges of COVID-19" Applied System Innovation 5, no. 2: 32. https://doi.org/10.3390/asi5020032

Article Metrics

Back to TopTop