1. Introduction
Teaching climate change is a demanding task: climate change is a complex topic that addresses different disciplines, but it is also still a socially controversial topic. Different social, political, economic, and scientific groups pursue different objectives in the context of climate change and its public perception. Accordingly, there are deliberate efforts to disturb the scientific facts around climate change via the distribution of misinformation [
1]. This makes it even more challenging to teach anthropogenic climate change, especially when teachers are directly confronted with misinformation in the classroom. Although there are undoubtedly teachers that have profound knowledge in the area of climate change, many teachers report that they feel underprepared for teaching climate change in their classrooms [
2]. Their knowledge about climate change may even stem from media sources that do not reflect a scientific point of view and portray it as highly controversial [
3,
4], or they might generally lack knowledge regarding climate change [
5]. As an example, many biology teachers in the U.S. report that they do not feel well prepared to teach about climate-related topics [
6], and pre-service teachers see preconceived notions and opinions from the media and the social debate as the most difficult factor in teaching about climate change [
7]. In general, the responsibility to interpret scientific information, for example in social media, is increasingly being shifted towards the public or media consumers in general. Hence, for example Höttecke & Allchin [
8] propose the integration of media literacy into science teaching.
To teach (socially) controversial issues such as climate change, Oulton, Dillon, and Grace [
9] suggest that pedagogical approaches should focus on the nature of the controversy; they should advise students that a person’s stance on an issue will be affected by their worldview and motivate teachers to share their personal views with pupils and make explicit the way in which they arrived at their own stance.
We think that for teachers, without explicit training on teaching climate change, it might be difficult to express how they arrived at their stance without understanding the arguments that speak against their stance—in our case, against the stance that humans are the current cause of climate change [
10]. Sezen-Barrie, Shea, and Borman [
11] see teachers’ awareness of climate change denial arguments and their weak reliance on multiple lines of evidence as the best support to engage students with skepticism about climate change issues existing in society.
Considering this, we think that it is a core requisite for science teachers to have sound skills to debunk and refute climate change misinformation, since these strategies have shown to be effective in responding to misinformation [
12]. In our study, we developed a learning environment that aims to improve physics pre-service teachers’ (PSTs) skills to debunk climate change misinformation and to apply ideas of inoculation theory [
13] in their future teaching.
1.1. Responding to Misinformation Using Inoculation Theory
There are several ways to address misinformation, not only in educational settings, but also in general. One fruitful approach is to prevent misinformation from taking root in the first place [
12]. For educational settings, this would mean that teachers have the skills to teach techniques that allow their students to protect themselves from misinformation they might encounter in the future. Thereby, the process of inoculation or “prebunking” seems a promising approach to neutralize the effects of false experts or misinformation, as was shown in various contexts [
14,
15,
16,
17,
18].
Inoculation theory was first introduced by McGuire in the 1960s [
19,
20]. The main idea is that individuals can be inoculated against misinformation attacks on their attitudes, similar to the way individuals can be immunized against a virus [
14,
21]. An attitudinal inoculation consists of several steps. First, a “threat” that challenges a person’s position is introduced by forewarning people that they may encounter (mis-)information that may challenge their pre-existing beliefs. This first component relates to the recognition that an already existing position is vulnerable to future threats [
22]. Afterwards, one or more (weakened) examples of that (mis-)information are presented and directly refuted in a process called “refutational preemption” or “prebunking” [
17]. This means that an inoculation message is usually a two-sided argument that both questions and supports the attitude or belief being protected. As an example, a simple inoculation message including a forewarning crafted to protect students from misinformation regarding the causes of current climate climate change might look like this:
“You already know that current climate change is human caused due to greenhouse gas emissions. In the future, some people may try to challenge this idea. For example, some people on the internet, friends or media may try to convince you that this idea is wrong. Climate has changed naturally before, and this is also the predominant factor now. But that is simply untrue, considering the full scientific picture. Greenhouse gasses have controlled most changes in the climate in the past. So, it’s the concentration of greenhouse gases and this time, we humans are the cause for the increase in greenhouse gases, mainly through CO2 emissions.”
The first part of this example functions as a forewarning, that alerts the recipients that a belief they hold might be challenged in the future [
13]. Research has shown that such forewarnings boost reactions to an inoculation message. The sentence: “because climate has changed naturally before, it is also the predominant factor now”, represents the counterargument. Compton et al. (2021) describe that this part functions as the viral component of a vaccination—it triggers a response of protection. The lines: “But that is simply untrue. Greenhouse gasses have controlled most changes in the climate in the past. This time, we humans are the cause for the increase in greenhouse gases, mainly through CO
2 emissions“, function as a weakening of the counterargument by refuting it.
Additionally, there are at least two different approaches to inoculation messages, both proving to be effective in past studies [
23,
24]. Inoculation messages can either be fact-based, where misinformation is refuted by factual explanations [
25], or they can be logic-based. Here, the logical fallacy used to mislead people is explained [
15,
26]. There is some evidence that a logic-based inoculation especially prevents people from being misled by the same technique in other contexts [
15,
27].
Recent studies even found that actively inoculating adults during online gaming phases significantly reduced the perceived reliability of tweets that embedded several common online misinformation strategies [
28]. This technique is, for example, also used in the “cranky uncle mobile game” [
23] in the context of climate change. In contrast to passive inoculation, where recipients read or receive an inoculation message, active inoculation encourages people to engage during the inoculation process, for example, by hypothetically crafting misinformation using science denial techniques themselves.
1.2. Responding to Misinformation Using Debunking Strategies
In the introduction section, we outlined why we think that is also important for science teachers to have the skills to properly respond or “debunk” climate change misinformation. Inoculation proves especially successful for future exposure to misinformation. Hence, we see it as a fruitful approach for teacher education to instruct teacher students on how to inoculate their future students against science-related misinformation. However, there will be situations when teachers are confronted with misinformation directly in the classroom, without the idea to inoculate their students against misinformation beforehand. Hence, they should also be able to directly debunk climate change misinformation in those situations. Research has shown that only correcting misinformation does not usually work and is therefore unlikely to “unstick” misinformation [
12]. Lewandowsky et al. [
12], for example, proposed a three or four component process to successful debunking, where one should introduce the fact first, afterwards, warn about an upcoming myth, explain how the myth misleads, and then finish by reinforcing the fact. However, in order to be able to explain how a myth misleads, teachers must have a sound knowledge about reasoning errors and strategies used in spreading misinformation.
Cook, Ellerton, and Kinkhead [
29], for example, propose deconstructing climate misinformation by using argumentation to identify such reasoning errors. A commonly used framework that entails rhetorical techniques used in misinformation is the FLICC-taxonomy [
29]. This taxonomy refers to five primary techniques of science denial: fake experts, logical fallacies, impossible expectations, cherry picking, and conspiracy theories. Hence, the basic idea is to identify reasoning errors in order to perform an argument-based refutation rather than a fact-based refutation. Furthermore, Cook, Bedford, and Mandia [
30] propose that “agnotology-based learning” or the “study of how and why ignorance or misconceptions exist” might be a successful approach to teach conceptual understanding of climate change, since it has the additional potential to raise students’ argumentative skills [
31].
As discussed above, in-service teachers frequently feel ill-prepared to teach climate change. If climate change education shall be implemented full-scale in our educational systems—and this is what we think is necessary to raise awareness and action in society—we cannot afford insecure teachers who may even procrastinate teaching climate change due to their insecurities. Therefore, we followed a strategy to equip physics pre-service teachers with essential knowledge about how to respond to, and inoculate future students against climate change misinformation. To facilitate this, we developed a learning environment that picks up both ideas—inoculation against and debunking of climate change misinformation. Based on these ideas, we identified a set of research questions guiding an explorative study to inquire whether our chosen approach should be explored in more detail in future studies.
2. Method
In total, we derived four research questions which we want to answer in this article. We see this study as an explorative case study. Its main purpose is to get first insights about our approach using inoculation and debunking, to identify hypotheses and identify fruitful approaches for future studies. Hence, our research questions reflect an explorative character.
2.1. Research Questions
Research question 1 (RQ1): Does an intervention based on active inoculation and debunking tasks in the context of climate change contribute to an increase in physics pre-service teachers’ perceived knowledge about climate change?
Although the development of conceptual understanding of climate change was not in the foreground, we hypothesized that the intervention would contribute to an increase in the PSTs’ perceived knowledge about climate change. If so, future studies should use conceptual knowledge tests on climate change.
Research question 2 (RQ2): Does an intervention based on active inoculation and debunking tasks contribute to an increase in physics pre-service teachers’ debunking skills?
In this explorative study, we were interested in whether our chosen approach seems fruitful to support the development of debunking skills. Hence, we hypothesized that the intervention contributes to an increase in the PSTs’ debunking skills.
Research question 3 (RQ3): Does an intervention based on active inoculation and debunking tasks contribute to an increase in the participating physics pre-service teachers’ climate myth debunking self-efficacy?
Regarding research question 3, we were interested in two different aspects: on the one hand, we wanted to find hints as to whether the PSTs’ myth-debunking self-efficacy increases during the intervention. This would be one hint that our approach is promising to foster PSTs’ myth-debunking self-efficacy, and hence should be picked up in future studies. Furthermore, we were interested in whether the PSTs’ myth debunking self-efficacy correlates with their actual debunking skills. Hence, we measured the PSTs’ myth debunking self-efficacy in four points of time, as outlined in the next section. We hypothesized that the intervention would contribute to an increase in the PSTs’ self-efficacy and that the correlation between self-efficacy and debunking skills would be higher after the intervention. The results concerning these aspects of our explorative study can serve as sign-posts for research of future, similar interventions.
Research question 4 (RQ4): How do students evaluate the learning environment based on active inoculation and debunking?
Since this study represents a first exploration, we were additionally interested in how the students perceived the intervention and if they enjoyed participating. Due to the current importance of the topic, we hypothesized that the PSTs would positively evaluate the learning environment.
2.2. Study Design
The explorative study was carried out in a pre-post format. In the pretest, the pre-service physics teachers filled in a questionnaire on demographics (see demographics below), climate change beliefs, and their perceived knowledge about climate change (see measures below). Furthermore, they completed a debunking task to measure their PSTs’ debunking skills. Additionally, we measured the PSTs’ climate myth debunking self-efficacy before and after completion of the debunking task as part of the measurement.
For the posttest, the pre-service teachers were again asked about their climate change beliefs and perceived knowledge about climate change. The PSTs were confronted with a debunking task, and their climate myth debunking self-efficacy was measured before and after the debunking task. Overall, the PSTs’ myth-debunking self-efficacy was measured four times: before (t1pre) and after (t2pre) the debunking task in the pretest, and again before (t1post) and after (t2post) the debunking task in the posttest. We chose this approach to measure the PSTs’ self-efficacy at four points in time since we assumed that the pre-service teachers might overestimate their ability to debunk climate change myths. The idea behind this assumption was that they might misjudge their ability due to their lack of experience in debunking. Hence, the comparison of results that stem from measurements from only a pre-post measure before the debunking task in the pre- and posttest might be misleading, since the PSTs’ self-efficacy might differ before and after the debunking task (t1 and t2), but not between the pre- and posttest before the corresponding debunking task. Furthermore, we developed the idea that the match between the PSTs’ self-efficacy and their debunking skills can be seen as a measure for the PSTs’ self-assessment of their debunking skills. Finally, participating students’ intrinsic motivation about the learning environment was measured based on self-report and we asked them two open questions regarding their perception of the learning environment.
2.3. Sample
The study was conducted during a physics teacher preparation course about digital media in physics education. This course is a compulsory part of the high school teacher education program and aims to support student’s digital competencies. The study was conducted between October 2020 and January 2021. Participation in this study was voluntary for the students; out of the 28 PSTs enrolled in the course, 20 participated in this study. Hence, our sample consists of
n = 20 physics pre-service teachers at a bachelor level. Personal data of these students can be found in
Table 1.
2.4. Intervention
The main goal of our learning environment was to foster physics pre-service teachers’ debunking skills in the context of climate change to prepare them for teaching this topic. The designed learning environment consists of three intervention phases, each intervention phase lasts about 3 h in a course setting at university and additionally includes one assignment that has to be completed at home.
In the first phase of the learning environment, the PSTs shortly learn about core scientific principles of climate change in a direct instruction format. Furthermore, they are introduced into science denial techniques that are typically used to convey doubt on climate change [
24]. Subsequently, students participate in an active inoculation against climate change misinformation [
27]. In this setting, the participating pre-service teachers’ have to slip into the role of a climate change denier and forge a misinformation document (“blog entry”) based on specific given science denial strategies. The PSTs work in pairs and each pair of students is assigned two specific science denial strategies and corresponding “climate myths” that they were asked to incorporate in their misinformation document.
Table 2 shows the assigned myths and corresponding science denial techniques.
The overarching idea of this first phase of the learning environment is to support students in refreshing and consolidating their knowledge about the scientific underpinnings of climate change. The active inoculation serves to support students in the development of their debunking skills, since this approach of active inoculation has proven promising in the past [
28]. We see this as a prerequisite for developing meta-skills to inoculate their future students.
In the second phase of the learning environment, the student pairs exchange the documents created in the first phase. Therefore, each pair goes on with a misinformation document that includes different climate myths than the ones incorporated in their own document. Their task is to spot the climate myths and climate denial strategies used by their peers and to write a text (commentary to a blog entry), where they debunk the addressed climate change myths according to known debunking strategies [
12]. In processing this task, the PSTs need to incorporate their scientific knowledge regarding climate change in combination with their knowledge about science denial strategies. The PSTs were asked to use a known debunking strategy that adheres to a fact-myth-fallacy-fact structure as described in the debunking handbook [
12]. The students were expected to first state the scientific fact that corresponds to the climate change myth, then address the myth and warn the recipients about the upcoming myth. Afterwards, the PSTs should explain how the addressed myth is misleading and finish their response by reinforcing the scientific fact stated in the beginning.
In the last phase of the learning environment, the PSTs evaluate and reflect whether they successfully spotted the climate myths and science denial strategies. After that, they plan lessons about how the ideas and concepts they have encountered and learned about can be transferred into school settings in their future teaching. An overview of the three different phases of the intervention can be found in
Figure 1.
2.5. Measures
2.5.1. Climate Change Beliefs
To get an idea of the participants’ climate change beliefs, we asked them five questions regarding their beliefs about climate change in general. The items were based on previous studies [
15,
17] and translated into German.
Table 3 shows the corresponding items and the scaling of the answers.
2.5.2. Perceived Climate Change Knowledge
To get an idea of participants’ knowledge about climate change, we asked them to estimate their climate change knowledge. Hence, we measured participants’ perceived climate change knowledge on a scale from 0 (I know very little about that) to 6 (I know a lot about that).
2.5.3. Climate Myth Debunking Self-Efficacy
In order to assess pre-service teachers’ self-efficacy regarding the debunking of climate change myths, we developed a five-item 7-point-Likert Scale from 0 (I do not agree) to 6 (I completely agree), with an internal consistency of Cronbach’s’ alpha of α = 0.93. The wording of the items as well as item correlation with the total score (corrected for item overlap and scale reliability) can be found in
Table 4.
2.5.4. Debunking-Skills
To measure participants’ skills in debunking climate change myths and/or climate change misinformation, we used a debunking-task based on real climate change misinformation. The misinformation document was a guest commentary published in a state-owned Austrian online-newspaper that was shortened. Overall, we used three different paragraphs of the climate change misinformation document. One section was identical in the debunking task in the pre- and posttest, one section was only presented in the pretest, and one section was only presented in the post test, all three sections were almost of the same length. The task for the participants was to read, identify, and debunk possible climate change misinformation presented in specific sections of the document.
To assess the PSTs’ debunking skills, we used two different indicators: debunking score and debunking quality based on a developed rubric. We used these two different measures to get an idea of both the quality of the PSTs’ debunking and their fluency in debunking. We think when PSTs are confronted with climate myths in classrooms, they do not only need skills to qualitatively debunk climate myths, but they also have to be able to immediately react within a favorable time window. According to the rubric, we assessed students’ debunking skills using a scale from 0 to 3; the rubric is explained in
Table 5. We scored one point when the person identified a climate myth, but was not able to state the corresponding fact or explain why this is a climate myth. Two points were scored when the PST identified the myth and was able to state the corresponding fact, but the PST did not explain why the myth is false or misleading. Finally, three points were scored when the PST correctly identified the climate myth, stated the corresponding scientific fact and explained the fallacy of the myth, hence, when they adhered to the debunking strategy described in
Section 1.2 and
Section 2.4. The PSTs’ answers were coded by three people, the first author of this article and two people who were made familiar with the coding scheme. To measure the agreement between the coders, we calculated Fleiss’ Kappa, reaching an agreement of
, which corresponds to almost perfect agreement. In cases where the evaluation differed, codings of the first author of this article were used. The maximum score for the task was 27 points for both the pre- and posttest; however, the time given for this task was limited to 15 min. Due to the time restriction, we did not expect any of the participants to achieve more than half of the points in the pretest.
The debunking quality was calculated as the mean points awarded on the debunking task according to the rubric. We calculated the debunking score as the sum of the points awarded on the debunking task according to the rubric. We calculated the debunking quality and the score for the section of the document that was presented in both the pre- and posttest to be able to directly compare whether the debunking score and quality increased.
2.5.5. Intrinsic Motivation
To measure participants’ self-report of intrinsic motivation during the intervention, we used six translated items of the intrinsic motivation inventory developed by Deci and Ryan [
32] with a 4-point Likert Scale ranging from 0 (I do not agree) to 3 (I totally agree), with an internal consistency of Cronbach’s alpha of α = 0.88.
2.5.6. Feedback
To get an idea about the participants’ impressions/perceptions of the intervention, we asked them three open questions at the end of the intervention:
- 1.
What did you particularly like about the intervention?
- 2.
What would you change about the intervention?
- 3.
What else do you want to say about the intervention?
3. Results
3.1. Descriptive Statistics
As a first step, the study variables were analyzed regarding their mean score, range, and standard deviation to identify possible ground or ceiling effects (see
Table 6). No such effects were found for the relevant variables.
In the next sections, the proposed research questions are discussed based on the collected data. The analysis of data was carried out with either t-tests or Wilcoxon rank sum-tests depending on the distribution of the variables. For this, we used the software R (version 4.2.0) [
33].
3.2. RQ1: Perceived Knowledge
To answer research question 1, concerning the PSTs’ perceived climate change knowledge, we calculated a paired Wilcoxon rank sum test to investigate whether the participating PSTs increased their perceived knowledge about climate change. The pre-service teachers’ perceived knowledge about climate change statistically increased between the pre- and posttest (V = 8.5, p < 0.05) with a medium effect size of r = 0.3.
3.3. RQ2: Debunking Score & Quality
To answer research question 2, the debunking scores for the pre- and posttest were compared using a
t-test. A boxplot for the debunking scores is displayed in
Figure 2. The PSTs’ debunking score significantly increased during the intervention (T (19) = 2.63 (
p < 0.05)) with a medium effect size of d = 0.59.
Additionally, the debunking quality in the pre- and posttest was compared using a Wilcoxon rank sum test. A boxplot for the debunking quality, measured as the mean score per debunked argument, can be seen in
Figure 3. The PSTs’ debunking quality also significantly increased during the intervention (V = 131.5 (
p < 0.001)) with a medium effect size of r = 0.30.
All in all, on the aggregate level, we can see the participating PSTs’ debunking score increased in the posttest and, at the same time, the quality of their debunking increased.
3.4. RQ3: Self-Efficacy
Since the self-efficacy of a person corresponds to an individual’s beliefs to perform actions necessary to produce specific outcomes, we were firstly interested in how confident the PSTs feel when confronted with common climate myths. Secondly, we were interested in whether the students’ self-efficacy matches with their actual debunking skills as measured in this study, and thirdly, whether the intervention contributes to a better self-assessment. Consequently, we measured the PSTs’ self-efficacy at four points in time. The first two measurements were performed directly before (t1pre) and after (t2pre) the debunking task in the pretest. Measurements three and four were performed directly before (t1post) and after (t2post) the debunking task in the posttest.
Figure 4 shows a boxplot of all four measurements.
Several analyses were performed to answer research question three. First, we performed a t-test to analyze whether the students’ climate myth debunking self-efficacy changed when comparing the measurement before the first debunking task in the pretest (t1pre) and the measurement after the debunking task in the posttest (t2post). No significant difference was found between these two measurements (T (19) = 0.80 (p = 0.44)). Therefore, at first sight, one could conclude that the PSTs’ self-efficacy did not change during the intervention.
As a next step, we were interested whether there is a relationship between the PSTs’ debunking self-efficacy and their shown debunking skills for data collected in the pre- and posttest. Therefore, we calculated correlations for each test (pretest and posttest) between the climate myth debunking self-efficacy before the debunking task (t1pre and t1post) with the debunking score on the task. The correlation coefficient hence gives information about the relationship between the PSTs’ climate myth debunking self-efficacy and their debunking skills at different points in time. The higher the correlation coefficient, the more in line the participants’ self-efficacy is with their shown skills.
For the pretest, no correlation was found between the t1pre climate myth debunking self-efficacy and their debunking skills measured in the pretest (p = 0.269). For the posttest, a positive relationship between the t1post climate myth debunking self-efficacy and students’ debunking skills, as measured in the posttest, was found with a correlation coefficient of r = 0.51, p < 0.05. Hence, we concluded that the intervention helps the PSTs to self-assess their debunking skills more realistically.
Next, we were interested in how the debunking task as part of the test—i.e., being confronted with actual climate science denial arguments—affected the PSTs’ climate myth debunking self-efficacy. To do so, we calculated t-tests to compare the students’ self-efficacy directly before and after the debunking task for both the pre- and posttest. For the pretest, we found a significant decrease between the t1pre and the t2pre measurement (T (19) = −5.03 (p < 0.001)), with a large effect size of Cohens d = 1.12. For the posttest, no significant difference between the t1post and the t2post measurement was found (T (19) = −0.90 (p = 0.379)).
From these findings, we concluded that the students adjusted their climate myth debunking self-efficacy when confronted with climate science denial in the pretest, but they did not in the posttest as it was more in line with their actual skills. Subsequently, we analyzed whether there was a significant difference between the PSTs’ climate myth debunking self-efficacy after the debunking task in the pretest (t2pre), and their climate myth debunking self-efficacy after the debunking task in the posttest (t2post). This comparison seems a more appropriate measurement to evaluate the effectiveness of the intervention in terms of climate myth debunking self-efficacy. A t-test revealed a significant increase in the PSTs’ climate myth debunking self-efficacy (T (19) = 4.19 (p < 0.001)), with a large effect size of Cohen’s d = 0.94. This means that, not only did the PSTs’ debunking self-efficacy increase during the intervention, but it is also more in line with their actual debunking skills than before participating in the intervention.
3.5. RQ3: Intrinsic Motivation and Feedback
To answer research question 4, we analyzed the PSTs’ self-reported intrinsic motivation for the intervention. The analysis showed a very high intrinsic motivation (ranging from 0 = low intrinsic motivation to 3 = high intrinsic motivation) with an average value of 2.5 ± 0.52; the distribution of the values is shown in
Figure 5.
In the last part of the posttest, we asked the students to answer three open questions regarding the developed intervention to receive feedback for further improvement. Overall, the students especially enjoyed the topicality of the subject and they particularly embraced that they got to know common science denial strategies and how to counter them. Furthermore, 12 of the 20 students explicitly mentioned that they feel better prepared to teach the topic of climate change in their future teaching. This supports our initial assumption that in order to be self-confident in teaching climate change, science teachers should have a basic understanding of the mechanisms behind (climate) science denial. However, we cannot rule out that an intervention solely focusing on conceptual understanding of climate change may have the same effect. Nonetheless, the students wished for more time for discussions during the course, with seven students even suggesting a full semester course addressing the topic of science denial.
4. Limitations and Discussion
Before we discuss the results of this exploratory study, we want to mention several limitations that are associated with this study to allow a deeper understanding of the findings. Primarily, we want to discuss the sample and sample size of this study, which is associated with several limitations. Our sample only consisted of 20 PSTs who were studying at the same university. Therefore, their previous experiences and learning opportunities might be more similar compared to PSTs from other universities. Furthermore, our sample was a convenience sample stemming from a specific seminar, where 20 out of 28 PSTs voluntarily participated. Additionally, one could argue that 20 is a relatively small sample size, however, we were expecting that we would detect medium to large effect sizes regarding development of the PSTs’ debunking skills, where sample size is not as much of an issue, and this study was conceptualized as an explorative study. Another limitation of this study can be attributed to the development of the PSTs’ self-efficacy when looking at the comparison between the t1pre measurement and the t2post measurement (see
Figure 4). Although we did not detect a significant difference, the mean value of the self-efficacy scale differs—the PSTs’ mean self-efficacy is lower in the t2post measurement. Maybe there is a decrease of the participants’ self-efficacy during the intervention (with a rather small effect size), and our sample was too small to detect this effect. This issue should definitely be addressed in future studies.
Another limitation addresses the participating PSTs’ climate change beliefs in terms of their pre-attitude. As the results show, all PSTs already held the belief that climate change is happening and is predominantly caused by humans. Hence, we cannot make any statements about the effect of the intervention on PSTs who hold contrarian views about climate change, i.e., that current climate change is either not human caused or not happening at all.
Next, we want to discuss the (external) validity of the debunking task and associated limitations for our study. Although we tried to make the debunking task as realistic as possible, by using sections of a real guest commentary, it is clear that the results of our study cannot be directly transferred to real-life settings, since the measurement of the PSTs’ debunking skills took place in a simulated environment. Although we think that the debunking score and quality represent good indicators for the PSTs’ actual debunking skills in real life situations, we cannot show the correctness of this assumption with the applied methods. Additionally, the debunking task addressed a specific topic, in our case climate change, so we cannot make any statements about the PSTs’ debunking skills for other topics (for example, SARS-CoV). However, there is some evidence that a logic-based inoculation especially prevents people from being misled by the same technique in other contexts [
15,
27].
Finally, of course the fact that we asked the participating PSTs only about their perceived climate change knowledge poses a limitation on itself. We would have liked to use a climate change concept inventory or test instrument to measure whether, during our intervention, the students’ conceptual knowledge about climate change also increased, but to our knowledge, at the moment of our study, no such test instrument was available in German [
34]. We think that it would be very interesting for future studies to investigate the relationship between participants’ conceptual understanding of climate change and their climate-change-related debunking skills.
Besides these limitations, our exploratory study provides a number of interesting findings that can be built upon in the future. We started our study under the assumption that physics teachers need to be prepared and supported in order to be willing and confident enough to teach a complex, demanding, and socially controversial topic such as climate change. We regarded two aspects as relevant, namely that PSTs do not only need a sound basis regarding their professional knowledge (i.e., content knowledge about climate change), but also debunking skills which help them confront and refute climate misinformation if encountered in an educational setting. Although in general, the relationship between self-assessed knowledge and actual knowledge can be rather ambiguous, we have nevertheless shown that through participation in our intervention, the perceived knowledge of the participating PSTs increased. This might mostly be due to the first phase of the learning environment where we thematized the most important scientific underpinnings of climate change, but we did not further address conceptual understandings of climate change in phase two and three of the intervention. Phase two was focused on logic-based refutation of climate science denial and not on fact-based refutation [
14].
We have shown that an intervention combining active inoculation and explicit debunking tasks can support physics pre-service teachers to improve their debunking skills. The maximum score on the debunking task was 27; however, the mean score of the participating PSTs was only 2.8 points in the pretest and 5.4 points in the posttest. Although there was a time restriction of 15 min for the task, these scores still seem rather low when compared with the maximum score. This means that it either took the students a long time to identify relevant misinformation in the document and they did not have enough time, or it was very difficult for them to properly debunk the misinformation at hand. Looking at the mean argument quality, we think that it might be a combination of both. The debunking quality was 1.08 in the pretest and 1.68 in the posttest (maximum debunking quality was three). However, this value also includes students that did not identify any misinformation in the posttest and thus were attributed a debunking quality of zero. Without these students, the mean quality was 2.10 in the posttest. This means that after the intervention, the PSTs were able to identify more false statements in the misinformation document and also the quality of their debunking increased on average. However, the mean quality of the debunking was still on a level where the students were able to spot the misinformation as such, but had big difficulties in attributing the appropriate type of logical fallacy. Since our intervention was relatively short for the complex topic of debunking and misinformation, we draw the conclusion that for future studies or courses regarding this topic, the intervention should be extended to either cover the debunking techniques in more detail or also address other topics. This also aligns with feedback from the students; some opted for a complete course addressing science misinformation. Hence, we infer that our approach can be seen as fruitful, but future interventions should be expanded with regards to time and content.
Furthermore, we showed that the PSTs greatly overestimated their actual debunking skills before the intervention, since the mean climate myth debunking self-efficacy significantly decreased when the participants first encountered climate change misinformation. However, through the intervention, the participants not only re-gained their initial level of self-efficacy, but they were also able to better self-evaluate their actual debunking skills, which may trigger the PSTs to further concern themselves with this topic if they feel the need to, and may hence positively contribute to their perceived competence to teach climate change.
Furthermore, it was also interesting to us that there were still a few students that did not identify any misinformation in the posttest. We think that it would be interesting for future studies to investigate how science teachers approach the debunking of such misinformation documents. A further very important aspect for future research is whether there actually is a relationship between science teachers’ debunking skills and their perceived competence to teach the demanding topic of climate change. Additionally, there is the question of whether interventions, such as the one proposed in this article, should even be taken up as mandatory continuous education offers for science pre-service teachers in a separate course, or for in-service teachers as part of professional development programs. This exploratory study provides the first hints that support this idea.
In conclusion, building upon the results of this explorative case study, we can formulate a few hypotheses and also research issues that should be picked up and expanded in future studies. We think future studies should take a closer look at pre-service teacher students’ development of conceptual understandings of climate change when using an intervention that is focused on pre- and debunking climate change misinformation.
Furthermore, we have shown that our debunking task, using real world climate change misinformation, is a feasible approach to measuring debunking skills. We want to encourage future studies to investigate other approaches to foster the development of debunking skills by using similar measurements. Furthermore, the relationship between conceptual understandings or attitudinal beliefs and PSTs’ debunking skills should especially be investigated in future studies.
With this study, we gathered first hints that our chosen approach might be fruitful to incorporate into teacher education programs. We think future studies taking up these ideas should especially use qualitative approaches to portray PSTs learning processes during the intervention. This might be interesting as our results show that the PSTs’ debunking skills increased during the intervention, but on the other hand, they still seemed to struggle to fluently debunk common climate change myths.
Overall, we want to emphasize the importance of incorporating new approaches into teacher education that do not only address digital media or the digital landscape from an effectiveness point of view (e.g., how can we use digital media to enhance learning processes), but also thematize how digital media and growing digitality shape how we as consumers interact with science.