Next Article in Journal
A Hybrid Photovoltaic/Diesel System for Off-Grid Applications in Lubumbashi, DR Congo: A HOMER Pro Modeling and Optimization Study
Previous Article in Journal
Exploring the Characteristics of Solid Waste Management Policy in the Guangdong-Hong Kong-Macao Greater Bay Area
Previous Article in Special Issue
Virtual Field Trips in Binational Collaborative Teacher Training: Opportunities and Challenges in the Context of Education for Sustainable Development
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Misinformation as a Societal Problem in Times of Crisis: A Mixed-Methods Study with Future Teachers to Promote a Critical Attitude towards Information

by
Angelika Bernsteiner
1,*,
Thomas Schubatzky
2 and
Claudia Haagen-Schützenhöfer
3
1
Centre for Didactics of Natural Sciences and Mathematics, University of Graz, 8010 Graz, Austria
2
Department of Subject-Specific Education and Institute for Experimental Physics, University of Innsbruck, 6020 Innsbruck, Austria
3
Department of Physics Education Research, Institute of Physics, University of Graz, 8010 Graz, Austria
*
Author to whom correspondence should be addressed.
Sustainability 2023, 15(10), 8161; https://doi.org/10.3390/su15108161
Submission received: 15 February 2023 / Revised: 15 May 2023 / Accepted: 16 May 2023 / Published: 17 May 2023

Abstract

:
Global society is facing major challenges, which are to be met by pursuing the Sustainable Development Goals (SDGs). Digitalization processes bring many opportunities for achieving SDGs, but they also bring pitfalls. For example, on one hand, social media makes it easier for more parts of society to participate. On the other hand, the ability to rapidly circulate unfiltered information can lead to the spread of misinformation and subsequently interfere with the achievement of SDGs. This effect could be observed during the COVID-19 pandemic and continues to occur in the context of climate change. Young people are especially likely to be exposed to misinformation on social media. With this in mind, it is enormously important for schools to prepare young people to critically handle the overload of information available online. The aim of this study was to provide future middle and high school teachers with a fruitful approach to foster a critical attitude towards information in classrooms. To this end, we expanded an existing approach by implementing active, technique-based inoculation and technique-based debunking within the COVID-19 content framework in a teacher education course. This implementation was monitored by a mixed-methods study with n = 24 future middle and high school teachers who participated in two courses in subsequent semesters. By performing statistical analysis on pretests and posttests and qualitative content analysis on reflective journal entries, we found that future teachers’ self-efficacy expectations for detecting and debunking misinformation, as well as their debunking skills, increased throughout the courses. In addition, our results show that future teachers perceive active, technology-based inoculation as a helpful approach for their future teaching. They feel that this approach can be a way to implement education for sustainable development in schools with a focus on the promotion of critical thinking. In summary, we believe that the approach presented in this article may be beneficial for teaching the critical treatment of information in various thematic contexts.

1. Introduction

Digitalization is driving social change in various spheres of society. The growing importance of digital technologies presents numerous opportunities for innovative thinking and implementation across a wide range of processes. However, along with these opportunities come challenges. One such challenge is the inundation of unfiltered information to which people are constantly exposed. In order to make sense of this information, individuals must create their own frame of reference to interpret its meaning and gain clarity [1]. This situation has facilitated the spread of misinformation, which has become a significant and perilous problem, particularly when society is facing significant challenges [2,3]. A recent example is the COVID-19 pandemic, during which the consequences of encountering unfiltered information in times of health, economic, and social crises have been highlighted.
The global COVID-19 pandemic posed a huge number of challenges for global society. The World Health Organization (WHO) reported that, as of 17 January 2023, 6,706,305 deaths occurred worldwide due to infection with the SARS-CoV-2 virus [4]. To contain the pandemic, measures such as wearing protective masks, isolating people, and providing vaccinations were implemented [5]. Support of these measures was critical to protect people’s health and stop the pandemic. However, the rapid spread of the SARS-CoV-2 virus was accompanied by the spread of misinformation [6,7,8,9,10,11]. The WHO refers to such a situation as an “infodemic,” in which the spread of too much information, including false or misleading information, fuels uncertainty among people and leads to mistrust in health systems [12]. During the COVID-19 pandemic, there was widespread dissemination of statements about the potential harmful health effects of wearing protective masks, vaccination, and alternative theories regarding the origin of COVID-19 [6,7,8,9,10,11]. Evidence indicates that belief in such alternative ideas, conspiracy theories, and misinformation resulted in inadequate support for measures aimed at containing the pandemic [13,14]. In particular, it has been observed that conspiracy theories and misinformation can negatively impact willingness to get vaccinated [15,16]. Similar detrimental effects of misinformation on crisis management have been observed in other crisis situations; prior to the onset of the COVID-19 pandemic, studies have shown that exposure to misinformation can adversely affect individuals’ willingness to vaccinate [2,3,17].
Not only was the COVID-19 pandemic a short-term medical crisis but also it has triggered lasting changes. For example, the United Nations (UN) describes the COVID-19 pandemic as a threat to the progress made in health policy in recent decades [18]. The 2030 Agenda for Sustainable Development, adopted by all UN member states in 2015, pursues 17 SDGs [19]. These goals unite the common aim of peace and well-being for humans and the environment [19]. SDG 3, “ensure healthy lives and promote well-being for all at all ages,” pursues, among other things, the control of pandemics and communicable diseases as well as the prevention of diseases [18]. The course of the COVID-19 pandemic and the associated infodemic have had a strong impact on the achievement of the third SDG [18,20,21] and, thus, on societal prosperity.
The COVID-19 pandemic has likewise influenced the fourth SDG, “ensure inclusive and equitable quality education and promote lifelong learning opportunities for all” [22]. The implementation of distance learning as a pandemic mitigation measure caused a push for digitalization in schools. This increased students’ exposure to digital media and information from the internet [23,24], which in turn increased students’ exposure to misinformation circulating on the internet, including on the topic of COVID-19 [25]. The circulation of misinformation fueled by the COVID-19 pandemic, as in other global crises [2,3], does not stop at the classroom. Thus, the spread of misinformation has become a serious issue in schools [26,27,28]. It is important to train students in school to be aware that information may contain misinformation, to recognize misinformation as such, and to debunk misinformation, or at least not fall for it.
In the context of COVID-19, a critical approach to misinformation may be accompanied by a greater willingness to support mandated anti-COVID-19 measures. In general, a critical approach to information, especially in times of crisis, can lead to the stabilization of trust in science and politics and, even more generally, to the strengthening of sustainable societal development [14,20,21]. Against this background, a report by the UN Secretary General’ describes the spread of false information as an existential threat to humanity [29]. Especially in crisis situations, individuals need reliable information in order to be able to make reflective decisions [30].
The primary mission of schools is to prepare students to effectively navigate the opportunities and challenges of everyday life and to equip them with skills to become self-determined and critically reflective participants in society [31,32,33]. The SDGs emphasize that, by 2030, all students should possess the knowledge and skills necessary to advance sustainable development [22]. Critical thinking is recognized as a key competency for promoting sustainable development [34,35,36]. Therefore, education for sustainable development should prioritize fostering a critical approach when engaging with the abundant information available on the internet, social media, and analog sources [34,35,36]. A critical approach enables students to make informed decisions in the context of global problems and crises, such as pandemics, and actively contribute to the achievement of the SDGs [30].
To successfully support students in developing critical information literacy skills and identifying misinformation, it is crucial for teachers to be familiar with methods and approaches for critically evaluating information that is disseminated online and via analog media [37,38]. In line with the emancipatory approach of education for sustainable development, the goal is not for teachers to impose their own opinions and values on students, but to cultivate a critical mindset for reading, interpreting, and evaluating information [39,40].
Our goal is to provide future teachers with a sound set of techniques [41,42,43] to detect and debunk misinformation in a wide variety of contexts, and to train them how to teach these strategies to their students. As part of this study, we implemented learning opportunities for future teachers to promote the detection and debunking of misinformation in the context of the COVID-19 pandemic. Our work builds upon findings of Schubatzky and Haagen-Schützenhöfer [44]. The study was performed according to the principle of inoculation theory [45] and the debunking process introduced by Lewandowsky et al. [41]. We introduced inoculation theory and debunking strategies to future teachers [45] as an approach to enable them and their future students to critically engage with (mis)information [12].

1.1. Inoculation Theory

Inoculation theory has emerged as an effective approach for protecting people from alternative beliefs in a wide variety of contexts [46]. This social-psychological theory was first introduced by McGuire [47]. Comparable to vaccination as a way to immunize people against pathogens, inoculation theory aims to “immunize” people against alternative beliefs. In this process, a person is confronted with an alternative belief that contradicts, and may challenge, their initial belief. However, refutation of the alternative belief should help the person to resist the belief in the future (see Figure 1, “misinformation is expected”).
There are different types of inoculation. Passive inoculation occurs when a message warning about possibly circulating alternative beliefs is spread. In this type of inoculation, people are confronted with an alternative belief that is subsequently refuted (see Figure 1) [46,48,49]. Active inoculation protects against potentially circulating misinformation by working with strategies that are also used by science deniers to spread misinformation [50]. Active inoculation has been applied, for example, in various online games in the context of COVID-19 and climate change [51,52,53].
It is worth noting that the inoculation approach has a limitation [51,54]: it assumes that people’s initial beliefs are consistent with scientifically based beliefs. For example, the inoculation message “You already know that current climate change is human-caused due to greenhouse gas emissions” [41] assumes that people’s initial belief is that climate change is caused by humans [44]. However, for such a controversial topic as climate change, people’s initial beliefs can vary widely. The same is true in regard to the COVID-19 pandemic. For example, some individuals may be inherently more hesitant about COVID-19 vaccination. For such individuals, the following inoculation message may not be convincing: “You already know in the context of diseases such as mumps, measles, or pox that vaccination is an effective disease prevention tool. In the context of the COVID-19 pandemic, people may question the effectiveness of COVID-19 vaccination”. Against the background of this limitation, Compton et al. suggest that the inoculation approach can be effective for helping people whose beliefs align with scientific opinion to build resistance to misinformation [45]. The authors refer to this process as preventive inoculation [45]. Compton et al. show that inoculation can work in a therapeutic way, supporting people holding controversial views in changing their undesired attitudes and building resistance against misinformation [45]. Accordingly, the inoculation approach may be particularly well suited for highly controversial issues.
Some people might argue that another limitation of the inoculation approach is that persuasive knowledge is necessary for refuting the misinformation cited in an inoculation message. However, either content-based argumentation or technique-based argumentation can be used. Content-based refutation can only work when it is based on convincing knowledge [44,51]. The technique-based approach focuses less on the content of the misinformation and more on the logical structure of the argument being debunked [55,56] (see Figure 1 and Section 1.2). In 2009, Diethelm and McKee [42] presented five techniques used by science denialists in a variety of contexts to spread alternative beliefs. These five techniques (later summarized [41] as FLICC techniques; see Section 1.2) led to the idea of technique-based refutation in an inoculation process. When people are aware of the techniques underlying misinformation, they become protected against misinformation in different contexts [55]. There is evidence in different settings supporting the effectiveness of technique-based refutation during inoculation and indicating that learning some techniques might enable people to recognize other techniques and misconceptions [55,56]. Use of technique-based refutation during inoculation can avoid the aforementioned limitation regarding persuasive knowledge.
As stated previously, our aim is to support future teachers in dealing critically with (mis)information and to provide them with an approach to use in the classroom to support their students in engaging critically with (mis)information. We assume that teachers do not always have persuasive knowledge to refute alternative beliefs on different topics. For this reason, we implement the active and technique-based inoculation approaches in teacher training to increase future teachers’ resistance to misinformation.

1.2. Debunking Strategies for Refuting Misinformation

Section 1.1 explained how people can be supported in dealing critically with (mis)information they may encounter by applying either a content-based or technique-based inoculation approach (see Figure 1). As described in the introduction, there is already a great amount of misinformation regarding societal crises such as the COVID-19 pandemic. In the context of the COVID-19 pandemic, for example, alternative beliefs were circulating about the effectiveness of protective masks, vaccination, and the cause of COVID-19 [6,7,8,9,10,11]. Given the number of alternative beliefs on these issues, many people have probably already come into contact with misinformation. It is therefore not always sufficient to protect people from misinformation using inoculation theory. If misinformation is already circulating, it must be debunked directly. Misinformation can be debunked with scientifically sound arguments (content-based) or by uncovering the fallacies underlying the misinformation (technique-based) [41] (see Figure 1).
Lewandowsky et al. [41] show that scientific, content-based debunking of misinformation is not always effective against misinformation. However, technique-based approach can be effective when it applies the fact-myth-fallacy-fact structure [41]. According to this approach, one states a fact, warns that there is an alternative statement, and then presents the statement. Next, the individual explains why the statement is misleading and not true. This is done in a technique-based manner, with the individual pointing out the misconceptions underlying the misinformation. The debunking process is concluded by stating the corresponding fact [41]. Figure 2 contrasts technique-based debunking with argument-based debunking.
Five common fallacies underlying misinformation were introduced by [42] and summarized by Cook et al. [43] using the FLICC taxonomy, which includes fake experts, logical fallacies, impossible expectations, cherry picking, and conspiracy theories. In order to perform technique-based debunking, it is important to have knowledge about FLICC techniques. In-depth background knowledge about the topic is not required [41,43].
As part of this study, we provide future teachers with technique-based debunking skills because misinformation might appear in the classroom without prior warning, and teachers may not have the expertise to present content-based arguments for diverse topics [44]. Therefore, our main aim is not to provide future teachers with extensive and scientifically sound content knowledge, but to familiarize them with FLICC techniques [43]. These techniques can be used by future teachers to perform technique-based inoculation to support their students in dealing critically with (mis)information and to debunk existing misinformation in a technique-based manner.

2. Research Questions

As already mentioned, our study is based on prior work [44]. We address several research questions.
Research Question 1 (RQ1) is as follows: To what extent is the intervention presented in the study of Schubatzky and Haagen-Schützenhöfer [44] transferable to the context of COVID-19?
We think that the approach presented by Schubatzky and Haagen-Schützenhöfer [44], which is based on active, technique-based inoculation (see Section 1.1) and technique-based debunking (see Section 1.2), can be applied to a broad range of contexts, including COVID-19. We adapted the intervention to apply to the context of the COVID-19 pandemic. The intervention we developed is presented in Section 4.2.
Climate change myths are addressed in [43] and debunked using FLICC techniques (see Section 1.2, [43]). However, to our knowledge, such a treatment has not yet been applied to the topic of COVID-19. Schubatzky and Haagen-Schützenhöfer [44] worked with climate change myths and their debunking (elaborated in [43]). In our study, future teachers do not work with predefined misinformation and debunkings related to COVID-19, but produce their own misinformation using FLICC techniques, data they collected (see Section 4.2), and their own searches of COVID-19 misinformation circulating in the media. Then, the future teachers incorporate the misinformation they formulate into blog articles, which are subsequently analyzed by their colleagues to identify the misinformation. In the context of the intervention (see Section 4.2), the future teachers are unaware of the particular misinformation that their colleagues are putting into the blog articles they have to debunk. However, the future teachers are aware of FLICC techniques [43]. We hypothesize that this active engagement with misinformation related to the COVID-19 pandemic will make this study’s approach more difficult to implement, but may be very effective.
Research Question 2 (RQ2) is as follows: Does the intervention, which is based on active inoculation and debunking skills, contribute to an increase in self-assessed knowledge about COVID-19?
In our study, future teachers do not receive content instructions on scientific knowledge in the context of COVID-19. Therefore, we hypothesize that our intervention will have no effect on future teachers’ self-assessment of their knowledge about COVID-19. However, future teachers may believe they have acquired an increased knowledge of COVID-19 due to their engagement with it while producing, recognizing, and debunking misinformation. If future teachers do report increased self-assessed knowledge about COVID-19 without explicit learning opportunities, further research on the development of conceptual knowledge will be needed.
Research Question 3 (RQ3) is as follows: To what extent does the intervention, which is based on active inoculation and debunking skills, contribute to an increase in future teachers’ self-efficacy expectations regarding the detection and debunking of misinformation in the context of COVID-19?
Based on the results of [44], we hypothesize that our intervention will lead to an increase in future teachers’ self-efficacy expectations regarding the detection and debunking of misinformation in the context of COVID-19.
Research Question 4 (RQ4) is as follows: To what extent does the intervention, which is based on active inoculation and debunking skills, contribute to an increase in future teachers’ debunking skills and debunking quality in the context of COVID-19, and what difficulties do future teachers report in this context?
Based on the findings of the preliminary work of Schubatzky and Haagen-Schützenhöfer [44], we hypothesize that our intervention, which is adapted for the COVID-19 context, will contribute to an increase in future teachers’ debunking skills and quality. However, because FLICC techniques (cf. Section 1.2, [43]) are not yet well reported in the COVID-19 domain, we assume that it is difficult for future teachers to debunk misinformation regarding COVID-19 (including the topics of protective masks, vaccination, and COVID-19 origin) using FLICC techniques.
Research Question 5 (RQ5) is as follows: To what extent do future teachers consider the approaches of active inoculation and debunking as useful tools for their teaching?
We expect that future teachers will consider active inoculation and debunking to be helpful approaches to promote critical attitudes toward (mis)information in classroom.

3. Sample

Our study was conducted in a teacher education course offered in two consecutive semesters. The first course took place in the 2022 summer semester, and the second took place in the 2022/2023 winter semester. The demographic data of the participants were collected before the pretest, which is described in Figure 3. Participation in the study was voluntary.
In the 2022 summer semester, there were n = 17 participants in both the course and our study. Our data set includes n = 15 participants because two participants did not complete either the pretest or posttest and were thus excluded. In the 2022/2023 winter semester, there were n = 13 participants in both the course and the study. Since 4 future teachers did not complete either the pretest or posttest, the data set included n = 9 participants. Thus, in total, n = 24 future teachers fully participated in the study. None of the future teachers participated twice. Table 1 presents a description of the sample.

4. Methods

In their exploratory case study, Schubatzky and Haagen-Schützenhöfer [44] present a fruitful approach for combining active inoculation and debunking to improve debunking skills and quality. Their study investigated future physics teachers’ learning of active inoculation and debunking in the context of climate change. Active inoculation was conducted by requiring participants to frame myths about climate change in a blog article using FLICC techniques. These blog articles were then debunked by other participants using a technique-based approach [44]. To investigate the effectiveness of this intervention, Schubatzky and Haagen-Schützenhöfer [44] used quantitative methods of data collection as well as a debunking task to collect qualitative data with a pre-post design. The data were quantitatively analyzed.
We apply this approach and extend Schubatzky and Haagen-Schützenhöfer’s study [44] on three levels:
  • In our study, the target group of the intervention is extended to future teachers of various subjects, with a focus on STEM-related subjects.
  • Our study was conducted within the contextual framework of the COVID-19 pandemic and therefore provides additional context to the study performed by Schubatzky and Haagen-Schützenhöfer [44]. To our knowledge, an approach using FLICC techniques (cf. Section 1.2, [43]) has rarely been reported in the context of COVID-19. Additionally, COVID-19 is an extremely controversial topic on which future teachers might have many different attitudes, including undesirable ones. We investigate whether the approach of active inoculation and technique-based debunking is also effective in this context for helping students develop skills to critically deal with (mis)information.
  • In our study, we use a mixed-methods approach to answer the research questions described in Section 2. Our study design is presented in Section 4.1.

4.1. Study Design and Measures

Our study replicates the intervention implemented by Schubatzky and Haagen-Schützenhöfer [44], which involved active inoculation and debunking. In our study, participants frame COVID-19 myths in blog articles using FLICC techniques [43]. The blog articles are subsequently debunked by other participants using a technique-based approach. Our intervention is described in Section 4.2. At the research level, we extend the pre-post design used by Schubatzky and Haagen-Schützenhöfer [44] and collect qualitative data (i.e., reflections of the future teachers) to answer the research questions described in Section 2.
Figure 3 shows the pre-post design of our study. The instruments for the pretest and the posttest (perceived knowledge, Section 4.1.1; self-efficacy, Section 4.1.2; debunking task, Section 4.1.3) are strongly based on the work of Schubatzky and Haagen-Schützenhöfer [44]. The pretest is performed before the implementation of active inoculation and technique-based debunking by creating and analyzing blog articles, and the posttest is performed afterward.
In both the pretest and posttest, future teachers carried out a debunking task. Debunking tasks (letters to the editor containing misinformation) were used to collect qualitative data, quantitative analysis of which allows us to infer participants’ debunking skills (see Section 4.1.3). Before starting the debunking task, we assessed future teachers’ perceived knowledge of COVID-19 (see Section 4.1.1) and their self-efficacy expectations (see Section 4.1.2) with regard to detecting and debunking misinformation in the context of COVID-19 (time points t1pre and t2pre; see Figure 3). After completing the debunking task, future teachers’ self-efficacy expectations regarding recognizing and debunking misinformation in the context of COVID-19 were assessed again by both the pretest (time point t2pre) and the posttest (time point t2post).
We surveyed future teachers’ self-efficacy expectations at four time points to determine the extent to which future teachers feel able to identify and debunk misinformation before and after being confronted with misinformation about COVID-19. Future teachers were exposed to misinformation in the pre- and posttests through the debunking task. In the intervention for active inoculation and technique-based debunking that took place between the tests, the future teachers were confronted with misinformation. Across all four time points, we were interested in the development of future teachers’ self-efficacy expectations as a result of their participation in the study.
In addition to the pre-post design shown in Figure 3, qualitative data were collected to answer the research questions. The future teachers compiled a reflective journal as part of the course in which the study was embedded. During the pretest and approximately one week after the posttest, the future teachers wrote two entries in their reflection journals. These were subjected to content analysis [58].

4.1.1. Perceived COVID-19 Knowledge

Perceived knowledge about COVID-19 (RQ2) was measured using a self-assessment with a 7-point scale (0 = I know very little about that; 6 = I know a lot about that). The following question was asked: “How do you rate your knowledge about COVID-19?

4.1.2. COVID-19 Debunking Self-Efficacy

To survey future teachers’ self-efficacy related to detecting and debunking misinformation about COVID-19 (RQ3), we adapted the “climate myth debunking self-efficacy scale” [44]. Self-efficacy was assessed with items using a 7-point scale (0 = I do not agree; 6 = I completely agree), as shown in Table 2. For this COVID-19 debunking self-efficacy scale, internal consistency was calculated as Cronbach’s alpha α = 0.89. Thus, the internal scale consistency can be described as good.

4.1.3. Debunking Task: Debunking Skills and Debunking Quality

Debunking skills and debunking quality (RQ4) were assessed with a debunking task in the pretest and posttest.
For the debunking task, we chose snippets from real letters to the editor about COVID-19 that were published online by daily newspapers. We wanted to ensure that future teachers are confronted with misinformation that is topical and currently circulating in the media. The texts were selected to ensure that they included a variety of misinformation in terms of content and underlying argumentation techniques (as described in Section 1.2 and [43]). When completing the debunking tasks in the pretest and posttest, study participants were required to identify the misinformation in the text, correct it, and state the fallacies underlying it.
Regarding the course structure (intervention between the pretest and the posttest; see Section 4.2), we dedicated 3 course sessions (each lasting 120 min) to an inquiry-based learning sequence (see Figure 4, phase 1) in which the future teachers worked on research questions about the functioning of protective masks. In the subsequent part of the course (2 course sessions), focus was placed on inoculation and debunking. As part of the active inoculation (see Figure 4, phase 2), the future teachers were treated with misinformation on the subject of protective masks and various other misinformation in the context of COVID-19. To support future teachers in developing skills in inoculation and debunking misinformation, we focus, as mentioned above, on FLICC techniques [43]. It was important to make sure that the pre- and posttests were not confounded with knowledge acquired during the inquiry-based task involving active inoculation. Therefore, we were careful to exclude statements about protective masks and include other misinformation in the context of COVID-19.
Two letters to the editor were used to develop the debunking task. Because the survey time was limited to a maximum of 60 min due to the duration of course sessions, a shortened version of one letter to the editor was used for the pretest. For the posttest, parts of the pretest letter were replaced with a snippet of another letter to the editor. This allows for pre-post comparison of the participants’ data and ensures that participants completing the posttest do not already know the full text of the debunking task.
Examples of misinformation from the included snippets are as follows: “The mortality rate of vaccinated persons is 30 times higher than that of unvaccinated persons”, “it is foreseeable that with general vaccination for all people, the overall harm will be greater than the benefit” and “the media report only one-sidedly, negative aspects are not reported”.
For the pre- and posttests, future teachers were asked to identify and debunk misinformation in the letters to the editor and to explain the underlying fallacies (as described in Section 1.2 and [43]). To analyze the future teachers’ debunking skills and quality, we used the 4-level point rubric proposed by Schubatzky and Haagen-Schützenhöfer [44]. Points are assigned based on the quality of future teachers’ debunking of the misinformation in the debunking task. To apply a technique-based approach to debunk misinformation, in general, future teachers first have to recognize misinformation as such in a text. Therefore, future teachers do not receive a point for a relevant passage if they do not recognize the misinformation. One point is awarded if the misinformation is detected but not corrected (see Table 3, statement 1). Two points are awarded if the misinformation is recognized and corrected, or recognized and the underlying fallacy (as described in Section 1.2 and [43]) is explained (see Table 3, statement 2). Three points are awarded if the misinformation is recognized and corrected, and an explanation of the fallacy underlying the misinformation is given [44] (see Table 3, statement 3). The debunking tasks in the pretest and posttest each include 12 myths. Thus, the maximum achievable score in both tests was 36 points (see Table 3).
To determine debunking quality, we calculated the mean score achieved per debunking task. To compare the change in debunking quality between pretest and posttest, we only used the text segments that appeared in both the pretest and posttest [44]. Future teachers’ answers were scored by two people using the described scoring scale and a coding manual. Intercoder reliability was calculated as Cohen’s Kappa κ = 0.78. If the two raters differed, the rating of the first author of this article was used. The calculated Cohen’s Kappa represents a substantial match between the two raters.

4.1.4. Perceived Implications for School Teaching

The posttest examined the future teachers’ self-assessment of their achievement of the following learning objective addressed in the course: “future teachers know strategies for teaching responsible use of information, including strategies for recognizing misinformation for the mathematics and science classrooms”. Therefore, a four-point Likert scale (1 = insufficient; 4 = sufficient) was used. The results of this investigation, combined with qualitative information provided by the future teachers in their reflection journals (see Section 4.1.6), are used to answer RQ5.

4.1.5. Statistical Analysis

We analyzed the data from the pre- and posttests both descriptively and using interference statistics. Before performing the interferential statistical analyses, we implemented power analyses to be able to correctly interpret the results afterward. Depending on the distribution of the data, a t-test or Wilcoxon rank sum test was used to analyze the data in SPSS. A post hoc power analysis shows that, with a sample size of n = 24 and normal distribution of the data, an effect size of d = 0.692 is required for a statistical power of 1 − β = 0.95 and a significance level of α = 0.05 to avoid misinterpretation of the Wilcoxon rank sum test and t-test.

4.1.6. Qualitative Investigations

At two points in the course (i.e., between the pretest and approximately one week after the posttest), the future teachers reflected on the intervention (described in Section 4.2) and its use for their future teaching, their learning processes, and their self-efficacy expectations regarding the recognition and debunking of misinformation. In addition, qualitative data were collected during the posttest. In the posttest, an open-ended question asked future teachers to indicate how confident they feel in dealing with misinformation and to what extent the learning opportunities in the course were helpful for gaining confidence in this regard. For each future teacher, we performed content analysis [58] on the two brief reflections and the response to the open-ended question. Then, we triangulated the results with the quantitative data. Personalized, quasi-anonymized participant codes were used to link data from the pre- and posttests with information in the reflective journals.

4.1.7. Presentation of the Qualitative Data

When describing the results of this study (see Section 5), we provide information about the qualitative data using an identification code. The first part of this code represents the future teacher. The future teachers (n = 24, see Section 3) were alphabetically sorted according to their personalized, quasi-anonymized participant codes, resulting in, for example, S1. The identification code then indicates the course in which the future teacher participated (for example, C2). Finally, the identification code indicates the source of the future teacher’s statement (for example, R1, indicating a reflection journal, or Q2, indicating a posttest).

4.2. Intervention

The intervention of our study was embedded in a course that presented methods for implementing digitally transformed teaching [59]. Digitally transformed teaching is supposed to include learning with digital media as well as learning about digital media [60,61]. For this reason, and based on competency models [60,62] as well as preliminary findings [63], the course design addresses two content areas: digital data acquisition with Arduino (i.e., learning with digital media) and dealing with (mis)information in a digital society (i.e., learning about digital media) [59].
The study presented in this article relates to the second content area. We believe that, in order to address the challenges of the 21st century, such as pandemics, SDGs, climate change, and digitalization, and the accompanying spread of misinformation, it is necessary for teachers to know strategies to debunk misinformation in their classes and to be able to inoculate their students against misinformation (see Section 1). Our aim to foster critical use of information and critical thinking in general aligns with the core competencies needed to achieve the SDGs [34,35].
The intervention for this study was implemented in the course as follows [44] (see Figure 4). In the first part of the course (phase 1 in Figure 4), future teachers work on research questions in the context of COVID-19. Arduino microcontrollers, 3D-printed model heads, CO2-, O2- and fine dust sensors, and different protective masks are used to collect data on the functionality of FFP2 protective masks [59]. Future teachers investigated the following research questions, among others: To what extent does CO2 accumulate under different protective masks? Do protective masks protect against fine dust? When wearing a protective mask, can less oxygen be inhaled than without a mask?
In the second part of the course, future teachers receive a theoretical introduction to misinformation in general and to inoculation theory specifically [53]. To clarify the concept of misinformation, future teachers collect various misinformation related to the context of COVID-19 (e.g., masks, vaccination, treatment, and the origin of COVID-19) and discuss it in the course. In addition, FLICC techniques [43] and the debunking process presented by Cook et al. [43] are explained (beginning of phase 2; see Figure 4).
Subsequently (phase 2), future teachers work in small groups and take on the role of a hypothetical disseminator of misinformation. They have to write a document in the style of a blog entry related to COVID-19. In doing so, they have to use the data they collected in phase 1 and transform it into misinformation. Future teachers are given two of the five FLICC techniques [43], which they must use to present the misinformation. Future teachers are asked to write the blog entry in a way that mixes misinformation with science-based arguments so that the subsequent debunking process is more realistic and difficult for the other future teachers. This active, technique-based inoculation is intended to enhance the debunking skills of future teachers, as has been empirically shown in various studies [44,50,54,55,56].
The process of active inoculation in our intervention differs somewhat from the intervention performed by Schubatzky and Haagen-Schützenhöfer [44]. In their study, future teachers had precise specifications regarding the misinformation to be included in the blog article. In our study, future teachers were only given specifications regarding the FLICC techniques [43] to be used. Misinformation is formulated individually based on the data collected using Arduino in phase 1 of the intervention.
In phase 3 of the intervention (see Figure 4), the future teachers are provided the blog articles created by their fellow participants. They are asked to identify the misinformation in the texts, debunk it, and explain the underlying argumentation techniques [43]. The aim of this phase is to introduce the future teachers to the process of technique-based debunking. This is expected to enable them to debunk misinformation that their students bring to their classroom without the need for extensive content knowledge.
In phase 4 of the intervention (see Figure 4), the process of generating blog posts for active inoculation and the process of debunking are discussed in plenary. The opportunities and challenges of these strategies are addressed and possible implications for teaching are discussed.
This very learner-centered intervention to promote critical approaches to information is consistent with the findings of Corres et al. [36]. In their review, the authors state that learner-centered learning opportunities can promote critical thinking as a core competency for achieving the SDGs [36]. With the intervention implemented in our study, future teachers experience learner-centered learning opportunities for themselves and learn about them for use in their future teaching. The presented intervention aligns with the emancipatory approach of education for sustainable development. Opinions about COVID-19 are not imposed upon future teachers; rather, the future teachers are presented with techniques to critically interpret information [39,40].

5. Results

This section presents the results of the study according to the research questions formulated in Section 2. Descriptive statistics, results of statistical calculations, and results of the qualitative analyses are presented and triangulated in line with the research questions. After the results for each research question are presented, they are interpreted.

5.1. RQ1: Intervention

To transfer the approach of active inoculation and debunking to the context of COVID-19, we developed an intervention for active inoculation in which future teachers produce misinformation themselves. The future teachers used data they collected during the course on the functionality of FFP2 protective masks. Using FLICC techniques [43], future teachers created misinformation using the collected data. The misinformation was written in the form of a blog article.
The following excerpts were taken from blog posts written by future teachers. The following statement includes the misleading strategy “cherry-picking” [43]: “The study, conducted by a team of leading respiratory experts, found that wearing a face mask causes the discrepancy between delivered and ingested oxygen to increase with increasing rate and depth of simulated breathing” (S7S11C2B). The misleading technique “conspiracy theory” [43] was implemented in the following statement: “However, these masks are only meant to protect our illegal, fascist government in order to suppress the voice of the crowd and cover up the truth. It is hereby urged to boycott the wearing of the masks immediately and not expose our children to such danger” (S5S17C2B).
In order to reveal the future teachers’ perspectives on this active inoculation, evaluative content analysis was performed on their reflective journal entries, in line with Kuckartz [58]. The categories “blog entry” and the subcategories “creating misinformation: difficult”, “creating misinformation: easy”, “debunking misinformation: difficult” and “debunking misinformation: easy” were generated deductively. The future teachers’ statements were assigned to these categories in a communicatively validating manner. In the reflections, 7 of the 24 future teachers indicated that they found it difficult to write misinformation independently. The future teachers mentioned the difficulties of packaging misinformation in a way that appears as credible as possible and applies the FLICC techniques [43]. One future teacher commented, “It would have been helpful not only to discuss the FLICC techniques theoretically, but also to discuss and expose them in advance by using text passages” (translated from German) (S23C1R1). However, 3 of the 24 future teachers found this task easy. Three future teachers indicated that they found analyzing their peers’ blog posts to be easy, and three indicated that they found it to be difficult. In particular, detecting FLICC techniques [43] was identified as a difficulty.
All in all, these results show that the intervention presented in the study of Schubatzky and Haagen-Schützenhöfer [44] is also suitable for implementation in the COVID-19 context. However, the fact that future teachers were not given specific misinformation to use in their blog article-like document made it difficult for them to practice active inoculation.

5.2. RQ2: Perceived Knowledge about COVID-19

Table 4 presents descriptive statistics regarding perceived knowledge about COVID-19. Future teachers’ self-assessment data regarding their knowledge of COVID-19 in the pre- and posttest are not normally distributed, as assessed by the Shapiro-Wilk-Test (pretest: p = 0.005; posttest: p > 0.001). Therefore, using a Wilcoxon rank sum test, we can answer RQ2: the intervention does not lead to a statistical change in perceived knowledge about COVID-19 (z = −0.243, p = 0.808, n = 24).
As we hypothesized, our intervention did not increase future teachers’ perceived knowledge of COVID-19 because there were no explicit learning opportunities related to it. Instead, future teachers implicitly dealt with the issue of COVID-19 through the production of misinformation and the detection and debunking of misinformation.

5.3. RQ3: Self-Efficacy

To answer RQ3, we analyzed quantitative survey data on future teachers’ self-efficacy expectations regarding the detection and debunking of misinformation as well as future teachers’ written reflections. Future teachers’ self-efficacy expectancy was quantitatively surveyed twice at the pretest (t1pre and t2pre) and twice at the posttest (t1post and t2post) before and after the debunking task (Figure 3). Qualitative data were collected from the future teachers’ reflection journals. The journals were produced in the period between the pretest and one week after the posttest. Table 5 presents descriptive statistics of the determined self-efficacy score.
Table 5 shows that future teachers indicated higher self-efficacy before working on the debunking task than they did after working on the debunking task. Statistically, there was no significant difference between future teachers’ responses at t1pre and t1post ((t(23) = 1.952, p = 0.063). Future teachers reported higher self-efficacy on the posttest than on the pretest (see Table 5). The posttest also shows that future teachers reported lower self-efficacy after completing the debunking task than they did before the task. Again, no statistically significant difference was shown between future teachers’ responses at t2pre and t2post ((t(23) = 0.860, p = 0.399).
One future teacher described the process of developing self-efficacy in the reflection journal in a way that aligns with the quantitative data in Table 5: “In the first part of the (…) survey, I indicated that I was not so bad at dealing with misinformation and so on. After a rather tricky task during this survey, however, I realized that I had clearly overestimated myself. It’s not quite so easy to identify these “fakes” from a text. This was followed by a few units where we learned how to deal with these fakes, and these were very helpful in improving me in the process” (translated from German) (S1C1R1).
To detect a statistical change in future teachers’ responses regarding their self-efficacy expectancy for dealing with misinformation, a t-test was performed and the responses at t1pre (before the first debunking task) and t2post (after the second debunking task) were compared. The results show that future teachers’ self-efficacy expectancy significantly increased (t(23) = −3.527, p = 0.002), with a medium effect (d = 0.721).
Analysis of the data for each future teacher shows that 4 future teachers reported a lower self-efficacy expectancy on the posttest (t2post) than on the pretest (t1pre). Two future teachers reported the same self-efficacy expectancy at the pretest (t1pre) and the posttest (t2post).
In addition to quantitative data analysis, we performed evaluative content analysis on the future teachers’ written reflections, in line with Kuckartz [50]. To answer RQ3, the two main categories “positive self-efficacy expectancy” and “negative self-efficacy expectancy” were defined deductively. The future teachers’ statements in their reflections related to self-efficacy were assigned to these two main categories. Assignment to the categories followed a communicative validation process.
Eight of the 24 future teachers reported positive self-efficacy expectancy for dealing with misinformation and did not refer only to the topic of COVID-19. Figure 5 presents the quantitative data associated with these teachers from t1pre and t2post in the form of boxplots and contrasts them with qualitative data from the future teachers’ written reflections. Only one future teacher expressed negative self-efficacy expectancy: “debunking fake news remains difficult, even if there are quite nice techniques for doing so” (translated from German) (S14C1R1). At the quantitative level, this future teacher indicated higher self-efficacy expectancy on the posttest than on the pretest. Fifteen future teachers did not reflect on their self-efficacy expectations to detect and debunk misinformation.
Overall, the hypothesis for RQ3 can be verified. Future teachers report higher self-efficacy expectations after the intervention than they do before. This result can be inferred from both the quantitative and qualitative data. In addition, the future teachers’ reflections provide initial indications that self-efficacy expectations with regard to detecting and debunking misinformation increase in general, not only in the context of COVID-19.

5.4. RQ4: Debunking Skills and Debunking Quality

Table 6 presents descriptive statistics regarding future teachers’ debunking scores, which were determined using 4-point scales (see Section 4.1.3). To answer RQ4, debunking scores from the pretest were compared to debunking scores from the posttest. The results show that debunking scores increase significantly (t(23) = −2.506, p = 0.020), with a medium effect (d = 0.512).
Table 6 also shows descriptive statistics on the debunking quality of the future teachers at the pretest and posttest. Debunking quality was calculated as the mean score for debunked misinformation using the scoring system presented in Section 4.1.3. Debunking quality from the pretest was compared to debunking quality from the posttest using a t-test. No significant change was found. Future teachers tended to give answers awarded 2 points (see Table 3) in the posttest, such as the following: “Wrong, the body forms the ‘antibodies’ itself” (S13C1Q2) and “All COVID vaccines have been carefully reviewed by the EMA and only then approved” (S8C1Q2). In the posttest, only a few answers were awarded 3 points (see Table 3), including the following: “The title ‘Dr.’ suggests medical credibility and conveys trust, yet Mr. Lutz is a psychologist and not a virologist—that makes a big difference” (S8C1Q2).
At the qualitative level, evaluative content analysis was performed on future teachers’ reflections, in line with Kuckartz [62]. The categories “difficulties in recognizing and debunking misinformation” and “simplicity of detection and debunking of misinformation” were created deductively. The subcategories “FLICC techniques”, “background knowledge” and “recognizing misinformation” were created inductively based on future teachers’ statements.
Statements from 4 of the future teachers could be assigned to the FLICC techniques subcategory. These future teachers expressed that it was difficult for them to recognize underlying FLICC techniques [43]. One future teacher expressed that debunking misinformation was easy due to knowledge of FLICC techniques [43]. Four future teachers indicated that, for them, prior knowledge concerning the content, the study, and the debunking method is crucial for determining whether recognizing and debunking misinformation is easy or difficult. Three future teachers indicated that recognizing and debunking misinformation was easy for them because of existing background knowledge. Four future teachers expressed that it was difficult to recognize misinformation interwoven in a text with scientifically based statements or in statistics. Two future teachers wrote that recognizing misinformation was not a problem, but debunking it was more difficult. Twelve future teachers did not write any statements in their reflection journals regarding the difficult or easy aspects of recognizing and debunking misinformation.
For further analysis of the reflections, future teachers’ texts were divided into three groups based on the quantitative development of the future teachers’ debunking score from the pretest to the posttest (group 1: n = 7, debunking score decreased; group 2: n = 10, debunking score increased; group 3: n = 7, debunking score did not improve or improved only by less than 5 points). This analysis shows that five of the future teachers who expressed the difficulties described above belonged to group 1, two belonged to group 2, and three belonged to group 3.
Accordingly, we can answer RQ4: the intervention presented here leads to a statistical increase in future teachers’ debunking skills, but there was no statistical change in debunking quality. Future teachers expressed missing content and methodological background knowledge, recognition of well-wrapped misinformation, and recognition of underlying FLICC techniques as difficulties. Based on the qualitative data, it can be deduced that the future teachers whose debunking score decreased expressed difficulties regarding the detection and debunking of misinformation. Therefore, the qualitative results support the quantitative results from the tests. Both, the quantitative and qualitative results suggest that the intervention promoted future teachers’ ability to detect misinformation, but debunking the misinformation remains difficult for them.

5.5. RQ5: Perceived Implications for School Teaching

Table 7 presents descriptive statistics on future teachers’ assessment of acquiring knowledge of strategies for teaching responsible use of information.
Content analysis was performed on the future teachers’ statements in their reflections, in line with Kuckartz [58]. The main category “addressing misinformation in class” was created deductively. Based on the statements collected in this category, the following subcategories were created inductively: “expression of interest”, “usefulness of intervention” and “expression of importance”. In total, 19 of the 24 future teachers made statements that could be assigned to the main category, addressing misinformation in class. Six future teachers explicitly wrote that they were interested in addressing the topic of misinformation in their future classes. Fourteen future teachers wrote that the intervention provided them with valuable methods and content for implementing the topic of misinformation in their classes. Five future teachers described it as very important to promote a critical approach to information in classroom.
Overall, RQ5 can be answered as follows: future teachers perceive the intervention as very valuable for their future teaching and feel that they learned methods to use in the classroom as a result of the intervention.

5.6. Further Results: Future Teachers with Alternative Attitudes toward COVID-19 Measures

Our study did not explicitly survey attitudes toward COVID-19 and the measures proposed to slow the progression of the pandemic. However, analysis of future teachers’ reflection journals shows that at least one future teacher holds attitudes contrary to scientific opinion. This future teacher expressed that they hold a hesitant view towards COVID-19 vaccination and that they believe that diverse opinions should be accepted in a university course. Analysis of this future teacher’s quantitative data shows that self-efficacy expectancy and debunking score increased, while debunking quality remained unchanged. This is consistent with the results of the other future teachers who participated in the study. Descriptive statistics for this future teacher can be found in Table 8.

6. Discussion

In this section, the results of our study are discussed in light of existing research. We address limitations and make suggestions for further research in this area. Furthermore, we discuss the extent to which the intervention can promote critical thinking in the context of education for sustainable development.
The results of our study were derived from a sample of n = 24. At first sight, this sample may be considered small. However, the sample is relatively heterogeneous. The participating future teachers had backgrounds in various subjects and had varying levels of academic progress. Thus, it can be assumed that the future teachers approached the intervention with varying levels of prior knowledge and attitudes. Prior exploratory work on inoculation and debunking [44] was implemented with a more homogeneous target group. Thus, the results of this study may be more broadly applicable.
Further research should include an even more heterogeneous target group and compare target groups to determine the effectiveness of the approach. The reflection statements in the present study indicate that future teachers of different subjects experience technique-based debunking differently in terms of difficulty. Because the group sizes for each subject in our study were too small, it was not possible to compare future teacher groups in our study. For this reason, further research is recommended. Due to our mixed-methods approach, we are able to increase the validity of the quantitative results of our relatively small sample because the quantitative data were supported by the qualitative data.
A limitation of the prior explorative study [44] was that the participants had the same attitudes towards climate change. Therefore, no statement could be made about the extent to which the results could be transferred to people with other attitudes towards the topic [44]. In our study, attitudes towards COVID-19, preventative measures, and vaccination were not explicitly surveyed. However, one study participant expressed an alternative attitude toward this in the reflection journal. Data from this future teacher show an increase in self-efficacy expectancy from t1pre to t2post and an increased debunking score. Thus, the quantitative results are quite similar to those of the other future teachers. The results from this case allow us to suggest that an intervention based on active inoculation and debunking might also be effective for individuals who have different preconceptions, particularly for promoting the ability to recognize misinformation. Whether this approach is also effective in terms of triggering a change in attitude could not be determined in the current study. In order to make statements about whether this approach can lead to a change in attitude in terms of therapeutic inoculation [45], further research is needed.
The explorative work of Schubatzky and Haagen-Schützenhöfer [44] cited a possible decrease in future teachers’ self-efficacy expectations as a limitation. Due to the decreasing mean value of the self-efficacy expectation from t1pre to t2post, a decrease in future teachers’ self-efficacy expectation during the intervention was suspected. No significant result could be established in previous studies [44]. However, our results show that future teachers’ self-efficacy expectations regarding the recognition and detection of misinformation increase significantly after an initial drop in self-efficacy expectations. The qualitative results from the future teachers’ reflections supported the quantitative results.
The results of our study indicate that future teachers overestimate their self-efficacy before completing the debunking task on the pretest. This is revealed by comparison of future teachers’ statements regarding self-efficacy at the t1pre and t1post time points and is consistent with statements and assumptions from other studies [44,64,65]. In the posttest, future teachers reported higher self-efficacy expectations before (t2pre) completing the debunking task than they did after (t2post). However, the difference was much smaller than in the pretest. It can be assumed that by participating in the intervention, future teachers learned to assess their self-efficacy more realistically in terms of recognizing and debunking misinformation. This assumption is consistent with the findings of Schubatzky and Haagen-Schützenhöfer [44]. The initial overestimation of future teachers’ self-efficacy expectations, which we surveyed qualitatively, could not be significantly confirmed by the quantitative data. It is necessary to investigate this possible effect in further studies.
In the present study, we were able to show that the implemented intervention, which consisted of active, technique-based inoculation and technique-based debunking, leads to an increase in future teachers’ debunking score from the pretest to the posttest. The statements in future teachers’ reflections indicate that the future teachers feel more confident in their ability to detect misinformation as a result of participating in the intervention. However, the future teachers described detection of the techniques underlying misinformation as difficult. This is consistent with the results on debunking quality. We found no significant change in debunking quality. It is interesting to note that while future teachers indicated in their reflections that they had learned helpful tools and approaches, they also indicated that recognizing underlying FLICC techniques [43] was difficult. One reason for this could be the relatively short intervention time (approximately 4 h). Further research should include a larger intervention time so that the different FLICC techniques [43] may be examined and learned in more detail. In addition, for further studies, future teachers should be specifically instructed on the structure they should use to debunk misinformation in the debunking task as part of the pretest and posttest. In the pretest and posttest for the current study, future teachers were simply asked to identify misinformation, justify why it is misinformation, and debunk it. If argumentation techniques [43] were specifically mentioned in the assignment, this could lead to higher debunking quality.
Some future teachers indicated that recognizing and debunking misinformation was difficult due to a lack of background knowledge. Technique-based debunking, however, aims to debunk misinformation through logical arguments without requiring convincing knowledge. The difficulty expressed by the future teachers could be counteracted by a longer intervention time, which could lead to better internalization of the FLICC techniques [43].
Although our study showed that future teachers’ self-efficacy expectations and debunking scores increased after the intervention, the results do not reliably indicate whether future teachers are any better at detecting and debunking misinformation in their daily lives and on topics other than COVID-19. Based on the qualitative data, future teachers reported increased self-efficacy expectations regarding recognizing and debunking misinformation. Future teachers did not refer only to the COVID-19 context in their statements. We suggest that future studies choose different contexts for the intervention and debunking tasks to draw conclusions about the transferability of debunking skills from one subject to another. Similarly, we suggest that follow-up studies be conducted in a different context. For example, the debunking task could take the form of a realistic conversation. It is strongly recommended that further research investigate the extent to which the applied approach can help future teachers identify and debunk misinformation in classroom situations.
In our intervention, future teachers were not explicitly provided with knowledge about scientific findings related to COVID-19. However, future teachers implicitly engaged with this topic when creating the misinformation documents and when completing the debunking tasks. The future teachers did not report an increase in perceived knowledge about COVID-19, suggesting that implicitly providing knowledge about COVID-19 may not be sufficient to increase perceived knowledge about it. This is consistent with the results of Schubatzky and Haagen-Schützenhöfer [44]; the intervention of that study explicitly provided knowledge about climate change, and the participants reported an increase in perceived knowledge about climate change.
We wanted to provide future teachers with an approach they could implement in their future teaching to support students in developing skills for critically dealing with (mis)information as well as to debunk misinformation that emerges in the classroom on the fly. The statements made by future teachers in the reflection journals indicate that they consider the approach of active inoculation and debunking to be quite valuable for their future teaching, and they are willing to implement this approach.
In the context of education for sustainable development, a learner-centered intervention guided by the emancipatory approach can effectively foster critical thinking among future teachers. Critical thinking is considered a core competency required to achieve the SDGs [36,39,40]. The expressed intentions of future teachers to implement this approach in their future teaching are promising. By equipping students with critical thinking skills, this approach can enable them to make informed decisions even during times of societal crisis, thereby contributing to the attainment of SDGs [22,30].
Future research should explore the extent to which this approach can promote critical thinking in a general sense, beyond just recognizing and debunking misinformation. This will allow for a more comprehensive understanding of the potential impact of this approach on the development of critical thinking skills among students.
In conclusion, our study provides evidence that active, technique-based inoculation and technique-based debunking can help support future teachers in developing skills for critically dealing with (mis)information and in adopting an approach for use in teaching to support students in critically dealing with or debunking (mis)information. As a critical attitude towards the overload of information currently circulating in society is enormously important to mitigate pandemics and climate change, as well as to achieve the stated SDGs, we advocate for further research in this area. Future research should focus on a more heterogeneous target group, interweave different contexts in the intervention and the pretest and posttest, and consider a longer period of time for implementation of the intervention to sufficiently train participants in technique-based debunking and address its use in different contexts. In particular, we suggest that future studies focus on the effectiveness of this approach for detecting and debunking misinformation in classroom situations. With insight into the possible need for teacher support in critically dealing with (mis)information in real classroom situations, the approach of active inoculation and technology-based debunking can be further developed.

Author Contributions

Conceptualization, A.B., T.S. and C.H.-S.; methodology, A.B., T.S. and C.H.-S.; validation, A.B., T.S. and C.H.-S.; formal analysis, A.B., T.S. and C.H.-S.; investigation, A.B.; data curation, A.B.; writing—original draft preparation, A.B., T.S. and C.H.-S.; writing—review and editing, A.B., T.S. and C.H.-S.; visualization, A.B.; supervision, T.S. and C.H.-S.; project administration, C.H.-S.; funding acquisition, C.H.-S. and A.B. All authors have read and agreed to the published version of the manuscript.

Funding

This research was funded by the Austrian Federal Ministry of Education, Science and Research in the project Teaching Digital Thinking and Joachim Herz Stiftung in the project Fächerverbindendes Lehren und Lernen mit und über digitale Medien im mathematisch-naturwissenschaftlichen Fachunterricht.

Institutional Review Board Statement

All participants were students at University of Graz. They took part voluntarily and signed an informed consent form. Pseudonymization of participants was guaranteed during the study. Due to these measures, an audit by an ethics committee was waived.

Informed Consent Statement

Informed consent was obtained from all subjects involved in the study.

Data Availability Statement

Not applicable.

Acknowledgments

Open Access Funding by the University of Graz. We thank the future middle and high school teachers that participated in our study. We also want to thank the reviewers for the constructive review process and the helpful comments and suggestions, which greatly improved the article.

Conflicts of Interest

The authors declare no conflict of interest.

References

  1. Stalder, F. The Digital Condition, 1st ed.; John Wiley & Sons: New York, NY, USA, 2018. [Google Scholar]
  2. Mehta, A.M.; Liu, B.F.; Tyquin, E.; Tam, L. A process view of crisis misinformation: How public relations professionals detect, manage, and evaluate crisis misinformation. Public Relat. Rev. 2021, 47, 102040. [Google Scholar] [CrossRef]
  3. Austin, L.; van der Meer, T.G.; Lee, Y.-I.; Spangler, J. Managing misinformation and conflicting information. In Advancing Crisis Communication Effectiveness; Jin, Y., Reber, B.H., Nowak, G.J., Eds.; Routledge: New York, NY, USA, 2020; pp. 113–129. [Google Scholar]
  4. WHO Health Emergency Dashboard. Available online: https://covid19.who.int/ (accessed on 18 January 2023).
  5. World Health Organization. Considerations for Implementing and Adjusting Public Health and Social Measures in the Context of COVID-19: Interim Guidance. 14 June 2021. Available online: who.int (accessed on 18 January 2023).
  6. Brennen, J.S.; Simon, F.; Howard, P.N.; Nielsen, R.K. Types, Sources, and Claims of COVID-19 Misinformation. Available online: https://reutersinstitute.politics.ox.ac.uk/types-sources-and-claims-covid-19-misinformation (accessed on 18 January 2023).
  7. Bruns, A.; Harrington, S.; Hurcombe, E. ‘Corona? 5G? or both?’: The dynamics of COVID-19/5G conspiracy theories on Facebook. Media Int. Aust. 2020, 177, 12–29. [Google Scholar] [CrossRef]
  8. Cuan-Baltazar, J.Y.; Muñoz-Perez, M.J.; Robledo-Vega, C.; Pérez-Zepeda, M.F.; Soto-Vega, E. Misinformation of COVID-19 on the internet: Infodemiology study. JMIR Public Health Surveill. 2020, 6, e18444. [Google Scholar] [CrossRef] [PubMed]
  9. Hakim, M.S. SARS-CoV-2, COVID-19, and the debunking of conspiracy theories. Rev. Med. Virol. 2021, 31, e2222. [Google Scholar] [CrossRef]
  10. Ries, M. The COVID-19 infodemic: Mechanism, impact, and counter-measures—A review of reviews. Sustainability 2022, 14, 2605. [Google Scholar] [CrossRef]
  11. Stein, R.A.; Ometa, O.; Pachtman Shetty, S.; Katz, A.; Popitiu, M.I.; Brotherton, R. Conspiracy theories in the era of COVID-19: A tale of two pandemics. Int. J. Clin. Pract. 2021, 75, e13778. [Google Scholar] [CrossRef]
  12. Infodemic. Available online: https://www.who.int/health-topics/infodemic#tab=tab_1 (accessed on 18 January 2023).
  13. Kantorowicz-Reznichenko, E.; Folmer, C.R.; Kantorowicz, J. Don’t believe it! A global perspective on cognitive reflection and conspiracy theories about COVID-19 pandemic. Pers. Individ. Differ. 2022, 194, 111666. [Google Scholar] [CrossRef]
  14. Romer, D.; Jamieson, K.H. Conspiracy theories as barriers to controlling the spread of COVID-19 in the U.S. Soc. Sci. Med. 2020, 263, 113356. [Google Scholar] [CrossRef]
  15. Zimmerman, T.; Shiroma, K.; Fleischmann, K.R.; Xie, B.; Jia, C.; Verma, N.; Lee, M.K. Misinformation and COVID-19 vaccine hesitancy. Vaccine 2023, 41, 136–144. [Google Scholar] [CrossRef]
  16. Carrieri, V.; Madio, L.; Principe, F. Vaccine hesitancy and (fake) news: Quasi-experimental evidence from Italy. Health Econ. 2019, 28, 1377–1382. [Google Scholar] [CrossRef]
  17. Wu, Y.; Kuru, O.; Kim, D.H.; Kim, S. COVID-19 news exposure and vaccinations: A moderated mediation of digital news literacy behavior and vaccine misperceptions. Int. J. Environ. Res. Public Health 2023, 20, 891. [Google Scholar] [CrossRef]
  18. Goal 3: Ensure Healthy Lives and Promote Well-being for All at All Ages. Available online: https://sdgs.un.org/goals/goal3 (accessed on 18 January 2023).
  19. Transforming Our World: The 2030 Agenda for Sustainable Development. Available online: https://documents-dds-ny.un.org/doc/UNDOC/GEN/N15/291/89/PDF/N1529189.pdf?OpenElement (accessed on 18 January 2023).
  20. Seshaiyer, P.; McNeely, C.L. Challenges and opportunities from COVID-19 for global sustainable development. World Med. Health Policy 2020, 12, 443–453. [Google Scholar] [CrossRef]
  21. Pan, S.L.; Zhang, S. From fighting COVID-19 pandemic to tackling sustainable development goals: An opportunity for responsible information systems research. Int. J. Inf. Manag. 2020, 55, 102196. [Google Scholar] [CrossRef]
  22. Goal 4: Ensure Inclusive and Equitable Quality Education and Promote Lifelong Learning Opportunities for All. Available online: https://sdgs.un.org/goals/goal4 (accessed on 18 January 2023).
  23. Huber, S.G.; Helm, C. COVID-19 and schooling: Evaluation, assessment and accountability in times of crises-reacting quickly to explore key issues for policy, practice and research with the school barometer. Educ. Assess. Eval. Account. 2020, 32, 237–270. [Google Scholar] [CrossRef]
  24. Mhlanga, D.; Moloi, T. COVID-19 and the digital transformation of education: What are we learning on 4IR in South Africa? Educ. Sci. 2020, 10, 180. [Google Scholar] [CrossRef]
  25. Hidayat, F.P. Media literacy education for students during learning online the COVID-19 pandemic. Edunesia J. Ilm. Pendidik. 2021, 2, 628–634. [Google Scholar] [CrossRef]
  26. Milondzo, T.; Meyer, J.C.; Dochez, C.; Burnett, R.J. Misinformation drives low human papillomavirus vaccination coverage in South African girls attending private schools. Front. Public Health 2021, 9, 598625. [Google Scholar] [CrossRef]
  27. Owen, D.; Moroney, E.; Ren, W.; Zhao, Z. Misinformation in the Civics Classroom. In Proceedings of the 79th Annual Midwest Political Science Conference, Chicago, IL, USA, 7–10 April 2022. [Google Scholar]
  28. Penzer, M.; Breig, A. Combating the consequences of COVID-19 misinformation: Comparative analysis. J. Stud. Res. 2021, 10. [Google Scholar] [CrossRef]
  29. Our Common Agenda: Report of the Secretary-General, New York. Available online: https://reliefweb.int/report/world/our-common-agenda-report-secretary-general?gclid=EAIaIQobChMIjcmz75Gf_gIVB9Z3Ch2rZQMgEAAYASAAEgKj8vD_BwE (accessed on 10 April 2023).
  30. Countering Disinformation and Promoting Integrity in Public Information: Breaking Away from Echo Chambers; SDG Learncast No. 14. 2022. Available online: https://www.unsdglearn.org/podcast/countering-disinformation-and-promoting-integrity-in-public-information-breaking-away-from-echo-chambers/ (accessed on 10 April 2023).
  31. Schratz, M. Lehren und Lernen aus der entstehenden Zukunft. In Impulse für Lehrkräftebildung in der Digitalen Welt: Wissenschaft trifft Schulpraxis; Forum Bildung Digitalisierung e.V., Ed.; Forum Bildung Digitalisierung: Berlin, Germany, 2019; pp. 7–12. [Google Scholar]
  32. Mau, T.; Diethelm, I.; Friedrichs-Liesenkötter, H.; Schlöndorf, C.; Weich, A. Lehrkräftebildung in der digital vernetzten Welt: Ein interdisziplinärer Kompetenzrahmen für (angehende) Lehrkräfte und dessen Umsetzung in einem Pilotseminar. In Kompetenzmodelle für den Digitalen Wandel; Knackstedt, R., Sander, J., Kolomitchouk, J., Eds.; Springer: Berlin/Heidelberg, Germany, 2022; pp. 247–266. [Google Scholar]
  33. Muuß-Merholz, J. Die Digitalisierung als große Verflechterin. Available online: https://www.netzwerk-bildung-digital.de/2022/05/dialogforum-dogotale-kompetenze/ (accessed on 6 April 2022).
  34. Rieckmann, M. Education for Sustainable Development Goals. Learning Objectives; UNESCO: Paris, France, 2017. [Google Scholar]
  35. Rieckmann, M. Learning to transform the world: Key competencies in Education for Sustainable Development. In Education on the Move. Issues and Trends in Education for Sustainable Development; Leicht, A., Heiss, J., Byun, W.J., Eds.; United Nations Educational, Scientific and Cultural Organization: Paris, France, 2018; pp. 39–59. [Google Scholar]
  36. Corres, A.; Rieckmann, M.; Espasa, A.; Ruiz-Mallén, I. Educator competences in sustainability education: A systematic review of frameworks. Sustainability 2020, 12, 9858. [Google Scholar] [CrossRef]
  37. Fasching, M.; Schubatzky, T. Beyond truth: Teaching digital competences in secondary school against disinformation. Medienimpulse 2022, 60. [Google Scholar] [CrossRef]
  38. Hodgin, E.; Kahne, J. Misinformation in the information age: What teachers can do to support students. Soc. Educ. 2018, 82, 208–211, 214. [Google Scholar]
  39. Rieckmann, M.; Schank, C. Sozioökonomisch fundierte Bildung für nachhaltige Entwicklung. Kompetenzentwicklung und Werteorientierungen zwischen individueller Verantwortung und struktureller Transformation. Science 2016, 1, 65–79. [Google Scholar]
  40. Vare, P.; Scott, W. Learning for a change. Exploring the relationship between education and sustainable development. J. Educ. Sustain. Dev. 2007, 1, 191–198. [Google Scholar] [CrossRef]
  41. Debunking Handbook 2020. Available online: https://www.climatechangecommunication.org/wp-content/uploads/2020/10/DebunkingHandbook2020.pdf (accessed on 9 January 2023).
  42. Diethelm, P.; McKee, M. Denialism: What is it and how should scientists respond? Eur. J. Public Health 2009, 19, 2–4. [Google Scholar] [CrossRef]
  43. Cook, J.; Ellerton, P.; Kinkead, D. Deconstructing climate misinformation to identify reasoning errors. Environ. Res. Lett. 2018, 13, 24018. [Google Scholar] [CrossRef]
  44. Schubatzky, T.; Haagen-Schützenhöfer, C. Debunking climate myths is easy—Is it really? An explorative case study with pre-service physics teachers. Educ. Sci. 2022, 12, 566. [Google Scholar] [CrossRef]
  45. Compton, J.; van der Linden, S.; Cook, J.; Basol, M. Inoculation theory in the post-truth era: Extant findings and new frontiers for contested science, misinformation, and conspiracy theories. Soc. Personal. Psychol. Compass 2021, 15, e12602. [Google Scholar] [CrossRef]
  46. Banas, J.A.; Rains, S.A. A meta-analysis of research on inoculation theory. Commun. Monogr. 2010, 77, 281–311. [Google Scholar] [CrossRef]
  47. McGuire, W.J. Resistance to persuasion conferred by active and passive prior refutation of the same and alternative counterarguments. J. Abnorm. Soc. Psychol. 1961, 63, 326–332. [Google Scholar] [CrossRef]
  48. Compton, J. Inoculation theory. In The SAGE Handbook of Persuasion: Developments in Theory and Practice; Dillard, J., Shen, L., Eds.; SAGE Publications, Inc.: Thousand Oaks, CA, USA, 2012; pp. 220–236. [Google Scholar]
  49. Traberg, C.S.; Roozenbeek, J.; van der Linden, S. Psychological inoculation against misinformation: Current evidence and future directions. Ann. Am. Acad. Pol. Soc. Sci. 2022, 700, 136–151. [Google Scholar] [CrossRef]
  50. Lewandowsky, S.; Ecker, U.K.; Cook, J. Beyond Misinformation: Understanding and coping with the “post-truth” era. J. Appl. Res. Mem. Cogn. 2017, 6, 353–369. [Google Scholar] [CrossRef]
  51. Ma, J.; Chen, Y.; Zhu, H.; Gan, Y. Fighting COVID-19 misinformation through an online game based on the inoculation theory: Analyzing the mediating effects of perceived threat and persuasion knowledge. Int. J. Environ. Res. Public Health 2023, 20, 980. [Google Scholar] [CrossRef]
  52. Cook, J.; Ecker, U.K.H.; Trecek-King, M.; Schade, G.; Jeffers-Tracy, K.; Fessmann, J.; Kim, S.C.; Kinkead, D.; Orr, M.; Vraga, E.; et al. The cranky uncle game—Combining humor and gamification to build student resilience against climate misinformation. Environ. Educ. Res. 2022, 29, 607–623. [Google Scholar] [CrossRef]
  53. Basol, M.; Roozenbeek, J.; Berriche, M.; Uenal, F.; McClanahan, W.P.; van der Linden, S. Towards psychological herd immunity: Cross-cultural evidence for two prebunking interventions against COVID-19 misinformation. Big Data Soc. 2021, 8, 205395172110138. [Google Scholar] [CrossRef]
  54. Bonetto, E.; Troïan, J.; Varet, F.; Lo Monaco, G.; Girandola, F. Priming resistance to persuasion decreases adherence to conspiracy theories. Soc. Influ. 2018, 13, 125–136. [Google Scholar] [CrossRef]
  55. Roozenbeek, J.; Traberg, C.S.; van der Linden, S. Technique-based inoculation against real-world misinformation. R. Soc. Open Sci. 2022, 9, 211719. [Google Scholar] [CrossRef]
  56. Cook, J.; Lewandowsky, S.; Ecker, U.K.H. Neutralizing misinformation through inoculation: Exposing misleading argumentation techniques reduces their influence. PLoS ONE 2017, 12, e0175799. [Google Scholar] [CrossRef]
  57. Lewandowsky, S.; Cook, J.; Schmid, P.; Holford, D.L.; Finn, A.; Leask, J.; Thomson, A.; Lombardi, D.; Al-Rawi, A.K.; Amazeen, M.A.; et al. The COVID-19 Vaccine Communication Handbook. A Practical Guide for Improving Vaccine Communication and Fighting Misinformation. 2021. Available online: https://sks.to/c19vax (accessed on 15 May 2023).
  58. Kuckartz, U. Qualitative Inhaltsanalyse: Methoden, Praxis, Computerunterstützung, 4th ed.; Beltz Juventa: Weinheim, Germany; Basel, Switzerland, 2018. [Google Scholar]
  59. Mandl, A.; Haagen-Schützenhöfer, C.; Spitzer, P.; Schubatzky, T. Digitalität im mathematisch-naturwissenschaftlichen Fachunterricht: Entwicklung und Beforschung einer Masterlehrveranstaltung für die Lehramtsausbildung. In PhyDid B, Didaktik der Physik Beiträge zur Virtuellen DPG-Frühjahrstagung 2022. DPG-Frühjahrstagung, 21.03.–23.03.2022; Grötzebauch, H., Heinicke, S., Eds.; German National Library (DNB): Frankfurt am Mein, Germany, 2022; pp. 161–168. ISSN 2191-379X. [Google Scholar]
  60. Brinda, T.; Diethelm, I. Education in the Digital Networked World. In Tomorrow’s Learning: Involving Everyone. Learning with and about Technologies and Computing, Proceedings of the 11th IFIP World Conference on Computers in Education (WCCE), Dublin, Ireland, 3–6 July 2017; Tatnall, A., Webb, M., Eds.; Springer: Dublin, Ireland, 2017; pp. 637–641. [Google Scholar]
  61. Vuorkari, R.; Kluzer, S.; Punie, Y. DigComp 2.2: The Digital Competence Framework for Citizens: With New Examples of Knowledge, Skills and Attitudes; Comissió Europea: Luxembourg, 2022. [Google Scholar]
  62. Mishra, P.; Koehler, M.J. Technological pedagogical content knowledge: A framework for teacher knowledge. Teach. Coll. Rec. 2006, 108, 1017–1054. [Google Scholar] [CrossRef]
  63. Mandl, A.; Haagen-Schützenhöfer, C.; Spitzer, P.; Schubatzky, T. Digitale Transformation der mathematisch-naturwissenschaftlichen Lehramtsausbildung: Entwicklung und Beforschung eines Masterlehrveranstaltungsformates zur Professionalisierung angehender Lehrkräfte. In Unsicherheit als Element von naturwissenschaftsbezogenen Bildungsprozessen; Habig, S., von Vorst, H., Eds.; DGCP: Düsseldorf, Germany, 2022; pp. 532–535. [Google Scholar]
  64. Weiler, D.; Burde, J.-P.; Lachner, A.; Riese, J.; Schubatzky, T.; Große-Heilmann, R. Entwicklung eines Seminars zur Förderung des Konzeptverständnisses mittels digitaler Medien. In PhyDid B, Didaktik der Physik Beiträge zur Virtuellen DPG-Frühjahrstagung 2021. DPG-Frühjahrstagung, 22.03.–24.03.2021; Grebe-Ellis, J., Grötzebauch, H., Eds.; DPG: Düsseldorf, Germany, 2021; pp. 209–215. [Google Scholar]
  65. Eickelmann, B.; Bos, W.; Gerick, J.; Goldhammer, F.; Schaumburg, H.; Schwippert, K.; Senkbeil, M.; Vahrenhold, J. (Eds.) ICLS 2018 #Deutschland: Computer- und Informationsbezogene Kompetenzen von Schülerinnen und Schülern im Zweiten Internationalen Vergleich und Kompetenzen im Bereich Computational Thinking; Waxmann: Münster, Germany; New York, NY, USA, 2019. [Google Scholar]
Figure 1. Overview of options for dealing with misinformation (adapted from [41]).
Figure 1. Overview of options for dealing with misinformation (adapted from [41]).
Sustainability 15 08161 g001
Figure 2. Contrasting technology-based reasoning and knowledge-based reasoning for debunking misinformation [57].
Figure 2. Contrasting technology-based reasoning and knowledge-based reasoning for debunking misinformation [57].
Sustainability 15 08161 g002
Figure 3. The pre-post-design was implemented by closely following Schubatzky and Haagen-Schützenhöfer [44].
Figure 3. The pre-post-design was implemented by closely following Schubatzky and Haagen-Schützenhöfer [44].
Sustainability 15 08161 g003
Figure 4. Design of the intervention (adapted from [44]).
Figure 4. Design of the intervention (adapted from [44]).
Sustainability 15 08161 g004
Figure 5. Self-efficacy regarding the detection and debunking of misinformation. Quantitative data from the pretest and posttest are triangulated with qualitative data from the future teachers’ reflections. The future teachers’ qualitative data were translated from German.
Figure 5. Self-efficacy regarding the detection and debunking of misinformation. Quantitative data from the pretest and posttest are triangulated with qualitative data from the future teachers’ reflections. The future teachers’ qualitative data were translated from German.
Sustainability 15 08161 g005
Table 1. Description of the sample of future middle and high school teachers participating in our study.
Table 1. Description of the sample of future middle and high school teachers participating in our study.
GenderAgeSubjectProgram and Semester of Teacher Education ± SD
nmale = 1424.3 ± 2.9 yearsnphysics = 4nbachelor = 17; semester: 8.2 ± 2.7
nfemale = 10 nbiology = 7nmaster = 7; semester: 2.1 ± 1.1
nmathematics = 4
ngeography = 1
nno math/science = 1
nchemistry/biology = 1
nphysics/biology = 1
nphysics/maths = 5
Table 2. Item wording of the COVID-19 debunking self-efficacy scale (adapted from [44]).
Table 2. Item wording of the COVID-19 debunking self-efficacy scale (adapted from [44]).
Item Wording (Translated from German)
I am always able to recognize COVID myths, for example, on the internet or social media, even if they are formulated in scientific language.
I am able to detect misinformation about COVID in any case, even when mixed with correct information.
I can debunk myths about COVID in discussions in any case, even if my counterpart puts forward seemingly valid arguments for the myth.
I can explain the scientific context associated with COVID myths, even without extra preparation.
I am able to recognize logical fallacies in COVID myths in any case, even if these myths
are new to me.
Table 3. Anchor example for scoring future teachers’ statements in the debunking task. The statements were translated from German.
Table 3. Anchor example for scoring future teachers’ statements in the debunking task. The statements were translated from German.
Myth in the Letter to the Editor Statement of a Future Teacher (1 Point)Statement of a Future Teacher (2 Points)Statement of a Future Teacher (3 Points)
“The mortality rate of vaccinated persons is 30 times higher than that of unvaccinated persons.”“Who is supposed to believe that?”
(S1C1Q2)
“The mortality rate of the “unvaccinated is significantly higher!”
(S19C1Q1)
“Different data pools were used to determine the percentages. Accordingly, a comparison as made here is not possible.” (S22C1Q2)
Table 4. Descriptive statistics regarding perceived knowledge about COVID-19.
Table 4. Descriptive statistics regarding perceived knowledge about COVID-19.
ItemMedian 1MinimumMaximum
Perceived COVID knowledge PRE416
Perceived COVID knowledge POST436
1 Scale: 0 to 6.
Table 5. Descriptive statistics regarding self-efficacy score.
Table 5. Descriptive statistics regarding self-efficacy score.
Construct Mean Value 1SD MinimumMaximum
Self-Efficacy t1pre3.320.901.605.40
Self-Efficacy t2pre3.031.171.004.80
Self-Efficacy t1post4.050.862.605.40
Self-Efficacy t2post3.931.071.805.80
1 Scale: 0 to 6.
Table 6. Descriptive statistics regarding debunking skills and debunking quality.
Table 6. Descriptive statistics regarding debunking skills and debunking quality.
Construct Mean Value 1SD MinimumMaximum
Debunking Score PRE9.174.532.0018.00
Debunking Score POST11.885.573.0022.00
Debunking Quality PRE0.800.340.251.63
Debunking Quality POST0.990.530.252.00
1 Maximum debunking score: 36; maximum debunking quality: 3.
Table 7. Descriptive statistics on the learning objective regarding strategies for teaching.
Table 7. Descriptive statistics on the learning objective regarding strategies for teaching.
ItemMean Value 1SD MinimumMaximum
By participating in the course, I know strategies for teaching responsible use of information, including strategies for identifying misinformation for the mathematics and science classroom.3.540.662.004.00
1 scale: 1 to 4.
Table 8. Descriptive statistics of a future teacher with alternative beliefs about COVID-19 measures.
Table 8. Descriptive statistics of a future teacher with alternative beliefs about COVID-19 measures.
ConstructValue
Self-Efficacy t1pre3.20
Self-Efficacy t2post3.40
Debunking Score PRE7.00
Debunking Score POST9.00
Debunking Quality PRE0.88
Debunking Quality POST0.88
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Bernsteiner, A.; Schubatzky, T.; Haagen-Schützenhöfer, C. Misinformation as a Societal Problem in Times of Crisis: A Mixed-Methods Study with Future Teachers to Promote a Critical Attitude towards Information. Sustainability 2023, 15, 8161. https://doi.org/10.3390/su15108161

AMA Style

Bernsteiner A, Schubatzky T, Haagen-Schützenhöfer C. Misinformation as a Societal Problem in Times of Crisis: A Mixed-Methods Study with Future Teachers to Promote a Critical Attitude towards Information. Sustainability. 2023; 15(10):8161. https://doi.org/10.3390/su15108161

Chicago/Turabian Style

Bernsteiner, Angelika, Thomas Schubatzky, and Claudia Haagen-Schützenhöfer. 2023. "Misinformation as a Societal Problem in Times of Crisis: A Mixed-Methods Study with Future Teachers to Promote a Critical Attitude towards Information" Sustainability 15, no. 10: 8161. https://doi.org/10.3390/su15108161

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop