Next Article in Journal
I Should Only Use One Language”: A Case Study of Spanish-Speaking Emergent Bilingual Students’ Translanguaging Experiences in a Middle School Science Classroom
Previous Article in Journal
Is Intrinsic Motivation Related to Lower Stress among University Students? Relationships between Motivation for Enrolling in a Study Program, Stress, and Coping Strategies
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Credibility Judgments in Higher Education: A Mixed-Methods Approach to Detecting Misinformation from University Instructors

Department of Animal Sciences, Auburn University, Auburn, AL 36849, USA
*
Author to whom correspondence should be addressed.
Educ. Sci. 2024, 14(8), 852; https://doi.org/10.3390/educsci14080852 (registering DOI)
Submission received: 14 June 2024 / Revised: 30 July 2024 / Accepted: 30 July 2024 / Published: 7 August 2024

Abstract

:
Given the convenience with which information can now be acquired, it is crucial to analyze cases of potential misinformation and disinformation in postsecondary education. Instructor credibility judgments were measured using descriptive survey research, and the main objective was to investigate trends related to misinformation, credibility, trust, bias, and others in graduate students and on a graduate program basis. Participants were surveyed from a land grant institution in the southeast United States where 186 graduate students completed an electronic survey on the detection of misinformation and similar experiences. Graduate students were divided based on graduate program into STEM (sciences, technology, engineering, and mathematics) and non-STEM groups. Quantitative methodologies included validated questionnaires developed by researchers containing Likert-type scale questions. Chi-square tests of independence and frequencies served as primary analyses. Participants in both STEM and non-STEM groups detected the following: misinformation, bias, challenges, intimidation, risk of measurable consequences, pressure to conform, and skepticism from post-secondary instructors. There were significant differences between the type of student for trust in claims (p < 0.05), while the perception of potential consequences tended to be different between the types of graduate students (0.05 < p < 0.10). Participants in both STEM and non-STEM groups reported perception bias in science material presentation, with STEM students reporting less bias. Qualitative methodologies included optional open response boxes to provide supporting details or narratives. Reliable and validated thematic coding following served as the primary analysis. Students disciplined in STEM and non-STEM faced misinformation, bias, challenges, intimidation, risk of measurable consequences, pressure to conform, and skepticism from post-secondary instructors. Graduate students reported consistent instances of misinformation and bias about science and agriculture topics in both science and non-science-focused classrooms.

1. Introduction

Science literacy extends beyond familiarity with and comprehension of scientific subjects. Individuals who are science literate are competent outsiders with respect to science [1]. These individuals have mastered the ability to recognize the circumstances in which science is relevant to their needs and interests and to interact with sources of scientific knowledge in a way that advances their own objectives [1]. The quest for science literacy, then, is not just about detecting relevance; rather, it is fundamentally about learning to recognize how science is or may be significant to things that citizens care about most [1].
On the other hand, facilitators of learning environments are involved in the process of gathering information, analyzing it, and disseminating it. Chronic misinformation is to blame for the failure of this procedure [2]. Although misinformation and disinformation have distinct differences, their definitions sometimes mix [3]. Both phrases refer to the dissemination of false or disproven information, although with different motives and objectives [3]. Misinformation is the accidental dissemination of false and misleading information [4]. Disinformation is the purposeful spread of false or incorrect information with the intent of damaging a person or group [4]. Bias refers to a preference for (or opposition to) a specific concept, individual, or object based on one’s own feelings or values [4].
Opinion-based instruction and political prejudice are the main sources of misinformation and disinformation in college courses [5,6]. When people acquire comparable information, a phenomenon known as belief polarization occurs when their opinions are virtually in conflict [7]. Similarly, when information is supplied that is credible and accepted by individuals with similar intentions but rejected by others with opposing agendas, science becomes politicized [8]. Research has concluded that there are influential relationships between opinionated instructors and students. Specifically, students were influenced by their instructors, which impacted their daily lives [9,10]. Giving present and future educators the skills necessary to explain scientific concepts gives students outstanding role models in the field of science communication. By developing their scientific communication skills, instructors can better explain emerging research to students, which will increase understanding among a wide range of citizens [11,12]. Competent graduates are widely sought after in today’s competitive employment environment because competent professors produce competent students with great science communication abilities [13]. Science communication and critical thinking proficiency can be likened to speaking two languages fluently [13]. The interaction between students and teachers is an important topic to cover when talking about misinformation in higher education. Several organizations that manage the trust between students and their professors explain this relationship. Simply put, the institutional environment in which instructors work and the idea of paying for a more advanced, complex education support the trust of students in their teachers [14,15]. Students seek specialized knowledge and pursue academic achievement when they enroll in college, strengthening students’ faith in their professors [14,15]. In college, students learn about epistemic confidence and put it into practice, which helps the instructor guide their constituents with epistemic competence [14]. Furthermore, it has been observed that students succeed more frequently when they have a good working relationship with their teacher [14].
Research has shown a need for reform efforts for science educators because, although science is advancing, our teaching and communication of science have not followed the same trend [16,17], revealing knowledge and ability gaps. A portion of these reform efforts specifically target science literacy as the main goal of advancing science education from kindergarten through postsecondary studies [16,18]. Barnes et al. [16] reported that both explicit and relative instruction are critical to narrowing these gaps.

1.1. Theoretical Framework

In their role as disseminators of scientific knowledge, educators often turn to Pragmatic Theories of Truth, commonly known as pragmatism, to explicate the nature of truth. Originating from American philosophers, pragmatism represents a contemporary justification of truth. Pragmatic theories posit that truths are determined by their practical consequences and empirical validation. William James asserted that truth consists of ideas that can be assimilated, validated, corroborated, and verified [19]. Similarly, John Dewey contended that truth is synonymous with what works effectively in practice [20]. Charles Peirce proposed that truth, although provisional, remains valid until superseded by scientific inquiry [19,21]. Pragmatism employs empirical evidence to differentiate between truth and falsehood, making it a dominant paradigm in research [22]. Understanding truth is likened to assembling a puzzle, where each new piece of information is deemed true if it fits into the larger framework [22]. However, pieces with unconventional shapes often appear incompatible with the final product, a phenomenon commonly encountered in research [21]. Scientists may prioritize narrowly focused research topics over broader social implications, potentially overlooking significant societal concerns [23]. Nevertheless, when appropriately placed, even the most unconventional puzzle pieces contribute to the comprehensive understanding of truth. Pragmatism posits that scientific progress continually introduces new knowledge, necessitating constant reassessment of perspectives and beliefs to apprehend truth accurately. In formal educational settings, the concept of knowledge sharing is highly significant. Within an organization, knowledge sharing refers to the intentional exchange of information, ideas, insights, and experiences between individuals or teams [24]. Various theories have been proposed to explain knowledge-sharing behaviors, including the Theory of Reasoned Action, the Theory of Planned Behavior, and Social Exchange Theory, among others [25]. In the context of education, Social Exchange Theory offers a particularly relevant framework. This theory, based on a cost–benefit analysis, elucidates the interactions between individuals, such as teachers and students, highlighting the dynamics of leadership, relationship building, and trust in the learning process [26]. In summary, educators are more inclined to impart “accurate information” to their students, albeit influenced by potential risks and biases, which can inadvertently misinform students about scientific concepts and hinder scientific literacy.

1.2. Study Purpose, Research Objectives, and Research Questions

Research into perspectives on detecting misinformation among students and their credibility judgments of instructors in higher education is not well understood. This study aimed to discreetly explore trends and perspectives in the detection of misinformation and bias in past classrooms of graduate students on a graduate program basis at a land grant school in the southeastern United States. A land-grant institution delivers higher education within the United States and is designated by a state to receive the benefits of the Morrill Acts of 1862 and 1890, or a beneficiary under the Equity in Educational Land-Grant Status Act of 1994. Graduate students are classified as STEM (science, technology, engineering, or mathematics) or non-STEM depending on their graduate program of study. This study examined both general and science-specific misinformation, as well as pseudoscientific evidence. It was hypothesized that graduate students of STEM disciplines and non-STEM disciplines would yield differing responses to these phenomena. Specifically, this study sought to report on the perspectives and stories of graduate students who experience misinformation and other obstacles in higher education. Sharing the narratives of students who likely otherwise would have kept quiet about these sensitive topics would be informative to educational administrators and higher education curriculum curators. The research questions (RQ) that this study answers are as follows:
  • What do STEM and non-STEM graduate students perceive and rank as credible sources of scientific information?
  • Is there a relationship between graduate program (STEM and non-STEM) and the perceived value of incorporating science into non-science classrooms?
    What are the perspectives on including science in non-science classrooms?
  • Is there a relationship between graduate program (STEM and non-STEM) and sensing misinformation, bias, challenge, intimidation, consequences, or pressure to conform?
  • How often is credible, evidence-based science discussed in non-science college classrooms?
  • If science was introduced in non-science classrooms, are the scientific claims perceived as trustworthy by graduate students (STEM and non-STEM), or are graduate students skeptical that the claims could be misinformation?
  • What are the experiences of misinformation, bias, challenge, intimidation, risk of consequences, pressure to conform, trust, and skepticism between STEM and non-STEM graduate students?

2. Materials and Methods

2.1. Study Design

This study used survey data (n = 186) following a descriptive research design and implemented using a mixed-methods approach. The research was approved by the university institutional review board. The initial questionnaire was composed of four sections: demographics, a science literacy questionnaire (SLQ), a detecting misinformation questionnaire (DMQ), science communication, and teaching responsibilities. The questions were designed by research personnel described in the next subsection, approved by a panel of three science educators, and the internal consistency was tested. The reliability analysis resulted in a Cronbach alpha of 0.647, which is lower than the standard value; however, considering the smaller sample size and fewer items, this was deemed sufficient for continued analysis. The electronic survey using Qualtrics Survey software (Version 2022, Provo, UT, USA) was distributed to graduate students by email using methodologies suited for web-based questionnaires [27]. Data collection ranged from July to August 2022. After narrowing responses to full completion of the demographics, SLQ, and DMQ, statistical analysis was performed on 186 of 327 findings.

2.2. Survey Development and Questionnaires Measured

For the purposes of this study, the demographic data, the SLQ responses, and the DMQ responses were analyzed. The questionnaire on detecting misinformation was developed by research personnel with the help of other university professionals. Three individual professors from three different departments (agriculture, engineering, and general sciences) were recruited to discuss and advise on questionnaire development. The discussion began with the original study purpose regarding misinformation and bias in science and non-science classrooms. After discussions, ultimately the researchers were able to channel two questions for the SLQ and ten questions for the DMQ.
Following demographic questions was the science literacy questionnaire (SLQ). A “select-all-that-apply” question for credible sources of scientific information was presented to participants, “What do you consider as credible sources of scientific information?” The second question in this sub-questionnaire was a Likert-type response asking, “Do you think there is value or importance in incorporating science topics into the nonscience classroom?”.
Following the SLQ was the detecting misinformation questionnaire (DMQ), which was an eight-item Likert-type design. These questions asked how often graduate students experienced misinformation and bias on science-related topics during their undergraduate and graduate student careers. Additionally, this section inquired whether graduate students had different opinions than their instructors or most of the class, with resulting feelings of challenge, intimidation, potential consequences, and pressure to conform to the opposite opinion. Finally, participants were surveyed on if they had experienced science presented by instructors in non-science classrooms. If responding yes, these participants (n = 144) completed three additional Likert-type questions that probed perceived trust or skepticism of scientific claims. After each survey item, participants had the opportunity to share experiences related to each question asked or provide context with their responses.

2.3. Analysis

Quantitative data analysis involved descriptive statistics and chi-square tests of independence that were analyzed using SPSS (Version 28). The α-level for mean differences was set at 0.05, and the tendency for differences was set at 0.1. When effects had p < α, differences or tendencies were discussed. Quantitative data analysis of demographic characteristics involved frequency statistics, which were analyzed using SPSS (Version 28). Open-response questions served as qualitative measures, and thematic coding was analyzed using ATLAS.ti (ATLAS.ti, Web Version, 2022). Optional comments were organized by discipline groups by the first author, which yielded a transcript consisting of 17 pages of single-spaced text. As encouraged by Saldaña [28], three different coders read, annotated, and coded the transcript. Open or inductive coding was the grounding approach to the coding process, where first- and second-cycle coding [28] traced open coding methodologies and thematic analysis first defined by Braun and Clark [29]. Specifically, first-cycle coding included the familiarization and initial rounds of coding [29]. To ensure stable responses, coders collaborated if there were discrepancies in interpretation, where codes of similar meaning were combined into one code through second-cycle coding. The codes were then classified and grouped into themes after second-cycle coding and reported in these results [29,30,31,32]. Dependability and reliability were achieved by having three experienced personnel review and validate the transcribed material, themes, and codes [33]. These coding methods ensured that the research study was based on the concepts of reliability and validity [34].

2.4. Participants

Participants in this study were selected using convenience sampling at a land-grant institution in the southeastern United States. Graduate students were chosen due to their unique workload in higher education, which typically includes research, teaching, coursework, and community engagement [35,36,37,38,39]. A web-based survey methodology [27] was employed to distribute emails to full-time, part-time, and distance-learning graduate students. Shown in Table 1, the main demographic characteristics of the participants, who predominantly fell within the 20–29 age range, identified as White, aligned with Democratic political affiliations or chose not to disclose, and came from suburban backgrounds. The sample encompassed various degree programs, with Master of Science and Doctor of Philosophy comprising the largest segments. The roles of graduate students were also diverse, with graduate students primarily dedicated to research, teaching, or other roles. Demographic characteristics are compared on a STEM and non-STEM discipline basis since detection of misinformation and science misinformation comprises the core of this study. The categorization of STEM and non-STEM was performed through the declaration of the graduate program of study, in which research personnel were classified as each falling into one category or the other.

3. Results

3.1. Credible Sources of Information Identified by Graduate Students

The types of credible sources of scientific information selected by the STEM and non-STEM graduate student groups are shown in Figure 1. Popular selections for both groups were peer-reviewed or open-access journals and articles, government- or education-based websites, and industry professionals. An interesting observation was that seven categories were not viewed as credible by non-STEM disciplined graduate students. Commodity organizations, social media and influencers, friends, blogs, Google, and Wikipedia were not selected as credible sources of information by non-STEM students. The “other” sources of credible information included textbooks, journal search engines, news, and extension publications.

3.2. Perceived Value or Importance of Incorporating Science into Non-Science Classrooms for Graduate Students

Both STEM and non-STEM graduate students indicated that there is some value in incorporating science into non-science classrooms (Table 2). The chi-square test of independence revealed that there is no association between the type of graduate program type and the perceived value of incorporating science into non-science classrooms, (χ2(2) = 3.68; p = 0.16).

3.3. Graduate Student Perspectives of Including Science in Non-Science Classrooms

Of 62 total responses, 51 of the comments were from respondents from STEM disciplines and 11 from non-STEM disciplines. The themes that emerged were “Interdisciplinary Efforts”, “Personal and Professional Development”, and “Interferences”. For the theme “Interdisciplinary Efforts”, the codes for the participant responses included “connections”, “useful”, “perspective”, “exposure”, and “awareness”. These responses indicated positive support for including science in non-science classrooms as respondents saw opportunities to advance quality education by immersing science into classrooms. For example, one STEM student explained, “It’s important for everyone to have some understanding of how different aspects of how our world works whether it be related to agriculture, medicine, construction, or any other important sector of human lives so that they understand why some practices are adopted”. The theme “Personal and Professional Development” resulted in codes such as “critical thinking”, “writing”, “social issues”, “global & human application”, “relevant”, and “improve or enhance”. Responses within this theme described specific instances in which incorporating science can positively improve students both personally and professionally. A positive perspective from a STEM student indicated “If we practice more integration of topics and how different schools of thought intersect to create deeper knowledge, there would be better base knowledge in society and that would lead to better outcomes in many areas”. The last theme, “Interferences”, describes responses that were hesitant to incorporating science in non-science classrooms with anticipating disruptions of learning or students losing interest. Codes under this theme included “inappropriate”, “less engaging”, “personal anecdotes”, “politics”, “student interest”, and “opinions”. One STEM student indicated the distractions of incorporating science in non-science classrooms in that “If people wanted to learn science, or learn about science, they would take science classes or sign up for a science major”. A non-STEM perspective includes, “A topic like climate change can be politicized, I think it’s important for teachers to think about not only what they teach, but how they teach it”, in hesitance to incorporating science in non-science classrooms.

3.4. Graduate Students Sensing Misinformation, Bias, Challenge, Intimidation, Consequences, and Pressure to Conform

Students in both graduate program groups indicated that they have sensed misinformation taught outside of their disciplines to some degree, with nearly 20% of each group reporting that they experience equal amounts of correct and misinformation. However, the results did not indicate differences between the groups in sensing misinformation (χ2(4) = 5.73; p = 0.22). Similarly, both groups have detected bias in the teaching of science-related topics, though this was not significant (χ2(4) = 6.28; p = 0.18). Approximately 20% of each group witnessed or sensed bias often, while 17.7% of STEM-disciplined students never sensed bias, compared to 8.1% in non-STEM disciplines. In addition, both types of graduate programs have sensed challenges (χ2(4) = 1.50; p = 0.83) because of differing perspectives on science topics, with the majority indicating that the challenge did not bother them. Both groups also detected intimidation to some degree (χ2(5) = 3.95; p = 0.56), but most were not intimidated by differing opinions. For non-STEM graduate students, 5.4% were intimidated enough to withdraw from a class, while 1.4% of STEM graduate students were intimidated enough to conform to the majority opinion. Both STEM and non-STEM students tended to sense consequences differently with medium effect (χ2(3) = 6.92; p = 0.08; Table 3). Although most graduate students did not feel at risk of suffering measurable consequences (grade suffering, judgment, etc.) because of their differing opinions, STEM students felt greater risk than non-STEM students. Though no effect was presented for pressure to conform (χ2(3) = 2.38; p = 0.50), non-STEM students reported that they were more pressured to conform than STEM students, and 2% of STEM students indicated they were heavily pressured to conform because of potential consequences, intimidation, etc.

3.5. Frequency of Evidence-Based Science Introduced in Non-Science Classrooms

To analyze the frequency with which graduate students experienced instructors in non-science university-level classrooms addressing science, Likert-type scale responses were counted. Out of 186 participants, 42 graduate students indicated that they had not previously experienced this phenomenon. About 11% and 16% of STEM and non-STEM graduate students, respectively, have had science introduced in non-science classrooms multiple times (Figure 2). Chi-square tests revealed that there is a relationship between graduate program type and experiencing science in non-science classrooms with a large effect (χ2(3) = 9.074, p = 0.028, V = 0.221). To understand how many of these scientific claims (n = 144) were supported by credible sources at the time they were presented, the frequencies of Likert-type responses were collected, analyzed, and displayed in Figure 3. Some participants were unable to recall whether sources were provided at all, with 24% and 15% of participants from STEM and non-STEM disciplines, respectively. Typically, credible sources were provided most of the time, if not always, for both disciplines. Approximately 6% of STEM graduate students and 3% of non-STEM graduate students indicated that these scientific claims were never backed by credible sources (χ2(3) = 4.56, p = 0.21).

3.6. Credibility or Possibility of Misinformation in Scientific Claims

For those students who experienced instructors introducing science to non-science classrooms (n = 144), participants were asked to assess their perceived level of trust and skepticism about whether the scientific claims made were misinformation. Graduate students were found to experience varying levels of trust in scientific claims with large effect (χ2(4) = 10.39; p = 0.03; V = 0.27; Table 4).
Specifically, all non-STEM graduate students either trusted the statements from a somewhat to a very high amount or had no opinion (Table 4). However, 8.4% of STEM graduate students indicated that they responded with somewhat low to very low trust in these scientific claims. It was also found that graduate students had varying degrees of skepticism that the scientific claims could be misinformation. Both STEM and non-STEM graduate students reported similar results of skepticism, where the largest majority were skeptical at a low to medium level (χ2(3) = 4.70; p = 0.18).

3.7. Experiences of Misinformation, Bias, Challenge, Intimidation, Risk of Consequences, Pressure to Confirm, Trust, and Skepticism

Additional comments regarding the researcher-developed questionnaires about detecting misinformation and similar items were organized by STEM and non-STEM disciplines and coded in a cyclic manner by research personnel. Overall, STEM graduate students shared a greater number of experiences than non-STEM graduate students. In total, 168 responses were recorded across multiple categories, with 134 responses from graduate students in STEM and 34 responses from graduate students not in STEM. Repeated themes across multiple experiences include “Memorable Experiences”, “Consistencies”, and “Natural Instructor and Student Tendencies”.

3.7.1. Experiences with Misinformation

The two themes that emerged were “Chronic Consistency” and “Memorable Points of Misinformation”. “Chronic Consistency” contained codes such as “influence”, “repetition”, “irrelevant”, “fast”, “corrected”, and “bother”. Respondents demonstrating this theme showed irritation toward the frequency they have experienced misinformation or how often instructors were corrected by students. As an example, one STEM student explained, “I had an undergrad professor who was often corrected by students”. Similarly, another non-STEM student recalled, “Having taken classes outside my main discipline, misinformation is common in places where scientific information is not formally studied. Often this is a result of misunderstanding the information or simply not being interested”. The other more-discussed theme, “Memorable Points of Misinformation”, includes codes such as “misinformation topics”, “politics”, “COVID-19”, and “personal anecdotes”. These codes represent the specific instances the respondents recalled experiencing misinformation. Graduate students in both STEM and non-STEM groups experienced persistent misinformation as students and had similar examples. STEM-disciplined graduate students recalled misinformation about agriculture, climate change, health and diet culture, quantum mechanics, and fish health, to name a few. Non-STEM graduate students mention climate science, animal production, and clinician errors. Several respondents described their misinformation experiences outside the classroom, typically through social media. One respondent went as far as explaining, “Sometimes our universities value a highly accomplished individual in an industry over a multi-disciplined industry worker who can speak on different perspectives and impacts of an industry. We often value success over diversity and sound science”.

3.7.2. Experiences with Bias

The themes that emerged from qualitative analysis and cyclic coding regarding experiences of bias were “Chronic Consistency”, “Memorable Points of Bias”, and “Natural Bias”. The theme “Chronic Consistency” resulted from multiple codes of “repetition” and “frequently”. Like experiences of misinformation, graduate students reflect on how often instructors presented bias toward science topics in the classroom. For example, a non-STEM graduate student reported, “Biases can be common in places where scientific information is not formally studied”. “Memorable Points of Bias” included codes of “bias topics” and “politics”, reflecting specific memories of bias presented. Bias topics recalled by STEM students were evolution, agriculture, and aerospace concepts. Non-STEM graduate students expressed consistency of bias with specific memories of agriculture and climate-related information. The “Natural Bias” theme contained both negative and neutral statements regarding the inevitable bias instructors will present. The codes that made up “Natural Bias” were “opinion”, “self-determination”, “recognition”, and “not harmful”. Comparing groups, some non-STEM graduate students expressed a greater urgency or magnitude in their experiences, while other STEM graduate students did not see the bias as malicious. Specifically, one non-STEM student reflected, “I’ve noticed misinformation designed to persuade people to make specific decisions or actions”. This STEM student highlighted, “There is always going to be misinformation”, while another said, “I have but it is not harmful or persuading, I guess. I like to listen to professors’ point of view and opinions and then I can make my own decisions”.

3.7.3. Experiences with Challenges, Intimidation, Risk of Consequences, and Pressure to Conform

Since sense of challenge, intimidation, risk of consequences, and pressure to conform can rely on each other, these categories were read, organized, and coded together, resulting in three themes. The first theme was “Memorable Challenges and Pressures”, which included codes named “challenge topics”, “intimidation topics”, “example”, “conform topics”, “consequences”, “politics”, “opinions”, “irrelevant”, and “influence”. Both disciplines had experiences to explicitly name, but STEM graduate students provided more evidence of challenges, pressures, and consequences that were frustrating and could have been negatively influential to others. As an example, one STEM student said, “I have received lower grades for writing articles with contradicting views of professor”. “Challenge topics” included the origins of Earth, animal rights, production agriculture, secularism, and demonizing industries or practices. However, “intimidation topics” were Earth system sciences, chemistries, and health sciences, which were accounted for by STEM respondents. The “example” codes represented topics that induced measurable consequences, such as agriculture topics and instances where science entered humanities discussions. The examples provided for “consequences” were suffering grade cuts, judgment from instructors, judgment from peers, and long-term career downfalls. Another theme was “Unexpected Troubles and Caution” where “bother”, “unsafe space”, “saving face”, “pressure”, “afraid”, and “conform” were codes comprising the theme. Both STEM and non-STEM students reported that they were less likely to speak up or share their opinions because of potential consequences or that it was easier to deal with potential arguments with classmates and instructors. Specifically, this non-STEM student shared, “Never conformed, but never felt able to have an open discussion without being absolutely teamed up against with a mob mentality that felt threatening (not to physical health but to departmental and work life)”. The third theme was “Welcoming Challenge”. “Welcoming Challenge” included codes such as “accept challenge” and “perspective”. Respondents who welcomed challenges explained that without challenge there is no need for science and that understanding different perspectives was foundational for critical thinking and respect. All respondents who fell under this theme were from STEM disciplines. For example, STEM students responded, “Challenges drive me to excel”, and, “Science is all about being challenged and be able to provide information to stand your ground. Without challenge, there would be no science. Everyone would just accept information immediately. Questions are important”.

3.7.4. Experiences with Trust and Skepticism

Participants were provided opportunities to explain their experiences of trust toward scientific claims in non-science classrooms and their experiences of skepticism that information could be misinformation. These responses were analyzed, coded, and condensed into three themes: “Memorable Trust and Skepticism”, “Innate Trust”, and “Passive Tolerance”. “Memorable Trust and Skepticism” contained codes of “example”, “social media”, “bias”, and “current events”.
Both STEM and non-STEM graduate students provided specific examples of topics where their trust in content was questionable. For STEM students, these topics were public policy containing science topics, implications of technological history, and aircraft. For example, “Most professors had some sort of evidence to back up their claims but often times their evidence was from a major news outlet, not scientific journals”. For non-STEM students, the topics described included climate change, artificial intelligence, space and astronomy, technology, and psychology. A non-STEM student said, “I had a few courses that discussed climate change, artificial intelligence, and also space/astronomy, so needless to say I was hesitant to trust”. The “Innate Trust” theme was composed of the codes “truth” and “authority”. Equally mentioned in both groups, these codes were labels to mention whether instructors at the time were trusted sources of scientific information and did not need to be questioned, or that being skeptical of misinformation was unnecessary. As an example, this non-STEM student said, “Because he is a published researcher and professor in the area he was speaking of, I was not skeptical”. “Passive Tolerance” was a theme made up of codes like “gut-check”, “self-determination”, “prior knowledge”, and “critical thinking”. Both STEM and non-STEM students explained that if science was presented in non-science classrooms, they relied on prior knowledge and gut-level decisions to decide if the information was accurate. Specifically, this STEM student recalled, “Always would take conversations that were had with a grain of salt and follow up to confirm validity”. Often, these interpretations did not cause stress or bother to the students; rather, the students tolerated the claims. Only non-STEM students mentioned “critical thinking”, as those students stated that skepticism was an important part of critical thinking skills. To provide further evidence, this non-STEM student said, “Skepticism is an important part of critical thinking. Simply taking a statement as fact or truth is irresponsible and is just not good science. It is important to dig further”.

4. Discussion

Considering that information is at peak accessibility, it is important to critically question and evaluate instances of possible misinformation in post-secondary classrooms. It is particularly important to judge the credibility of the scientific claims of the instructors to preserve the integrity and mission of land grant universities. As described in previous studies [9,10] and the current study, instructors are influential individuals who can have long-term impacts on their students beyond the scope of the class. This phenomenon occurs due to the remarkable level of trust and authority between instructors and students and is also explained by student–teacher relationships [14,15]. Trust between students and teachers is attributed to the Social Exchange Theory, where there is a clear leadership and knowledge division [26,40], and teachers rely on pragmatism to corroborate ideas [19,20,21,22]. An effective way to judge credibility is to survey student perceptions [41]. Previous studies have primarily focused on detecting misinformation across news media, social platforms, and other online sources [42,43,44,45,46]. Prior research has explored trends in students’ identification of credible sources, highlighting the primary criterion of topic relevance [47]. Other investigations have examined various factors contributing to source credibility, such as content accuracy, authorship credentials, website design and structure, domain credibility (.org), usability, academic rigor, and other attributes [48]. Consistent with these findings, the current study reveals that nearly all STEM and non-STEM students perceive peer-reviewed journals as highly credible, followed by domains ending in ‘.org’ and government agencies. These resources are noted for their specificity, comprehensibility, and rigorous review processes, aligning with insights from prior research. Generation Z and Millennials have reported the highest confidence in their ability to detect misinformation, compared to other generations [48]. The current study revealed that both STEM and non-STEM students reported on the consistency of misinformation and bias and recalled specific topics that they perceived as misinformation or biased in free responses.
Instructors frequently express their perspectives on scientific topics during teaching, sometimes incorporating political viewpoints [5,6,7,8]. This can lead to varying student reactions depending on whether the classroom environment is neutral or contentious [49]. Instances of recurring misinformation or anecdotal presentation of information can adversely impact its perceived accuracy [50,51]. To facilitate discussion on potentially sensitive subjects, instructors often refer to creating a “safe space”, aimed at fostering inclusivity and providing a secure environment for students to express vulnerable opinions [52]. However, achieving this can be challenging depending on university policies and class size dynamics. Smaller class settings generally encourage greater student engagement compared to larger ones [53]. Nevertheless, students across all class sizes may still hesitate to voice dissenting opinions due to perceived challenges or social pressures, potentially leading to debates over conformity and the consequences of expressing non-majority views.
Graduate students reflected that when these situations (instances of misinformation, bias, etc.) occurred, they had less knowledge of credibility than they do now and would have judged the information differently than before. Through qualitative analysis, it was found that with these challenges, graduate students had positive, negative, or neutral reactions. Positive reactions embraced obstacles, negative reactions explained frustration or annoyance with these occurrences, and neutral statements did not express an emotional reaction. Considering all these answers regarding misinformation, bias, and others, graduate students see numerous benefits of including science in non-science classrooms. The perceived value or importance of incorporating science into non-science classrooms was attributed to developing critical thinking skills and overall improving science understanding and science literacy. With improved knowledge and understanding of science, student confidence is expected to increase, as well as critical thinking skills and other higher-order thinking skills [54,55]. However, the action of incorporating science in an interdisciplinary fashion requires some skills. As highlighted both in previous research and in the current study, the effective teaching of science in non-science classrooms requires effort from the instructor [56,57]. Without positive acceptance from the instructors, there would be a lack of skill and knowledge, and an ineffective delivery of science content, which would defeat the purpose of science integration [58,59]. Another perspective removes the teacher from this train of thought [1]. That is, teachers are not the sole party responsible for making science relevant like typical pedagogical tools [1]. Rather, it is up to the students to learn how to make science relevant through questioning, practice, and providing their own social context [1].
Graduate students encounter various challenges that undermine their trust in instructors’ assertions. Research conducted by Linvill and Havice reveals that students often perceive bias in higher education, particularly when instructors’ political views diverge from the course content [60]. Concerns about grade penalties, ridicule, and judgment can compel students to align their assignments with the instructor’s perspectives. Instructor epistemologies significantly influence students’ reception of information [16,60], and the predominance of liberal perspectives among university instructors may inadvertently propagate misinformation about scientific topics [61]. However, instructors’ political orientations generally do not measurably affect students’ own political leanings [62]. Some students who perceive bias in higher education express frustration over feeling that their efforts are devalued when confronted with biased instruction [60].
Regarding susceptibility to influence, several factors determine graduate students’ acceptance of misinformation. It was hypothesized that STEM and non-STEM groups of graduate students would differ significantly in their levels of trust and skepticism. This study found that trust in scientific claims varied significantly based on whether students were in STEM or non-STEM disciplines. STEM students tended to exhibit greater skepticism compared to their non-STEM peers, although both groups demonstrated a similar tendency to question potentially misleading claims. Factors influencing acceptance of misinformation include knowledge of the issue, information processing skills, and reliance on media sources [63]. Evaluating the credibility and validity of information is crucial, with analytically minded individuals being less prone to accepting misinformation [63,64]. Non-STEM students, typically having less prior exposure to scientific claims, often rely on external sources of scientific information to assess trustworthiness.
Among all the opportunities to provide additional context and information to their answers, the most common topic mentioned was production agriculture. Consistent with public concerns about agriculture, the majority of agriculture topics included the environment and food production or safety [65]. Specifically, the participants reported misconceptions, misinformation, and opinion-based teaching about genetically modified agricultural organisms, the use of antibiotics in livestock, exaggerated inputs to climate change, the use of pesticides, and others. Discussing agriculture outside of agricultural classrooms is not a unique case. Bias against agriculture, typically in the ethics of consuming meat, is a frequent topic for instructors. Schwitzgebel et al. [9] examined the influence of instructors on student meat consumption behaviors. In an ethics course, students were assigned to sections focusing either on unethical meat consumption or charity ethics [9]. Surveys tracked changes in students’ meat-eating habits and dining choices, revealing that those exposed to meat ethics discussions were more inclined to opt for non-meat alternatives [9]. Similarly, studies evaluating interventions like “nudging” towards plant-blended burgers or educational methods demonstrated effectiveness in reducing the preference for all-beef hamburgers compared to control groups [10]. These influences were observed across both STEM and non-STEM classrooms.
Incorporating science into non-science classrooms is crucial for bridging knowledge gaps and fostering confidence in future science communication [13,54]. Learning science communication skills is a foundational component of critical thinking [13,16,54], and in this study, students prefer interdisciplinary actions for these benefits. However, instructors need to integrate science into their classrooms to effectively communicate complex topics to a larger audience.

5. Conclusions, Limitations, and Recommendations

This descriptive research study utilized a mixed-methods approach to systematically quantify and elucidate graduate students’ perceptions of the credibility of university instructors. Across diverse STEM and non-STEM disciplines, graduate students reported encountering or observing instances of misinformation during their academic careers. Similarly, both groups noted instances of instructor bias in science-related topics, with non-STEM disciplines reporting higher frequencies compared to STEM disciplines.
However, our study encountered several limitations. Firstly, a larger sample size of graduate student responses would enhance the breadth and depth of representation [66,67,68]. Additionally, survey length contributed to a notable rate of incomplete responses due to respondent fatigue [67,68]. Future research efforts should consider shortening survey durations to mitigate respondent fatigue and potential bias while facilitating larger sample sizes.
Further investigations are warranted to explore the broader impacts of misinformation and faculty influence on students. Future research should encompass multiple universities and student cohorts to longitudinally assess changes in instructor credibility, potential impairments, and evolving trustworthiness over time. Additionally, interventions aimed at mitigating bias and misinformation in science education should be systematically evaluated.
The results of the current study reflect the importance of continually measuring student perceptions and the urgency to put efforts toward mitigating and minimizing bias and misinformation surrounding science topics at higher education institutions. Such efforts or interventions may include modules or training in science communication that would be included in teacher education or faculty development programs. In this study, the sense of challenge, the feeling of intimidation of the said challenges, the risk of consequences, and the pressure to conform were measured. Both disciplines had experienced each of the listed phenomena. Some of the obstacles that graduate students faced also challenged their trust in these claims. In general, non-STEM graduate students expressed greater trust in claims than STEM students, while both groups were equally represented as skeptical that the claims could be misinformation.

Author Contributions

K.C. and D.M.; methodology, K.C. and D.M.; validation, K.C. and M.C.; formal analysis, K.C.; investigation, K.H., K.C. and D.M.; resources, D.M.; data curation, K.C. and W.B.S.; writing—original draft preparation, K.C.; writing—review and editing, S.R., W.B.S. and D.M.; visualization, K.C.; supervision, D.M.; project administration, D.M.; funding acquisition, D.M. All authors have read and agreed to the published version of the manuscript.

Funding

This research was funded by The Alabama Agricultural Experiment Station grant 2021-38420-34060 ‘Bolstering the Social Licensure of Agriculture—Discovery and Curation of Ag Issue Modalities’. The APC was funded by the Department of Animal Sciences, Auburn University.

Institutional Review Board Statement

The study was conducted in accordance with the Declaration of Helsinki, and approved by the Institutional Review Board (or Ethics Committee) of Auburn University (protocol code #22-223 EX 2205 approved May 2022.

Informed Consent Statement

Informed consent was obtained from all subjects involved in the study.

Data Availability Statement

The data that support the findings of this study are available upon reasonable request from the corresponding author, D.M. The data is not publicly available due to potential privacy concerns among research participants.

Conflicts of Interest

The authors declare no conflict of interest.

References

  1. Feinstein, N. Salvaging science literacy. Sci. Educ. 2011, 95, 168–185. [Google Scholar] [CrossRef]
  2. West, J.D.; Bergstrom, C.T. Misinformation in and about science. Proc. Natl. Acad. Sci. USA 2021, 118, e1912444117. [Google Scholar] [CrossRef]
  3. Gebel, M. Misinformation vs. Disinformation: What to Know about Each Form of False Information, and How to Spot Them Online. Available online: https://www.businessinsider.com/guides/tech/misinformation-vs-disinformation (accessed on 26 July 2024).
  4. Polger, M.A. CSI Library: Misinformation and Disinformation: Thinking Critically about Information Sources: Definitions of Terms. Available online: https://library.csi.cuny.edu/c.php?g=619342&p=4310781 (accessed on 26 July 2024).
  5. Linvill, D.L. The Relationship between Student Identity Development and the Perception of Political Bias in the College Classroom. Coll. Teach. 2011, 59, 49–55. [Google Scholar] [CrossRef]
  6. Kunkle, K.A.; Monroe, M.C. Cultural cognition and climate change education in the U.S.: Why consensus is not enough. Environ. Educ. Res. 2019, 25, 633–655. [Google Scholar] [CrossRef]
  7. Cook, J.; Lewandowsky, S. Rational Irrationality: Modeling Climate Change Belief Polarization Using Bayesian Networks. Top. Cogn. Sci. 2016, 8, 160–179. [Google Scholar] [CrossRef] [PubMed]
  8. van der Linden, S.; Leiserowitz, A.; Rosenthal, S.; Maibach, E. Inoculating the Public against Misinformation about Climate Change. Glob. Chall. 2017, 1, 1600008. [Google Scholar] [CrossRef]
  9. Schwitzgebel, E.; Cokelet, B.; Singer, P. Do ethics classes influence student behavior? Case study: Teaching the ethics of eating smeat. Cognition 2020, 203, 104397. [Google Scholar] [CrossRef]
  10. Prusaczyk, E.; Earle, M.; Hodson, G. A brief nudge or education intervention delivered online can increase willingness to order a beef-mushroom burger. Food Qual. Prefer. 2021, 87, 104045. [Google Scholar] [CrossRef]
  11. Kompella, P.; Gracia, B.; LeBlanc, L.; Engelman, S.; Kulkarni, C.; Desai, N.; June, V.; March, S.; Pattengale, S.; Rodriguez-Rivera, G.; et al. Interactive youth science workshops benefit student participants and graduate student mentors. PLoS Biol. 2020, 18, e3000668. [Google Scholar] [CrossRef]
  12. Amin, A.M.; Karmila, F.; Pantiwati, Y.; Adiansyah, R.; Yani, A. The Communication Skills Profile of Pre-Service Biology Teachers. J. Penelit. Pendidik. IPA 2022, 8, 1814–1819. [Google Scholar] [CrossRef]
  13. Beardsworth, S.J. Building Knowledge Bridges through Effective Science Communication. Chem.—A Eur. J. 2020, 26, 1698–1702. [Google Scholar] [CrossRef]
  14. Platz, M. Trust Between Teacher and Student in Academic Education at School. J. Philos. Educ. 2021, 55, 688–697. [Google Scholar] [CrossRef]
  15. Basch, C. Student-Teacher Trust Relationships and Student Performance. Ph.D. Thesis, St. John Fisher University, Rochester, NY, USA, 2012. [Google Scholar]
  16. Barnes, C.; Angle, J.; Montgomery, D. Teachers Describe Epistemologies of Science Instruction Through Q Methodology. Sch. Sci. Math. 2015, 115, 141–150. [Google Scholar] [CrossRef]
  17. Schuh, G.E. Revitalizing Land Grant Universities: It’s Time To Regain Relevance. Choices Mag. Food Farm Resour. Issues 1986, 1, 6–10. [Google Scholar]
  18. American Association for the Advancement of Science. Atlas of Science Literacy; National Science Teachers Association: Washington, DC, USA, 2001. [Google Scholar]
  19. Legg, C.; Hookway, C. Pragmatism. In The Stanford Encyclopedia of Philosophy; Zalta, E.N., Ed.; Metaphysics Research Lab, Stanford University: Stanford, CA, USA, 2021. [Google Scholar]
  20. Sorrell, K. Pragmatism and moral progress: John Dewey’s theory of social inquiry. Philos. Soc. Crit. 2013, 39, 809–824. [Google Scholar] [CrossRef]
  21. McCarthy, C.L.; Sears, E. Deweyan Pragmatism and the Quest for True Belief. Educ. Theory 2000, 50, 213–227. [Google Scholar] [CrossRef]
  22. Heikkinen, H.L.T.; Kakkori, L.; Huttunen, R. This is my truth, tell me yours: Some aspects of action research quality in the light of truth theories. Educ. Action Res. 2001, 9, 9–24. [Google Scholar] [CrossRef]
  23. Kuhn, T. The Structure of Scientific Revolutions. Philos. Pap. Rev. 2013, 4, 41–48. [Google Scholar] [CrossRef]
  24. Nazim, M.; Mukherjee, B. Chapter 11—Factors Critical to the Success of Knowledge Management. In Knowledge Management in Libraries; Nazim, M., Mukherjee, B., Eds.; Chandos Publishing: England, UK, 2016; pp. 263–286. ISBN 978-0-08-100564-4. [Google Scholar]
  25. Abdul, N.; Razak, N.; Pangil, F.; Lazim, M.; Mohd Zin, M.L.; Azlina, N.; Mohamed Yunus, A.; Asnawi, N. Theories of Knowledge Sharing Behavior in Business Strategy; Elsevier: Amsterdam, The Netherlands, 2014; Volume 37. [Google Scholar]
  26. Elstad, E.; Christophersen, K.A.; Turmo, A. Social Exchange Theory as an Explanation of Organizational Citizenship Behaviour among Teachers. Int. J. Leadersh. Educ. 2011, 14, 405–421. [Google Scholar] [CrossRef]
  27. Dillman, D.; Smyth, J.; Christian, L. Internet, Phone, Mail, and Mixed-Mode Surveys: The Tailored Design Method, 4th ed.; Wiley: Hoboken, NJ, USA, 2014; Available online: https://www.wiley.com/en-nl/Internet%2C+Phone%2C+Mail%2C+and+Mixed-Mode+Surveys%3A+The+Tailored+Design+Method%2C+4th+Edition-p-9781118456149 (accessed on 26 July 2024).
  28. Saldaña, J. The Coding Manual for Qualitative Researchers (3rd edition). Qual. Res. Organ. Manag. Int. J. 2017, 12, 169–170. [Google Scholar] [CrossRef]
  29. Braun, V.; Clarke, V. Using thematic analysis in psychology. Qual. Res. Psychol. 2006, 3, 77–101. [Google Scholar] [CrossRef]
  30. Braun, V.; Clarke, V. Toward good practice in thematic analysis: Avoiding common problems and be(com)ing a knowing researcher. Int. J. Transgender Health 2023, 24, 1–6. [Google Scholar] [CrossRef] [PubMed]
  31. Clarke, V.; Braun, V. Teaching thematic analysis: Overcoming challenges and developing strategies for effective learning. Psychol. 2013, 26, 120–123. [Google Scholar]
  32. O’Sullivan, T.A.; Jefferson, C.G. A Review of Strategies for Enhancing Clarity and Reader Accessibility of Qualitative Research Results. Am. J. Pharm. Educ. 2020, 84, 7124. [Google Scholar] [CrossRef] [PubMed]
  33. Cypress, B.S. Rigor or Reliability and Validity in Qualitative Research: Perspectives, Strategies, Reconceptualization, and Recommendations. Dimens. Crit. Care Nurs. 2017, 36, 253. [Google Scholar] [CrossRef] [PubMed]
  34. Seale, C. Quality in Qualitative Research. Available online: https://www.semanticscholar.org/paper/Quality-in-Qualitative-Research-Seale/6c636b18a99ae8cfd44352501fc199973233b056 (accessed on 26 July 2024).
  35. Swanson, H.L.; Pierre-Louis, C.; Monjaras-Gaytan, L.Y.; Zinter, K.E.; McGarity-Palmer, R.; Clark Withington, M.H. Graduate student workload: Pandemic challenges and recommendations for accommodations. J. Community Psychol. 2022, 50, 2225–2242. [Google Scholar] [CrossRef] [PubMed]
  36. Scully, G.; Kerr, R. Student Workload and Assessment: Strategies to Manage Expectations and Inform Curriculum Development. Account. Educ. 2014, 23, 443–466. [Google Scholar] [CrossRef]
  37. Longfield, A.; Romas, J.; Irwin, J.D. The Self-Worth, Physical and Social Activities of Graduate Students: A Qualitative Study. Coll. Stud. J. 2006, 40, 282–292. [Google Scholar]
  38. Graybill, J.K.; Dooling, S.; Shandas, V.; Withey, J.; Greve, A.; Simon, G.L. A Rough Guide to Interdisciplinarity: Graduate Student Perspectives. BioScience 2006, 56, 757–763. [Google Scholar] [CrossRef]
  39. Oswalt, S.B.; Riddock, C.C. What to Do About Being Overwhelmed: Graduate—ProQuest. Available online: https://www.proquest.com/docview/224810970?sourcetype=Scholarly%20Journals (accessed on 26 July 2024).
  40. What Is Social Exchange Theory?|Tulane School of Social Work. Available online: https://socialwork.tulane.edu/blog/social-exchange-theory/ (accessed on 26 July 2024).
  41. Ramos, R.; Gonçalves, J.; Gonçalves, S.P. The Unbearable Lightness of Academic Fraud: Portuguese Higher Education Students’ Perceptions. Educ. Sci. 2020, 10, 351. [Google Scholar] [CrossRef]
  42. Majerczak, P.; Strzelecki, A. Trust, Media Credibility, Social Ties, and the Intention to Share towards Information Verification in an Age of Fake News. Behav. Sci. 2022, 12, 51. [Google Scholar] [CrossRef] [PubMed]
  43. Nygren, T.; Wiksten Folkeryd, J.; Liberg, C.; Guath, M. Students Assessing Digital News and Misinformation. In Proceedings of the Disinformation in Open Online Media; van Duijn, M., Preuss, M., Spaiser, V., Takes, F., Verberne, S., Eds.; Springer International Publishing: Cham, Switzerland, 2020; pp. 63–79. [Google Scholar]
  44. Wineburg, S.; McGrew, S. Why Students Can’t Google Their Way to the Truth. Educ. Week 2016, 36, 22–28. [Google Scholar]
  45. Evanson, C.; Sponsel, J. From Syndication to Misinformation: How Undergraduate Students Engage with and Evaluate Digital News. Commun. Inf. Lit. 2019, 13, 228–250. [Google Scholar] [CrossRef]
  46. Svrovátková, J.; Pavliček, A. Social Media News Credibility among Students in the Czech Republic. In Proceedings of the 2021 Eighth International Conference on Social Network Analysis, Management and Security (SNAMS), Gandia, Spain, 6–9 December 2021; pp. 1–7. [Google Scholar]
  47. Metzger, M.J.; Flanagin, A.J. Digital Media, Youth, and Credibility; Mit Press, Cop: Cambridge, MA, USA; London, UK, 2008; pp. 49–72. [Google Scholar]
  48. Liu, Z. Perceptions of credibility of scholarly information on the web. Inf. Process. Manag. 2004, 40, 1027–1038. [Google Scholar] [CrossRef]
  49. Rodriguez, F.; Ng, A.; Shah, P. Do College Students Notice Errors in Evidence when Critically Evaluating Research Findings? J. Excell. Coll. Teach. 2016, 27, 63–78. [Google Scholar]
  50. Rodriguez, F.; Rhodes, R.; Miller, K.; Shah, P. Examining the Influence of Anecdotal Stories and the Interplay of Individual Differences on Reasoning. Think. Reason. 2016, 22, 274–296. [Google Scholar] [CrossRef]
  51. Pennycook, G.; Cannon, T.D.; Rand, D.G. Prior exposure increases perceived accuracy of fake news. J. Exp. Psychol. Gen. 2018, 147, 1865–1880. [Google Scholar] [CrossRef] [PubMed]
  52. Flensner, K.K.; Von der Lippe, M. Being safe from what and safe for whom? A critical discussion of the conceptual metaphor of ‘safe space’. Intercult. Educ. 2019, 30, 275–288. [Google Scholar] [CrossRef]
  53. Wright, A.; Gottfried, M.A.; Le, V.-N. A Kindergarten Teacher Like Me: The Role of Student-Teacher Race in Social-Emotional Development. Am. Educ. Res. J. 2017, 54, 78S–101S. [Google Scholar] [CrossRef]
  54. Train, T.L.; Miyamoto, Y.J. Research and Teaching: Encouraging Science Communication in an Undergraduate Curriculum Improves Students’ Perceptions and Confidence. J. Coll. Sci. Teach. 2017, 46, 76–83. [Google Scholar] [CrossRef]
  55. You, H.S. Why Teach Science with an Interdisciplinary Approach: History, Trends, and Conceptual Frameworks. J. Educ. Learn. 2017, 6, p66. [Google Scholar] [CrossRef]
  56. Sadler, P.M.; Sonnert, G.; Coyle, H.P.; Cook-Smith, N.; Miller, J.L. The Influence of Teachers’ Knowledge on Student Learning in Middle School Physical Science Classrooms. Am. Educ. Res. J. 2013, 50, 1020–1049. [Google Scholar] [CrossRef]
  57. Goe, L. The Link between Teacher Quality and Student Outcomes: A Research Synthesis; National Comprehensive Center for Teacher Quality: Washington, DC, USA, 2007. [Google Scholar]
  58. Baumert, J.; Kunter, M.; Blum, W.; Brunner, M.; Voss, T.; Jordan, A.; Klusmann, U.; Krauss, S.; Neubrand, M.; Tsai, Y.-M. Teachers’ Mathematical Knowledge, Cognitive Activation in the Classroom, and Student Progress. Am. Educ. Res. J. 2010, 47, 133–180. [Google Scholar] [CrossRef]
  59. Hill, H.; Rowan, B.; Ball, D. Effects of Teachers’ Mathematical Knowledge for Teaching on Student Achievement. Am. Educ. Res. J. Summer 2005, 42, 371–406. [Google Scholar] [CrossRef]
  60. Linvill, D.L.; Havice, P.A. Political Bias on Campus: Understanding the Student Experience. J. Coll. Stud. Dev. 2011, 52, 487–496. [Google Scholar] [CrossRef]
  61. Gross, N. Liberals and Conservatives in Academia: A Reply to My Critics. Society 2015, 52, 47–53. [Google Scholar] [CrossRef]
  62. Mariani, M.D.; Hewitt, G.J. Indoctrination U.? Faculty Ideology and Changes in Student Political Orientation. PS Political Sci. Politics 2008, 41, 773–783. [Google Scholar] [CrossRef]
  63. Hwang, Y.; Jeong, S.-H. Education-Based Gap in Misinformation Acceptance: Does the Gap Increase as Misinformation Exposure Increases? Commun. Res. 2023, 50, 157–178. [Google Scholar] [CrossRef]
  64. Schwarz, N.; Jalbert, M. When (Fake) News Feels True: Intuitions of truth and the acceptance and correction of misinformation. In The Psychology of Fake News; Routledge: London, UK, 2020; ISBN 978-0-429-29537-9. [Google Scholar]
  65. Whitaker, B.K.; Dyer, J.E. Identifying sources of bias in agricultural news REPORTING. J. Agric. Educ. 2000, 41, 125–133. [Google Scholar] [CrossRef]
  66. Galesic, M.; Bosnjak, M. Effects of Questionnaire Length on Participation and Indicators of Response Quality in a Web Survey. Public Opin. Q. 2009, 73, 349–360. [Google Scholar] [CrossRef]
  67. Deutskens, E.; de Ruyter, K.; Wetzels, M.; Oosterveld, P. Response Rate and Response Quality of Internet-Based Surveys: An Experimental Study. Mark. Lett. 2004, 15, 21–36. [Google Scholar] [CrossRef]
  68. Herzog, A.; Bachman, J. Effects of Questionnaire Length on Response Quality. Public Opin. Q. 1981, 45, 549–559. [Google Scholar] [CrossRef]
Figure 1. Credible sources of scientific information identified using check-all multiple-choice options for STEM (sciences, technology, engineering, and math) participants (n = 149) and non-STEM (n = 37) graduate student participants.
Figure 1. Credible sources of scientific information identified using check-all multiple-choice options for STEM (sciences, technology, engineering, and math) participants (n = 149) and non-STEM (n = 37) graduate student participants.
Education 14 00852 g001
Figure 2. Frequencies of experiencing science addressed in non-science classrooms for (a) graduate students in STEM (sciences, technology, engineering, and math) programs (n = 149) and (b) non-STEM programs (n = 37). χ2(3) = 9.07, p = 0.03, V = 0.22.
Figure 2. Frequencies of experiencing science addressed in non-science classrooms for (a) graduate students in STEM (sciences, technology, engineering, and math) programs (n = 149) and (b) non-STEM programs (n = 37). χ2(3) = 9.07, p = 0.03, V = 0.22.
Education 14 00852 g002
Figure 3. Frequencies of (a) graduate students in STEM (sciences, technology, engineering, and math) programs (n = 149) and (b) non-STEM programs (n = 37) that reported whether instructors provided credible evidence for scientific claims made inside non-science classrooms. χ2(3) = 4.56, p = 0.21.
Figure 3. Frequencies of (a) graduate students in STEM (sciences, technology, engineering, and math) programs (n = 149) and (b) non-STEM programs (n = 37) that reported whether instructors provided credible evidence for scientific claims made inside non-science classrooms. χ2(3) = 4.56, p = 0.21.
Education 14 00852 g003
Table 1. Demographic characteristics of graduate student survey participants from STEM (sciences, technology, engineering, and math) and non-STEM disciplines z.
Table 1. Demographic characteristics of graduate student survey participants from STEM (sciences, technology, engineering, and math) and non-STEM disciplines z.
STEMNon-STEMFull Sample
N%N%N%
Gender
     Female6543.62670.39148.9
     Male8154.41027.09148.9
     Third Gender/Non-Binary21.312.731.6
     Prefer not to say10.700.010.6
Age
     20–2911174.52464.913572.5
     30–392919.5513.53418.3
     40–3942.738.173.8
     50–5921.3410.863.2
     60 or older32.012.742.2
Ethnicity
     Caucasian/White9765.12875.712567.2
     Hispanic/Latino138.725.4158.1
     African American/Black64.0513.5115.9
     Asian/Pacific Islander3120.800.03116.6
     Mixed/Other21.325.442.2
Political Affiliation
     Democrat3825.51848.65630.1
     Republican2718.1821.83518.8
     Independent2718.1616.23317.7
     Libertarian64.012.773.8
     Prefer not to say5134.2410.85529.6
Upbringing
     Urban3322.1616.23921.0
     Suburban8456.42259.510657.0
     Rural3221.5924.34122.0
Degree Program
     Graduate Certification96.025.463.2
     Masters5939.61540.57238.7
     Doctor of Philosophy8053.72054.110053.8
     Postdoctoral Studies10.700.010.6
z Survey utilizing Qualtrics of 186 graduate students (n = 149 for graduate student programs in STEM disciplines; n = 37 for graduate student programs in non-STEM disciplines) and credibility judgment questionnaire.
Table 2. Perceived value of incorporating science in non-science classrooms of a sample of STEM (sciences, technology, engineering, and math) and non-STEM graduate students zy.
Table 2. Perceived value of incorporating science in non-science classrooms of a sample of STEM (sciences, technology, engineering, and math) and non-STEM graduate students zy.
Value Incorporating Science Topics into Non-Science Classrooms
GroupYes, AlwaysYes, SometimesNo, NeverTotal
STEM88574149
Non-STEM1720037
z Survey utilizing Qualtrics of 186 graduate students (n = 149 for graduate student programs in STEM disciplines; n = 37 for graduate student programs in non-STEM disciplines) and credibility judgment questionnaire; y χ2(2) = 3.68, p = 0.16.
Table 3. Sensing potential consequences with differing opinions than class majority or instructor of a sample of STEM (sciences, technology, engineering, and math) and non-STEM graduate students zy.
Table 3. Sensing potential consequences with differing opinions than class majority or instructor of a sample of STEM (sciences, technology, engineering, and math) and non-STEM graduate students zy.
Risk of Potential Consequences Witnessed or Sensed
Did Not Have Differing OpinionsDid Not Feel at Risk, Even with Differing OpinionsSomeFelt at RiskTotal
STEM18695012149
Non-STEM02411237
z Survey utilizing Qualtrics of 186 graduate students (n = 149 for graduate student programs in STEM disciplines; n = 37 for graduate student programs in non-STEM disciplines) and credibility judgment questionnaire; y χ2(3) = 6.92, p = 0.08, V = 0.19.
Table 4. Perceived trust in claims when instructors introduced science in a non-science classroom in a sample of STEM (sciences, technology, engineering, and math) and non-STEM graduate students zyx.
Table 4. Perceived trust in claims when instructors introduced science in a non-science classroom in a sample of STEM (sciences, technology, engineering, and math) and non-STEM graduate students zyx.
Level of Trust in Scientific Claims
Very LowSomewhat LowNeither Trust nor DistrustSomewhat HighVery High Total
STEM2929646110
Non-STEM001017734
z Survey utilizing Qualtrics of 186 graduate students (n = 149 for graduate student programs in STEM disciplines; n = 37 for graduate student programs in non-STEM disciplines) and credibility judgment questionnaire; y n = 144 (n = 110 for STEM graduate students; n = 34 for non-STEM graduate students); x χ2(4) = 10.39, p = 0.03, V = 0.27.
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Corbitt, K.; Hiltbrand, K.; Coursen, M.; Rodning, S.; Smith, W.B.; Mulvaney, D. Credibility Judgments in Higher Education: A Mixed-Methods Approach to Detecting Misinformation from University Instructors. Educ. Sci. 2024, 14, 852. https://doi.org/10.3390/educsci14080852

AMA Style

Corbitt K, Hiltbrand K, Coursen M, Rodning S, Smith WB, Mulvaney D. Credibility Judgments in Higher Education: A Mixed-Methods Approach to Detecting Misinformation from University Instructors. Education Sciences. 2024; 14(8):852. https://doi.org/10.3390/educsci14080852

Chicago/Turabian Style

Corbitt, Katie, Karen Hiltbrand, Madison Coursen, Soren Rodning, W. Brandon Smith, and Don Mulvaney. 2024. "Credibility Judgments in Higher Education: A Mixed-Methods Approach to Detecting Misinformation from University Instructors" Education Sciences 14, no. 8: 852. https://doi.org/10.3390/educsci14080852

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Article metric data becomes available approximately 24 hours after publication online.
Back to TopTop