**Preface to "Scientific Reasoning in Science Education: From Global Measures to Fine-Grained Descriptions of Students' Competencies"**

In modern science- and technology-based societies, competencies that enable citizens to reason scientifically play a key role not only in science and technology-based careers but also for democratic co-determination (e.g., OECD, 2019). Developing these competencies is, hence, considered an important goal for science education in many countries around the globe (e.g., KMK, 2020; NRC, 2012).

Scientific reasoning competencies are defined as a complex construct that encompasses abilities such as identifying scientific problems, developing questions and hypotheses, categorizing and classifying entities, engaging in probabilistic reasoning, generating evidence through modeling, experimentation, etc., and communicating, evaluating, and scrutinizing claims (Lawson, 2004; NRC, 2012). These abilities require different forms of knowledge, such as content knowledge about the concepts of science, procedural knowledge about scientific methods, and epistemic knowledge of how such procedures warrant the claims that scientists advance (Osborne, 2014).

The research on scientific reasoning competencies is quite diverse. This diversity is—at least in part—caused by the manifold abilities that models of scientific reasoning comprise and the wide range of content, procedural, and epistemic knowledge that is deemed necessary to exercise these abilities. Differences exist, for instance, in the specific abilities that are addressed (e.g., applying the control-of-variables strategy: Reith & Nehring, 2020; handling of anomalous data: Chinn & Brewer, 2001; formulating questions and hypotheses: Vorholzer et al., 2016; developing and using models: Gohner & Krell, 2020). In addition, even studies that focus on similar abilities may use different ¨ theoretical frameworks and address different procedural and epistemic concepts (Vorholzer et al., 2016). Moreover, studies focus on a broad spectrum of respondents ranging from K-12 students (e.g., Koerber & Osterhaus, 2019; Mayer et al., 2014; Nehring et al., 2015; Vorholzer et al., 2016) to pre-service (e.g., Khan & Krell, 2019) and in-service teachers (e.g., Krell & Kruger, 2016). ¨

Empirical research that focuses on scientific reasoning competencies typically describes the addressed competencies in a rather large-grained way. On a conceptual level, most studies offer a clear description of the addressed competencies, while the specific abilities, as well as the corresponding procedural and epistemic knowledge, are often less precisely defined (Vorholzer et al., 2016). For instance, a study may report that it focuses on students' competencies to develop scientific investigations without stating whether that entails just knowledge of the control-of-variables strategy or also knowledge of strategies such as repeating measurements or measuring with large quantities. In addition, empirical studies often report aggregated measures, for instance, in the form of a global scientific reasoning competency measure or a global measure of their epistemic understanding (e.g., na¨ıve vs. sophisticated) without stating what exactly students are (or are not) able to do or how they understand concepts related to scientific reasoning. It is important to note that the grain-size outlined above is completely sufficient when the goal of a study is, for instance, to investigate the effectiveness of a specific instructional intervention or to analyze the dimensionality of a competency model. Studies that utilize this grain-size have provided many vital insights regarding the modeling, assessment, and ways of fostering scientific reasoning competencies. However, we argue that more fine-grained perspectives have substantial benefits for instructional practice and research. For instance, detailed insights into students' procedural and epistemic knowledge related to scientific reasoning can inform teachers in designing instructions that match students' current understanding and specific learning needs. Such insights also provide manifold opportunities for further research, for instance, regarding the development of students' scientific reasoning competencies and the corresponding learning processes.

This book compiles empirical and theoretical contributions that seek to provide a more fine-grained perspective on scientific reasoning competencies, for instance, by providing precise descriptions of specific abilities and corresponding knowledge or by offering insights into the extent to which students of different age groups are able to reason scientifically. The contributions demonstrate the variety of conceptualizations of scientific reasoning in science education. Several contributions have based their research on well-established conceptualizations, such as formulating research questions, generating hypotheses, planning experiments, observing and measuring, preparing data for analysis, and drawing conclusions (e.g., Bicak et al.). Others have broadened their scope and discuss aspects that are somewhat "on the sidelines" of what is typically considered scientific reasoning, such as the relevance of conceptual knowledge for reasoning in a specific context (Schellinger et al.) and through reasoning on controversial science issues (Beniermann et al.). Most of the contributions address the abilities related to experimentation and modeling (e.g., Khan & Krell; Upmeier zu Belzen et al.).

The contributions in the book are ordered by their conceptualizations of scientific reasoning. After the editorial, six contributions follow, which address scientific reasoning in general (Bicak et al.; Hilfert-Ruppel et al.; Khan & Krell; Krell et al.; Mahler et al.; Schlatter et al.). The second part of ¨ the book comprises nine contributions, which address specific aspects of scientific reasoning such as modeling- or data-based reasoning (Beniermann et al.; Cabello et al.; Lang et al.; Masnick & Morris; Meister & Upmeier zu Belzen; Schellinger et al.; Rost & Knuuttila; Upmeier zu Belzen et al.; Wei et al.).

#### **References**:

Chinn, C. A., & Brewer, W. F. (2001). Models of data: A theory of how people evaluate data. *Cognition and Instruction, 19*, 323–393.

Gohner, M. & Krell, M. (2020). Preservice science teachers'strategies in scientific reasoning: The ¨ case of modeling. *Research in Science Education, 52*, 395–414.

Khan, S. & Krell, M. (2019). Scientific reasoning competencies: A case of preservice teacher education. *Canadian Journal of Science, Mathematics and Technology Education, 19*, 446–464.

KMK [Sekretariat der Standigen Konferenz der Kultusminister der L ¨ ander in der ¨ BRD] (Eds.). (2020). *Bildungsstandards im Fach Biologie f ¨ur die Allgemeine Hochschulreife*. https://www.kmk.org/fileadmin/Dateien/veroeffentlichungen beschluesse/2020/2020 06 18- BildungsstandardsAHR Biologie.pdf

Koerber, S. & Osterhaus, C. (2019). Individual differences in early scientific thinking: Assessment, cognitive influences, and their relevance for science learning. *Journal of Cognition and Development, 20*, 510–533.

Krell, M., & Kruger, D. (2016). Testing models: A key aspect to promote teaching activities related ¨ to models and modelling in biology lessons? *Journal of Biological Education, 50*, 160–173.

Lawson, A. (2004). The nature and development of scientific reasoning: A synthetic view. *International Journal of Science and Mathematics Education, 2*, 307–338.

Mayer, D., Sodian, B., Koerber, S. & Schwippert, K. (2014). Scientific reasoning in elementary school children: Assessment and relations with cognitive abilities. *Learning and Instruction, 29*, 43–55.

Nehring, A., Nowak, K. H., Upmeier zu Belzen, A. & Tiemann, R. (2015). Predicting

students' skills in the context of scientific inquiry with cognitive, motivational, and sociodemographic variables. *International Journal of Science Education, 37*, 1343–1363.

NRC [National Research Council] (2012). *A framework for K-12 science education: Practices, crosscutting concepts, and core ideas*. Washington, DC: National Academies Press.

OECD (2019). *Conceptual learning framework: Learning Compass 2030*. https://www.oecd.org/education/2030-project/teaching-and-learning/learning/learning-compass -2030/OECD Learning Compass 2030 concept note.pdf

Osborne, J. (2014). Scientific practices and inquiry in the science classroom. In N. G. Lederman & S. K. Abell (Eds.), *Handbook of research on science education*. Volume 2 (pp. 579–599). New York: Routledge/Taylor & Francis Group.

Reith, M., & Nehring, A. (2020). Scientific reasoning and views on the nature of scientific inquiry: testing a new framework to understand and model epistemic cognition in science. *International Journal of Science Education, 42*, 2716–2741.

Vorholzer, A., von Aufschnaiter, C., & Kirschner, S. (2016). Entwicklung und Erprobung eines Tests zur Erfassung des Verstandnisses experimenteller Denk- und Arbeitsweisen. ¨ *Zeitschrift f ¨ur Didaktik der Naturwissenschaften, 22*, 25–41.

> **Moritz Krell, Andreas Vorholzer, and Andreas Nehring** *Editors*
