Next Article in Journal
Promoting Agency Among Upper Elementary School Teachers and Students with an Artificial Intelligence Machine Learning System to Score Performance-Based Science Assessments
Next Article in Special Issue
The Impact of AI-Generated Instructional Videos on Problem-Based Learning in Science Teacher Education
Previous Article in Journal
Promoting High Achievement for Disadvantaged Students: Co-Designing a School Self-Evaluation Process Aligned to Evidence of Successful Leadership Practice Across Five English Districts
Previous Article in Special Issue
Assessing Student Teachers’ Motivation and Learning Strategies in Digital Inquiry-Based Learning
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Inquiring in the Science Classroom by PBL: A Design-Based Research Study

by
Jorge Pozuelo-Muñoz
,
Ana de Echave Sanz
and
Esther Cascarosa Salillas
*
IUCA Research Institute and Beagle Research Group, Department of Specific Didactics, Faculty of Education, Universidad de Zaragoza, 50009 Zaragoza, Spain
*
Author to whom correspondence should be addressed.
Educ. Sci. 2025, 15(1), 53; https://doi.org/10.3390/educsci15010053
Submission received: 30 November 2024 / Revised: 30 December 2024 / Accepted: 2 January 2025 / Published: 8 January 2025

Abstract

:
The aim of this study has been the design and evaluation of a sequence of activities that promotes the development of scientific skills in secondary school. For this purpose, design-based research was conducted using a problem-solving methodology to learn as a tool to engage in scientific inquiry practice. The research was structured around the design, implementation, and evaluation phases, with specific tools created to assess both student learning outcomes and the validity of the TLS. These tools helped identify the performance levels achieved by students in the various stages of scientific inquiry, from formulating hypotheses to interpreting data, and also allowed for the evaluation of the teaching methodology’s effectiveness. The results indicated that the TLS significantly enhanced students’ scientific competence by promoting skills related to scientific inquiry, such as hypothesis formulation, variable identification, observation, data collection, and interpretation. Additionally, the use of a weather station as the central topic provided a context closely tied to the students’ local environment, which facilitated deeper engagement and understanding. The evaluation revealed that students progressed in their scientific inquiry skills, moving from “pre-scientific” to “uncertain inquirer” performance levels. While challenges such as initial disorientation and difficulties in representing experimental data were observed, the overall performance of students demonstrated the success of the TLS. Furthermore, the students worked collaboratively, contributing their individual skills and experiences to achieve group goals. This study provides valuable insights into the potential of TLS as an alternative to traditional teaching methods, offering an innovative way to assess and enhance students’ scientific skills. It also highlights the importance of teacher guidance in inquiry-based activities and suggests that future projects could benefit from allowing students to choose the topic, further enhancing their motivation and engagement.

1. Introduction

Science classes should be designed with the aim of helping students investigate their scientific concerns while learning what teachers must teach (Solbes et al., 2007). Prior studies revealed that students are interested in solving problems in their immediate environment; that is, if the scientific problems are contextualized in the students’ environment, their interest grows (Caamaño, 2018; Fraser & Walberg, 2005; Rodrigues & Mattos, 2011; Swarat et al., 2012). In this respect, knowing whether students develop science process skills should be a topic that teachers assess. Therefore, an ideal teaching methodology could consist of students themselves posing problems contextualized in their own environments and based on their own interests or, at least, based on their environment necessities (Pozuelo-Muñoz et al., 2023). In this work, we present the design, development, and evaluation of a science teaching strategy based on a problem that needs to be solved by the students. The learning results of the participating students are evaluated, identifying the scientific skills developed and the degree of their achievement. The teaching methodology followed the problem-based learning (PBL) system in order to develop the practice of inquiry in the classroom. To investigate its development, design-based research (DBR) was implemented through the development of a sequence of activities (TLS), which is fundamental in a DBR.

2. Theoretical Framework

2.1. Problem-Based Learning

PBL is an educational method developed by Dewey (Dewey, 1938). The aim of PBL is to foster both content knowledge and problem-solving abilities in students (Wang et al., 1998). According to scholars like Hmelo-Silver (2004) and Merritt et al. (2017), PBL goals are organized into four categories: (a) content knowledge (building a broad and adaptable knowledge base, achieving academically, retaining information, and developing concepts), (b) procedural knowledge (enhancing problem-solving abilities and self-directed, lifelong learning skills), (c) collaboration (becoming effective team members), and (d) attitudes (developing intrinsic motivation to learn and staying engaged). PBL emphasizes learning through problem-solving and integrating knowledge in real-world contexts (Capraro & Slough, 2013), which helps develop competencies and skills (Bell, 2010).
Drake and Long (2009) explored a typical PBL framework, which consists of eight key components: the problem, small groups, a student-centered iterative inquiry process, resources, technology, community partnerships, communication of findings, and the role of teachers as facilitators. There is limited research on the impact of PBL in science education for secondary school students. However, some studies show promising outcomes, such as enhancing critical thinking skills (Anazifa & Djukri, 2017) and helping both students and teachers engage in scientific practices (Kolodner et al., 2003).
It is important to note that the problem is central to the learning process, acting as a catalyst for student motivation and activity (Chin & Chia, 2004). Therefore, the problem should be complex, related to real-life situations, and ill-structured, allowing for open-ended solutions and broad inquiry (Eherington, 2011; Merritt et al., 2017). Additionally, several researchers suggest that students should have the freedom to identify and address their own problems (Chin & Chia, 2004; Runco & Okuda, 1988). Once the problem is defined, students collaborate in small groups (Kolodner et al., 2003; Runco & Okuda, 1988; Wang et al., 1998), engaging in an iterative inquiry process that is strongly supported by the PBL approach (Chin & Chia, 2004). Taking the above into account, this work proposes research to evaluate the development of students’ science process skills in inquiry by asking questions in a PBL context.

2.2. Design-Based Research

One of the most current lines of research in experimental science teaching is research based on the design of teaching–learning sequences (TLS) (Guisasola et al., 2021). Design-based research (DBR) aims to generate didactic knowledge about the teaching and learning of science through direct classroom intervention (Kortland & Klaassen, 2010). This research is interventionist in nature, theoretically grounded, iterative, process-oriented and pragmatic. Classroom interventions are carried out through the design of TLS, a set of structured activities adapted to the development of the school curriculum and student development, while also constituting an educational research activity (Méheut & Psillos, 2004). There are two fundamental characteristics that differentiate DBR from other research in the field of science teaching (Sánchez-Azqueta et al., 2019). The first focuses on the context of the activity as a fundamental pillar for understanding teaching practice in the classroom (Guisasola et al., 2021). This positions the objective of DBR as the characterization of specific situations and the development of classroom interventions that characterize the design of practice (Barab & Squire, 2004; Bell, 2004). The second characteristic, and the main difference between DBR and other methodologies, is that its implementation involves changes at the local level. That is, DBR implies a pragmatic research philosophy supported by methodological rigor (CherryHolmes, 1992; Philips, 2006).
Talking about DBR undeniably means talking about TLS, a series of classroom activities developed under prior study that aim to develop the relevant part of the curriculum while also serving as a research activity within science teaching (Méheut & Psillos, 2004). The existing formats for designing a TLS are diverse in nature (Furió-Más et al., 2003; Guisasola et al., 2021), but they basically focus on three phases: students must present their ideas and listen to those of others; activities are conducted to guide their ideas toward the academic concepts; and, upon completion of the sequence implementation, they must reflect on how their ideas have changed compared to their initial thoughts.
Design-based research (DBR) is proving to be an important line for the development of research in the field of experimental science didactics (Guisasola et al., 2021). This line of research is being supported by the scientific community in the field and each contribution on the topic helps to underpin advances in scientific knowledge, developed through the implementation of such research in the classroom (Kortland & Klaassen, 2010).
As (Guisasola et al., 2021) state, to establish a complete DBR, it is necessary to theoretically ground the research, design the teaching–learning sequence (TLS), develop it, and evaluate it. The results of DBR are directly linked to the quality of the sequence (Guisasola et al., 2021), so by evaluating the quality of the sequence, conclusions can be drawn about the designed research. It should not be forgotten that classroom research must fulfill a dual function: advancing knowledge on the topic under investigation and helping students develop knowledge and procedures inherent to science (Méheut & Psillos, 2004). This is an opportunity to work on scientific practices in the classroom at any educational stage (Crujeiras-Pérez & Cambeiro, 2018). Therefore, this research has a double objective. On the one hand, it is the design, implementation, and evaluation of a sequence of activities from which results are obtained through the DBR methodology. The objective of this part is to investigate the scientific skills (and their scope) that the participating students put into play. On the other hand, in relation to advanced students, compare the characteristics of the sequence with their educational needs to conclude if this sequence could be an option for science work with these students.
This study aims to contribute to the advancement of knowledge in educational research within the context of secondary education by designing and developing a (TLS) focused on the scientific practice of inquiry. The central tool of this research is a weather station, which is used to promote practical and experimental learning in the classroom. The specific objectives of the research are as follows:
Develop a design-based research study (DBR): Create and evaluate a teaching–learning sequence (TLS) centered on the PBL methodology and by means of the scientific practice of inquiry and determine the validity of the sequence for classroom implementation in secondary education.
Design assessment tools: Develop specific tools to measure students’ performance in scientific practice throughout the development of the TLS.
Analyze the utility of the weather station: Investigate the extent to which a weather station can serve as an effective tool for developing inquiry skills in the classroom and also to evaluate the problem-solving methodology of learning in this case.

3. Materials and Methods

3.1. Stages of Design-Based Research

Design-based research (DBR) can be defined as a research methodology where the tools available to researchers at each stage are flexible. However, a common theoretical framework is necessary for this methodology to be genuinely useful (Alghamdi & Li, 2013; Juuti & Lavonen, 2006; Reeves, 2006). Therefore, it is essential to clarify the specifics of each stage of design-based research. These stages are as follows: (a) theoretical foundations; (b) design; (c) implementation; (d) evaluation and redesign. Each of these phases is outlined below, with a focus on how they will be concretized in the context of our research.

3.1.1. Theoretical Foundations for Research

The starting point for designing a teaching–learning sequence (TLS) rests on two main ideas. First, one of the primary goals is to understand and comprehend the epistemology of science (Guisasola et al., 2008, 2021), emphasizing that students should view the production of scientific knowledge as a rational process. Second, considering DBR in its social and constructivist dimensions (Guisasola et al., 2008), learning begins with students’ prior ideas and is not only focused on the individual but also supported by the entire student body, the learning environment, and the activities conducted (Leach & Scott, 2002).

3.1.2. Design Phase

Based on the theoretical foundations, the experience to be implemented in the classroom is designed. This involves defining learning objectives, identifying difficulties, addressing learning demands, and outlining the teaching strategies to be employed.
In any science teaching process, contextualization is crucial as it allows for the establishment of relationships between science and everyday life, leading to more durable learning (Caamaño, 2018; King & Ritchie, 2012). It facilitates the transfer of learned concepts to other contexts (Gilbert, 2006; King & Ritchie, 2012) and helps observe the presence of science in daily situations (Chamizo & Izquierdo, 2005), which increases interest in science (Gilbert, 2006; Sanmartí & Márquez, 2017).
Teachers must carefully choose an appropriate context to capture students’ attention (Avargil et al., 2012; Romero Ariza & Quesada, 2015). This context should not only be relevant to students’ lives but also serve as a source of the scientific ideas they aim to explore (Dori et al., 2018). This generates a need to know, facilitating lasting learning (Ültay & Çalik, 2012). Additionally, the design considers the “epistemological” and “learning demand” analysis tools (Guisasola et al., 2021), which aim to construct scientific knowledge in the educational context through a comprehensive study of the content to be covered. This will help define the “guiding problems” to be addressed in the TLS (Guisasola et al., 2021) and determine the sequence of activities within the TLS (CherryHolmes, 1992; Savall Alemany et al., 2016, 2019). By analyzing learning demands, the goal is to identify ontological and epistemic differences between students’ ideas and the defined objectives (i.e., what is and what is thought to be) (Leach & Scott, 2002).
Finally, learning strategies are established to align with the objectives and learning difficulties. Based on these blocks, the design of activities that will form the TLS can begin.
In this research, the development of the TLS is carried out toward inquiry. Inquiry is one of the three scientific practices that students must develop to work with science in the classroom (inquiry, argue, and modeling) (Sánchez-Azqueta et al., 2019). It is a process through which students can understand the methods scientists use, evaluate the potential of observations, develop the ability to formulate investigable questions, and make hypotheses that are then tested or refuted based on collected and analyzed data (Crawford, 2007). Inquiry has an epistemic purpose, allowing students to grasp the provisional and evolving nature of science (Aguilar & Barroso, 2015; Jiménez Aleixandre, 2011; Kelly & Duschl, 2002). It is well founded that this inquiry-based approach promotes procedural learning, where both scientific skills and reasoning are addressed. By engaging students in a genuine process of scientific discovery, inquiry fosters reasoning and scientific competence. Thus, learning science is presented as a constructive process in which nothing is final and students “learn science by doing science” (Couso et al., 2020; Pedaste et al., 2015). From a pedagogical perspective, the complex scientific process is divided into smaller, logically connected units that guide students through scientific reasoning. These individual units are called inquiry phases, and their connections form an inquiry cycle. The literature describes various phases and cycles of inquiry, but we base our work on (Pedaste et al., 2015) to outline, in the Methodology Section, what these phases are, their objectives, and the actions required from students in the TLS presented in this research.

3.1.3. Implementation of the Activity

Guisasola et al. (2021) define this stage as a “teaching experiment” aimed at studying whether the design truly enhances student learning. Using the teaching–learning sequence (TLS) as a guide, it will be possible to establish improvements on the initial design (Cobb et al., 2003).

3.1.4. Retrospective Analysis: Evaluation and Redesign

Research tools are not predetermined by design-based research (DBR), so they will depend on the didactic strategies used and the aspects that need to be evaluated. However, it is advisable to use multiple evaluation designs that explicitly show the results of the TLS (Nieveen, 2009). Guisasola et al. (2021) propose evaluation instruments in two dimensions: the evaluation of the quality of the sequence and the evaluation of learning outcomes. In this work, tools were designed to assess both dimensions, allowing conclusions to be drawn about the research question posed in the introduction of this study. A summary of the stages of DBR is presented in Figure 1.

3.2. Instruments

The methodology followed in this research adheres to the phases of design-based research established in the previous section, which itself serves as the theoretical foundation of the research. When working within the framework of design-based research (DBR), the design of the learning sequence (TLS) is included as a result of the research itself, thus it has been included in the Results Section.
To collect and analyze the data obtained from the development of the TLS, various instruments were used (semi-structured interviews with teachers, researchers, and students, direct observations, written student reports, questionnaires, and audio and video recordings of the sessions). However, often more than one instrument was used for the same session, which facilitates the triangulation of both methods and data (Aguilar & Barroso, 2015). We proposed evaluation instruments to both analyze the learning outcomes and the quality of the sequence.
To evaluate the learning outcomes of the TLS, evaluations were first separated for each of the two activities it comprises. As described in the Results Section, the TLS consisted of an activity where students must decide on the purchase of a weather station (activity 1) and a second activity where they must choose a location for this weather station within the educational center (activity 2). For each activity, the evaluation process followed consisted of the stages that each activity underwent during its implementation. For evaluating the development of the skills involved in each stage, as a reference we used the work of (Ferrés-Gurt et al., 2015), which synthesizes the Practical Test Assessment Inventory (PTAI) system (Tamir et al., 1982). Before deciding the reference instrument, we carried out a literature analysis to find other authors who described a similar assessment of the inquiry process (Buty et al., 2004; Lou et al., 2015). Akuma and Callaghan (2019) describe the evaluation and design of evaluation instruments as one of the most important challenges in the design of TLS. All assessment tools must be designed in the context in which they are going to be applied; therefore, for the design of ours, despite using reference works, we specifically designed a tool that would allow the assessment of the main objectives in our research.
Following their models, we divided down each activity into the stages of investigation that composed them, associating the skills involved and adapting each of the skills described by the authors to the work performed for this research. Figure 2 specifies the stages of each activity, the skill associated with the inquiry process according to the literature, and the skill addressed in our research concerning the first, for each of the two TLS activities.
The evaluation of each stage did not always correspond to a single session. This was due to two reasons: firstly, the development of the stage within the activity may not be completed in just one session; for example, measurements may be taken over several sessions. Secondly, some skills were associated with more than one session. This factor needs to be reflected in the results and for this purpose, a rubric was developed that specifies different items within each skill. To develop these items, a new variable was included: the cognitive level or depth of knowledge required for each of the proposed actions. This evaluation method is included within the PISA assessment framework (OCDE, 2016; Rosales Ortega et al., 2020) and derives from the revised Bloom’s taxonomy (Krathwohl, 2002). This inclusion has been useful for establishing different levels of achievement of the objectives beyond a simple yes or no dichotomy. Additionally, to make these levels easier to analyze, they were assigned scores. These scores were used to quantify the levels of scientific competence achieved, following a tool adapted from the one proposed by (Ferrés-Gurt et al., 2015).
To make the cognitive demand level of each item more visible, a color scale was established to denote cognitive demand from lower to higher requirement. Each demand level was associated with a score. The scale used is shown in Figure 3.
Keeping this scale in mind, the stages, skills, and adapted skills are detailed in Tables S1 and S2 of the Supplementary Materials, and the item and score of each item, for activity 1 and 2.
Table 1 and Table 2 detail the scores for each stage and skill for activity 1 and 2. The Supplementary Materials are used as a detailed evaluation rubric for each of the skills, so that by adding the maximum possible scores to be achieved, it is established that if the student reaches that score, his or her level of inquiry is “inquirer”. And thus, a scale was established based on the specialized literature (Ferrés-Gurt et al., 2015).
Finally, by using a scoring system associated with the skills involved in each activity and stage, as well as with cognitive demand, we can establish a categorization of performance levels in inquiry practice for each stage. The performance levels on which the design for this research is based were proposed by Ferrés-Gurt et al. (2015) and were adapted for other works such as the study by (Crujeiras-Pérez & Cambeiro, 2018). The established levels were as follows: non-participant, non-scientific (or unscientific), pre-scientific, emerging inquirer, uncertain inquirer (or hesitant inquirer), and inquirer.
To clarify the assessment process, the researchers designed a specific document (acting as a rubric) that can be observed in the Supplementary Materials. Each section of the rubric is quantified with a maximum number, which corresponds to the higher level of development of this section. Based on it, the researchers just had to quantify every section with numbers (attending to the detailed description of each in the rubric). Therefore, it allowed us to quantify the cognitive demands of activity 1 and 2. And, knowing the stages (Figure 4), it was possible to classify the level of the inquiry developed in activities 1 and 2.
Figure 4 shows the scores required to achieve the defined performance levels.

3.3. Sample

The research was conducted with a group of 16 first-year high school students (8 female students) in the subject of Scientific Culture, through a situation arising from the needs of the school itself: purchasing a weather station (with a fixed budget of 250 euros) (activity 1) and finding a location to install it in the school (activity 2). Four groups of 4 students each were formed, and each group was tasked with proposing a weather station for purchase. The entire class had to decide and buy the most suitable station for the proposed needs. Subsequently, each group proposed a location, and all the groups had to choose the option they considered most appropriate according to the criteria they established.
The teaching staff acted as teacher-researchers, collecting the questions that students asked in each phase of the project and, as support, providing access to what was required. As Mosquera Bargiela et al. (2018) demonstrated, showing an attitude of support and providing resources to students so that they can answer in the school context encourages their scientific development.
The students started with a problem to solve (choose by themselves), and they had to design the procedure in order to obtain results and draw conclusions about this problem, which is a daily problem.

4. Results

The stages implemented in each activity can be seen in Figure 2. The first stage of activity 1 was dedicated to presenting the research project and identifying students’ prior knowledge about the main learning outcomes involved in this TLS (see Table 3), which served as the final design of the TLS.
Below (see Figure 5 and Figure 6), the results obtained in the practice of the inquiry are presented for each of the groups with which activity 1 and 2 of the TLS were implemented.

4.1. Activity 1: Choice of Weather Station

Stage 1: Research Proposal: Group 1 used scientific data to propose a hypothesis for choosing the station, stating that “the temperature range is between −10 °C and 50 °C, as temperatures here can range from −3 to 40 °C”. On the other hand, Group 4 presented a very thorough report on their station choice. This report included the group’s initial hypothesis, specifying the physical magnitudes measured by the station along with their units. However, they proposed a station with a temperature measurement range of 0 °C to 50 °C, without considering that this range was not suitable for the intended use of the station. Groups 2 and 3 demonstrated fewer skills at this stage of the activity. Group 2 did not mention specific magnitudes, ranges, or precisions, nor did they relate to the context of the location. For example, they used phrases like “Our station has components commonly found in weather stations, such as a rain gauge, a thermometer, etc.” and “It has a very low error tolerance, meaning high precision”. Finally, Group 3 proposed a station with temperature ranges between 30 and 80 °C, which did not meet the required criteria.
Stage 2: Research Planning: In the second stage, which involved identifying variables influencing the problem, Groups 1 and 4 again stood out. Group 4 organized their discussion by differentiating between physical and technological parameters. Group 3 received the lowest score as they met only the items with lower cognitive demand and did not plan any sessions. They identified the physical variables of temperature, pressure, wind, and precipitation, but did not refer to any measurement characteristics (e.g., units, ranges, or pressures) and did not propose additional variables such as humidity or solar radiation measurement. Group 2 performed better than Group 3 because they included these additional variables for the station.
Stage 4: Conclusions and Argumentation: Group 1 used arguments related to the physical and technological parameters of their station. Regarding technological ranges, they mentioned the allowed distance between devices (100 m), using these data as justification. They also used data related to the station’s power supply. However, where this group really excelled was in using physical magnitudes as arguments. They provided data on the station’s precision and ranges and related them to the population’s needs. They also identified a disadvantage of their station. This factor was documented in writing and included in the debate. Their station “measures a wind speed range of 0 to 50 km/h. In the environment, winds can exceed 45 km/h, so there may be days when the maximum is surpassed”. In the case of Group 2, they barely participated in the debate, presenting only the general characteristics of their station without using evidence beyond listing some features, and did not mention disadvantages or other groups, although they did listen to their peers. Group 3 also presented their station’s characteristics in a generic manner and opposed the view by saying that they believe “the other stations are more or less similar to ours”.
Stage 5: Communication of Results: In this stage, Groups 1 and 4 used their own photographs of the purchased station and edited them to identify each exposed part. The only difference was that Group 1 explained the functioning of most of the measurement instruments (all except the solar radiation meter). However, in this part of the activity, none of the groups referred to the measurement characteristics related to precision and range.
Stage 6: Reflection: The ability of each group to relate the results obtained to their role in the activity and the connection between the project and science and technology was evaluated. In this regard, Group 4 showed the best performance. This group described their role in the activity as “a challenge to face alone” and also related their work to that of scientists.

4.2. Activity 2: Choice of Weather Station Location

Stage 1: Research Proposal: Three of the four groups identified a scientific issue in the activity due to the data collection processes. They made statements such as “we need to take measurements and use the collected data” (Group 1); “it is necessary to collect and record data, otherwise we cannot justify anything later” (Group 2); “we need to take measurements to choose a location” (Group 3). Group 4 identified the scientific issue by relating it to the first activity of the project: “we need to use the information obtained in the first part of the project to continue with our research” (Group 4). Group 3 proposed a location where the station is exposed to both sun and shade throughout the day without providing any arguments for it. Group 1 argued before choosing the locations: “before thinking about the ideal site, we considered several factors: that there should be no buildings or obstructions; the sensors of the station should not fall under an area with sun/shade”; “the station should face north”. Group 2 conducted some preliminary tests to choose the location, which allowed them to evaluate the station they had chosen for activity 1 as incorrect.
Stage 2: Research Planning: In the second stage of activity 2, Groups 1, 2, and 4 showed very similar performance. Group 3 did not achieve the same level of performance, although the difference with the other groups decreased compared to activity 1. All four groups identified the two groups of variables (physical and technological).
Stage 3: Observation and Data Collection: In setting up the instrument, both Group 1 and Group 2 followed similar procedures. Group 2 evaluated locations that met technical requirements (“first we tested sites where the station would function”) and then took measurements (“later we will conduct tests to ensure the measurements are accurate”). Group 1 started with random connection tests and trial and error: “since it did not work in the computer room, we tried the adjacent room”. Group 4 conducted a connection test to establish the maximum distance between devices “to have a range of action”. Group 3 approached this part by making approximations. Group 2 analyzed the data and made decisions based on this analysis. For example, “we did tests in the courtyard and had a wind speed of 17 km/h and on the rooftop 6 km/h. We realized that the building created very strong wind currents”. Upon detecting such anomalies, they decided to consult “official meteorological data sources”. This group consulted data from the State Meteorological Agency. Group 1 received a similar score, as they did not manage to take systematic measurements but shared all the information they had with the other groups. Through this exchange of information, they validated their location. They also consulted meteorological data from nearby weather stations using a mobile app (Weather Underground). Group 4 identified errors in some of their tests but did not design a new experiment to address them and decided to directly “change the location because it did not measure well”, attributing the issues to technological connection factors. Finally, Group 3 detected errors in their tests but did not resolve them and did not attempt to validate their data.
Stage 4: Conclusions and Argumentation: In justifying their proposed choices, Groups 1, 2, and 4 supported their decisions using the data obtained from their measurements (Groups 1 and 2) or by describing in detail the characteristics of the location for accurate measurements (Group 4).
Stage 5: Communication of Results: The highest-scoring groups distinguished in their vocabulary between magnitudes, units of measurement, and measuring instruments.
Stage 6: Reflection: Groups 1, 2, and 4 demonstrated a greater level of achievement by relating the work to their daily lives. Group 1 stated that they learned about the origin of weather maps (“now we know where the weather map data comes from”). This suggests that this group found a connection between the activity and their everyday life. Group 4 related the activity to the procedural aspects of science, stating they “felt like scientists at times and euphoric when they finally managed to connect the station”. Group 2 related the activity to scientific practice, stating that “the whole activity was about constantly applying the way scientists work”. Finally, Group 3 did not refer any aspects in this part of the activity.
These results are shown in Figure 7 and Figure 8.
Lastly, the results obtained for each of the groups implementing activities 1 and 2 of the TLS are presented. The data corresponds to Figure 9.

5. Conclusions

This paper presents a design-based research study developed through the design, implementation, and evaluation of a teaching–learning sequence (TLS) based on a PBL methodology. The research methodology used (DBR) was demonstrated to be useful for both research development and teaching practice (Guisasola et al., 2021). The results from the evaluation of the quality of the TLS, i.e., the evaluation of its design and implementation using the station, address the research objectives. Specific tools were designed for this evaluation, which ultimately allowed for the establishment of the scientific level performance achieved by the students participating in the research. The adaptation of these tools may be useful for evaluating other sequences in the educational field, both for teaching and research practice.
The data obtained from the research confirms that the TLS has promoted inquiry practice among students. Firstly, the use of an inquiry-based teaching methodology has developed skills and abilities related to scientific inquiry practice (Crujeiras-Pérez & Cambeiro, 2018). The use of a weather station as a generating topic facilitated students in formulating hypotheses, identifying variables, making observations and data collection, and interpreting those data (Aguilar & Barroso, 2015), all in a context closely related to their immediate environment (Caamaño, 2018). To reach this conclusion, students’ performances in different stages of the teaching sequence were analyzed. It was found that the choice of a weather station for purchase served as an introductory activity, allowing students to identify the variables involved in the problem, the characteristics of the measurements, and their relationship to the local climate. On the other hand, choosing the location for the station developed procedural practices, enabling students to create their own experimental designs for data collection and subsequent interpretation. Some difficulties among students were also identified. One difficulty is the sense of disorientation that students experience at the beginning of such activities, highlighting the crucial role of the teacher’s guidance throughout the process (Sánchez-Azqueta et al., 2023), or the low performance in representing the obtained experimental data (Aguilar & Barroso, 2015). Despite these challenges, students have demonstrated performance levels that allow their classification between “pre-scientific” and “uncertain inquirer”.
All students worked in an integrated manner in their groups and maintained motivation to achieve group objectives from the beginning to the end. It was observed that each of them contributed to the group based on their skills and previous experiences, respecting all opinions and ways of working.
This research work has allowed for the evaluation of the feasibility of a TLS for working on science in the classroom, offering an alternative to classical teaching and providing tools to assess the effectiveness of the TLS in developing students’ scientific skills through a PBL. The PBL proposed has motivated the students and has favored the development of the scientific skills analyzed. The topic addressed in this project was a subject close to the students’ everyday life and with direct applications, so they could see the usefulness of the project’s development. However, one point to consider for the future is attending to the students’ interest in the project topic. In this project, the topic was chosen by the teacher (the purchase and placement of the weather station), but in future projects, to better meet their needs, the topic can be chosen by the students themselves.

Supplementary Materials

The following supporting information can be downloaded at: https://www.mdpi.com/article/10.3390/educsci15010053/s1, Table S1: Stages, skills, adaptive skills, and items established for the activity 1. Table S2: Stages, skills, adaptive skills, and items established for activity 2.

Author Contributions

This research is a part of the Jorge Pozuelo Muñoz Doctoral Thesis. Conceptualization, J.P.-M., A.d.E.S. and E.C.S.; methodology, J.P.-M., A.d.E.S. and E.C.S.; formal analysis, J.P.-M., A.d.E.S. and E.C.S.; investigation, J.P.-M., A.d.E.S. and E.C.S.; resources, J.P.-M., A.d.E.S. and E.C.S.; writing—original draft preparation, J.P.-M., A.d.E.S. and E.C.S.; writing—review and editing, J.P.-M., A.d.E.S. and E.C.S. All authors have read and agreed to the published version of the manuscript.

Funding

This research received no external funding.

Institutional Review Board Statement

The authors confirm all the subjects’ participants in the present research have provided appropriate informed consent in verbal form. We declare that the institution (and the country) in which we developed the research does not have an ethics committee; so, to develop the research, the authors requested the explicit consent of the students and their legal tutors.

Informed Consent Statement

Informed consent was obtained from all subjects involved in the study.

Data Availability Statement

Data are contained within the article.

Conflicts of Interest

The authors declare no conflicts of interest.

References

  1. Aguilar, S., & Barroso, J. (2015). La triangulación de datos como estrategia en investigación educativa. Pixel-Bit. Revista de Medios y Educación, 47, 73–88. [Google Scholar] [CrossRef]
  2. Akuma, F. V., & Callaghan, R. (2019). A systematic review characterizing and clarifying intrinsic teaching challenges linked to inquiry-based practical work. Journal of Research in Science Teaching, 56(5), 619–648. [Google Scholar] [CrossRef]
  3. Alghamdi, A. H., & Li, L. (2013). Adapting design-based research as a research methodology in educational settings. International Journal of Education and Research, 1(10), 1–12. [Google Scholar]
  4. Anazifa, R. D., & Djukri, D. (2017). Project-based learning and problem-based learning: Are they effective to improve student’s thinking skills? Jurnal Pendidikan IPA Indonesia, 6, 346–355. [Google Scholar] [CrossRef]
  5. Avargil, S., Herscovitz, O., & Dori, Y. J. (2012). Teaching thinking skills in context-based learning: Teacher’s challenges and assessment knowledge. Journal of Science Education and Technology, 21, 207–225. [Google Scholar] [CrossRef]
  6. Barab, S., & Squire, K. (2004). Design-based research: Putting a stake in the ground. Journal of the Learning Sciences, 13(1), 1–14. [Google Scholar] [CrossRef]
  7. Bell, J. (2010). Doing your research project. McGraw-Hill. [Google Scholar]
  8. Bell, P. (2004). On the theoretical breadth of design-based research in education. Educational Psychologist, 39(4), 243–253. [Google Scholar] [CrossRef]
  9. Buty, C., Tiberghien, A., & Le Maréchal, J. (2004). Learning hypotheses and an associated tool to design and to analyse teaching–learning sequences. International Journal of Science Education, 26(5), 579–604. [Google Scholar] [CrossRef]
  10. Caamaño, A. (2018). Enseñar química en contexto. Un recorrido por los proyectos de química en contexto desde la década de los 80 hasta la actualidad. [Teaching chemistry in context. A tour of chemistry projects in context from the 1980s to the present]. Education Química, 29, 21–54. [Google Scholar] [CrossRef]
  11. Capraro, R., & Slough, S. (2013). Why PBL? Why stem? Why now? An introduction to STEM project-based learning: An integrated science, technology, engineering, and mathematics (STEM) approach. In Project-based learning: An integrates science, technology, engineering and mathematics (STEM) approach (2nd ed., pp. 1–6). Sense. [Google Scholar]
  12. Chamizo, J., & Izquierdo, M. (2005). Ciencia en contexto: Una reflexión desde la filosofía. Alambique: Didáctica de las Ciencias Experimentales, 46, 9–17. [Google Scholar]
  13. CherryHolmes, C. H. (1992). (Re)clamación de pragmatismo para la educación. Revista de Educación, 297, 227–262. [Google Scholar]
  14. Chin, C., & Chia, L. (2004). Problem-based learning: Using students’ questions to drive knowledge construction. Science Education, 88, 707–727. [Google Scholar] [CrossRef]
  15. Cobb, P., Confrey, J., diSessa, A., Lehrer, R., & Schauble, L. (2003). Design experiments in educational research. Educational Researcher, 32(1), 9–13. [Google Scholar] [CrossRef]
  16. Couso, D., Jiménez-Liso, M. R., Refojo, C., & Sacristán, J. A. (2020). Enseñando ciencia con Ciencia. FECYT and Fundación Lilly; Penguin Random House Grupo Editorial S.A.U. [Google Scholar]
  17. Crawford, B. A. (2007). Learning to teach science as inquiry in the rough and tumble of practice. Journal of Research in Science Teaching, 44(4), 613–642. [Google Scholar] [CrossRef]
  18. Crujeiras-Pérez, B., & Cambeiro, F. (2018). Una experiencia de indagación cooperativa para aprender ciencias en educación secundaria participando en las prácticas científicas. Revista Eureka sobre Enseñanza y Divulgación de las Ciencias, 15(1), 1201. [Google Scholar] [CrossRef]
  19. Dewey, J. (1938). Experience and education. Macmillan. [Google Scholar]
  20. Dori, Y., Avargil, S., Kohen, Z., & Saar, L. (2018). Context-based learning and metacognitive prompts for enhancing scientific text comprehension. International Journal of Science Education, 40(10), 1198–1220. [Google Scholar] [CrossRef]
  21. Drake, K. N., & Long, D. (2009). Rebecca’s in the dark: A comparative study of problem-based learning and direct instruction/experiential learning in two 4th-grade classroom. Journal of Elementary Science Education, 21, 1–16. [Google Scholar] [CrossRef]
  22. Eherington, M. B. (2011). Investigative primary science: A problem-based learning approach. Australian Journal of Teacher Education (Online), 36, 53–74. [Google Scholar] [CrossRef]
  23. Ferrés-Gurt, C., Marbà-Tallada, A., & Sanmartí, N. (2015). Trabajos de indagación de los alumnos: Instrumentos de evaluación e identificación de dificultades. Revista Eureka sobre Enseñanza y Divulgación de las Ciencias, 12(1), 22–37. [Google Scholar] [CrossRef]
  24. Fraser, B. J., & Walberg, H. J. (2005). Research on teacher-student relationships and learning environments: Context, retrospect and prospect. International Journal of Educational Research, 43, 103–109. [Google Scholar] [CrossRef]
  25. Furió-Más, C., Guisasola, J., Almudí, J., & Ceberio, M. (2003). Learning the electric field concept as oriented research activity. Science Education, 87, 640–662. [Google Scholar] [CrossRef]
  26. Gilbert, J. K. (2006). On the nature of “context” in chemical education. International Journal of Science Education, 28(9), 957–976. [Google Scholar] [CrossRef]
  27. Guisasola, J., Ametller, J., & Zuza, K. (2021). Investigación basada en el diseño de secuencias de enseñanza-aprendizaje: Una línea de investigación emergente en enseñanza de las ciencias. Revista Eureka sobre Enseñanza y Divulgación de las Ciencias, 18(1), 1801. [Google Scholar] [CrossRef]
  28. Guisasola, J., Furió, C., & Zuza, K. (2008). Science education based on developing guided research. In M. V. Thomase (Ed.), Science education in focus (pp. 55–85). Nova Science Publisher. [Google Scholar]
  29. Hmelo-Silver, C. E. (2004). Problem-based learning: What and how do students learn? Educational Psychology Review, 16, 235–366. [Google Scholar] [CrossRef]
  30. Jiménez Aleixandre, M. P. (2011). 10 Ideas clave. Competencias en argumentación y uso de pruebas. Educatio Siglo XXI, 29(1), 363–366. [Google Scholar]
  31. Juuti, K., & Lavonen, J. (2006). Design-based research in science education: One step towards methodology. Nordic Studies in Science Education, 2(2), 54–68. [Google Scholar] [CrossRef]
  32. Kelly, G. J., & Duschl, R. A. (2002, April 1–5). Toward a research agenda for epistemological studies in science education. Annual Meeting of the National Association for Research in Science Teaching, New Orleans, LA, USA. [Google Scholar]
  33. King, D., & Ritchie, S. M. (2012). Learning science through real-world contexts. In Second international handbook of science education (pp. 69–79). Springer. [Google Scholar]
  34. Kolodner, J. L., Camp, P. J., Crismond, D., Fasse, B., Gray, J., Holbrook, J., Puntambekar, S., & Ryan, M. (2003). Problem-based learning meets case-based reasoning in the middle-school science classroom: Putting learning by design(tm) into practice. The Journal of the Learning Sciences, 12, 495–547. [Google Scholar] [CrossRef]
  35. Kortland, K., & Klaassen, K. (2010). Designing theory-based teaching-learning sequences for science education: Proceedings of the symposium in honour of piet lijnse at the time of his retirement as professor of physics didactics at utrecht university. CDBeta Press. [Google Scholar]
  36. Krathwohl, D. R. (2002). A revision of Bloom’s taxonomy: An overview. Theory into Practice, 41(4), 212–218. [Google Scholar] [CrossRef]
  37. Leach, J., & Scott, P. (2002). Designing and evaluating science teaching sequences: An approach drawing upon the concept of learning demand and a social constructivist perspective on learning. Studies in Science Education, 38(1), 115–142. [Google Scholar] [CrossRef]
  38. Lou, Y., Blanchard, P., & Kennedy, E. (2015). Development and validation of a science inquiry skills assessment. Journal of Geoscence Education, 63(1), 73–85. [Google Scholar] [CrossRef]
  39. Merritt, J., Lee, M. Y., Rillero, P., & Kinach, B. M. (2017). Problem-based learning in K-18 mathematics and science education: A literature review. Interdisciplinary Journal of Problem-Based Learning, 11(2), 3. [Google Scholar] [CrossRef]
  40. Méheut, M., & Psillos, D. (2004). Teaching–learning sequences: Aims and tools for science education research. International Journal of Science Education, 26(5), 515–535. [Google Scholar] [CrossRef]
  41. Mosquera Bargiela, I., Puig, B., & Blanco Anaya, P. (2018). Las prácticas científicas en infantil. Una aproximación al análisis del currículum y planes de formación del profesorado de Galicia. [Scientific practices in children. An approach to the analysis of the curriculum and teacher training plans in Galicia]. Enseñanza de las Cienc, 36, 7–23. [Google Scholar]
  42. Nieveen, N. (2009). Formative evaluation in educational design research (T. Plomp, & N. Nieveen, Eds.; pp. 89–101). Enschede. [Google Scholar]
  43. OCDE. (2016). PISA 2015 asseessment and analytical framework: Sicence, reading, mathematic and financial literacy. OEDC Publisching. [Google Scholar]
  44. Pedaste, M., Mäeots, M., Siiman, L. A., de Jong, T., van Riesen, S. A. N., Kamp, E. T., Manoli, C. C., Zacharia, Z. C., & Tsourlidaki, E. (2015). Phases of inquiry-based learning: Definitions and the inquiry cycle. Educational Research Review, 14, 47–61. [Google Scholar] [CrossRef]
  45. Philips, D. (2006). Assessing the quality of design research proposals: Some philosophical perspectives. In J. van de Akker, K. Gravemeijer, S. McKenney, & N. Nieveen (Eds.), Educational design research (pp. 93–97). Routledge. [Google Scholar]
  46. Pozuelo-Muñoz, J., Calvo-Zueco, E., Sánchez-Sánchez, E., & Cascarosa-Salillas, E. (2023). Science skills development through problem-based learning in secondary education. Education Sciences, 13, 1096. [Google Scholar] [CrossRef]
  47. Reeves, T. C. (2006). Design research from a technology perspective. In J. van de Akker, K. Gravemeijer, S. McKenney, & N. Nieveen (Eds.), Educational design research (pp. 52–66). Routledge. [Google Scholar]
  48. Rodrigues, A., & Mattos, C. (2011). Contexto, negociación y actividad en una clase de física. [Context, negotiation and activity in a physics class]. Enseñanza de las Ciencias, 29, 263–274. [Google Scholar]
  49. Romero Ariza, M., & Quesada, A. (2015). Is the science taught useful to explain daily phenomena? A qualitative study with pre-service teachers. In ICERI2015 Proccedings (pp. 2150–2156). LATED Academy. [Google Scholar]
  50. Rosales Ortega, E. M., Rodríguez Ortega, P. G., & Romero Ariza, M. (2020). Conocimiento, demanda cognitiva y contexto en la evaluación de la alfabetización científica en PISA. Revista Eureka sobre Enseñanza y Divulgación de las Ciencias, 17(2), 2302. [Google Scholar] [CrossRef]
  51. Runco, M. A., & Okuda, S. M. (1988). Problem Discovery, divergent thinking and the creative process. Journal of Youth and Adolescence, 17, 211–220. [Google Scholar] [CrossRef]
  52. Sanmartí, N., & Márquez, C. (2017). Aprendizaje de las ciencias basado en proyectos: Del contexto a la acción. Ápice Revista de Educación Científica, 1(1), 3–16. [Google Scholar] [CrossRef]
  53. Savall Alemany, F., Doménech, J., Guisasola, J., & Martínez Torregrosa, J. (2016). Identifying student and teacher difficulties in interpreting atomic spectra using a quantum model of emission and absorption of radiation. Physical Review Physics Education Research, 12, 010132. [Google Scholar] [CrossRef]
  54. Savall Alemany, F., Guisasola Aranzábal, J., Rosa Cintas, S., & Martínez-Torregrosa, J. (2019). Problem-based structure for a teaching-learning sequence to overcome students’ difficulties when learning about atomic spectra. Physical Review Physics Education Research, 15, 020138. [Google Scholar] [CrossRef]
  55. Sánchez-Azqueta, C., Cascarosa, E., Celma, S., Gimeno, C., & Aldea, C. (2019). Application of a flipped classroom for model-based learning in electronics. International Journal of Engineering Education, 35(3), 938–946. [Google Scholar]
  56. Sánchez-Azqueta, C., Cascarosa, E., Celma, S., Gimeno, C., & Aldea, C. (2023). Quick response codes as a complement for the teaching of Electronics in laboratory activities. International Journal of Electrical Engineering & Education, 60(2), 153–167. [Google Scholar]
  57. Solbes, J., Montserrat, R., & Furió, C. (2007). El desinterés del alumnado hacia el aprendizaje de la ciencia: Implicaciones en su enseñanza. [The disinterest of students towards learning science: Implications in their teaching]. Didáctica de las Ciencias Experimentales y Sociales, 21, 91–117. [Google Scholar]
  58. Swarat, S., Ortony, A., & Revelle, W. (2012). Activity matters: Understanding student interest in school science. Journal of Research in Science Teaching, 49, 515–537. [Google Scholar] [CrossRef]
  59. Tamir, D. P., Nussinovitz, R., & Friedler, Y. (1982). The design and use of a practical tests assessment inventory. Journal of Biological Education, 16(1), 42–50. [Google Scholar] [CrossRef]
  60. Ültay, N., & Çalik, M. (2012). A thematic review of studies into the effectiveness of context-based chemistry curricula. Journal of Science Education and Technology, 21, 686–701. [Google Scholar] [CrossRef]
  61. Wang, H. A., Thompson, P., & Shuler, C. F. (1998). Essential components of problem-based learning for the K-12 inquiry science instruction. CCMB. [Google Scholar]
Figure 1. Summary of the stages of DBR.
Figure 1. Summary of the stages of DBR.
Education 15 00053 g001
Figure 2. Adaptation of the stages of inquiry according to Pedaste et al. (2015) to this research, including the skills associated with each stage and their development in the TLS.
Figure 2. Adaptation of the stages of inquiry according to Pedaste et al. (2015) to this research, including the skills associated with each stage and their development in the TLS.
Education 15 00053 g002
Figure 3. Color scale used for cognitive demand.
Figure 3. Color scale used for cognitive demand.
Education 15 00053 g003
Figure 4. Score to achieve each level of inquiry for activities 1 and 2.
Figure 4. Score to achieve each level of inquiry for activities 1 and 2.
Education 15 00053 g004
Figure 5. Scores obtained by groups and stages in activity 1.
Figure 5. Scores obtained by groups and stages in activity 1.
Education 15 00053 g005
Figure 6. Proportional representation of the performance achieved by each group in activity 1 (Stage 3 has been eliminated as it is not necessary in this activity).
Figure 6. Proportional representation of the performance achieved by each group in activity 1 (Stage 3 has been eliminated as it is not necessary in this activity).
Education 15 00053 g006
Figure 7. Scores obtained by groups and stages in activity 2.
Figure 7. Scores obtained by groups and stages in activity 2.
Education 15 00053 g007
Figure 8. Proportional representation of the performance achieved by each group in activity 2.
Figure 8. Proportional representation of the performance achieved by each group in activity 2.
Education 15 00053 g008
Figure 9. Proportional representation of the performance achieved by each group in both activities attending to the colors in Figure 3.
Figure 9. Proportional representation of the performance achieved by each group in both activities attending to the colors in Figure 3.
Education 15 00053 g009
Table 1. Distribution of scores by stages and skill for activity 1 and maximum cognitive demand that must be achieved in each of the activities.
Table 1. Distribution of scores by stages and skill for activity 1 and maximum cognitive demand that must be achieved in each of the activities.
Activity 1: Choosing a Weather Station
StagesSkillsCognitive Demand
Total
E1: Research approachIdentification of researchable problems012 328
Hypothesis formulation012345 15
Information search01234 10
E2: Research planningIdentification of variablesRecognize types0123 655
Technological01234 10
Physics012385 19
Research planningLong term01234 10
Each session01234 10
E3: DataObservation and data collection0 00
Interpretation of results0 0
E4: ConclusionsConclusion and argumentReference to evidence01234562163
Disadvantages012345621
Opposite position012345621
E5: Communication of resultsResults presentationClarity01 131
Graphics and images012345 15
Language012345 15
E6: ReflectionReflectionScience and technology0123 612
Self-assessment0123
Total0163042483518189
Table 2. Distribution of scores by stages and skill for activity 2 and maximum cognitive demand that must be achieved in each of the activities.
Table 2. Distribution of scores by stages and skill for activity 2 and maximum cognitive demand that must be achieved in each of the activities.
Activity 2
StagesSkillsCognitive Demand
TotalT
E1: Research approachProblem identification012 328
Hypothesis formulation012345 15
Information search01234 10
E2: Research planningVariable identificationRecognize types012 349
Technological014 5
Physics012612 21
Research planningLong term01234 10
Each session01234 10
E3: DataObservation and data collectionFunctioning012345 15101
Instruments0123412628
Location 12346622
Interpretation of resultsAbout data 12345621
Validation0 45615
E4: ConclusionsConclusion (argumentation)Reference to evidence01234562163
Disadvantages012345621
Opposite position012345621
E5: Communication of resultsResults presentationClarity01 131
Graphics and images012345 15
Language012345 15
E6: ReflectionReflectionScience and technology0123 612
Self-assessment0123 6
Total0204051686342284
Table 3. Previous ideas of the students participating in the TLS.
Table 3. Previous ideas of the students participating in the TLS.
Presented IdeaAssociated Data or Phrase
Related to Scientific Practice and Inquiry
They understand the activity as buying a weather station online and then placing it in the school“So we looked online for a station to buy and then we put it in the school.”
They justify the scientific component because:It is conducted in the subject of Scientific Culture“It’s a scientific project because otherwise, we wouldn’t do it in the Scientific Culture subject.”
A weather station is purchased, an instrument associated with measuring atmospheric phenomena“It’s a meteorological project because it’s used to measure the weather.”
They do not see the usefulness of the weather station, although they do recognize the importance of knowing the weather“And why do we need a weather station if in the end, the mobile phone tells you the weather forecast?”
Related to Physical and Meteorological Content Linked to the Project
MagnitudesThey do not know the meteorological magnitudes measured by a weather station.“All weather stations will be more or less the same.”
“It will measure heat, cold, wind, storms, lightning, and all that.”
“Lightning cannot be predicted or measured.”
The only magnitudes clearly mentioned are temperature and wind“It won’t measure heat and cold, just temperature.”
The weather station is not exactly understood as a measuring instrument but rather as a weather forecasting tool“The station will be used to know if the weather will be good or if there will be a storm.”
MeasurementsOnly the thermometer and the weather vane are named as meteorological instruments. The units of measurement are known, but not the name of the instrument that measures wind speed“Temperature is measured with a thermometer and wind with a weather vane.”
“Temperature is measured in degrees or Kelvin degrees.”
“Wind speed is measured in km/h.”
Regarding characteristics, it is only mentioned that there should be no errors. It is not specified how or what needs to be considered for this to happen“It is important that the station measures accurately”… “That means the station should not have measurement errors.”
LocationTwo characteristics are mentioned: there should be no obstacles that interfere with the wind; it should be placed in the sun (without justification)“It should not be placed behind a wall, for example, because then the weather vane won’t move.”
“And it should also be in the sun” (the reason is not provided).
Related to Technological Implications and Project Aspects
They believe that all weather stations are the same or very similar in characteristics“All weather stations will be more or less the same; there won’t be a difference between the ones each group chooses.”
They are unaware of the station’s operation regarding its connection formats, parts, etc.“But the station runs on batteries.”
“And we need to check it to know what it is measuring.”
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Pozuelo-Muñoz, J.; de Echave Sanz, A.; Cascarosa Salillas, E. Inquiring in the Science Classroom by PBL: A Design-Based Research Study. Educ. Sci. 2025, 15, 53. https://doi.org/10.3390/educsci15010053

AMA Style

Pozuelo-Muñoz J, de Echave Sanz A, Cascarosa Salillas E. Inquiring in the Science Classroom by PBL: A Design-Based Research Study. Education Sciences. 2025; 15(1):53. https://doi.org/10.3390/educsci15010053

Chicago/Turabian Style

Pozuelo-Muñoz, Jorge, Ana de Echave Sanz, and Esther Cascarosa Salillas. 2025. "Inquiring in the Science Classroom by PBL: A Design-Based Research Study" Education Sciences 15, no. 1: 53. https://doi.org/10.3390/educsci15010053

APA Style

Pozuelo-Muñoz, J., de Echave Sanz, A., & Cascarosa Salillas, E. (2025). Inquiring in the Science Classroom by PBL: A Design-Based Research Study. Education Sciences, 15(1), 53. https://doi.org/10.3390/educsci15010053

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop