Next Article in Journal
Digital Redesign of Problem-Based Learning (PBL) from Face-to-Face to Synchronous Online in Biomedical Sciences MSc Courses and the Student Perspective
Previous Article in Journal
Trust, Transgression and Surrender: Exploring Teacher and SEND Student Perceptions of Engagement with Creative Arts Project-Based Learning (CAPBL) Pedagogies
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

German and French Students’ Strategies While Performing Geographical Comparisons in a Group Task Setting

Institute of Geography Education, University of Cologne, Gronewaldstraße 2, 50931 Cologne, Germany
*
Author to whom correspondence should be addressed.
Educ. Sci. 2023, 13(8), 849; https://doi.org/10.3390/educsci13080849
Submission received: 11 July 2023 / Revised: 14 August 2023 / Accepted: 18 August 2023 / Published: 20 August 2023

Abstract

:
Today’s challenges, such as climate change, require developing geographical literacy, which includes discussion and argumentation around scientific results. One important geographical method and competency is comparison. However, learning geographical methods, such as comparison, can be a challenge for students if they rarely solve open tasks that do not require simple answers. In this study, we analysed group discussions that took place during an intervention, aiming to develop comparison competency with 44 German and French students from the experimental group. Through the use of the documentary method, students’ main orientations and strategies to solve the open comparison tasks were reconstructed. We related the implementation of the comparison method during group discussions to students’ individual progress during the intervention and explored differences between French and German students. Results show that students’ main task completion orientation was challenged by their uncertainty towards the comparison task. Groups developed strategies to solve the task, showing, in a few cases, competency acquisition processes. Only a few differences were found between German and French students. Overall, implementing scientific literacy means to operate a shift in task culture at schools towards more open tasks aiming to enhance geographical competencies and argumentation.

1. Introduction

In a rapidly changing world, facing important challenges such as climate change or the consequences of the COVID-19 pandemic calls for more science education (in both the natural and social sciences). There is a need to form “scientifically informed” students able to act and exert their agency to build a more sustainable future (e.g., [1], pp. 5–6). Scientific literacy—one’s ability to understand scientific issues and how science is produced while thinking critically about it [2] (p. 16)—is therefore a fundamental competency to be able to tackle future challenges and has long been identified as such [3,4]. Recent analyses emphasize the collective dimension of scientific literacy and of agency [1] (p. 12), i.e., responses to the aforementioned societal challenges are to be discussed and found collectively. Therefore, both educational contexts and geography education are a pivot to enhance scientific literacy and discussion around intrinsically geographical challenges, and therefore need a deeper understanding of geographical methods.
One of the most important research methods in the natural and social sciences is comparison [5], which is, for example, used by geographers to interpret and understand similarities and differences between geographical places. Mastering comparison is essential to understand “big” geographical questions [6] (p. 307). It can help understand, for example, how climate change differently affects geographical places depending on different criteria (for example, different vulnerabilities of natural or urban systems [7]) and how different reactions or measures (for example, a framework for urban climate resilience [8]) can have different outcomes depending on the studied places. To develop scientific (geographical) literacy and help students understand complex challenges, enhancing key geographical methods, such as comparison, is necessary. While learning these methods, students have to understand that argumentation and reasoning are core scientific features [4] (p. 2), which means that educators have to integrate debate and discussion about scientific methods into common classroom activity. However, this can be difficult. First, on the one hand, there is school and “task” culture and, on the other hand, there are students’ patterns of action, which often lead them to look for only one “right” answer, leaving little place for uncertainty or collective debate [9]. Second, since school and academic geographies are often diverse [10,11], it becomes difficult to teach geographical thinking productively [12].
This article is part of a mixed-method study [13] implementing an embedded design. In the quantitative parent study, we led a quasi-experimental intervention with 83 German and French students (44 students in the experimental group, 39 in the control group). We tested their comparison skills and used an educational tool to enhance students’ comparison competency [14]. Results showed that although students mastered very low levels of comparison competency in the pre-test, it was possible to enhance it via the explicit teaching of the scientific method that is comparison. This article focusses on the embedded qualitative study that took place during the intervention. We analysed qualitative data obtained during group discussions within the experimental group (44 students) to better understand our quantitative results and explain the comparison competency acquisition processes. Our research questions were:
(1)
What student groups’ action-guiding orientations can be reconstructed while they perform open comparison tasks?
(2)
What strategies do German and French students adopt to solve argumentative and collaborative comparison tasks, and how do these strategies relate to individual performance regarding comparison competency during the intervention?
(3)
To which extent can we identify different action-guiding orientations or strategies between French and German students?
This article follows with a description of our theoretical basis (Section 2) and of our embedded mixed-method approach aiming to explain our quantitative results via qualitative analysis (Section 3). Then, we present the results of our analysis (Section 4), and finally, we discuss the implications for the designing of comparison tasks in geography education (Section 5).

2. Theoretical Background

2.1. Comparison: A Method and a Competency

To compare is the cognitive process of juxtaposing comparison units (for example, regions or urban areas) along comparison variables (for example, access to resources or population density) to identify similarities and/or differences [15] (p. 6). Comparison is a much used research method. It is used in the natural and social sciences to derive rules from particular examples in theory-oriented approaches [16] (p. 691), or to understand fine complexities and variations between local or specific examples in a more idiographic way [5,17,18]. In geography, comparison of places and derived explanations are at the core of the discipline: for example, Cutter et al. [6] (p. 307) identified the question “What Makes Places and Landscapes Different from One Another and Why Is This Important?” as the first “big question” there is for geographers to investigate. Morgan [19] (p. 275) also defined geographical thinking as the “trained capacity to construct a mental map to see patterns, recognise relationships, to see movement, to take that map and ‘clothe it in meaning’.” In this definition, identifying patterns means to compare geographical spaces or places. While comparing geographically, many scientific and methodological practices are possible, such as selecting a large number of comparison criteria or variables to identify types or patterns (for example, to study urban growth of global cities [20]), or exploring, qualitatively, the local specificities of a common characteristic (for example, to analyse gentrification processes [21]). Comparison objectives are also very much discussed among geographers around the question of the possibility to generalize and derive rules from examples in a nomothetic way (e.g., [22]). Therefore, comparison is not an easy or a ready-made geographical method, because it supposes to decide carefully and being able to argue to defend not only one’s own decisions while selecting comparison elements, but also the followed comparison process and obtained results. As a consequence, argumentation and reflection are at the core of the comparison method. Designing meaningful comparison tasks for geography education, while promoting the acquisition of geographical methods, such as comparison, means to guide students so that they argue and reflect on their choices.
Wilcke and Budke [23] (p. 7) developed a model for the comparison method in geography education, which highlights these necessary elements such as argumentation while describing comparison as a process following different steps. In the first step, students formulate a specific question to solve with the help of the comparison (for example, to investigate reasons for how human migrations changed over time). In the second step, they choose the comparison units, such as past and recent migration waves. In the third step, students select comparison variables, such as political or economic factors, and then juxtapose these units along with the variables in the fourth step, in order to identify similarities and/or differences. Finally, students weigh the different variables and formulate an answer to their question. In this process, each step must be carefully reflected upon. Performing a comparison is therefore a complex competency that can be divided into four dimensions (see Figure 1). Comparison competency supposes to be able to, first, organize and implement comparison processes sustained by argumentation (First and Second dimensions of comparison competency: “Planning and implementation of comparison processes”, and “Reflection and argumentative justification of comparison processes”, see Figure 1). Along with these methodological components, there are also content-related dimensions of comparison competency (Third dimension, “Acknowledgement and analysis of interrelations between geographical information,” and Fourth dimension, “Achievement of comparison objectives”, see Figure 1).
To evaluate how comparison tasks allow to develop comparison competency, we led a textbook analysis showing that French, German and English textbook tasks often focus on content-related dimensions of comparison competency (Dimension 3: “Acknowledgement and analysis of interactions between geographical information”, and Dimension 4: “Achievement of comparison objectives”, see Figure 1). Most tasks do not allow to autonomously plan and reflect on comparison processes [24,25]. Our evaluation of university and German and French secondary students’ performances, before having received training, also showed that they had very low levels of comparison competency in the second dimension (Dimension 2: “Reflection and argumentative justification of comparison processes”, see Figure 1) since they recurred, only rarely, to argumentation to support their results or justify their choice of comparison elements. Overall, although argumentation and reflection are central to comparison processes, those were precisely the skills students in our two former studies lacked while answering comparison tasks [14,26]. This highlights the difficulties for students to overcome the gap between school and academic geographies [10,11] and the subsequent necessity for educators to find tools to help students to learn geographical thinking and skills [12].

2.2. Group Discussions to Enhance Comparison Competency Development

Since students have difficulties in the argumentative dimension of comparison competency, there is a need for educational tools and task settings that can enhance the development of comparison competency and this dimension particularly. Different authors have already called for more research on geographical skill development [27] and on “effective geography teaching” [28] (p. 8). However, there are still few intervention studies in geography education [29], and no intervention to date has integrated the teaching of the comparison method via group work settings in geography education research [29]. Interventions often focus on other competencies such as system or spatial thinking, e.g., [30,31]. In our parent study, and first quantitative part of the project, we used the comparison method as a scaffold during the individual learning phases of intervention and during group discussions (see Materials and Methods for a description of the overarching project). Our results showed that students improved their comparison competency significantly between the pre- and post-test compared to the control group, and that the use of the comparison method [23] during the individual work phase of the intervention was positively correlated to students’ progress between the two tests [14]. However, we do not know to what extent group discussions helped enhance comparison competency; thus, this study focusses on this specific part of the intervention and qualitatively analyses group discussions to evaluate their contribution to comparison competency development.
While in class, and also during group work, students act according to certain action-guiding orientations [32,33,34]. Action-guiding orientations are internalised patterns of perception, thought and action [34], modi operandi that guide practical action [32,33] and lead to adopt certain strategies. One important student action-guiding orientation is the “task completion orientation”, which corresponds to their “student job” oriented towards the delivery of a work product [34,35,36]. Luhmann [9] (pp. 77 ff.) has described this usual classroom situation in his concept of “trivialization”, which exists in schools because the knowledge to be acquired is pre-defined and a distinction is usually made between false and correct answers. However, our intervention and, specifically, the group work phase were designed to allow for very different answers to the initial task and for a variety of comparisons (as is also the case in scientific comparisons). Therefore, they did not correspond to the usual and reassuring setting of a task leading to only one “correct” answer, which is often the case in secondary [34] and geographical education, and which is also a problem in other natural sciences [4]. Since no other intervention tested a comparison task allowing for a variety of multiple answers, we do not know how students dealt with this “new” situation where they had to compare their answers and find a solution and, thus, if their possible “task completion orientation” was challenged by the task. Jiménez–Aleixandre et al. [37] also showed that concurring cultures relevant either to scientific or to school culture could be seen in group work settings with students sometimes “doing science” and sometimes “doing the lesson”. Thus, our study aims to explore which action-guiding orientations students were more drawn to while answering the group task, to better understand how these orientations could influence groups’ strategies while solving the task.
Group discussions or having to justify one’s own results in a group setting and debating to find a common answer can be a way to develop argumentative and reasoning skills [4,38,39]. Argumentation is necessary for one’s own records or reflections during the comparison process and the selection of comparison elements, such as comparison units or variables. But it is also necessary so that students can justify their results in front of other students in a group setting and in a scientific context [4]. Osborne et al. [4] showed that student argumentations were enhanced when presented with alternate ideas. Interaction and small groups also allow better outcomes to develop argumentation and scientific reasoning skills than individual learning [40,41,42]. However, tasks involving group discussions without scaffolds, which guide students to develop arguments, lead to very little successful argumentation [43]. Other research findings suggest that explicit prompts that encourage reasoning have positive effects on students’ argumentations [4,44]. Although we used the comparison method as a scaffold during the whole intervention, we do not know to what extent students integrated this into their strategies to solve the task, and therefore, collectively, developed their comparison competency. Strategies adopted by students engaged in group work can be very varied to finish with a common answer. Albe [45] described some of them, which were discussion, voting, collaborative argumentation, role playing and imposition of authority or acceptance of other arguments. Such group work also implies the adoption of specific roles between students who can act as “leaders” or “helpers” [45] (p. 84). This study aims to clarify what strategies were used by the groups, and which of them allowed themselves to develop comparison competency.
Overall, this study aims to qualitatively analyse students’ action-guiding orientations and strategies during the group discussions in relation to their comparison competency acquisition [46] (p. 235), and to identify possible differences between French and German students while using the reconstructive approach of the documentary method [47].

3. Materials and Methods

This qualitative study took place during a quasi-experimental intervention in an embedded mixed-method design [13].

3.1. Previous Study Research Design

We recruited 83 students in the age range from 16 to 18 from two secondary schools, two classes in Germany (“11.Klasse”) and two classes in France (“classe de Terminale”), who constituted the 44 students of the experimental group (29 French, 15 German) and 39 of the control group (31 French, 8 German). Students from both groups took a pre- and a post-test just before and after the intervention, which took place between October and December 2021 in both countries. Both tests allowed to assess students’ comparison competency in all dimensions of the competency model (see [14,26] for more details on the assessment; see Figure 1), and we could assess that students only mastered low competency levels at the beginning of the intervention. At the end of the intervention, results revealed significant progress in comparison competency in the experimental group compared to the control group. We could also positively correlate the use of the comparison method during the intervention to the difference between post- and pre-tests: students who had used the comparison method during the intervention were those who progressed most between the tests [14].
Students from the experimental group were taught an intervention course during 6 classes of 45 min each (see Figure 2) based on a digital learning unit available as an OER (Open Educational Resource) (The digital learning unit is available in German: https://www.ilias.uni-koeln.de/ilias/goto_uk_lm_4325913.html (accessed on 19 August 2023). In French: https://www.ilias.uni-koeln.de/ilias/goto_uk_lm_4391846.html (accessed on 19 August 2023). In English: https://www.ilias.uni-koeln.de/ilias/goto_uk_lm_4911773.html (accessed on 19 August 2023)). All data used in the learning unit were developed with scientists from the Collaborative Research Center “CRC-806”, which worked on migration routes from Homo Sapiens from Africa to Europe and whose data were adapted to be taught in high schools (Scientists (archaeologists, geographers, climate scientists, anthropologists) from the CRC-806 “Our way to Europe” analysed factors, obstacles and possible routes for human dispersal from Africa to Europe. Our institute participated while adapting scientific results into school material and conducting educational research. See https://www.sfb806.uni-koeln.de/ (accessed on 17 August 2023) for more information). An overview of the intervention can be found in Figure 2.
In the first phase of the intervention, students learned the comparison elements (variables, units) and the different comparison steps as described by Wilcke and Budke [23], and were adapted as a teaching tool for the intervention (see Figure 3).
In the second phase of the intervention, students had to carry out a comparison. To complete the task, students were guided through the steps of the comparison method used as a scaffold: the main subject was migration, and the main question of the digital learning unit was how similar or different migrations are in time. Students were given comparison elements such as comparison units (recent human migration compared to migration from Homo sapiens). However, they decided, autonomously, which specific question or variables they wanted to investigate, which was different from usual comparison tasks found in textbooks in both countries [24,25]. For example, students could choose between investigating the reasons for migration or routes taken by migrants, and had many different data available that they could choose from to answer their question.

3.2. Description of the Third Phase: Group Discussions (Focus of This Study)

The third phase of the intervention, which is the core of this present qualitative study, was taught in a 45 min class and involved two phases of group discussions. Figure 4 provides an overview of the educational method used during the class.
Since students had worked previously on different questions around migration but had analysed different variables or data such as a video or a map, at the start of class they could have slightly different answers to the question they had chosen to study (end of phase 2, see Figure 4). Therefore, in phase 3.1 (see Figure 4) they were sorted into different groups, question by question, and had the task to compare their answers and then come to a common answer while reflecting on their previous comparison choices from phase 2. In phase 3.2 (see Figure 4), having come to a common answer, students were grouped with different students having worked on different questions. They had to answer the overall question “Are migration movements in the past and today similar or different?” To do this, they had to contrast and compare their previous answers and to defend their conclusions to come up with a common answer (Figure 5 provides an example of a task sheet from phase 3.2).
At the end of the discussion, groups were asked to produce a poster summarizing their results (see Figure 5). They were allotted 15 min to solve each task in both phases. Tasks were formulated very openly and did not provide guidance on the method to come to an answer, which allowed us to analyse groups’ strategies and intents to solve the task. Both included reflection on comparison in two ways: first, because during phase 3.1 students had to reflect on the comparison of migration processes, and second, because during both phases they had to compare their own answers to come to a result. Researchers and classroom teachers were at the students’ disposal but only joined the groups when students asked for help or signalled that they had completed the task. All discussions were audio-recorded and anonymised. Posters realized by the groups during phase 3.1 were also collected. Students, parents and school staff were all informed of the research methods and objectives of our study to which all consented.

3.3. Data Analysis

A total of 12 groups (4 in Germany, 8 in France) participated in each phase of the discussions. However, in each phase (3.1.and 3.2), two groups of French students decided not to deliver a recording and two other groups delivered only very short recordings of their results without delivering the discussion leading to it. None of the French groups delivered a poster, although students wrote on their task sheets.
Obtained discussions were transcribed using MAXQDA and then analysed qualitatively. Our qualitative analysis was carried out in two stages: firstly, an analysis of the orientations and strategies based on the documentary method, and secondly, an analysis of groups’ implementation of the comparison method. In a first qualitative analysis, we used the documentary method [47] to better understand groups’ action-guiding orientations, strategies and competency acquisition during the discussions. The documentary method is used in social science to study group discussions or interviews [48], but also in education research [49]. This method allows to reconstruct the groups’ implicit collective knowledge and action-guiding orientations [32,33]. First, it involves analysing the content and meaning of what is said or achieved during the group discussion. This stage is known as formulating interpretation, and enables the thematic structure of the document to be identified. The second stage of the documentary method is called reflecting interpretation. This examines how the content is formulated and discussed within the group, and thus analyses the organisation of the discourse and the interaction. It reveals the orientations guiding the actions of the members of the group, but also the extent to which the members share these collective orientations. The first author analysed all discussions, reconstructing central action-guiding orientations and deriving similarities and differences between groups’ strategies. Strategies used to compare the group members’ answers were also analysed in relation to the comparison method learnt during the previous phases of the intervention to check for the integration of this method [46]. After a first validation with the second author, several researchers speaking German and/or French were asked to validate the interpretations in three sessions. During these sessions, researchers analysed the same examples of group discussions, which allowed to refine and validate the analysis.
The thematic structure of the group discussions was very much constrained by the comparison task that the students had to solve. To relate groups’ used comparison strategies to individual achievements of the groups’ members at the end of the intervention, we analysed, more specifically, student discussions from phase 3.1 in a second step of the analysis. Discussions were coded deductively using content analysis [50] to check the use of the comparison method steps (see Figure 3) as a strategy to solve the task, since it was also explicitly formulated in the task sheet. To conduct this, we used the same assessment of comparison steps that was used in the quantitative study (see Table 1, [14]), and allowed us to obtain a maximum of 10 points. Then, this was analysed quantitatively to see the distributions of the groups’ results and their assimilation of the comparison method. The difference between German and French groups was tested for significance using a t-test.
Spearman’s ρ was calculated to analyse how being in a group who used the comparison method during phase 3.1 of the discussion could correlate to students’ individual improvement between the pre- and post-tests assessing comparison competency. To conduct this, we used results from the pre- and post-test from the quantitative phase of the project. These tests allowed us to assess, with an open comparison task, students’ comparison competency using our already validated assessment tool (see [14,26], Table 2).
Groups’ comparative strategies during phase 3.2 of group discussions were reconstructed using the documentary method. To complete the analysis, Fisher’s exact test was calculated to determine if there was a significant association between the use of a specific strategy to come to a result within the groups and students’ individual progress between the post- and the pre-test of the intervention.

4. Results

4.1. Analysis of Group Interactions: Reconstruction of Groups’ Action-Guiding Orientations and Strategies

In a first qualitative analysis, using the documentary method, we reconstructed the groups’ collective orientations while solving the comparison tasks in phases 3.1 and 3.2 to analyse how groups dealt with an open comparison task, and which strategies they used. For clarity, results are presented using selected examples from our corpus, beginning with common action-guiding orientations and then analysing strategies to solve the task.
All groups who delivered a recording shared an orientation towards task completion. This was materialized by the fact that groups organized their work, often implicitly, with one student or two leading the group work while reading the task on the task sheet and distributing talk turns at the beginning of the task. Additionally, leading students often controlled the recording of the discussion and repeated or re-read task formulations on the task sheet.
The task completion orientation was challenged by the openness of the task. In all groups, students had difficulties dealing with this openness, which revealed an orientation towards knowledge reproduction while thinking that tasks can only have one “right” or “correct” answer. Box 1 shows a sequence from a French group in phase 3.1, and allows us to see these two main orientations reconstructed.
Box 1. Excerpt from a French group discussion (phase 3.1). All names are fictitious. Translated from French by the authors
John: So now we have to get to question 3. “A common result to the question”... [0:05:17.1]
Charlotte: Do we have to record this too?
John: Yes. So our answers were different to the question…
Charlotte: Yes.
John: We have to come to an agreement and come up with a common answer. [0:05:40.4]
Charlotte: Well we agreed that there were differences and similarities between past and present migrations. [0:05:46.1]
John: Yeah. So does it have to be a nuanced answer or does it have to be a "yes" or "no" answer? [0:05:53.8]
Charlotte: Erm... well if we rephrase question we can show that it's nuanced...
John: Yeah, I don't know.... [0:05:59.9]”
The group in Box 1 is led by John who refers to the task sheet in a new proposition and reads elements of the task. John also controls the recording of the discussion. He is also the one repeating words such as “we have to” or “does it have”, which shows his concern to complete the task. But one can also see how unsure he is of how the task shall be answered. The form of the answer (if it is a nuanced or an exclusive answer) is very important to him, so that it is the expected one or the “right” one. This uncertainty is also visible in the pauses in speech during the recording. All groups in both countries expressed this uncertainty. Depending on the groups, some were expressing uncertainty towards the task formulations, towards the method of use to solve the task, or uncertainty towards one or the group’s own answers after formulating a result.
To overcome this uncertainty, groups developed strategies. Some were strategies to evict the task and/or the debate around the common answer. Such strategies were, for example, to realize the task as quick as possible, to state that they agreed even if they did not or to provide an answer which validity did not seem important while completing the task as quick as possible. While some students did not intervene in the discussions (as in our former example), other groups simulated debate without really debating and accepted one student’s answer as the group’s without discussing. These group strategies were visible in both countries and are shown in Box 2 with an example from phase 3.2.
Box 2. Excerpt from a French group discussion (phase 3.2). All names are fictitious. Translated from French by the authors
Aude: We have to debate ... (reads) So, in your opinion: “Is migration in the past and today similar or different?” [0:01:42.2]
Etienne: For me it is rather different. For me it is rather different because new factors come into play that didn't happen before. For example, war or even political reasons mean that migration is taking place all over the world. [0:02:01.4]
Hélène: For me it is also different, due to climate change, certain migratory routes have been removed or (laughter) annexed because it was either too hot or too cold so it was no longer possible. The political context, because some migration routes may have been possible before, but for example because of civil wars or... for example through Israel it’s impossible to migrate. And also technological progress because now the means of transport are much more developed than those of Homo sapiens so it's different. [0:02:46.5]
Caroline: Well, for me migration in the past and today is different because the climatic risks have changed, the types of transport used have changed, before they used boats that weren't very well built and now they use more modern boats and there are different types of risks linked to migration because now, because of the borders, migrants can't get back into the country like they used to at the time of Homo sapiens. [0:03:19.5]
Aude: So for our group, migration in the past and today is similar because it's always for the same reasons that people migrate but it's rather that the way of migrating is different. So I think our claim will be that migration is different in the end. And erm... well, so we've given our arguments...
Caroline: Well, yes, we've already set out our arguments with erm... let’s stop now. [0:03:31.5]”
In this excerpt from a French group (see Box 2), debate is only simulated since all students present the conclusions from phase 3.1 without really debating on the common result. Each student says “for me” at the beginning of their answer but does not try to convince others why their results shall weigh in the common answer. The interaction can be synthesized in a succession of propositions without relation within them, although the whole discussion is supposed to be a debate. In the end, the leading student (Aude) formulates a common answer rather rapidly (“migration is different”). She abides by others’ conclusions, although her own group had said the contrary. Here, we can clearly see a common strategy to solve the task as quickly as possible and to formulate an answer, in which validity is not a concern, while simulating a debate. Again, these strategies highlight the common orientations, which were task completion and the culture of the “right answer” while trying to find a solution to the task.
Other strategies aimed to seek security or help while solving the task. Different means were recurred to, such as using the task sheet and following instructions, asking teachers or researchers for help, trying to look at what other groups answered, writing their ideas or realizing the poster to structure their ideas. An excerpt from a German discussion in phase 3.2 can serve as an example (see Box 3).
Box 3. Excerpt from a German group discussion (phase 3.2). All names are fictitious. Translated from German by the authors
Nina: “Prepare a poster with your key messages”. Yes, I think we could write reasons colon, obstacles colon, route colon and conclusion and see if that has changed or not. [0:05:26.1]
Lars: Normally I would look somewhere for inspiration, but I don't think it would be a good idea if I got up now.
Anne: The thing is, I thought we were going to say that altogether now.
Nina: Shall I ask?
Anne: Yeah, I would say.
Nina: You ask best. [0:06:07.0]
Anne (to researcher): Are we supposed to say that altogether or is the idea now that everyone from a group writes down, so to speak, reasons, obstacles, routes and then we write down “it is similar because of that”, “it is different because of that”? Or should we really do that altogether? [0:06:20.2]”
In this discussion (see Box 3), the group, led by Nina, discusses in a very short time different strategies to solve the task. While Nina discusses how the poster shall be structured, Lars does mention the possibility to look at another group’s answers. They also ask the researcher to detail the task formulation. These strategies were used diversely by all groups, although French students did not deliver posters presenting their ideas but wrote their answers on their task sheets (designing a poster to write down ideas is not a very common task in older classes).
Finally, groups did use different strategies to formulate an answer to the common question as it was the task in phase 3.2. All of them started with listing all answers. Then, to come to an answer, three specific strategies could be reconstructed within the groups. Among the ten groups, five (two from France, three from Germany) formulated an answer based on the majority of responses. For example, if three previous groups had said “migration in the past and recent migration are similar”, then it would also be the group’s answer, although one student would disagree. By contrast, two groups (one in each country) used the weighting of variables to come to an answer and thus used a strategy learned during the intervention to solve the task. One French group only listed answers and did not use a specific strategy, refusing to take position (their answer was thus: “migration is similar and different”). The last two groups only delivered a short recording, not allowing identification a specific strategy. Box 2 shows how a French group recurred to the majority strategy to come to an answer, whereas Box 4 shows how a group came to weigh variables.
Box 4. Excerpt from a French group discussion (phase 3.2). All names are fictitious. Translated from French by the authors
Juliette: The issue is whether migration in the past and today is similar or different. So if we summarise groups 1, 2 and 3…
Arthur: Well, there are differences and similarities on different scales.
Juliette: I don’t think the answer is necessarily closed. There are necessarily several possibilities and several answers. The answers are not simple. So, group 1 had rather similarities. [0:05:54.2]
Paul: In the similarities we had climatic and political factors, so… in terms of wars. [0:06:01.0]
Juliette: Group 2...
Pierre: Transportation...
Arthur: Different modes of transport.
Pierre: and new ways of preventing migrants from crossing, for example border control. [0:06:33.6]
Juliette: And then in the differences there have been major climatic changes which have meant that migratory routes have diversified and/or have simply been replaced. There have also been political changes which have meant that certain routes have been blocked or have become more difficult to cross. Globalisation has meant that routes have diversified on a global scale rather than on a continental scale, and technological changes have meant that people can move more or less easily over longer or shorter distances. (10 seconds pause).
So... the tendency is... well, personally I think it's different. If we were to... it's not the same scale at all, it's not the same modes of transportation at all...
Pierre: It’s not the same era.
Juliette: Yes, there are a lot of things that make it different after all... [0:07:18.2]
Paul: It's the same factors but there are factors that are different too...
Pierre: It’s the same... war and all that, it’s always existed.
Juliette: But the ways are totally different.
Pierre: And the means are different. [0:07:26.4]
Juliette: Since we’re thinking the ways are more important, I think it's important to underline that, personally, I would choose, and I think you agree that we would choose the claim “migration in the past and today is different”. So can we agree on that now?
All: Yes. [0:07:43.8]
Juliette: So the arguments to support our claim... Well, as we've said, for political, climatic and technological reasons, globalisation has changed the ways and means of creating migratory routes. As Pierre said, customs also make migratory movements more or less difficult or easy in some cases. We could also say, however, that there are similarities that suggest that migration routes are not totally different from those of the past.” [0:08:33.2]
This conversation shows how, in this group, students discuss the fact that the answer is not a simple answer. This possible distancing from the orientation, to think that the task only meant to provide a specific “correct” answer, leads in the following: first, list all the answers like all groups did, but in a second phase, to provide a real weighting of variables. This is visible in the dialog between students after Juliette says “I think it’s different”, with students discussing variables (such as factors, geographical scale, routes, modes of transportation) and then Juliette reformulating the claim “migration in the past and today is different” after having stated that “ways are more important”. Here, the whole interaction shows how a specific student (Juliette) leads the organisation of speech, but also reflection. When Juliette reflects on the task, there are some hesitations, which still show signs of uncertainty towards the group’s results. However, the answer is clearly formulated in the end, and Juliette asks the rest of the group to validate her result. In this short excerpt, we can notice the use of the comparison method to solve the task, since the weighting of variables was explained during the whole intervention as a way to evaluate the results and to nuance them. Thus, a specific moment of comparison competency manifestation is recorded here. However, one specific student (Juliette) dominates the whole discussion and comparison competency acquisition for the whole group is not confirmed by this excerpt.

4.2. Analysis of the Use of the Comparison Method as a Strategy in the Group Discussions

In a second phase, group discussions in phase 3.1 were evaluated via deductive content analysis [50] to analyse the use of the comparison method in their answers on migration processes, as task completion orientation based their answers and as the task sheets structured the thematic organization of the discussions (see Table 1). Results are presented in Figure 6, which show, cumulatively and by identifying the groups according to the country, the number of points obtained in our assessment of the use of the comparison method (for example, three groups in total (one German and two French groups) obtained seven points in the assessment).
Groups obtained results between 3 and 8 points out of a maximum of 10 points, with a mean of 6.4 points and a median of 7. These results show that students used the comparison method as they were asked by the task sheet. Only one group obtained less than half of the possible points (10). German groups performed slightly better than French groups, obtaining a mean of 7 and a median of 7.5, while French groups obtained a mean of 6 and a median of 6.5. However, there was no statistically significant difference between results in the two groups as the t-test showed, t(8) = −0.934, p = 0.378. All groups formulated their research question and answered it (Steps 1 and 6, see Figure 3 and Table 1) with juxtaposed comparison units according to comparison variables (Step 4, see Figure 3 and Table 1). However, only 50% of German (two out of four) and French groups (three out of six) could identify properly the comparison units, and only three German groups (75%) and four French groups (66.6%) could identify comparison variables and the data used to compare (Steps 2 and 3, see Figure 3 and Table 1). All groups did explain their results, but one French group did not weigh the variables (Step 5, see Figure 3 and Table 1). This group was also the only one not arguing on the steps of the comparison process, since the recording mainly consisted of the results of the comparison. However, although groups argued about the obtained results points (Transversal task, see Figure 3 and Table 1), they did not argue to justify the choice of comparison elements with five groups (1 German, 4 French), obtaining only 1 point out of 4 possible points in this task and four groups (3 German, 1 French) only obtaining two points.
We correlated groups’ performances in this assessment to students’ individual progress between results in the pre- and the post-test during the intervention. Overall, students from the experimental group had obtained better results in the post-test (mean of 12.68 points out of possible 28 points in the test) than in the pre-test (mean of 10.05 points), with a significant net difference of 2.64 points. Progress was made in all dimensions of comparison competency (see [14] for more detail). Being in a group who used the comparison method extensively during phase 3.1 of the intervention (first part of the group discussions phase), correlated rather strongly [51] to individual students’ improvement between the pre- and the post-test: Spearman’s ρ = 0.412, p = 0.009. This indicates a positive relationship between the use of the comparison method, which was learnt, practised in each phase of the intervention, and also presents as a scaffold in the task sheets for the group phase, and the development of comparison competency.

4.3. Correlation of Groups’ Strategies to Students’ Individual Achievements during the Intervention

Our qualitative analysis following the documentary method showed that groups of students selected different strategies to come to an answer to the question “are migration movements today and in the past similar or different?”: either they formulated an answer after the majority of responses, or they weighted variables, or did not follow a specific strategy, or did not deliver a recording. These strategies allowed them to provide four sorts of answers: either a nuanced answer, an exclusive answer, an undecided answer, or no answer (see Table 3).
The four German groups all formulated a nuanced answer as did two French groups (see Table 3). Two French groups delivered an exclusive answer, and another French group formulated an undecided answer (see Table 3). One French group did not deliver an answer. We calculated the relation between being in a group using a specific strategy and individuals’ achievements in post-test using Fisher’s exact test. Results are presented in Table 4.
An improvement between pre- and post-test and strategy of the pupils was significantly correlated (p = 0.002). Students performed better in the post-test when they had completed the task entirely during the group discussions and come up to an answer, than students not completing the task (see Table 4). These first elements show that solving the task during group discussions correlated to individual comparison competency acquisition. However, although the strategy of weighting variables had been learned during the intervention, its use during group discussions was not the strategy that better correlated to improvement between the post- and the pre-test. On average, more students progressed who were in groups that based their answer on the majority of responses and in the group not having chosen a specific strategy, than students being in groups who used the weighting of variables (see Table 4).

5. Discussion

In this article, we presented results from a qualitative analysis of group discussions that took place during an intervention study, in which students implemented the comparison method, with 44 French and German students from our experimental group. Our objective was to analyse students’ action-guiding orientations and strategies while exploring how groups solved open comparison tasks, in order to relate strategies to individual comparison skill improvement and to identify possible differences between German and French students. This study provides first insights on comparison competency acquisition processes, responding to calls for research in the area of skill development [27]. Our analysis of groups’ strategies and action-guiding orientations while solving open comparison tasks also provides insights into possible explanations for our quantitative analysis, which showed students’ competency improvement during the intervention [14].
German and French groups shared a similar orientation towards task completion, which could be reconstructed via the use of the documentary method [47,48]. This general action-guiding orientation, visible in all group recordings, is common, since it corresponds to the “student job” [35,36] and was also reconstructed in other research situations [34]. This also corresponds to students “doing the lesson” as analysed by Jiménez–Aleixandre et al. [37]. Martens and Asbrand [34] (p. 64) have described in their typology, within the task completion orientation, how a frequent type of action is to deliver a result (such as a poster or a completed task sheet). While delivering this result, students try to be efficient, do not necessarily identify with the result or with the subject matter and do not analyse it deeply. In our study, this orientation and corresponding strategies (such as answering as quick as possible) could also be reconstructed. A second action-guiding orientation was visible in our corpus, in which students from both countries tried to look for the “right” or the “correct” answer. Students did express their uncertainty towards the way to solve the task and their own answers, and showed uneasiness towards the task’s openness. This is consistent with other research results which showed how students tried to tell a “story of success” while solving the task, instead of reflecting on inquiry hesitations or processes [52], and which described this type of action within the task completion orientation as knowledge reproduction [34]. Luhmann [9] and Perrenoud [36] also showed how students are used to closed tasks whose answers are often already known by the teacher. This result shows that using open tasks can be a challenge in interventions and, more generally, in geography education since students in Germany and France are not used to this approach as our textbook analysis showed [24,25]. Perrenoud [53] described how the changing culture towards competency acquisition can meet resistance from students who have to accept the change in the “didactic contract”. This resistance from students in our study was also shown in students’ uneasiness towards the research situation. Some groups did either not deliver a recording, or delivered very short recordings (this was the case in the French group with two groups in each phase). Also, some students did not participate with the group discussions, leaving for other members of the group to take decisions and discuss tasks. Even in some of the groups who did deliver recordings and answered the task, laughing attitudes, off topic conversations and attitudes of refusal were observed, although this was less of the case in phase 3.2 than in phase 3.1, revealing a possible adaption to the research situation. In the French group, uneasiness towards the group work was also visible, with some groups experiencing difficulties to work collaboratively and no group delivering a poster, which was actually the task. This shows that comparison competency acquisition via group discussion and, more generally, competency acquisition through open tasks in a scientific context should be implemented in schools in the long term. Although our intervention allowed to show the potentiality of such an approach, real competency acquisition would indeed need the construction of a new long term habitus [34,54]. It would allow, within the task completion, orientation to make a shift from the “delivery of a result” and “knowledge reproduction” task completion orientation types, towards the “own construction of [geographical] knowledge and processes” type [34], and to take a step back from the “normative nature of classroom discourse” [4] (p. 5). It would also narrow the gap between school and academic culture [10,11] and enhance the development in schools of geographical thinking [12,19].
Strategies to overcome uncertainty towards an open task were developed by student groups from both countries. Some were strategies of task eviction or “job” eviction, also identified in other works [36] (pp. 15–16) such as cheating, trying to be forgotten (in our study, students who did not intervene) or simulating task solving. For example, our group discussions showed eviction and simulation of argumentative debate, with students often accepting one (often the leading) student’s idea as the group’s solution (see Box 2 and Box 4), as was also shown in other works [45] (p. 83). Other strategies could be reconstructed, which related to solving the comparison task in an effective manner. A first strategy was to use the comparison method to discuss the subject matter, as expected in the task sheet in phase 3.1, confirming the task completion orientation by applying the comparison method. The use of our assessment tool allowed to analyse this strategy more deeply. German students performed slightly better than French students, though we found no statistically significant difference as results of the t-test showed. Our assessment allowed us to notice that groups rarely used argumentation to justify their choices in the comparison process, confirming our results from former studies in which we showed that students had difficulties in this dimension of comparison competency [14,26]. Other studies also showed that students’ skills concerning argumentation are rather low [55] (p. 68), [56] (p. 59). However, students who were in groups who used the comparison method also performed better in the intervention’s post-test than students who were in groups that did not use it to solve this task. This result is in accordance with our previous results from our quantitative study, where we could positively correlate the use of the comparison method during the individual learning phase to individual achievements during the intervention. This confirms, first, the necessity to reinforce argumentation skills to support students in the acquisition of geographical competencies such as comparison and, second, that the comparison method used as a scaffold contributes to enhance comparison competency [14].
Some of the groups solved the comparison task in phase 3.2 while adopting different strategies to come to different types of answers. This could be positively correlated to individual improvements in comparison competency between the pre- and the post-test, allowing us to state that group discussions on open tasks can contribute to competency acquisition. Answers and strategies which did not relate to the comparison method (such as the majority strategy or not deciding on a specific strategy) were used by students who also performed better in the post-test during the intervention. This was also the case in the study from Knight et al. [39], in which even false answers within the groups contributed to positive learning outcomes. Our reconstruction of strategies allowed to see how comparison competency was also trained in a few groups (see Box 4) while solving the comparison task in phase 3.2, in which students had to compare their answers. Although the strategy of weighting variables was only observed in two groups (one French, one German), and its use was hesitant and often led by one student (see Box 4), students chose freely to use it to solve the task, and its use can indicate a moment of collaborative competency acquisition through discussion. Finally, German and French students showed similar uncertainty towards the open comparison task, adopted rather similar strategies and patterns of action in group discussions. Differences between the groups’ use of the comparison method were not significant. German groups formulated nuanced answers when French groups formulated different types of answers (exclusive, nuanced, undecided). Since there was a difference in the number of French students, compared to the number of German students in our intervention, this result would need replication and further research.

6. Conclusions

Enhancing scientific literacy in schools is a challenge if educational actors and students doing their “student job” are still very much influenced by a closed-task culture. In our study, we used an open task in group discussions with French and German students to train comparison competency after students had learned the comparison method in the first part of the intervention. Our analysis allowed to combine qualitative and quantitative approaches in a mixed-method design. We found that group discussions contributed to individual comparison competency acquisition, but that students had difficulties with the openness of the task. Using the comparison method as a scaffold provided with the task helped students with structuring their answers and developing strategies. However, consolidating competency acquisition seems necessary in the long term. Competency acquisition to enhance scientific and geographical literacy should be reflected in classrooms as a new task culture, which would allow errors and reflection around the scientific process.

Author Contributions

Conceptualization, M.S. and A.B.; Investigation, M.S.; Data curation, M.S.; Writing—original draft, M.S.; Writing—review & editing, M.S. and A.B.; Visualization, M.S.; Supervision, A.B.; Project administration, A.B.; Funding acquisition, A.B. All authors have read and agreed to the published version of the manuscript.

Funding

This research was funded by the Deutsche Forschungsgemeinschaft (DFG, German Research Foundation)–Project Number 57444011–SFB 806. The digital learning unit (OER) was funded by the Bundesministerium für Bildung und Forschung (BMBF, Federal Ministry of Education and Research) in the DiGeo project under the funding code 16DHB3003.

Institutional Review Board Statement

The study was conducted in accordance with the Declaration of Helsinki, but did not require approvement of an ethics committee as data collection did not pose any threats and did not imply physical or psychological stress for the respondents, as the German Research Foundation (DFG) states.

Informed Consent Statement

Informed consent was obtained from all subjects involved in the study.

Data Availability Statement

Data presented in this study is available on request from the corresponding author.

Acknowledgments

The authors would like to thank Frank Schäbitz for his support, and the students and teachers who participated in the study for their collaboration.

Conflicts of Interest

The authors declare no conflict of interest.

References

  1. White, P.J.; Ardoin, N.M.; Eames, C.; Monroe, M.C. Agency in the Anthropocene: Supporting Document to the PISA 2025 Science Framework; OECD Education Working Papers; OECD: Paris, France, 2023; Volume 297, p. 44. [Google Scholar]
  2. Rychen, D.; Salganik, L. The Definition and Selection of Key Competencies. Available online: https://www.oecd.org/pisa/35070367.pdf?_ga=2.268275273.674286943.1663773478-1082129267.1663773478 (accessed on 21 September 2022).
  3. OECD. PISA 2018 Science Framework; OECD: Paris, France, 2019; pp. 97–117. [Google Scholar]
  4. Osborne, J.; Erduran, S.; Simon, S. Enhancing the Quality of Argumentation in School Science. J. Res. Sci. Teach. 2004, 41, 994–1020. [Google Scholar] [CrossRef]
  5. Piovani, J.I.; Krawczyk, N. Comparative Studies: Historical, Epistemological and Methodological Notes. Educ. Real. 2017, 42, 821–840. [Google Scholar] [CrossRef]
  6. Cutter, S.L.; Golledge, R.; Graf, W.L. The Big Questions in Geography. Prof. Geogr. 2002, 54, 305–317. [Google Scholar] [CrossRef]
  7. Kumar, P.; Geneletti, D.; Nagendra, H. Spatial Assessment of Climate Change Vulnerability at City Scale: A Study in Bangalore, India. Land Use Policy 2016, 58, 514–532. [Google Scholar] [CrossRef]
  8. Tyler, S.; Moench, M. A Framework for Urban Climate Resilience. Clim. Dev. 2012, 4, 311–326. [Google Scholar] [CrossRef]
  9. Luhmann, N. Das Erziehungssystem der Gesellschaft; Suhrkamp: Frankfurt, Germany, 2002. [Google Scholar]
  10. Butt, G. Debating the Place of Knowledge Within Geography Education: Reinstatement, Reclamation or Recovery? In The Power of Geographical Thinking; Brooks, C., Butt, G., Fargher, M., Eds.; Springer International Publishing: Cham, Switzerland, 2017; pp. 13–26. [Google Scholar] [CrossRef]
  11. Maude, A. Applying the Concept of Powerful Knowledge to School Geography. In The Power of Geographical Thinking; Brooks, C., Butt, G., Fargher, M., Eds.; Springer International Publishing: Cham, Switzerland, 2017; pp. 27–40. [Google Scholar] [CrossRef]
  12. Brooks, C.; Butt, G.; Fargher, M. (Eds.) The Power of Geographical Thinking; Springer International Publishing: Cham, Switzerland, 2017. [Google Scholar] [CrossRef]
  13. Creswell, J.W.; Plano Clark, V.L. Designing and Conducting Mixed Methods Research, 3rd ed.; SAGE Publications: Thousand Oaks, CA, USA, 2017. [Google Scholar]
  14. Simon, M.; Budke, A. An Intervention Study: Teaching the Comparison Method to Enhance Secondary Students’ Comparison Competency. Int. Res. Geogr. Environ. Educ. 2023, 1–18. [Google Scholar] [CrossRef]
  15. Namy, L.L.; Gentner, D. Making a Silk Purse out of Two Sow’s Ears: Young Children’s Use of Comparison in Category Learning. J. Exp. Psychol. Gen. 2002, 131, 5–15. [Google Scholar] [CrossRef] [PubMed]
  16. Lijphart, A. Comparative Politics and the Comparative Method. Am. Polit. Sci. Rev. 1971, 65, 682–693. [Google Scholar] [CrossRef]
  17. Kantor, P.; Savitch, H.V. How to Study Comparative Urban Development Politics: A Research Note. Int. J. Urban Reg. Res. 2005, 29, 135–151. [Google Scholar] [CrossRef]
  18. Krehl, A.; Weck, S. Doing Comparative Case Study Research in Urban and Regional Studies: What Can Be Learnt from Practice? Eur. Plan. Stud. 2020, 28, 1858–1876. [Google Scholar] [CrossRef]
  19. Morgan, J. What do we mean by thinking geographically? In Debates in Geography Education; Lambert, D., Jones, M., Eds.; Routledge: Londond, United Kingdom, 2013; pp. 273–281. [Google Scholar]
  20. Schneider, A.; Woodcock, C.E. Compact, Dispersed, Fragmented, Extensive? A Comparison of Urban Growth in Twenty-Five Global Cities Using Remotely Sensed Data, Pattern Metrics and Census Information. Urban Stud. 2008, 45, 659–692. [Google Scholar] [CrossRef]
  21. Chabrol, M.; Collet, A.; Giroud, M.; Launy, L.; Rousseau, M.; Ter Minassian, H. Gentrifications; Amsterdam Editions: Paris, France, 2016. [Google Scholar]
  22. Robinson, J. Cities in a World of Cities: The Comparative Gesture. Int. J. Urban Reg. Res. 2011, 35, 1–23. [Google Scholar] [CrossRef]
  23. Wilcke, H.; Budke, A. Comparison as a Method for Geography Education. Education Sciences 2019, 9, 225. [Google Scholar] [CrossRef]
  24. Simon, M.; Budke, A. How Geography Textbook Tasks Promote Comparison Competency—An International Analysis. Sustainability 2020, 12, 8344. [Google Scholar] [CrossRef]
  25. Simon, M.; Budke, A.; Schäbitz, F. The Objectives and Uses of Comparisons in Geography Textbooks: Results of an International Comparative Analysis. Heliyon 2020, 6, 1–13. [Google Scholar] [CrossRef] [PubMed]
  26. Simon, M.; Budke, A. Students’ Comparison Competencies in Geography: Results from an Explorative Assessment Study. J. Geogr. High. Educ. 2023, 1–21. [Google Scholar] [CrossRef]
  27. Kidman, G.; Chang, C.-H. Assessment and evaluation in geographical and environmental education. Int. Res. Geogr. Environ. Educ. 2022, 31, 169–171. [Google Scholar] [CrossRef]
  28. Bednarz, S.; Heffron, S.; Huynh, N. A Road Map for 21st Century Geography Education: Geography Education Research; Association of American Geographers: Washington, DC, USA, 2013. [Google Scholar]
  29. Abricot, N.; Zuniga, C.G.; Valencia-Castaneda, L.; Miranda-Arredondo, P. What learning is reported in social science classroom interventions? A scoping review of the literature. Stud. Educ. Eval. 2022, 74, 101187. [Google Scholar] [CrossRef]
  30. Cox, M.; Elen, J.; Steegen, A. The Use of Causal Diagrams to Foster Systems Thinking in Geography Education: Results of an Intervention Study. J. Geogr. 2019, 118, 238–251. [Google Scholar] [CrossRef]
  31. Bednarz, R.; Lee, J. What improves spatial thinking? Evidence from the Spatial Thinking Abilities Test. Int. Res. Geogr. Environ. Educ. 2019, 28, 262–280. [Google Scholar] [CrossRef]
  32. Asbrand, B. Wie erwerben Jugendliche Wissen und Handlungsorientierungen in der Weltgesellschaft? Globales Lernen aus der Perspektive qualitativ-rekonstruktiver Forschung. ZEP Z. Für Int. Bild. Entwicklungspädagogik 2008, 31, 4–8. [Google Scholar] [CrossRef]
  33. Nohl, A.-M.; Von Rosenberg, F.; Thomsen, S. Bildung und Lernen im Biographischen Kontext: Empirische Typisierungen und Praxeologische Reflexionen; Springer Fachmedien: Wiesbaden, Germany, 2015. [Google Scholar] [CrossRef]
  34. Martens, M.; Asbrand, B. “Schülerjob” revisited: Zur Passung von Lehr- und Lernhabitus im Unterricht. Z. Bild. 2021, 11, 55–73. [Google Scholar] [CrossRef]
  35. Breidenstein, G. Teilnahme am Unterricht; VS Verlag für Sozialwissenschaften: Wiesbaden, Germany, 2006. [Google Scholar]
  36. Perrenoud, P. Métier D’élève et Sens Du Travail Scolaire, 8th ed.; ESF: Issy-les-Moulineaux, France, 2013. [Google Scholar]
  37. Jiménez-Aleixandre, M.P.; Bugallo Rodríguez, A.; Duschl, R.A. “Doing the lesson” or “doing science”: Argument in high school genetics. Sci. Educ. 2000, 84, 757–792. [Google Scholar] [CrossRef]
  38. Johnson, D.W.; Johnson, R.T. Making Cooperative Learning Work. Theory Into Pract. 1999, 38, 67–73. [Google Scholar] [CrossRef]
  39. Knight, J.K.; Wise, S.B.; Southard, K.M. Understanding Clicker Discussions: Student Reasoning and the Impact of Instructional Cues. CBE Life Sci. Educ. 2013, 12, 645–654. [Google Scholar] [CrossRef]
  40. Chi, M.T.H. Active-Constructive-Interactive: A Conceptual Framework for Differentiating Learning Activities. Top. Cogn. Sci. 2009, 1, 73–105. [Google Scholar] [CrossRef]
  41. Mercer, N.; Dawes, L.; Wegerif, R.; Sams, C. Reasoning as a scientist: Ways of helping children to use language to learn science. Br. Educ. Res. J. 2004, 30, 359–377. [Google Scholar] [CrossRef]
  42. Osborne, J. Arguing to Learn in Science: The Role of Collaborative, Critical Discourse. Science 2010, 328, 463–466. [Google Scholar] [CrossRef]
  43. Maier, V.; Budke, A. Wie Planen Schüler/Innen? Die Bedeutung Der Argumentation Bei Der Lösung von Räumlichen Planungsaufgaben. GW-Unterricht 2018, 149, 36–49. [Google Scholar]
  44. Zohar, A.; Nemet, F. Fostering Students’ Knowledge and Argumentation Skills through Dilemmas in Human Genetics. J. Res. Sci. Teach. 2002, 39, 35–62. [Google Scholar] [CrossRef]
  45. Albe, V. When Scientific Knowledge, Daily Life Experience, Epistemological and Social Considerations Intersect: Students’ Argumentation in Group Discussions on a Socio-Scientific Issue. Res. Sci. Educ. 2008, 38, 67–90. [Google Scholar] [CrossRef]
  46. Bonnet, A. Die Dokumentarische Methode in Der Unterrichtsforschung. Ein Integratives Forschungsinstrument Für Strukturrekonstruktion Und Kompetenzanalyse. ZQF-Z. Für Qual. Forsch. 2010, 10, 7–8. [Google Scholar]
  47. Bohnsack, R. Documentary Method. The SAGE Handbook of Qualitative Data Analysis; Fick, U., Ed.; SAGE: London, UK, 2014; pp. 217–233. [Google Scholar]
  48. Bohnsack, R.; Nentwig-Gesemann, I.; Nohl, A.-M. Einleitung: Die dokumentarische Methode und ihre Forschungspraxis. In Die Dokumentarische Methode und ihre Forschungspraxis: Grundlagen Qualitativer Sozialforschung; Bohnsack, R., Nentwig-Gesemann, I., Nohl, A.-M., Eds.; VS Verlag für Sozialwissenschaften: Wiesbaden, Germany, 2013; pp. 9–32. [Google Scholar]
  49. Asbrand, B.; Martens, M. Dokumentarische Unterrichtsforschung; Springer Fachmedien: Wiesbaden, Germany, 2018; ISBN 978-3-658-10891-5. [Google Scholar]
  50. Mayring, P. Qualitative Inhaltsanalyse: Grundlagen Und Techniken, 12th ed.; Beltz: Weinheim, Germany; Basel, Switzerland, 2015. [Google Scholar]
  51. Cohen, J. Statistical Power Analysis for the Behavioral Sciences, 2nd ed.; Lawrence Erlbaum Associates: Hillsdale, NJ, USA, 1988. [Google Scholar]
  52. Olschewski, P.; Herzmann, P.; Schlüter, K. Group Work during Inquiry-Based Learning in Biology Teacher Education: A Praxeological Perspective on the Task of (Collaborative) Protocol Generation. Educ. Sci. 2023, 13, 401. [Google Scholar] [CrossRef]
  53. Perrenoud, P. Des Savoirs Aux Compétences: Les Incidences Sur Le Métier d’enseignant et Sur Le Métier d’élève. Pédagogie Collégiale 1995, 9, 6–10. [Google Scholar]
  54. Bourdieu, P. La Distinction. Critique Sociale Du Jugement; Le sens Commun; Les Editions de Minuit: Paris, France, 1979. [Google Scholar]
  55. Budke, A.; Schiefele, U.; Uhlenwinkel, A. ’I Think It’s Stupid’is No Argument: Investigating How Students Argue in Writing. Teach. Geogr. 2010, 35, 66–69. [Google Scholar]
  56. Uhlenwinkel, A. Geographisches Wissen Und Geographische Argumentation. In Fachlich Argumentieren. Didaktische Forschungen zur Argumentation in den Unterrichtsfächern; Budke, A., Kuckuck, M., Meyer, M., Schäbitz, F., Schlüter, K., Weiss, G., Eds.; LehrerInnenbildung gestalten; Waxmann: Münster, Germany, 2015; Volume 7, pp. 46–61. [Google Scholar]
Figure 1. Competency model for comparison in geography education, for more details see [24] (p. 5). Own elaboration.
Figure 1. Competency model for comparison in geography education, for more details see [24] (p. 5). Own elaboration.
Education 13 00849 g001
Figure 2. Overview of the intervention study and different analyses [14]. Own elaboration.
Figure 2. Overview of the intervention study and different analyses [14]. Own elaboration.
Education 13 00849 g002
Figure 3. Comparison steps as provided to students during the intervention. Translated from German. Own elaboration on the basis of Wilcke and Budke [23].
Figure 3. Comparison steps as provided to students during the intervention. Translated from German. Own elaboration on the basis of Wilcke and Budke [23].
Education 13 00849 g003
Figure 4. Organisation of the group discussions. Colours stand for different questions. Own elaboration.
Figure 4. Organisation of the group discussions. Colours stand for different questions. Own elaboration.
Education 13 00849 g004
Figure 5. Task sheet for phase 3.2 (group discussion 2). Translated from German by the authors.
Figure 5. Task sheet for phase 3.2 (group discussion 2). Translated from German by the authors.
Education 13 00849 g005
Figure 6. Group distribution of points obtained in the assessment of the use of the comparison method. Own elaboration.
Figure 6. Group distribution of points obtained in the assessment of the use of the comparison method. Own elaboration.
Education 13 00849 g006
Table 1. Assessment tool to evaluate elements of group discussions corresponding to comparison steps (based on Figure 3). Own elaboration.
Table 1. Assessment tool to evaluate elements of group discussions corresponding to comparison steps (based on Figure 3). Own elaboration.
Comparison Steps and TasksPossible Points
Step 1: formulate a question0 or 1
Step 2: determine comparison units0 or 1
Step 3: determine comparison variables and material used0 or 1
Step 4: juxtapose comparison units according to comparison variables0 or 1
Step 5: weigh comparison variables and explain results0 or 1
Step 6: formulate an answer to the question0 or 1
Transversal task: justify and argue on each step of the comparison process (choice of units, variables, material and justification of results)0–4
Table 2. Assessment tool for comparison competency assessment: list of categories to measure comparison competency [26] (p. 5). Own elaboration.
Table 2. Assessment tool for comparison competency assessment: list of categories to measure comparison competency [26] (p. 5). Own elaboration.
Categories to Measure Comparison CompetencyPossible Points
Elements of a comparison (units and variables) are set in relation to each other in order to carry out a comparison0 or 1
The question is implicitly or explicitly chosen0 or 1 (implicitly) or 2 (explicitly)
Variables are implicitly or explicitly chosen0 or 1 (implicitly) or 2 (explicitly)
Units are implicitly or explicitly chosen0 or 1 (implicitly) or 2 (explicitly)
Material is implicitly or explicitly chosen0 or 1 (implicitly) or 2 (explicitly)
The result of the comparison is justified argumentatively0 or 1
The argumentative justification for the results of the comparison is successful0 or 1
The chosen question is justified argumentatively0 or 1
The argumentative justification for the choice of the question is successful0 or 1
Chosen units are justified argumentatively0 or 1
The argumentative justification for the choice of the units is successful0 or 1
Chosen variables are justified argumentatively0 or 1
The argumentative justification for the choice of the variables is successful0 or 1
Chosen material is justified argumentatively0 or 1
The argumentative justification for the choice of the material is successful0 or 1
A result of the comparison is provided0 or 1
Comparison is made with more than 1 variable0 or 1
Comparison is made with more than 2 units0 or 1
Variables are weighted0 or 1
Underlying geographical concepts are reflected with the weighting of variables0 or 1
Comparison is used to juxtapose or rank units along the variables0 or 1
Comparison is used to test a rule/model or show change0 or 2
Comparison is used to question a rule/model or define a process0 or 3
Comparison is used to formulate a rule/model or highlight the particularity of examples0 or 4
TOTALMax. 28 points
Table 3. Distribution of strategies and types of answers to solve the comparison task. N = 12 (Two French groups did not deliver a recording). Own elaboration.
Table 3. Distribution of strategies and types of answers to solve the comparison task. N = 12 (Two French groups did not deliver a recording). Own elaboration.
Types of Strategies to Come to an AnswerTypes of Answers to the Overall Comparison Task
Nuanced Answer
(for Example: “Recent and Past Migration Is Rather Different But Some Elements Are Still Similar”)
Exclusive Answer (for Example: “Recent and Past Migration Is Similar”)Undecided Answer
(for Example: “Recent and Past Migration Is Similar and Different”)
No Answer
Strategy: majority of responses3 German groups2 French groups--
Strategy: weighting of variables1 French group, 1 German group---
No specific strategy--1 French group1 French group
No recording of the strategy1 French group---
Table 4. Cross table of students’ individual improvement in comparison competency between post- and pre-test, related to the use of specific strategies to solve the comparison task in phase 3.2 of the group discussion. Own elaboration.
Table 4. Cross table of students’ individual improvement in comparison competency between post- and pre-test, related to the use of specific strategies to solve the comparison task in phase 3.2 of the group discussion. Own elaboration.
Individual Improvement of Students’ Comparison Competency between The Pre- and Post-TestStrategies during the Group Discussion
Answer Based on The Majority of ResponsesAnswer Based on The Weighting of VariablesAnswer Based on no Specific Strategyno Recording Delivered or no Recording of the StrategyTotal
Individual improvement in comparison competency between the pre- and the post-test240814
No individual improvement in comparison competency between the pre- and the post-test 1745430
Total of students19851244
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Simon, M.; Budke, A. German and French Students’ Strategies While Performing Geographical Comparisons in a Group Task Setting. Educ. Sci. 2023, 13, 849. https://doi.org/10.3390/educsci13080849

AMA Style

Simon M, Budke A. German and French Students’ Strategies While Performing Geographical Comparisons in a Group Task Setting. Education Sciences. 2023; 13(8):849. https://doi.org/10.3390/educsci13080849

Chicago/Turabian Style

Simon, Marine, and Alexandra Budke. 2023. "German and French Students’ Strategies While Performing Geographical Comparisons in a Group Task Setting" Education Sciences 13, no. 8: 849. https://doi.org/10.3390/educsci13080849

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop