Next Article in Journal
Gas Seeps at the Edge of the Gas Hydrate Stability Zone on Brazil’s Continental Margin
Next Article in Special Issue
Geological Map of a Treasure Chest of Geodiversity: The Lavagnina Lakes Area (Alessandria, Italy)
Previous Article in Journal
Role of Trapped Air on the Tsunami-Induced Transient Loads and Response of Coastal Bridges
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

What Pattern of Progression in Geoscience Fieldwork can be Recognised by Geoscience Educators? †

School of Social Science and Public Policy, Keele University, Keele, Staffordshire ST5 5BG, UK
Running title: Teaching progression in geoscience fieldwork?
Geosciences 2019, 9(5), 192; https://doi.org/10.3390/geosciences9050192
Submission received: 21 March 2019 / Revised: 19 April 2019 / Accepted: 22 April 2019 / Published: 26 April 2019
(This article belongs to the Special Issue Educating for Geoscience)

Abstract

:
The question in the title was addressed by dividing the elements of geoscience fieldwork into eight different strands and then subdividing these into different items addressed during fieldwork. Separate small groups of experienced geoscience educators (mainly geology teachers of 16–18-year-old and younger students) were then asked to sort the items for each strand into order, from the most simple to the most difficult, and then to assign the items to levels of difficulty identified in the progression of scientific skills given in the National Curriculum for Science in England. The results indicate that, whilst those involved found the exercise fairly difficult, nevertheless it was possible to identify an agreed progression in each of the strands. It is hoped that this exercise will provoke geoscience educators to carry out further research into progression in geoscience fieldwork education, enabling them to confirm such a progression. Meanwhile raising awareness amongst teachers of geoscience fieldwork that there is the opportunity to teach elements of fieldwork in order of increasing difficulty, as identified in the research, should enable them to plan more effective progression in their fieldwork teaching. The strategy and methodology used could also enhance professional development in the teaching of geoscience fieldwork.

1. The Research Question

The question of ‘What pattern of progression in geoscience fieldwork can be recognised by geoscience educators?’ was prompted by a meeting of senior science fieldwork educators at an Association for Science Education, Outdoor Working Group meeting in the UK. The meeting included senior members of national organisations concerned with fieldwork in their subject areas, biology, chemistry, physics and Earth science. During the meeting, a biology colleague pointed out in paraphrase, ‘many pupils undertake pond-dipping five times during their school experience—surely there should be some progression in this activity’ [1]. He was commenting on the fact that pupils in the UK may carry out a pond-dipping exercise when they are in infant school (5–7 years old), junior school (7–11 years old), lower secondary school (11–14 years old), upper secondary school (14–16 years old) and post-16 education (16–18 years old).
So, can educators recognise progression in the teaching of fieldwork? If so, how can such a progression be identified? Further, how can this be applied to the teaching of geoscience fieldwork?
Geoscience-related fieldwork in the UK today might be undertaken in primary (elementary) school, in secondary (high) school, as part of the 16–18 curriculum in schools and colleges, and as part of an undergraduate degree, during Higher Education. Given this span, the research question remains, ‘What pattern of progression in geoscience fieldwork can be recognised by geoscience educators?’ over these educational years.

2. Literature Review

2.1. Progression

The idea of progression in learning in different subject areas has a long history. Early work was carried out by Piaget [2] through his research into levels of thinking in children of different ages. In 1956, Bloom [3] developed his taxonomy of cognitive levels, focussed on the development of thinking skills (later updated by Anderson et al. in 2001 [4]). In 1960, Bruner [5] published his work on the spiral curriculum, suggesting that concepts should be revisited regularly at higher theoretical levels as learning in a subject progressed. Vygotsky’s work (1978, [6]) suggested how higher-level thinking skills might be promoted through social interactions.
In the UK, some of these ideas were picked up in the late 1980s, in the debate about the development, for the first time, of a National Curriculum. The National Curriculum Task Group on Assessment and Testing, commented in its 1988 report [7] that, at that stage, ‘No country appears to have a national assessment system which is well developed in relation to […] a framework of progression’, (p 9, para 12) and went on to recommend ‘stages of progress to be marked by levels of achievement [...] derived from […] [the] curriculum.’ (p 30, para 93). The report introduced the term ‘level’ ‘to define one of a sequence of points on a scale to be used in describing the progress of attainment …’ (p 32, para 100) and further, went on to recommend that ‘For a profile component which applies over the full age range 7 to 16, there should be ten such levels …’ (p 32, para 101).
Some time later, the idea of a ‘learning progression’ was mooted, and defined by the National Research Council in the USA [8] (8: 214) as ‘descriptions of the successively more sophisticated ways of thinking about a topic that can follow one another as children learn about and investigate a topic over a broad span of time’. This triggered further research into progression in learning, as described below.

2.2. Progression in Science

The first version of the National Curriculum for Science [9], implemented in all government schools in England, Wales and Northern Ireland from 1989 onwards, contained 17 separate areas of scientific study (called Attainment Targets), each of which was subdivided into ten progressive levels.
The approach of subdividing different strands of the science curriculum into series of levels continued through various revisions of the National Curriculum for Science until the version published for England in 2007 [10]. By this time, the levels had been called Attainment Levels, and had been reduced to a focus on 5 to 14-year-olds. Each area of the science curriculum had a range of levels from 1 to 8, but with an additional ‘exceptional performance’ section (aimed at the most able 14-year-olds, who might be achieving the previous levels 9 and 10, which had been aimed at 16-year-olds). As previously, part of the science curriculum was focused on scientific methodology, now called ‘How science works’.
The ‘How science works’ strand encompassed: scientific thinking, applications and implications of science, cultural understanding, collaboration, practical and enquiry skills, critical understanding of evidence and communication, [10] (10:208/9). By the time of the publication of this ‘new’ curriculum in 2007, all teachers of 5–14-year-olds, and particularly primary (elementary) teachers, were very familiar with ‘levelling’ children in a range of subject areas. Thus the idea of progression was part of the daily work and experience of teachers across England (and also Wales and Northern Ireland).
The research background for the development of the levels of attainment in the National Curriculum implemented across England has not been published, nevertheless, it seems clear from the language used, that the version of progression used is linked to Bloom’s taxonomy of knowledge, comprehension, application, analysis, synthesis and evaluation [3] and subsequent modifications, together with the work of Piaget on concrete and formal (abstract) thinking skills (as espoused in the ‘Thinking science’ project [11].
The ‘Thinking science’ project, otherwise called the ‘Cognitive Acceleration in Science Education’ (CASE) project, was a research-based project in science education that sought development in thinking skills through science for 11–14-year-olds. This was run by King’s College, London and, despite not being government supported, was being used in perhaps a third of secondary (high) schools in England at one stage. The project assessed thinking skill development by assessment tasks based on a hierarchy, that originated with Piaget, and was validated by research into the assessment of thinking skills by Shayer and Adey [12,13,14,15] and Adey and Shayer [16].

2.3. Learning Progressions in Science

The specific idea of a ‘learning progression’ seems to have developed in the USA from reports published by Catley et al. [17] and Smith et al. [18]. This work was followed by a ‘Learning Progressions in Science (LeaPS)’ conference in 2009 [19]. The approach gathered pace with the publication in 2009 of a special edition of the Journal of Research in Science Teaching focussed on learning progressions, containing an editorial paper outlining the issues by Duncan and Hmelo-Silver [20]. A book entitled ‘Learning Progressions in Science’ was published in 2012 [21] which summarised earlier research and suggested ways forward.
The learning progression definition of the National Research Council [8] has been stated more succinctly by Alonzo and Gotwals [19] (19:3) as, ‘descriptions of increasingly sophisticated ways of thinking about or understanding a topic’. Meanwhile, Shavelson and Kurplus described a learning progression as, ‘a sequence of progressively more complex ways of reasoning about a set of ideas.’ [22] (22:15). These studies recognise that learning may not be linear. Research into the ‘messy middle’ of student learning described by Gotwals and Songer [23] (23:277) may, according to Alonzo and Gotwals [19], provide important perspectives on developing formative assessment, curriculum development and teaching standards.
Krajcik [24] describes four requirements for learning progressions, ‘First the big idea should be identified and explained. […] Second the learning progression should be clearly described at each level [including] student reasoning and prerequisite understandings […] Third each learning progression should include psychometrically validated assessment items that can identify students at a particular level. Fourth, each learning progression should include classroom-tested instructional components […] to use in advancing learners to the next level of understanding […] providing components that are useful for building curriculum materials.’ [24] (24:31). Thus, Krajcik’s view of a full learning progression involves a big idea described through levels with related assessment and curriculum materials.
Work on identifying levels in learning progressions in science has often focussed on four levels, a lowest ‘anchor’ of the naive ideas that pupils bring with them in starting to study a big idea, and a highest ‘anchor’ of the scientifically-correct interpretation, with intermediate levels of increasing understanding, as described by Mohan and Plummer [25] and Alonzo and Gotwals [21] and exemplified by Gunckel et al. [26], Plummer [27] and Schwarz et al. [28].
One of the recommendations of the ‘Learning Progressions in Science’ book is that teams of researchers should work on particular learning progressions in science, ‘learning what does and does not work, find-tuning the progressions, and making their findings available to others …’ (Shavelson and Kurplus, [22] (22:19)), however Krajcik cautions that, ‘creating learning progressions, unfortunately, requires years of systematic research’ [24] (24:28) and Alonzo and Gotwals [21] (21:484) add, ‘The development, refinement and validation of learning progressions require long-term work.’
More recently Jin et al. [29] have worked on a learning progression in scientific reasoning, Osborne et al. [30] have been developing a learning progression into argumentation in science. Hadenfeldt et al. [31] have studied students’ progression in understanding the concept of matter and Hovardas [32] has developed a hypothetical learning progression for reasoning in ecology. Meanwhile Alonzo [33] has argued for the use of formative assessment of learning progressions.

2.4. Learning Progressions in Geoscience

In geoscience, studies include those by Lindsey et al. [34], Gunckel at al. [26] and Jin et al. [35] on the carbon cycle, by Gunckel et al. [26,36] on the water cycle and by Plummer [27] and Plummer and Maynard [37] on celestial motion and the seasons. Covitt et al. [38] studied formative assessment based on learning progression in teaching about water, while McDonald et al. [39] sought to map a learning progression in student understanding of plate tectonics. The learning progressions in these studies are identified by means of assessments given to students of various ages from 9 to 18 years old (depending on the study). Breslyn et al. [40] investigated a learning progression in sea level rise by working with school students and trainee (pre-service) teachers.
Further research in geoscience education at the level of ‘learning progressions’ as described would involve great investment of time and substantial resources. Such resources have so far not been available for researching potential progression in geoscience fieldwork. Such ‘learning progression’ level of research into geoscience fieldwork would only be justified if there was general agreement amongst educators that progression in geoscience fieldwork can be recognised. This study therefore addresses the exploratory precursor question, ‘What pattern of progression in geoscience fieldwork can be recognised by geoscience educators?’

2.5. Progression in Geoscience Fieldwork

Geographers have attempted to address the question of progression in fieldwork by producing materials that include progression, for example the framework produced by Field Studies Council centre at Brockhole in the UK [41], tabulating the skills and concepts appropriate for teaching geography to 5–18-year-olds. Meanwhile, Rutter and Sharp [42] produced a table of recommended progression in primary (elementary, 5–11-year-old) geography fieldwork. In both these tabulations, progression is described from simple data-collection and enquiry-based skills to more complex ones, but neither table is accompanied by a rationale for the progression given.
Meanwhile, the lack of progression in the teaching of biological fieldwork has been bemoaned by Tilling [43] and for archaeological fieldwork by Brooks [44].
Orion [45] (45:113) discusses Earth science fieldwork education for school students. The students addressed are those of the ‘Science for All Americans’ project [46] namely K-12 or 5–18-year-olds. He argues that their fieldwork should involve, ‘a gradual progression from the concrete levels of the curriculum towards its more abstract components […] to prepare the students for their outdoor learning activities.’ The progression he recommends is that teaching should begin with practical student activities in the classroom before progressing to the outdoors. This is the model followed by Esteves et al. [47] for Portuguese secondary (high school) students.
Whilst progression in geoscience fieldwork has been little addressed at school level, it has been given more consideration in the context of undergraduate geoscience teaching (of post-18-year-olds). For example, in the ‘UK Geosciences Fieldwork Symposium’ [48], Wright [49] noted: ‘Most existing field trips fall into three categories, generally as a progression:
  • First, to show students a variety of rock-types and structures;
  • second, how to record data on a large-scale map and;
  • third, how to compile the data into a map and define the geological history.’
However, Hawley [50] then commented on, ‘the need for many other questions to be asked—questions such as [four other questions, followed by] […] ‘what is progression in fieldwork?’
In his ‘Teaching geoscience through fieldwork’ guide in the section on ‘Progression and development of field skills’, Butler [51] (51:16) recommends the following example of progression as part of a two-year degree scheme for to prepare undergraduate students for independent project work:
  • operating and navigating in the countryside
  • use of equipment (e.g., compass/clinometer)
  • elementary observations, measurements and recording (e.g., simple lithologies, structural relationships, use of compass-clinometer, keeping a good notebook field sketching)
  • more advanced observations, recording and analysis (e.g., hazard assessment, sedimentary logging, stereonet plotting, model building)
  • synthesis (e.g., cross-sections, sedimentary environments, simple geological histories)
  • making a geological map and related cross-sections
  • advanced synthesis skills (e.g., complex geological histories, model building and testing, report writing).
… however, Butler provides no rationale for this progression.
So, in summary, little seems to have been researched or written about progression in the teaching of geoscience fieldwork, or indeed about progression in fieldwork education in general.
This could be because fieldwork is a rather ad hoc activity at school level, usually experienced by pupils from the age of five years old at long intervals of time, unless they undertake a geoscience-specific course. Thus a new set of skills has to be taught and remembered each time. The apparent lack of research into fieldwork progression at undergraduate level could be because many students arrive with no previous experience of geoscience fieldwork, and so must be taught the requisite skills ‘all in one go’ to enable them to engage with the fieldwork activities, meaning that there is no time for a logical progression to be developed.
Whether or not these surmises are true, there is still a good case for discussion amongst geoscience fieldwork educators into a logical order for the teaching of fieldwork skills, based on their perceived difficulty, as a means of promoting such a logical order in their future teaching.

3. Method

3.1. Rationale

There is normally no formal assessment of fieldwork skills across the school or undergraduate system in the UK which could be used to discover from student work a learning progression for geoscience fieldwork. Indeed, geoscience fieldwork does not lend itself to generating the level of student data that could be used to identify such progression. An alternative way of addressing the question of progression is to consult experienced practitioners on their views. In order to do this, different elements of fieldwork needed to be identified and then referenced against a generally agreed system. This approach can be implemented by inviting experienced practitioners to undertake a type of sorting exercise of fieldwork items against a recognised progression.

3.2. A Progression in Scientific Skills Reference System

The system of progression in scientific skills widely used in England was the ‘How science works’ Attainment Level system, published as part of the National Curriculum for Science in 2007 [10]. A further document summarising the Attainment Level system for all subject areas was published in 2010 [52]. Although this system, as published, referred to 5–14-year-olds, it had, as explained above, been derived from an earlier system aimed at 5–16-year-olds. The published levels ranged from 1 (lowest) to 8, with an additional ‘exceptional performance’ section aimed at the most able pupils. An example of the full detail of a ‘How science works’ level descriptor for one of the Attainment Levels is given in Box 1.
Box 1. An exemplar level descriptor for a ‘How science works’ Attainment Level—from the 2007 version of the National Curriculum for Science (Qualifications and Curriculum Authority (QCA), 2007).
Level 8
Pupils recognise that different strategies are required to investigate different kinds of scientific questions and use scientific knowledge and understanding to select an appropriate strategy. In consultation with their teacher they adapt their approach to practical work to control risk. They record data that are relevant and sufficiently detailed and choose methods that will obtain these data with the precision and reliability needed. They analyse data and begin to explain, and allow for, anomalies. They carry out multi-step calculations and use compound measures, such as speed, appropriately. They communicate findings and arguments, showing awareness of a range of views. They evaluate evidence critically and suggest how inadequacies can be remedied.
The Attainment Levels are not age-specific, nevertheless the National Curriculum for Science document [10] aimed at 11–14-year-olds begins at Level 4 and describes the top level as ‘exceptional achievement’. This suggests that, whilst the attainment levels overlap age ranges, levels 1–4 are most appropriate for primary (elementary) age pupils of 5–11 years old, levels 4-8 are most appropriate for lower secondary (high school) pupils of ages 11–14, while levels 8 and ‘exceptional performance’ are most appropriate for students of ages 14 and above.
For the purposes of this study, the published ‘How science works’ attainment levels were abbreviated, whilst retaining their critical features, to reduce the reading time for those involved in the sorting exercise. For example, Level 8, as described in full in Box 1, was abbreviated to: ‘Tackle different questions using different strategies; obtain data with precision and reliability; analyse data and begin to explain anomalies; evaluate evidence critically’. The ‘exceptional performance’ category (originally aimed at 14–16-year-olds, as described above) was designated as Level 9 for this study.
Since it became clear as different items of fieldwork were being considered in preparation for the exercise, that some involved skills even higher than the new Level 9 category, an additional Level 10 was added to include geoscience field skills that might form part of fieldwork in schools and colleges for 16–18-year-olds (A-level in England) and those undertaking the lower levels of undergraduate courses. The new Level 10 added to the table (in italics to distinguish it from the published National Curriculum for Science levels) was intended to link to the ‘cognitive gains from learning in the field’ described by Mogk and Goodwin [53] (53:138) as ‘Higher-order thinking skills […] the ability to synthesise a broad range of theoretical knowledge, evaluate uncertainties, and distinguish between observation and inference, communicate results and generally develop “scientific habits of mind”.’
The resulting descriptor table is given in Table 1.
An approximate guide to the level of thinking involved, from concrete to formal was added to Table 1, based on the levels of thinking skills discussed within the ‘Thinking science’ approach [11] (11:1). This approach was described as ‘using Piaget’s labels’ and further defined as, ‘concrete operations are thought processes in which a child […] can cope with only a limited number of variables, and that they allow the child to describe situations but not explain them.’ [11.1] whereas, ‘By contrast, formal operations can handle multi-variable explanations and allow people to provide explanations …’ [11.1]. This approach was then extended in the table to include more complex abstract thinking skills.
The ‘How science works’ descriptor table in Table 1 was used by the groups of educators to assess the thinking levels of the different items of geoscience fieldwork skill. The table was intended for students of ages 5 to 18 and above.
Note that, since the publication and widespread usage of the ‘How science works’ level descriptors in 2007, the UK government has changed, and the whole levelling system has now been abandoned for what appear to be political reasons. Individual schools are now required to develop their own systems of assessing student ability with no reference to a national system of ‘levelling’.

3.3. The Fieldwork-Leading Experience of the Experienced Practitioners Involved in the Sorting Exercise

More than half of the ten participants in the pilot study and of the 42 participants in the revised study outlined below were active or retired teachers of A-level geology (for 16–18-year-olds). [Note that the equivalence of A-levels with the qualifications of other educational systems is difficult to establish, but a guide to equivalence with US university entry qualifications is given by the Fulbright Commission [54]; in response to the question of ‘Q: Will I get university credit for my A-levels, Pre-U or IB Diploma?’ the response is, ‘You will most likely receive university-level credit for these qualifications.’ This implies that A-levels overlap some university teaching in the USA.] Most of the fieldwork experience of A-level teachers is in leading the fieldwork coursework element of A-level courses, in which students undertake a range of fieldwork approaches including independent investigational work.
Other participants in the exercise had experience of leading university-level fieldwork or running fieldwork for science or geography classes of younger secondary school (high school) students of ages 11–16.

3.4. The Pilot Strategy

Members of the Earth Science Teachers’ Association (ESTA) gather together annually in the UK for a secondary workshop (aimed at teaching the 11–18 Earth science/geology curriculum) to develop new teaching materials and discuss current matters of concern. In what was retrospectively called ‘The pilot strategy’ the views of the ten experienced geoscience teachers and educators who attended one of the workshops were sought. Most of these were teachers of A-level geology in schools or colleges, with extensive fieldwork experience of leading 16–18-year-old student fieldwork groups through their geology A-level coursework. Half were male, half female and they ranged in age from around 35 to 60 years.
The sorting exercise was devised as follows. Prior to the workshop, the ‘How science works’ ten level descriptor table was taken and a column was added entitled ‘Appropriate Earth science skills and techniques’. Then the author, as an experienced leader of fieldwork of students of ages 11–18 and particularly of 16–18-year-olds, added to this column of the table the fieldworks skills most commonly taught during the fieldwork of his experience. Then, for the purposes of the exercise, the ‘Appropriate Earth science skills and techniques’ column was cut off with scissors, separated and then cut into ten slips of paper.
During the workshop, the geoscience educators were invited, in discussion with one another, to place each of the ten slips opposite the ‘best fit’ section of the table.
The strategy failed, since, whilst those involved readily agreed verbally that there is progression in the teaching of geoscience fieldwork, they were unable to agree how the table should be reassembled and so were unable to agree what the progression actually was. The failure of the strategy was probably because too many separate elements of fieldwork were covered by each slip of paper. Another shortcoming may have been that there was no methodology for reducing the impact of those geoscience educators with more forceful personalities, whilst increasing the impact of those with less strong views.

3.5. The Evolved Strategy

The evolved strategy attempted to take the learning from the first failed attempt into account by:
subdividing different elements of geoscience fieldwork into a series of different strands each with its own sorting task—so that different areas of fieldwork were not conflated; inviting the geoscience educators to work on the task in mixed-gender groups of three, to encourage decisions made on the basis of professional discussions—attempting to give each participant equal input into the discussions; repeating the activity with different groups of practitioners on three different occasions—to test the methodology with different groups.
Details of the teachers and educators involved are given in Table 2.
The different elements of fieldwork considered in the pilot strategy were subdivided for the ‘evolved strategy’ task and seemed to break down naturally into eight strands, with no major areas overlap or omission. The eight strands were: ‘Identify landform features correctly’; ‘Use field equipment accurately’; ‘Identify rocks/ minerals correctly’; ‘Record field information effectively’; ‘Use a topographic map effectively’; ‘Identify exposed structures correctly’; ‘Map geological boundaries properly’ and ‘Collect and use geological information effectively’.
In the written feedback to the exercise, none of those involved commented that the subdivision into eight strands was inappropriate.
The sorting exercise is described here for the ‘Identify landform features correctly’ strand.

Pre-preparation

A table was prepared with the ‘Identify landform features correctly’ title and each of the items listed in the ‘Identify landform features correctly’ column in Table 3 were listed beneath. The table was printed. Each of the items was cut away from the printed table using scissors, to give a series of separate items. The title and separated items were put into an envelope.

The exercise undertaken

Each group of three participants was given a set of written instructions, the envelope of items and a copy of the ‘How science works’ descriptor table (Table 1). They were taken through the instructions to ensure understanding, and then invited to continue as independent groups. Under the title slip, they sorted the item slips into an agreed order of progression from the ‘easiest’ at the top to the ‘hardest’ at the bottom. They then allocated each slip a level using the ‘How science works’ descriptor table, writing the allocated level number (1–10) on each item slip. Through a questionnaire, described in Box 2, they provided written comments on their perception of the difficulty of the task.
Box 2. The eight-section questionnaire. The first section, addressing one strand only, is shown here, the remaining sections had identical wording with different strand headings.
Names of individuals in group:
Questionnaire response to:Please circle the appropriate number
Is there progression in Earth Science fieldwork?
If so—what is it and how should it be defined?
ProgressionNone
Identify landform features correctlyobviousfound
Is there a progression in this element of Earth science fieldwork?12345
Very easyImpossible
How easy was it to assign NSC levels?12345
Please write a comment below (e.g., are there things missing, could the wording be improved?
Is this an important part of ES fieldwork?)
When the sorting activity for the ‘Identify landform features correctly’ strand had been completed, the participants were asked to repeat the activity for the other seven strands. Different groups were asked to address the strands in different orders, so that work on one strand would not be diminished by every group undertaking it last.
Each group took around an hour to complete the eight sorting tasks and all were involved in deep discussion during the experience. No questions were asked of the author during these discussions.

4. Results and Analysis of the Data Collected Through the ‘Evolved Strategy’

For each strand, the median NCS level for each item was calculated in order to place the items in order of increasing level, to show progression in that strand. The median was used, rather than the mean, because the educators’ judgements comprised ordinal, rather than interval level data (that the median is the best measure of the ‘middle’ for ordinal data is discussed on several websites, such as the QuickMBA statistics website [56]). The results showing the perceived increase in level of difficulty of the eight strands (indicated by the median of the levels of difficulty for each strand) are given at the top of Table 3. Then progression in each individual strand, shown in order of increasing level (or difficulty) as assigned during the sorting exercise, is given further down Table 3.
The results in Table 3 are presented visually by using boxplots in Figure 1a,b which show the eight strands in order of increasing median from the left to the right. For each strand, the boxplot of all the data for that strand is presented on the right-hand side. The data to the left of this is the item data, presented in order of increasing median, from left to right.
Comments on how boxplots are used are available on the minitab website [57]. Briefly:
  • the coloured box is the interquartile range box, showing the middle 50% of the data;
  • the heavy horizontal line is the median;
  • the upper and lower whiskers represent the upper and lower 25% of the data;
  • the dots are outliers of data beyond the upper or lower whisker.
The comments collected by questionnaire are collated in Appendix A. They indicated that the groups found progression within the strands moderately straightforward to identify, but found it more difficult overall to assign levels to the different items within each strand, as shown by Table 4. A comment from one group on the difficulty of the task was; ‘A very useful and thought-provoking exercise which we found very challenging due to difficulty in assigning HSW levels.’ (‘HSW levels’ refers to the ‘How science works’ levels in the descriptor table).

4.1. The ‘Identify Landform Features Correctly’ Strand

This was indicated to be the most straightforward of the eight strands considered (median level 3.0). The participants indicated that the progression in the statements relating to landforms was fairly clear (median of 2.0) and that it was relatively straightforward to assign levels to the statements (median of 2.0). Their combined results did indicate a clear progression suggesting a clear teaching sequence beginning with the identification of larger scale landform features (hill, bay, etc.) at level 2 (involving concrete thinking skills, as shown on Table 1) up to glacial features at level 4 (lower level formal thinking).

4.2. The ‘Use Field Equipment Accurately’ Strand

The results suggest that, overall, the educators found the ‘Use field equipment correctly’ strand to be relatively straightforward, with progression fairly clear to see (median of 2.0) and assigning NSC levels relatively straightforward (median of 2.0). They commented, ‘Slightly more straightforward than other categories’, ‘Large jump in cognitive thinking from 2D to 3D’ and ‘Use a geological hammer safely was difficult to deal with’. The results of the sorting exercise identified a clear progression from relatively simple activities at levels 1, 2 and 3 to the measurements of dip and dip direction, requiring the higher-level formal thinking skills of levels 6 and 7. A jump in thinking skills was identified from the use of simple tools to the more complex clinometer and compass.

4.3. The ‘Identify Rocks/ Minerals Correctly’ Strand

The educators found that progression in this strand was fairly clear to see (median of 2.0) and levels were relatively straightforward to assign (median of 2.0). The overall progression identified was clear, from the use of simple identification features to use of more complex features requiring formal thinking skills. Identification using secondary sources (such as reference books or the internet) in the field was deemed a complex abstract task.

4.4. The ‘Record Field Information Effectively’ Strand

Progression was fairly clear in this strand (median of 2.0) but NCS levels more difficult to assign (median 3.0) despite comments from some groups of ‘Found this one easier’ and ‘Straightforward’. Although recording on sheets and photographing was seen to be relatively easy, most skills were thought to require formal thinking skills, with the exception of graphic logging, seen to involve complex abstract thinking. The progression was clear and could readily form the basis of a teaching scheme from younger or less able pupils to older/more able students.

4.5. The ‘Use a Topographic Map Effectively’ Strand

The educators indicated that the progression in these statements was fairly clear (median of 2.0) and that it was more difficult to assign levels to the statements (median of 3.0), with comments such as ‘uneven progression’ and ‘important part of fieldwork’. Most of this fieldwork strand was allocated to levels 4, 5 and 6 indicating a reasonably narrow range of progression except for ‘Identify position on a topographic map using bearings’—which was perceived to be at the higher level 7. This relatively narrow range of levels indicates that the progression in this strand was over a narrow band of understanding. Thus, if teaching were based on these findings, the items could all be taught at the same time (apart from ‘Identify position on a topographic map using bearings’) to pupils of appropriate age/ability.

4.6. The ‘Identify Exposed Structures Correctly’ Strand

Progression in the ‘Identify exposed structures correctly’ sorting exercise was fairly clear (median of 2.0) and levels were relatively straightforward to assign (median of 2.0). Most of the skills involved were deemed to be higher level and all involved formal thinking. Since many of the skills had similar levels, it was difficult to see a clear overall progression in the data, apart from the identification of bedding planes, joints/fractures and mineral veins (seen as straightforward) and the identification of folds, faults etc. from indirect data (involving complex thinking skills).

4.7. The ‘Map Geological Boundaries Properly’ Strand

Those involved indicated that progression in this strand was fairly clear (median of 2.0) but that it was difficult to assign levels (median of 3.5) with comments of, ‘Contains very difficult stuff’, ‘High-level stuff’ and ‘Appears to be some progression, but not linear’. With the exception of ‘Identify exposed geological boundary’ which was perceived to be relatively easy, all other elements were found to be difficult and abstract ranging to the highest level of 10. The participants were able to allocate a range of different levels, and thus identify a reasonable progression.

4.8. The ‘Collect and use Geological Information Effectively’ Strand

This was considered the most complex of the eight strands, involving formal and complex abstract thinking skills. Progression was fairly clear (median of 2.0) but levels more difficult to assign (median of 3.0) with comments such as, ‘Lots of high-level skills’, ‘All fairly high-level’, ‘All very high-level skills. Very specialist’ and ‘Discussed it the longest—it’s difficult stuff!’ Nevertheless, an overall progression was identified ranging steadily from level 6 to level 9. A clear teaching sequence could be devised from the progression identified.

5. Discussion

These exercises were carried out in response to the question ‘What pattern of progression in geoscience fieldwork can be recognised by geoscience educators?’. Small groups of experienced geoscience fieldwork practitioners were asked to undertake a sorting task of fieldwork items printed on slips of paper, seeking a progression in the items and then allocating them a level according to a pre-prepared progression sequence. The tasks were successful in that, over about an hour, the participants working in groups of three were able to complete the task despite, in some cases, commenting that the task was a difficult one.
The completion of the tasks, and the analysis provided, shows that these practitioners, for the items they were given, were able to denote a progression in each of the strands. This finding suggests that different geoscience field activities can usefully be put into learning sequences such as the one proposed in Table 5. The Table 5 suggestion is based on asking different levels of scientific question at a locality or series of localities.
The findings of the research so far suggest that it is worth pursuing the strategy through further research. The next stages of research could include extending the particular methodology used in these tasks to a wider teacher educator audience to test the findings of this explorative study. Alternatively, the tasks could be developed by either extending the fieldwork item bank to include more or broader aspects of geoscience fieldwork, or by changing or enhancing the level descriptor table using progressions from different sources, or both, before using these as the basis of tasks for wider teacher educator audiences. These approaches would test some of methodology used for the present exploration.
Another approach to tackling this research question would be by assessing the fieldwork of students undertaking these items. If student fieldwork feedback were to be used to identify progression, then this would need to encompass a range of student ages, abilities and experiences, implying a much larger research project. A process would have to be developed for obtaining student fieldwork feedback, at least in the UK, since the current assessment of fieldwork enquiry, carried out for small numbers of students as part of GCSE (14–16-year-olds) and A-level (16–18-year-olds) geology examinations, not only does not provide feedback on the attainment of particular fieldwork skills, but has also been phased out for what seem to be largely political rather than educational reasons.
A larger research project, if undertaken, might begin to approach the search for a ‘learning progression’ in geoscience fieldwork of the type described in the literature and outlined in Section 2.3 above. This approach might follow the recommendations of Krajcik [24] who recognised four stages: identifying a big idea; describing the idea through levels; developing related assessment materials; preparing suitable curriculum materials.
The suggestion from the study here that there should be a progression in the teaching of geological fieldwork, prompts a further question of how such a progression could be implemented by the teaching profession. This is a difficult question to respond to, at least in a UK context, since most science teachers do not undertake geoscience fieldwork with their students (as shown by King [58]). A small number of 14–16-year-old students who opt for a General Certificate in Secondary Education (GCSE) course in Geology in schools where this is available, do undertake geoscience fieldwork. A slightly larger, but still small, number of 16–18-year-old students who opt for A-level geology courses, where available, also undertake geoscience fieldwork (around 0.3% of the A-level entry in all subjects across the UK—some 2000 students, based on 2017 figures). Meanwhile, a small number of students study geology at university (some 4000 student applications for courses in 2010—data supplied by the UK University and Colleges Admissions Service, UCAS). Some of these students may have undertaken geoscience-related fieldwork in primary (elementary) school, lower secondary (high) school, at GCSE, A-level and undergraduate level, whilst others may have only undertaken geoscience fieldwork at one of these levels. This means that if a learning progression for geoscience fieldwork, from five years old to undergraduate level were developed, it would not be applicable to most students in the UK. Not only this, but it would also probably be impossible to educate all the teachers involved in the teaching of fieldwork, and then to ensure that details of student whereabouts in their own fieldwork learning were be passed on to successive teachers.
So, if a learning progression in fieldwork cannot be identified because it would be difficult or impossible to implement for the reasons given above and/or, ‘The development, refinement and validation of learning progressions require long-term work.’ (Alonzo and Gotwals, [21]), the value of continuing to research the proposal might be questioned.
However, further research is justified to enable all geoscience educators to become aware that there is likely to be progression in geoscience fieldwork learning, thus enabling them to plan their teaching accordingly. Awareness that some elements of geoscience fieldwork involve lower cognitive skills than others, will allow them to prepare their own teaching progressions from lower level to higher level skills. GCSE and A-level courses in the UK are normally two years long and undergraduate geology courses are three or four years long, so there is time within such courses for a teaching progression of fieldwork skills to be implemented. The data gathered by this study suggests what such a teaching progression might involve.
The research also suggests a focus for professional development for geoscience educators in fieldwork skills. By undertaking an exercise similar to the one described, they would become aware of potential progressions in elements of fieldwork and to take these into account in planning their future teacher strategies. Use of a levelled series of questions, such as those proposed in Table 5, would enable them to develop a levelled inquiry approach to their fieldwork teaching.

6. Limitations

The current exploratory survey was limited by the time available in busy CPD days and the number of occasions when experienced geoscience educators were gathered together. More time and a wider range of individuals would allow questions about, and differences in, progressions perceived by individuals of different ages or experience to be addressed. They would allow testing whether group size plays an important role, for example from individuals working alone to groups of four or five.
The necessity for a sorting and levelling exercise to be undertaken in limited time in a controlled way meant that there was no opportunity for discussion about the numbers of strands to be investigated, the numbers and range of the items in each strand, and the relevance of the ‘How science works’-based levelling table. Broader opportunities for research could address all these issues.
This study was based on the perspectives of geoscience educators alone, the involvement of students and the assessment of their work would allow investigation into whether the perceived progressions are actually those experienced in student learning.
Further research questions could be developed from these comments on the limitations of the exploratory study.

7. Conclusions

This exercise has shown that it is possible for geoscience educators to identify progression in various strands of geoscience fieldwork. This suggests that further research using different geoscience fieldwork strands, different items, differing descriptors of levels and different participants is also likely to identify progression, even if the fine details of that progression are likely to differ.
If such progressions are identifiable, then teachers of geoscience-related fieldwork should take this into account. Whilst it is difficult to envisage a full learning progression in geoscience fieldwork being devised, as described in the literature (with levels, assessment tools and curriculum materials) and implemented across the system, nevertheless geoscience courses are normally long enough for some progression in learning to be implemented. Teachers should be aware of such progression and plan their teaching accordingly.
Consideration of fieldwork progression should also be included in the professional development of geoscience teachers. At least, this is likely to be ‘A very useful and thought-provoking exercise’ (quotation from one of the groups involved in the sorting task) and would enable them to better-prepare teaching sequences for their geoscience fieldwork.

Funding

This research received no external funding.

Acknowledgments

I am most grateful to the Earth Science Education Unit (ESEU) facilitators and the Earth Science Teachers’ Association (ESTA) members who willingly took part in this exercise and also to ESTA for supporting the research. I would particularly like to thank Ben Jones and Caroline Paget, both researchers for the AQA Awarding Body in the UK, for their guidance and help on presenting the results most effectively.

Conflicts of Interest

The author declares no conflict of interest.

Appendix A. Written Feedback Comments by Participant Groups to the Questionnaire Questions. Notes on the Acronyms Used Are Given at the Foot of These Comments

Table A1. Identify landform features correctly.
Table A1. Identify landform features correctly.
GroupComment
FProgression because harder to spot
GHow about rather than ‘identify’—’characterise’ or ‘distinguish’—the process is the important part
HClarify this is all in the field
LIdentify probably means low level. Not an exhaustive list of things that could be identified/distinguished. Yes—it is important (for context only)
MThe process of identification is low. The knowledge and understanding behind them is quite high. So suggest word changes to include interpretation and analysis
NIdentify is ambiguous. You don’t need to know how something is made in order to identify it.
ONeed to be very clear if in the field or in the classroom. Identifying complex features, thus the need for pre-understanding
Table A2. Use field equipment accurately.
Table A2. Use field equipment accurately.
GroupComment
BCriteria given are HSW descriptors but LO are knowledge/ application-based
CDelete hammer. Split acid from hardness
DNeeds to be clear when first introduced—how to use a notebook properly
ESlightly more straightforward than other categories
FDon’t use the hammer. Important part of fieldwork
HEstimating distance/ size needs some idea of accuracy.
IWe think ‘estimate’ higher order skill
JImportant
KInterpretation of specific words makes the decision difficult. Measurement is important
LUse a geological hammer safely was difficult to deal with. Important to add GPS equipment and camera as items of field equipment. Field equipment should include a field notebook (need to be able to record field data). Yes—it is very important
MImportant! Estimating v measuring debate. More info needed on some
NEstimating dip numerically is very hard but qualitatively much easier
OLarge jump in cognitive thinking from 2D to 3D
Table A3. Identify rocks/ minerals correctly.
Table A3. Identify rocks/ minerals correctly.
GroupComment
CNeeds more splitting between cards, e.g., acid from hardness
ENuance of words ‘obvious features’, ‘requiring careful study, v. important
FNeed to distinguish between rocks + minerals. Important part of fieldwork. Enjoyed doing this exercise—sad or what?
GHow hard is the key?—suggest using ‘simple key’
HSignificance of ‘fine-grained’ rock?
KObserve and interpret context
LWording needs to be improved, particularly of the word ‘careful’
MPresumed sorting was grain size rather than compositional
OHow is density measured in field. Also properties of fine grained rock
Table A4. Record field information effectively.
Table A4. Record field information effectively.
GroupComment
EEasier to decide levels when fewer words/ clearer
FFound this one easier. Important part of fieldwork
HOK. Does everyone understand graphic log?
JVery important
KSample effectively and with a view to geoconservation
LYes—it is an important part
MStraight forward
Table A5. Use a topographic map effectively.
Table A5. Use a topographic map effectively.
GroupComment
CUneven progression. Need v simple sketch map for KS2 before published map
FImportant part of fieldwork. Topographic = difficult word
GChange examples of ‘topography’ to features on map
HClearer than field equipment group
LYes—it is very important, but les so if you have a GPS system
OOrientating a map is different skill than giving a grid reference
Table A6. Identify exposed structures correctly.
Table A6. Identify exposed structures correctly.
GroupComment
CAll fairly high level involving aspects which would be unlikely to be taught until GCSE+
ELevel summaries don’t correspond well to descriptors
FDefine ‘identify’ 5 out of 6 use this word—could words like distinguish be used?
HHigher level skills—top KS4
JImportant
KIdentify fossils/ trace fossils
LShould include other structures, e.g., large scale folds, pillow lavas, xenoliths, in the range of examples given in the questions—also fossil (not just tectonic features). Yes—it is very important
MThe identify terminology doesn’t take account of the knowledge needed to do so—more interpretation
OConsidered 4 of the 6 to be of the same level
Table A7. Map geological boundaries properly.
Table A7. Map geological boundaries properly.
GroupComment
CMostly A-level work
DAbove GCSE
EAll about the same
FContains very difficult stuff!!
GHigh-level stuff
HNot clear if following it theoretically (on map) or in field (visually). Much higher level skills
IObserving and plotting different levels of difficulty—too many variables in each statement
KAppears to be some progression, but not linear
LWording of some statements needs to be improved. Yes—it is important
MSome progression in skills, but not really much in the way of levels
N‘Infer position of geological boundary’—you need to know what you’re inferring it from to level it
OV important. Follow/ plot = too many categories on same slip
Table A8. Collect and use geological information effectively.
Table A8. Collect and use geological information effectively.
GroupComment
BA very useful and thought-provoking exercise which we found very challenging due to difficulty in assigning HSW levels
CAll fairly high level
DMostly above GCSE
EAll higher levels—step jump from distinguish to geol. hist.
FLots of high-level skills? Beyond NC?
GStarts at a high level—all very high-level skills. Very specialist
HTop GCSE to A-level work—learnt integration and then analysis in fieldwork with experience
IAnalysing an area—could be done at lots of different levels
KCompare relative texture of: igneous rocks; metamorphic rocks. Understand significance of igneous mineralogy/ composition
LWording of some items needs attention We split off the ‘death or life assemblage’ label—nut sure if it is a key part of ES fieldwork. Yes—it is an important part
MWe’ve cut a couple up. Discussed it the longest—it’s difficult stuff!
NAll more or less the same level (10)—but there is progression within the level
Notes on acronyms used by participants:
  • HSW descriptors— ‘How science works’ descriptors from Table 1
  • LO—probably ‘Learning objectives’
  • KS2— ‘Key Stage 2’ —7–11-year-olds in the UK apart from Scotland
  • GPS system—global positioning system
  • GCSE— ‘General certificate of secondary education’—taught to 14–16-year-olds in the UK apart from Scotland
  • KS4— ‘Key Stage 4’—14–16-year-olds in the UK apart from Scotland
  • A-level— ‘Advanced-level’ taught to 16–18-year-olds in the UK apart from Scotland
  • NC—National Curriculum (for 5–16-year-olds)
  • ES—Earth science.

References

  1. Tilling, S.; (University College, London, UK). Personal Communication, 2010.
  2. Piaget, J. Language and Thought of the Child, 3rd ed.; Routledge: London, UK, 1926. [Google Scholar]
  3. Bloom, B. Taxonomy of Educational Objectives, Handbook I: The Cognitive Domain; David McKay: New York, NY, USA, 1956. [Google Scholar]
  4. Anderson, L.W.; Krathwohl, D.R.; Bloom, B.S. A Taxonomy for Learning, Teaching, and Assessing: A Revision of Bloom’s Taxonomy of Educational Objectives; Allyn & Bacon (Pearson Education Group): Boston, MA, USA, 2001. [Google Scholar]
  5. Bruner, J. The Process of Education; Harvard University Press: Cambridge, MA, USA, 1960. [Google Scholar]
  6. Vygotsky, L.S. Mind in Society: The Development of Higher Psychological Processes; Harvard University Press: Cambridge, MA, USA, 1978. [Google Scholar]
  7. National Curriculum Task Group on Assessment and Testing. 1988. Available online: http://www.educationengland.org.uk/documents/pdfs/1988-TGAT-report.pdf (accessed on 23 April 2019).
  8. National Research Council (NRC). Taking Science to School: Learning and Teaching Science in Grade K-8; The National Academies Press: Washington, DC, USA, 2007. [Google Scholar]
  9. Department of Education and Science (DES). Science in the National Curriculum; HMSO: London, UK, 1989.
  10. Qualifications and Curriculum Authority. Science: Programme of Study for Key Stage 3 and Attainment Targets. 2007. Available online: http://webarchive.nationalarchives.gov.uk/20130904095138/https://media.education.gov.uk/assets/files/pdf/s/qca-07-3344-p_science_ks3_tcm8-413.pdf (accessed on 23 April 2019).
  11. Adey, P.; Shayer, M.; Yates, C. Thinking Science: The Curriculum Materials of the CASE Project; Nelson Thornes: London, UK, 1989. [Google Scholar]
  12. Shayer, M.; Adey, P.S. Towards a Science of Science Teaching; Heinemann: London, UK, 1981. [Google Scholar]
  13. Shayer, M.; Adey, P.S. Accelerating the development of formal thinking II: Postproject effects on science achievement. J. Res. Sci. Teach. 1992, 29, 81–92. [Google Scholar] [CrossRef]
  14. Shayer, M.; Adey, P.S. Accelerating the development of formal thinking III: Testing the permanency of the effects. J. Res. Sci. Teach. 1992, 29, 1101–1115. [Google Scholar] [CrossRef]
  15. Shayer, M.; Adey, P. Accelerating the development of formal operational thinking in high school pupils, IV: Three years on after a two-year intervention. J. Res. Sci. Teach. 1993, 30, 351–366. [Google Scholar] [CrossRef]
  16. Adey, P.; Shayer, M. Really Raising Standards: Cognitive Intervention and Academic Achievement; Routledge: London, UK, 1994. [Google Scholar]
  17. Catley, K.; Lehrer, R.; Reiser, B. Tracing a Proposed Learning Progression for Developing Understanding of Evolution; Paper commissioned for the Committee on Test Design for K-12 Science Achievement Centre for Education; National Research Council: Washington, DC, USA, 2005. [Google Scholar]
  18. Smith, C.; Wiser, M.; Anderson, C.W.; Krajcik, J. Implications for children’s learning for assessment: A proposed learning progression for matter and the atomic molecular theory. Measurement 2006, 14, 1–98. [Google Scholar]
  19. Alonzo, A.C.; Gotwals, A.W. Learning Progressions in Science; Sense: Rotterdam, The Netherlands, 2012. [Google Scholar]
  20. Duncan, R.G.; Hmelo-Silver, C.E. Learning progressions: Aligning curriculum, instruction, and assessment. J. Res. Sci. Teach. 2009, 46, 606–609. [Google Scholar] [CrossRef] [Green Version]
  21. Alonzo, A.C.; Gotwals, A.W. Leaping forward. In Learning Progressions in Science; Alonzo, A.C., Gotwals, A.W., Eds.; Sense: Rotterdam, The Netherlands, 2012. [Google Scholar]
  22. Shavelson, R.J.; Kurplus, A. Reflections on learning progressions. In Learning Progressions in Science; Alonzo, A.C., Gotwals, A.W., Eds.; Sense: Rotterdam, The Netherlands, 2012. [Google Scholar]
  23. Gotwals, A.W.; Songer, N.B. Reasoning up and down a food chain: Using an assessment framework to investigate students’ middle knowledge. Sci. Educ. 2010, 94, 259–281. [Google Scholar] [CrossRef]
  24. Krajcik, J. The importance, cautions and future of learning progressions. In Learning Progressions in Science; Alonzo, A.C., Gotwals, A.W., Eds.; Sense: Rotterdam, The Netherlands, 2012; pp. 27–36. [Google Scholar]
  25. Mohan, L.; Plummer, J. Exploring challenges to developing learning progressions. In Learning Progressions in Science; Alonzo, A.C., Gotwals, A.W., Eds.; Sense: Rotterdam, The Netherlands, 2012. [Google Scholar]
  26. Gunckel, K.L.; Mohan, L.; Covitt, B.A.; Anderson, C.W. Developing learning progressions for environmental literacy. In Learning Progressions in Science; Alonzo, A.C., Gotwals, A.W., Eds.; Sense: Rotterdam, The Netherlands, 2012. [Google Scholar]
  27. Plummer, J.D. Defining and validating an astronomy learning progression. In Learning Progressions in Science; Alonzo, A.C., Gotwals, A.W., Eds.; Sense: Rotterdam, The Netherlands, 2012. [Google Scholar]
  28. Schwarz, C.; Reiser, B.J.; Acher, A.; Kenyon, L.; Fortus, D. MoDeLS. In Learning Progressions in Science; Alonzo, A.C., Gotwals, A.W., Eds.; Sense: Rotterdam, The Netherlands, 2012. [Google Scholar]
  29. Jin, H.; Shin, H.; Johnson, M.E.; Kim, J.; Anderson, C.W. Developing learning progression-based teacher knowledge measures. J. Res. Sci. Teach. 2015, 52, 1269–1295. [Google Scholar] [CrossRef]
  30. Osborne, J.F.; Henderson, J.B.; MacPherson, A.; Szu, E.; Wild, A.; Yao, S.-Y. The development and validation of a learning progression for argumentation in science. J. Res. Sci. Teach. 2016, 53, 821–846. [Google Scholar] [CrossRef]
  31. Hadenfeldt, J.H.; Neumann, K.; Bernholt, S.; Liu, X.; Parchmann, I. Students’ progression in understanding the matter concept. J. Res. Sci. Teach. 2016, 53, 683–708. [Google Scholar]
  32. Hovardas, T. A learning progression should address regression: Insights from developing non-linear reasoning in ecology. J. Res. Sci. Teach. 2016, 53, 1447–1470. [Google Scholar] [CrossRef]
  33. Alonzo, A.C. An argument for formative assessment with science learning progressions. Appl. Meas. Educ. 2018, 31, 104–112. [Google Scholar] [CrossRef]
  34. Lindsey, M.; Chen, J.; Anderson, C.W. Developing a multi-year learning progression for carbon cycling in socio-ecological systems. J. Res. Sci. Teach. 2009, 46, 675–698. [Google Scholar]
  35. Jin, H.; Zhan, L.; Anderson, C.W. Developing a Fine-Grained Learning Progression Framework for Carbon- Transforming Processes. Int. J. Sci. Educ. 2013, 35, 1663–1697. [Google Scholar] [CrossRef]
  36. Gunckel, K.L.; Covitt, B.A.; Salinas, I.; Anderson, C.W. A Learning Progression for Water in Socio-Ecological Systems. J. Res. Sci. Teach. 2012, 49, 843–868. [Google Scholar] [CrossRef]
  37. Plummer, J.D.; Maynard, L. Building a learning progression for celestial motion: an exploration of students’ reasoning about the seasons. J. Res. Sci. Teach. 2014, 51, 902–929. [Google Scholar] [CrossRef]
  38. Covitt, B.A.; Gunckel, K.L.; Caplan, B.; Syswerda, S. Teachers’ use of learning progression-based formative assessment in water instruction. Appl. Meas. Educ. 2018, 31, 128–142. [Google Scholar] [CrossRef]
  39. McDonald, S.; Bateman, K.; Gall, H.; Tanis-Ozcelic, A.; Webb, A.; Furman, T. Mapping the increasing sophistication of students’ understandings of plate tectonics: A learning progressions approach. J. Geosci. Educ. 2019, 67, 83–96. [Google Scholar] [CrossRef]
  40. Breslyn, W.; McGinnis, J.R.; McDonald, R.C.; Hestness, E. Developing a learning progression for sea level rise, a major impact of climate change. J. Res. Sci. Teach. 2016, 53, 1471–1499. [Google Scholar] [CrossRef]
  41. Field Studies Council (FSC) Centre Brockhole. Progression in Fieldwork—A Framework Suggesting when Different Skills and Concepts Might be Introduced. Available online: https://www.rgs.org/RGS/media/RGS-Media-Library/In%20the%20field/Fieldwork%20in%20schools/FW_Progressioninfieldwork.pdf (accessed on 23 April 2019).
  42. Rutter, O.; Sharp, S. Skills progression table. Prim. Geogr. 2009, 1. [Google Scholar]
  43. Tilling, S.; (University College, London, UK). Biology Fieldwork: Victim or Sinner? Personal Communication, 2002. [Google Scholar]
  44. Brooks, S. Archaeology in the field: Enhancing the role of fieldwork training and teaching. Res. Archaeol. Educ. 2008, 1, 1–17. [Google Scholar]
  45. Orion, N. A holistic approach for science education for all. Eurasia J. Math. Sci. Technol. Educ. 2007, 3, 111–118. [Google Scholar] [CrossRef]
  46. American Association for the Advancement of Science (AAAS). Science for All Americans; Oxford University Press: New York, NY, USA, 1990. [Google Scholar]
  47. Esteves, H.; Ferreira, P.; Vasconcelos, C.; Fernandes, I. Geological Fieldwork: A Study Carried Out with Portuguese Secondary School Students. J. Geosci. Educ. 2013, 61, 318–325. [Google Scholar]
  48. Proceedings of the UK Geosciences Fieldwork Symposium; Earth Staff Development Project and the UK Earth Science Personal and Career Development Network: Southampton, UK, 2007.
  49. Wright, L. Fieldwork in the Earth sciences. In Proceedings of the UK Geosciences Fieldwork Symposium; King, H., Hawley, D., Thomas, N., Eds.; Earth Staff Development Project and the UK Earth Science Personal and Career Development Network: Southampton, UK, 1997. [Google Scholar]
  50. Hawley, D. Being there—A short review of field-based teaching and learning. In Proceedings of the UK Geosciences Fieldwork Symposium; King, H., Hawley, D., Thomas, N., Eds.; Earth Staff Development Project and the UK Earth Science Personal and Career Development Network: Southampton, UK, 1998; pp. 7–13. [Google Scholar]
  51. Butler, R. Teaching Geoscience through Fieldwork; GEES Learning and Teaching Guide: Plymouth, UK, 2008. [Google Scholar]
  52. Qualifications and Curriculum Authority. The National Curriculum: Level Descriptions for Subjects. 2019. Available online: https://dera.ioe.ac.uk/10747/7/1849623848_Redacted.pdf (accessed on 23 April 2019).
  53. Mogk, D.W.; Goodwin, C. Learning in the field: Synthesis of research on thinking and learning in the geosciences. In Earth and Mind II: A Synthesis of Research on Thinking and learning in the Geosciences; Kastens, K.A., Manduca, C.A., Eds.; Geological Society of America: Boulder, CO, USA, 2012; pp. 131–163. [Google Scholar]
  54. Fulbright Commission Website. Available online: http://www.fulbright.org.uk/study-in-the-usa/faqs/undergraduate-study-faqs/admissions-requirements-faqs/ (accessed on 23 April 2019).
  55. King., C.; Thomas, A. Earth Science Education Unit workshops—An evaluation of their impact. Sch. Sci. Rev. 2012, 94, 25–35. [Google Scholar]
  56. QuickMBA Statistics Website. Available online: http://www.quickmba.com/stats/centralten/ (accessed on 23 April 2019).
  57. Minitab Website Description of the use of Boxplots. Available online: https://support.minitab.com/en-us/minitab/18/help-and-how-to/graphs/how-to/boxplot/interpret-the-results/key-results/ (accessed on 23 April 2019).
  58. King, C. The response of teachers to new content in a National Science Curriculum: The case of the Earth-science component. Sci. Educ. 2001, 85, 636–664. [Google Scholar] [CrossRef]
Figure 1. (a) Boxplot of the median data for the four strands with the lowest medians. For each strand in Figure 1a,b, the boxplot on the right-hand side is the median for all the data in that strand. The strands are presented in order, from the one with the lowest median to the left rising to the one with the highest median at the right. For each strand, the items step up in difficulty, according to the median, from left to right; (b) Boxplot of the median data for the four strands with the highest medians.).
Figure 1. (a) Boxplot of the median data for the four strands with the lowest medians. For each strand in Figure 1a,b, the boxplot on the right-hand side is the median for all the data in that strand. The strands are presented in order, from the one with the lowest median to the left rising to the one with the highest median at the right. For each strand, the items step up in difficulty, according to the median, from left to right; (b) Boxplot of the median data for the four strands with the highest medians.).
Geosciences 09 00192 g001aGeosciences 09 00192 g001b
Table 1. ‘How science works’ descriptor table, based on abbreviated ‘How science works’ level descriptors (National Curriculum for Science in England (QCA, 2007), with the additions shown in italics of Level 9, the NCS ‘Exceptional performance’ level descriptor, and Level 10 adapted from Mogk and Goodwin [53] (53:138), as explained in the text.
Table 1. ‘How science works’ descriptor table, based on abbreviated ‘How science works’ level descriptors (National Curriculum for Science in England (QCA, 2007), with the additions shown in italics of Level 9, the NCS ‘Exceptional performance’ level descriptor, and Level 10 adapted from Mogk and Goodwin [53] (53:138), as explained in the text.
‘How Science Works’ LevelNational Curriculum for Science (NCS) Level SummaryApproximate Thinking Level Required
NCS Level 1Observe; describe observations Concrete thinking
NCS Level 2Suggest how to collect data; observe and compare; review findings
NCS Level 3Suggest how to answer a question; observe and measure with simple equipment; explain simple patterns; suggest improvements
NCS Level 4Plan approach; use fair test; observe and measure with suitable equipment; explain patterns; give reasons for improvementsLow level formal thinking
NCS Level 5Decide approaches; select data to be collected and apparatus; work systematically; analyse findings; evaluate methods to make improvements
NCS Level 6Use investigatory approach; make observations with precision; collect qualitative and quantitative data; analyse findings and explain them scientifically; account for inconsistencies; evaluate methods making reasoned suggestions for improvementHigher level formal thinking
NCS Level 7Plan approaches that synthesise information; identify factors where variables cannot be controlled; collect reliable systematic data; use quantitative relationships; explain conclusions and limitations scientifically; evaluate conclusions against data collected
NCS Level 8Tackle different questions using different strategies; obtain data with precision and reliability; analyse data and begin to explain anomalies; evaluate evidence criticallyComplex abstract thinking
NCS Exceptional performance
Level 9
Use scientific knowledge to determine investigational strategy; make relevant observations and comparisons; decide appropriate precision levels; analyse findings for trends and patterns and draw conclusions; show awareness of uncertainly levels; evaluate evidence critically and reason what extra evidence is needed
Added levelPlan complex investigations using several lines of evidence; collect a range of appropriate data reliably; distinguish reliable from anomalous data; modify investigational approach, as necessary; derive reliable conclusions from a range of data; evaluate conclusions critically; suggest further avenues of researchHigh-level abstract and three- and four-dimensional thinking
High- level thinking
Level 10
Table 2. Details of the three groups of educators undertaking the ‘evolved strategy’ sorting task.
Table 2. Details of the three groups of educators undertaking the ‘evolved strategy’ sorting task.
Occasion when the Task was UndertakenParticipant Details
BackgroundNumberApprox. Age-RangeGender
Annual meeting of Earth Science Education Unit (ESEU) facilitators— England and WalesThe facilitators of the Earth Science Education Unit (ESEU) had geology degrees and diverse backgrounds. Each had been appointed and trained by the ESEU to present teacher professional development workshops across the country on an ad hoc basis [55]. Most were or had been teachers of A-level geology (to 16–18-year-olds) in schools and colleges, as well as science/ geography teachers to 11–16-year-olds. some had been university lecturers; most had retired. Many had experience of teaching the learning progressions of the secondary (high school) National Curriculum together with extensive fieldwork-leading experience.2240–7014 female,
8 male
Annual meeting of Earth Science Education Unit facilitators—Scotland835–703 female,
5 male
Earth Science Teachers’ Association (ESTA) secondary (high school) workshopMostly teachers of A-level (16–18-year-old) geology in schools or colleges, with extensive fieldwork experience of leading 16–18-year-old student fieldwork groups as well as experience of leading science or geography fieldwork for 11–16-year-old students. Most also had experience of teaching the learning progressions of the National Curriculum at secondary (high school) level.1230–656 female,
6 male
Total42 23 female,
19 male
Table 3. Progression identified in the fieldwork strands, listed in order of increasing median and so of general increase in thinking levels across the table.
Table 3. Progression identified in the fieldwork strands, listed in order of increasing median and so of general increase in thinking levels across the table.
Fieldwork progression StrandIdentify Landform Features Correctly
No. of Group Responses (n) = 13
Use Field Equipment Accurately
(n = 14)
Identify Rocks/ Minerals Correctly
(n = 14)
Record Field Information Effectively
(n = 13)
Use a Topographic Map Effectively
(n = 14)
Identify Exposed Structures Correctly
(n = 13)
Map Geological Boundaries Properly
(n = 13)
Collect and use Geological Information Effectively
(n = 13)
Fieldwork Progression Strand
Median for whole strand3.04.05.05.56.06.07.08.0Median for whole strand
How Science Works’ attainment levelItemMedian levelItemMedian levelItemMedian levelItemMedian levelItemMedian levelItemMedian levelItemMedian levelItemMedian levelHow Science Works’ attainment level
Level 1 Use a magnifier1.0 Level 1
Level 2Identify a cave/ arch/ stack1.5Use a tape measure/ruler2.5 Take a photograph with scale2.0 Level 2
Identify a hill/valley; Identify a cliff/ beach; Identify a bay/ headland; 1.5
Level 3Distinguish a U-shaped from a V-shaped valley3.0Use a handlens3.0Distinguish minerals/rocks from one another in the field using obvious features (colour, grain size, grain orientation)3.0Record data on pre-prepared sheets or in tables (eg. colour, size, shape)3.0 Level 3
Use a size comparator 3.0Identify minerals/rocks using a key3.0
Use tools to measure hardness3.0
Level 4Identify a scarp/ escarpment; Identify a ridge;3.5Use acid test4.0 Orientate a map using a compass4.0Identify bedding plane; Identify joint/fracture; Identify mineral vein4.0Identify exposed geological boundary4.0 Level 4
Identify a cuesta; Identify a plateau3.5
Identify a corrie; Identify a saddle/ col4.0
Identify glacial mounds (moraine)4.0
Identify drumlins 4.0
Level 5 Estimate size4.5Identify minerals/ rocks in the field with obvious features (feldspar, biotite, sandstone, mudstone, granite, slate, etc)5.0Make freehand notes of measurements and observations5.0Orientate a map using several topographic clues (footpath, road, wall, building, etc)4.8Identify small scale fold5.0 Level 5
Estimate distance5.0Identify bedding, cross bedding, ripple marks/ sole structures5.0
Use a geological hammer safely5.0Give a four-figure grid reference4.8
Level 6 Use a clinometer to measure dip6.0Distinguish minerals/rocks from one another in the field using measurement/ testing (comparator, hardness, acid test)6.0Draw a small-scale labelled field diagram6.0Identify position on a topographic map given several topographic clues (footpath, road, stream, wall, building, etc.)6.0Identify graded bedding6.0Follow exposed vertical or near-vertical geological boundary and plot on a topographic map of a low relief area6.5Describe the depth of formation of an igneous rock6.0Level 6
Identify fault6.0
Use a compass to take a bearing6.0Identify minerals/ rocks in the field requiring careful study (calcite, quartz, limestone, breccia, gabbro)6.0Give a six-figure grid reference6.0Identify baked margin/chilled margin6.0Follow geological boundaries and plot them on topographic maps in high relief areas6.5Distinguish younger from older rocks (superposition, cross-cutting, included fragments)6.0
Identify position on a topographic map given few topographic clues (contours, footpath, etc.)6.0Identify angular unconformity6.0
Distinguish dyke from sill6.0Describe a geological sequence from age-relationship data6.0
Level 7 Use a compass to measure dip direction/strike7.0Distinguish minerals/rocks from one another in the field using features such as sorting, density, properties of fine-grained rock7.0Draw a large-scale labelled field diagram 7.0Identify position on a topographic map using bearings7.0Distinguish sill from lava flow; Use way up criteria7.0Plot dip of bedding plane; Plot position of exposed vertical or near-vertical geological boundary on a topographic map of a low relief area7.0Describe a depositional environment from collected data; Compare a modern depositional environment with an ancient analogue6.8Level 7
Estimate dip and dip direction7.0Draw a sketch map7.0Plot position of exposed dipping or horizontal geological boundary on a topographic map of a low relief area7.0Distinguish a life from a death assemblage;7.0
Level 8 Identify unusual rocks/ minerals using secondary sources in the field8.0Make a graphic log8.0 Follow position of exposed dipping or horizontal geological boundary and plot on a topographic map of a low relief area8.0Distinguish river/ beach/ eolian /limestone / ice-meltout/ volcanic environments7.5Level 8
Level 9 Identify fold from strike/dip data; Identify concealed fault / unconformity from map data9.0Infer position of geological boundaries and plot them on a topographic map9.0Use metamorphic minerals to assign metamorphic grade; Describe the conditions of formation of a metamorphic rock9.0Level 9
Describe the geological history of an area9.0
Relate the geological history of an areas to its plate tectonic environment9.0
Level 10 Follow poorly-exposed geological boundaries using field clues and plot on topographic map9.5 Level 10
Table 4. Summary of feedback on the questions ‘Is there a progression in this element of Earth science fieldwork?’ and ‘How easy was it to assign National Curriculum for Science (NCS) levels?’ for each strand.
Table 4. Summary of feedback on the questions ‘Is there a progression in this element of Earth science fieldwork?’ and ‘How easy was it to assign National Curriculum for Science (NCS) levels?’ for each strand.
Fieldwork Strand‘Is there a Progression in this Element of Earth Science Fieldwork?’ Responses on Likert Scale of 1 (Progression Obvious) to 5 (None)‘How Easy was it to Assign NSC Levels?’ Responses on likert Scale of 1 (very Easy) to 5 (Impossible)
MedianMedian
Identify landform features correctly 2.02.0
Use field equipment accurately 2.02.0
Identify rocks/ minerals correctly 2.02.0
Record field information effectively 2.03.0
Use a topographic map effectively 2.03.0
Identify exposed structures correctly 2.02.0
Map geological boundaries properly 2.03.5
Collect and use geological information effectively 2.03.0
Table 5. Suggested Earth science investigative questions forming a teaching sequence through the ‘How science works’ levels identified through the National Curriculum for Science (NCS), with additions, as explained for Table 1.
Table 5. Suggested Earth science investigative questions forming a teaching sequence through the ‘How science works’ levels identified through the National Curriculum for Science (NCS), with additions, as explained for Table 1.
‘How Science Works’ LevelNCS Level SummaryApproximate Thinking Level RequiredEarth Science Investigative Question that Might be Appropriated for This Level
NCS Level 1Observe; describe observations Geosciences 09 00192 i001
  • What does the rock face look like?
  • What do the stones look like?
NCS Level 2Suggest how to collect data; observe and compare; review findings
  • Which rock is the toughest?
  • Which rock is the most waterproof?
  • Which stone would make the best door-stop?
NCS Level 3Suggest how to answer a question; observe and measure with simple equipment; explain simple patterns; suggest improvements
  • Which rock contains the most different sorts of grains?
  • Do the pebbles come from the nearby cliffs—or not?
  • Where are the thickest beds found in the exposure?
  • Why is the steepest part of the beach so steep?
NCS Level 4Plan approach; use fair test; observe and measure with suitable equipment; explain patterns; give reasons for improvements Geosciences 09 00192 i002
  • How can we find out which rocks would make the best building stones?
  • Which rock is the most permeable, and why?
  • Which of the rocks are non-sedimentary?
  • Which rocks form the highest land in the area?
NCS Level 5Decide approaches; select data to be collected and apparatus; work systematically; analyse findings; evaluate methods to make improvements
  • How does the dip of the rocks change across the area; are any folds open or tight?
  • Were all the sedimentary rocks laid down in the same environment—or not?
  • Which rock would make the best reservoir/cap rock?
NCS Level 6Use investigatory approach; make observations with precision; collect qualitative and quantitative data; analyse findings and explain them scientifically; account for inconsistencies; evaluate methods making reasoned suggestions for improvement Geosciences 09 00192 i003
  • What is the trend of the structural features in the area; how does this relate to the pressures that formed them?
  • How does the directional data given by the sedimentary structures vary?
  • What type of faulting is present; how does this relate to the stresses that caused the faulting?
NCS Level 7Plan approaches that synthesise information; identify factors where variables cannot be controlled; collect reliable systematic data; use quantitative relationships; explain conclusions and limitations scientifically; evaluate conclusions against data collected
  • In which sedimentary environment were the rocks deposited?
  • Is the igneous body a lava flow, sill, dyke or pluton?
  • Is the geological boundary associated with topographic features?
  • How does the simple geological boundary plot on a map of the area?
  • How could elements of the sedimentary environment be modelled?
NCS Level 8Tackle different questions using different strategies; obtain data with precision and reliability; analyse data and begin to explain anomalies; evaluate evidence critically Geosciences 09 00192 i004
  • How can the variability in the sedimentary strata be explained?
  • What is the geological history of this small, relatively simple area?
  • How is the landscape controlled by the local geology?
NCS Exceptional performance
Level 9
Use scientific knowledge to determine investigational strategy; make relevant observations and comparisons; decide appropriate precision levels; analyse findings for trends and patterns and draw conclusions; show awareness of uncertainly levels; evaluate evidence critically and reason what extra evidence is needed
  • How does the geological boundary plot on a map of the area?
  • What would a geological cross section of the area of dipping/folded geology look like?
  • What is the geological history of this more complex area?
  • Are the strata the correct way up—or inverted?
  • What would an interpretative sign, explaining the geology of the area to the general public, show?
Added levelPlan complex investigations using several lines of evidence; collect a range of appropriate data reliably; distinguish reliable from anomalous data; modify investigational approach, as necessary; derive reliable conclusions from a range of data; evaluate conclusions critically; suggest further avenues of researchHigh-level abstract and three and four dimensional thinking
  • How does a largely obscured geological boundary plot on a map of the area?
  • What would a geological cross section of the area of fairly complex geology look like?
  • What is the geological history of this large and complex area?
  • How did the depositional environment change over time?
  • How does the sedimentary environment identified in an ancient rock sequence compare with a modern analogue?
  • What evidence does the rock sequence contain for the plate tectonic environment of the time?

Share and Cite

MDPI and ACS Style

King, C.J.H. What Pattern of Progression in Geoscience Fieldwork can be Recognised by Geoscience Educators? Geosciences 2019, 9, 192. https://doi.org/10.3390/geosciences9050192

AMA Style

King CJH. What Pattern of Progression in Geoscience Fieldwork can be Recognised by Geoscience Educators? Geosciences. 2019; 9(5):192. https://doi.org/10.3390/geosciences9050192

Chicago/Turabian Style

King, Chris J.H. 2019. "What Pattern of Progression in Geoscience Fieldwork can be Recognised by Geoscience Educators?" Geosciences 9, no. 5: 192. https://doi.org/10.3390/geosciences9050192

APA Style

King, C. J. H. (2019). What Pattern of Progression in Geoscience Fieldwork can be Recognised by Geoscience Educators? Geosciences, 9(5), 192. https://doi.org/10.3390/geosciences9050192

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop