Next Article in Journal
A Mating Selection Based on Modified Strengthened Dominance Relation for NSGA-III
Next Article in Special Issue
Reasoning, Representing, and Generalizing in Geometric Proof Problems among 8th Grade Talented Students
Previous Article in Journal
Clustering Vertex-Weighted Graphs by Spectral Methods
Previous Article in Special Issue
The Influence of NeoTrie VR’s Immersive Virtual Reality on the Teaching and Learning of Geometry
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Preservice Teachers’ Eliciting and Responding to Student Thinking in Lesson Plays

1
School of Education and Human Services, Oakland University, Rochester, MI 48309, USA
2
Graduate School of Education, Yonsei University, Seoul 03722, Korea
*
Author to whom correspondence should be addressed.
Mathematics 2021, 9(22), 2842; https://doi.org/10.3390/math9222842
Submission received: 18 September 2021 / Revised: 19 October 2021 / Accepted: 2 November 2021 / Published: 9 November 2021
(This article belongs to the Special Issue Research on Powerful Ideas for Enriching School Mathematical Learning)

Abstract

:
This study presents an analysis of 95 lesson play scripts—hypothetical dialogues between the teacher and a student—written by 32 preservice teachers (PSTs). Writing lesson scripts was part of the assessment design activities to elicit and respond to students’ thinking. The findings present the types and frequencies of teacher talks/moves in fraction-related tasks during a stage of lesson plays, such as launch, active elicitation, and closure. Our analysis indicates a wide range in the number of turns taken by the PSTs, while there is little correlation between the number of turns and effectiveness at eliciting and responding to student thinking. The study also confirmed that some unproductive talk moves were still present in the lesson play context, although the PSTs had plenty of time to craft a script. This study drew implications of PSTs’ prior perceptions, experiences, knowledge, and needs in mathematics teacher education regarding the ways to create learning opportunities for them to elicit and respond to student thinking.

1. Introduction

Students’ mathematical learning occurs via numerous visible and invisible interactions in the classroom. One challenge for teachers is gaining access to students’ mathematical thinking to provide appropriate support. Accordingly, research on mathematics teaching urges teachers to develop professional noticing skills in the classroom [1,2] along with their capacity to elicit, make sense of, and respond to student thinking [3,4,5,6]. Meanwhile, there has been an increasing focus on the curriculum in teacher education for supporting preservice teachers (PSTs) to develop professional core practices [7,8], which later became a useful idea as the use of approximations of practice in teacher education [8] became common. That is, through the use of approximations of practice, novices can have opportunities to “engage in practices that are more or less proximal to the practices of a profession” [8] (p. 2058). Here, the practices are designed to support the core skills of PSTs, which include analyzing written cases, engaging in live role plays, generating components of practice, or enacting the practices in an intellectually safe and productive environment in teacher education programs.
Recently, some researchers have suggested writing lesson plays to offer approximations of practice [9]. Lesson plays are imagined interactions written in verbatim form, which create a “bridge between a plan for action and its implementation” while offering researchers a window to investigate one’s mathematical knowledge for teaching [10] (p. 271). This study utilized the lesson play method to identify a group of PSTs’ ways of eliciting and responding to students’ thinking and understanding. In recent years, a core competencies list and high-leverage practices are often referred to as part of the teacher education curriculum. While these practices—such as eliciting students’ thinking—sound self-evident, it is elusive how the construct is actually interpreted and put into practice by novice teachers. By offering the opportunity to write lesson plays with approximations of practice in mind, this study intended to provide PSTs with the space to interpret for themselves the practice of eliciting and responding to students’ thinking. The overarching question that guided this study is as follows: What are the characteristics of PSTs’ conversational strategies as they elicit and respond to student thinking as reflected in lesson play scripts? To answer this question, this study analyses patterns regarding the following aspects: (a) frequency of the teacher follow-up questions to elicit and respond to student thinking in lesson play scripts, and (b) types of the teacher’s conversational strategies in the launch, elicitation, and closure stages of lesson play scripts.

2. Related Literature

2.1. Teacher Questioning, Talk Moves, and Eliciting and Responding to Students’ Thinking

Posing appropriate questions in the classroom with a clear intention and purpose is a critical instructional skill, which should be learned and refined [7,11]. Combined with the importance of promoting student engagement and noticing students’ mathematical thinking [12] for productive mathematics instruction, recent research in mathematics teacher education stresses that skillful elicitations and responses to student thinking should be among the core practices to be fostered in teacher education programs [13,14].
The research on verbal classroom interactions has a long history, probing both quantitative and qualitative aspects. Some studies focus on the quantitative aspect of interactions, reporting that teachers’ questioning made up the majority of classroom verbal interactions [15,16]. However, research also considered the purposes and patterns of teacher questioning. A typical pattern of classroom interaction is frequently referred to as the IRE Initiation-Response-Evaluation (IRE) or Initiation-Response-Feedback (IRF). This three-step sequence is often associated with closed questions, which are mostly used to check for the acquisition of general information; some educators suggest avoiding or minimizing this conversational pattern. Many studies about teachers’ questioning report that teachers predominantly use closed, low-level questions rather than open, high-order questions [17,18].
Nevertheless, educators also admit that this IRE/IRF sequence of interactions is not exclusively associated with closed questions; this pattern can offer more open opportunities when it is combined with appropriate follow-up questions, moving beyond simply providing evaluations or feedback (cited in [19,20,21]). Building on the potential of teachers’ follow-up questions to enrich the discourse patterns in mathematics classrooms, many educators suggest that teachers use the third turn in this sequence more productively and extend the third turn with various follow-up questions purposefully responding to students’ responses.
For example, Chapin, O’Connor, and Anderson [22] suggested various talk moves, which refer to teachers’ verbal actions that effectively elicit, expand, and support students’ thinking. Some examples include as follows: (a) using wait-time to create space for students to process the question and prepare an answer, (b) revoicing a student’s contribution, (c) asking students to restate another student’s contribution, (d) prompting students for further discussion, and (e) asking students to apply their own reasoning to someone else’s reasoning. These talk moves offer teachers the opportunity to better elicit and respond to student thinking, but they also help students feel heard. A recent study [23] reports that teachers identified by students as promoting mathematics discussions tended to ask follow-up questions that increased and sustained student participation in mathematics discussions. These findings suggest by asking follow-up questions, the teacher listened and responded to students’ ideas, and the students felt heard.
Prior research on teacher questioning often describes the IRE/IRF pattern. However, given the recent attention towards productive talk moves, along with the recognition of elicitation and response as core teaching practices, the present study suggests that the IRE/IRF pattern is unsatisfactory. This study proposes that the IRE/IRF pattern be redefined as Initiate-Respond-Elicit and Initiate-Respond-Follow-Up. Additionally, the PSTs should be given more intentional opportunities to practice this skill in their teacher education programs as it might not be a familiar practice when they were students.

2.2. Studies on PSTs’ Eliciting and Responding to Student Thinking

There has been some discussion on what constitutes the good practice of eliciting and responding to students’ thinking for in-service teachers. Lampert [24] urges teachers to initiate and support social interactions for mathematical arguments in response to their students’ conjectures. This contrasts with the conventional approaches in school mathematics that rely on authority for ratification, emphasize rules and formulas, and keep students silent without explicit student thinking. Borko’s review [25] of research on teacher professional development highlights the teacher’s facilitator role to guide student thinking and elaborates on how teachers develop ways to elicit and listen to their students’ mathematical thinking.
Studies on PSTs’ practicing of eliciting and responding to student thinking have been relatively sparse until recent developments in practice-based teacher education. In practice-based teacher education, sets of core practices or high-leverage practices are identified to support novice teachers in learning how to competently enact those practices [26,27].
There have been some studies probing PSTs’ skills of questioning, eliciting, and responding to student thinking. Some studies used clinical interviews with real students [11,28,29]. Others used teaching simulations with a hypothetical student profile (i.e., a person whose actions and statements are guided by artificial reasoning and responding, including scripted responses) [4]. In recent years, teaching simulations via digital platforms also have become more popular [30].
This overview briefly summarizes, compares, and contrasts the contexts and findings of five selected studies that reported PSTs’ performance on eliciting and responding to student thinking (Table 1).
These five studies involved elementary PSTs who were at different stages in their teacher education programs. These studies specifically examined the knowledge of and skills regarding questioning, eliciting, and responding to students’ thinking. They all used the one-on-one assessment interview context with real students or hypothetical students. These studies share some findings in terms of areas for improvement: (a) the need to refrain from instructing students rather than eliciting their thinking, asking leading questions, or filling in the student’s thinking are commonly observed teacher actions when asked to elicit student thinking and (b) the need to probe more to elicit student mathematical thinking. Informed by prior studies, the present study intentionally includes some components in the planning process to get the PSTs’ attention.

2.3. Lesson Play as a Medium for Approximation

In recognition of the challenges involved in developing pedagogical practices, educators have emphasized designing pedagogical approaches that help PSTs engage in developing teaching practices. Grossman et al. [8] propose the pedagogies of enactment, including representation, decomposition, and approximation of practice. Representation of practice intends to make teaching practice visible for analysis and reflection. Decomposition, the breaking a practice into constituent parts, helps PSTs attend to the discrete components of teaching. Approximation refers to the engagement in components of practice under conditions of reduced complexity [8] (p. 2055).
As teacher educators enact pedagogies of “approximation” [8], they offer “opportunities [to PSTs] to engage in practices that are more or less proximal to the practices of a profession” with varied authenticity [8] (p. 2056). An example of an approximation pedagogy is the lesson play: authoring verbatim an imaginary written interaction between a teacher and students [9,10]. Zazkis et al. [9] suggest that lesson plays can offer opportunities for in-depth discussions of key aspects of teaching mathematics prior to enactment. Teacher educators and researchers have utilized these imaginative scripting approaches for various purposes in teacher education (see [32] for more information).
The present study utilizes lesson plays as a medium for PSTs to practice the skills of eliciting and responding to student thinking as well as for teacher educators to identify patterns of questions and moves PSTs employ. Prior studies reviewed in the previous section offered the approximation of practices using simulations with peers or teacher educators working with real students or using hypothetical student profiles in one-on-one scenarios. In both contexts, PSTs still face unexpected students’ responses and make in-the-moment decisions, which can hinder researchers from deciphering the PSTs’ original intentions. Thus, the reason for working with lesson plays in this study is to provide PSTs with space and structure for more elaborated thought processes and opportunities to make their intention more visible in less authentic contexts.

3. Method

3.1. Participants and Context

The study’s participants were 32 undergraduate PSTs enrolled in a required elementary mathematics methods course in a Midwestern university in the United States. This paper’s author was the course instructor. Before taking this methods course, the PSTs had completed two mathematics content courses focusing on number theory and geometry. Typically, the PSTs took this course for one or two semesters before their final full-time student teaching experience in an authentic classroom setting. This course had been a lecture-based course with plenty of hands-on activities, aiming to present an overview of the teaching of mathematics in elementary school. At the time of collecting data for this study, the program and course were making a transition to a more practice-based teacher education program. Concurrently with this course, the PSTs also had some field experiences at local elementary classrooms consisting of observation and limited instructional activities levels under the supervision of mentor teachers. However, at the data collection time, the authentic interactions with real students were not available due to the global pandemic situation. Thus, the lesson play method was utilized as a “bridge between a plan for action and its implementation” [10] (p. 271).
Being the only mathematics methods course offered to PSTs in the program, it was challenging to focus on specific topics or teaching practices over a long period. By the data collection time, the PSTs had reviewed several documents, including the Common Core State Standards for Mathematics (CCSSM) and the accompanying Standards for Mathematical Practices [33], along with the guidelines for effective teaching practices [34] and high-leverage teaching practices [14]. They also read several articles regarding teacher questioning and talk moves [22] and watched short video clips related to teacher questioning techniques available via Internet-based teacher resources. The PSTs were also provided with a brief introduction of lesson play by reading a chapter of Zazkis et al. [10]. Aside from these introductory materials, the actual experiences with students in the field settings were varied depending on the discretion of individual mentor teachers. Thus, this study focused on the PSTs’ developing perceptions of what constitutes effective elicitation of students’ thinking as influenced by the exploration stage in their teacher education program.

3.2. Task and Procedures

As part of course activities, the PSTs were required to develop a plan to assess a fourth-grade student’s understanding of fractions through three sets of problems. Each set respectively focused on the basic concept of fractions using the area model, number line representations, and the comparison of two fractions. Table 2 below shows the target standards/objectives and requirements for each set.
The task consisted of two parts that were completed in and outside class time over three weeks. The first part was planning a one-on-one interview with a fourth-grade student. The PSTs were required to produce three question sets and include the following components in their planning document: (a) three differentiated tasks (core, less challenging, and more challenging questions); (b) anticipated student confusion; (c) a list of follow-up questions/prompts; and (d) anticipated student explanations. Informed by prior studies reviewed in the previous section, the PSTs were asked to prepare additional problems that could be used to learn more about the student’s thinking [4] as well as follow-up questions for responding to both correct and incorrect student responses [11,28,29].
The second part was writing a lesson play script for each question set based on the planning document. When writing lesson play scripts, the PSTs were asked to consider the following factors:
  • These imagined assessment interviews would serve as diagnostic assessments of a randomly chosen student whose personal and academic background was unknown. Using the planned assessment, it was the goal of the PSTs to determine the level of the student’s understanding against the chosen standards by eliciting their thinking. Thus, there was no predetermined profile of the student as used in the prior study [4]. It was intentionally left open to see how the PSTs’ anticipation of the students’ confusion and their improvisation of the breadth and depth of elicitation could be tailored to various imaginary students in the lesson play scripts.
  • There was no required minimum or maximum number of talk turns included in the lesson play scripts.
  • The PSTs needed to include all the teacher talk and student talk from the beginning to the end, imagining they met a new student for an assessment interview.
  • The PSTs could end the lesson play scripts once they determined that they fully elicited the student’s thinking.
This work intended to develop PSTs’ skills at eliciting individual student thinking, which is a high-leverage practice in which teachers must be competent or else face significant obstacles while trying to effectively teach [7]. This activity’s purpose was shared with the PSTs before they began developing the planning document and lesson play scripts.
The PSTs produced three lesson plays for the three sets of questions. One PST out of 32 PSTs did not write a script for one set of questions. Thus, we collected a total of 95 lesson play scripts for analysis. Appendix A shows a sample of the planning document that contains the required components and the associated lesson play script by a PST for Set 1.

4. Data Analysis

A total of 95 lesson play scripts (two or three scripts from each of the 32 PSTs) were analyzed to find trends within the data and reflect on their meaning [35]. The scripts were analyzed following the inductive content analysis approach by developing data-driven codes [36,37]. This study was mainly grounded in the data within the study.
Several aspects were considered when examining the data. The first aspect was the frequency of teacher–student talks. Additionally, the number of turns between the teacher and the student in each lesson play script was noted, especially focusing on the teacher’s follow-up questions to the initial student response.
The second aspect considered was the nature of the teacher’s follow-up questions in three segments of lesson play: (a) launch, (b) active elicitation, and (c) closure. This part of the analysis employed multiple stages to accomplish the open-coding process. First, data were categorized based on the themes that emerged as the researchers read all the lesson play scripts. To ensure reliability in the coding of the nature of the teacher’s follow-up questions, two research personnel independently coded the data. There was an 86% agreement between two coders, which met the minimum acceptable percent agreement of 80% ([38] as cited in [39]). Two coders resolved any discrepancies between their initial coding through discussion until they reached an agreement on the coding. To highlight different approaches taken by the PSTs, the frequencies of codes in the PSTs’ scripts were calculated.

5. Findings

The first part of the findings reports the types of teacher talks and frequencies of occurrence. Frequency is reported as the percentage of specific themes present in individual PSTs’ responses. Some PSTs addressed multiple aspects, and thus we coded their work into multiple categories, which resulted in some columns totaling more than 100%. The first part of the findings provides examples that increase the PSTs’ ways of eliciting and responding to students’ thinking.

5.1. Frequency of Teacher Talks

By design, the task asked the PSTs to write a sequence of interactions consisting of the teacher’s initiation of a question, the student’s responses, and the teacher’s listening and posing additional questions to further elicit student thinking. This was intended as a special form of the IRF as non-evaluative teacher questioning. Using the 95 lesson play scripts written by 32 PSTs, the number of turns of teacher talk—including the first initiation questions—was examined. The example shown in Appendix A consists of 14 turns of teacher talks.
The average number of teacher talk turns was about eight, and the lowest number was two, which ended the interaction with one simple cycle of I-R-F. The highest number of teacher talk turns was 34. For analysis purposes, the number of turns of teacher talk in each script was counted and categorized into three levels by convenience, as shown in Table 3. Related analysis should be taken with caution. Although the longer turns are, the more complex the teacher–students interactions may become [23], this categorization only signifies the quantity—not the quality—of talks.
Since each of the 32 PSTs wrote two or three lesson play scripts, the frequencies of teacher talks in multiple scripts developed by each PST were also examined to see whether the number of teacher talks was related to the individual PST’s capacity or preference. The scripts written by 12 PSTs (36%) had the same level of frequency (e.g., PST #1′s all three scripts had a low-level of frequencies). The scripts of 17 PSTs (53%) had two levels of frequencies (e.g., PST #5′s two scripts had low-levels of frequencies, and one script had middle-level frequency). The remaining three PSTs’ (9%) scripts each had a level of frequency (e.g., PST #1 wrote one low, one middle, and one high-level of frequency scripts).
Because the PSTs wrote three sets of scripts that focused on different mathematical topics/representations, the frequency of teacher talk was also examined to see whether the content mattered with the teacher talk frequencies in the scripts. The average number of turns of teacher talk in Set 1 (fraction concepts using the area model) was 7.75 turns, 8.06 turns, and 8.90 turns, respectively for Set 2 (fraction concepts using the number line representation) and Set 3 (comparing fractions without specifying the representation to use).

5.2. Types of Teacher Talk

In addition to the frequency of teacher talk turns, the type/intention of teacher talk was also examined. The inductive approach yielded various codes/themes. Some are similar to the items appearing in the extant literature as reviewed in the literature review section. The codes/categories were identified in three sections of lesson play scripts: (a) Launch (first teacher talk), (b) Elicitation, and (c) Closure.
Launch (first teacher talk). Table 4 shows the list of inductive categories used in the analysis of the first stage of teacher talk during the launch portion of lesson play scripts until the problems were first presented, prior to active elicitation. Here, the frequencies are reported whether specific categories are present or absent in each script.
About 69% of lesson play scripts started in a very neutral and straightforward way by presenting prepared written problems and/or slightly rephrasing and elaborating on the prepared questions. The low- and middle-frequency groups took this launching strategy more often than the high-frequency group.
The rest of the lesson play scripts (about 31%) added additional components to presenting prepared problems, such as checking on background knowledge/experience and key terms and informing the student on interview protocol and the mathematics topics to be focused on. The high-frequency group took this launching strategy more than the low- or middle-frequency groups.
Teacher talks during active elicitation. An active elicitation starts after the teacher presents the task, and the student provides their first response to the teacher. In this study, the majority of lesson play scripts presented the task in the first teacher talk. Thus, for most cases, the active elicitation started from the third turn of the interactions to the teacher’s talk prior to the student’s final talk. Because each turn of teacher talk may include multiple sentences with different purposes/intentions, each turn can be coded into multiple categories. Thus, in this analysis, as shown in Table 5, the frequencies are reported by the number of occurrences out of total meaning units (i.e., individual sentences or meaningful phrases) used for analysis. The total number of meaning units considered was 748. A total of 113 units, 234 units, and 401 units were considered in each low, middle, and high group (see Appendix B for more detailed categories and analysis).
Elicitation of methods or reasoning for actions. About 30% of the total teacher talks belong to this category. More than half of these examples focused on asking the student to describe the methods used (e.g., “Can you tell me what you used and what you did?”). The remaining examples asked for the reason for specific actions (e.g., “Could you explain why you put the 1/2 and 4/6 where you did?”) or asked an open-ended question to explain the response (e.g., “Could you explain your answer?”). Looking at each group’s percentage frequency, the low group’s frequency was the highest.
Follow-up probing. About 26% of total teacher talks were devoted to further probing. Almost 85% of the examples in this category intended to get students to further explain their thinking about their actions or answers (e.g., “Okay, so you counted five squares. How did you know it was ⅝ though?”). The remaining examples consist of either asking for step-by-step procedures (e.g., “Okay, what is your next step after that?”) or asking the student to explain/clarify the underlying meaning of their response (e.g., “So, you got six. What does six mean?”). The high-frequency group wrote this type of teacher talk more frequently than the other two groups, as shown in the groups’ percentage frequencies.
Teacher-led process. In about 14% of teacher talks, the PSTs tried to offer guidance or hints. Among these examples, about 78% of the talks led the student to a specific method/answer, explained the procedures to take, or suggested alternative strategies (e.g., “How about we try to find the common denominator? We want to try to make it the same to see which one is bigger. Do you know how to find the common denominator?”, “First, you need to find a common denominator by multiplying denominators,” “Another skill you could also try is drawing dotted lines and breaking up B into as many pieces of A as possible”). Other examples in this category attempted to draw the student’s attention to certain key aspects (e.g., “Are all the pieces evenly divided?”) or stated or wrote the final answer before the student finalized their answer (e.g., “Yes, then you can put ¼ under here.”). In this category, the high-frequency group produced a slightly higher percentage frequency than the other two groups.
Making connections. About 13% of the total teacher talks asked the student to show their ideas using different representations such as drawings or manipulatives (e.g., “Can you draw a picture to help me understand your thinking?”, “Could you show what that would look like if you use fraction circles?”), or asked if the solution can be extended to other situations (e.g., “So, how would we do this with non-unit fractions?”). The percentage frequencies in the low and middle groups were slightly higher than in the high group.
Modifying questions. In about 8% of total teacher talks, the PSTs asked questions that modified the original problems. About 46% of those talks utilized the differentiated problems (e.g., less or more challenging questions) that were prepared in advance. For the remaining examples, the PSTs modified the problem by breaking the original questions into a series of small questions (e.g., “Let’s look at the square for the corn only. How much of the square does the corn take up?”) or by rephrasing or changing the used representations (e.g., “Let me rephrase,” “Let me write it down”) when facing the student’s confusion. The percentage frequencies in the low and middle groups were slightly higher than in the high group.
Revoicing. In about 5% of the total teacher talks, the PSTs used revoicing by restating the student’s response to confirm (e.g., “Because you had a whole circle and you cut it in half… Is it what you are saying?”). The percentage frequencies in each group were similar.
Miscellaneous talks. About 3% of the total teacher talks did not specifically belong to the above categories. Those include offering the student wait time (e.g., “I will give you a minute to think”), checking for the student’s confusion (e.g., “So, did the parts of the shaded circle confuse you?”), asking the student to recall what they have done and/or said (e.g., “Do you remember what we talked about a fraction being in the first question?”), and reminding the student of what to do (e.g., “Remember to use that number line.”).

5.3. Illustrative Examples from Active Elicitation

The types and patterns of teacher talks during the active elicitation stage reported above revealed many different frequencies and types of requests the PSTs used to identify student thinking through imagined interactions. Because PSTs were the writers of these plays and designed the plot and defined each character’s actions and talks, the lesson play scripts may be indicative of PSTs’ knowledge and performance as well as their perceptions about students’ ability and this specific core teaching practice. In addition to the quantitative report, this section reports several examples that offer more nuanced insights into the ways different categories might be intertwined.

5.4. Looking into Some Short Scripts

Although the small number of teacher talks does not necessarily result in ineffective eliciting, it is worth noting some cases.
Ending scripts without active elicitation. A couple of scripts in the low-frequency group presented the problem in several steps without starting any active elicitation. The example shown below consists of two teacher turns where both were coded into the launch category by reading the prepared question (see Figure 1). The student provided their reasoning, but it was not based on the teacher’s eliciting.
Teacher: “John says that 1 3 of the square is shaded. Do you agree? If not, approximately how much of the square is shaded?”
Student: “No, I don’t agree because if you split this in half (the square), this is half of the square (pointing to the shaded part), and there are three parts, but they are not equal.”
Teacher: “How much of the square do you think is shaded?”
Student: “Half of it.”
This type of lesson play may be an indication of PSTs’ very narrow understanding of what is involved in eliciting and responding to students’ thinking; this case indicates that, for some PSTs, the main goal is to capture the student’s answer.
Overestimating the students’ ability and performance. Some lesson play scripts ended abruptly without much teacher eliciting. In some of these scripts, students fluently explained with great elaboration and ended with the teacher deciding that they fully elicited the student’s thinking. Here are two examples (see Figure 2a,b):
Teacher: “What fraction is located at the dotted spot on the number line?”
Student: “6/8.”
Teacher: “Can you explain how you got your answer?”
Student: “I counted the spaces on the number line. One, two, three, four, five, six, seven … Oh, I miscounted. The answer is 7/8 because the dot is between the numbers zero and one and the number line is split into eight sections and the dot is located in the seventh section. I thought the dot was at the sixth spot at first.”
Teacher: “Why is 7 the numerator and 8 the denominator?”
Student: “The dot is at seven, but the entire number line is divided into eight sections.”
Example 2
Teacher: “So the first question says. How are the two fractions pictured below equivalent if each circle represents one whole?” (see Figure 2b)
Student: “The first fraction is 1/2 because there are two parts, and one of them is colored blue. The second fraction has six parts, and three of them are shaded blue, so that is 3/6. I know that 3/6 can be divided by 3/3, which will equal one half. So, they are equivalent because they are both equal to one half.”
Teacher: “I see what you did. You started by looking at the first fraction and saw that there were two equal parts and one of those parts was shaded in making the fraction 1/2. Then you looked at the other whole circle which had six equal parts and three shaded in so you knew the fraction was 3/6, but you noticed that both the 3 and the 6 could be simplified by 3 and that would make this fraction also 1/2. When looking at 1/2 and 1/2, you know they are equal. Did I explain this correctly?”
Student: “Yes.”
Teacher: “Okay, let’s take a look at the next problem.”
In the first example, the student realized the error in the initial answer and explained the correct answer and what caused the error without any teacher prompts. In the second example, the student immediately provided an answer and explanation in one turn, and the teacher revoiced what the student said and concluded the interview.
The PSTs’ weak mathematical knowledge. In this study, the PSTs were asked to develop their own assessment items. Although the used fractions were simple, some scripts apparently contained invalid/incorrect mathematical ideas and representations.
Example 1 (see Figure 3a) was the core question for Set 1 and was used in the lesson play script. The PSTs’ expected explanation in the planning document was as follows and did not match with the provided representation in the problem: “3/6—There are six equal sections and three are shaded, and the student can simplify to 1/2 if needed”.
The lesson play script also did not reflect this, as shown below:
Teacher: “Can you try to solve this for me?”
Student: “I am going to count the pieces because I was taught that in class. I see one, two, three, four, five, and six pieces in the circle. I now see that there are three shaded pieces.”
Teacher: “What would you do next? Use the markers if needed to solve the problem.”
Student: “So, I am going to put 3 with a line over the 6, and I get 3/6 fraction for the answer.”
Teacher: “How did you get your answer? Talk me through it.”
Student: “I just counted the pieces of the circle and then counted the shaded parts after.”
Teacher: “That is an awesome way to do it.”
This PST used a similar incorrect representation for the more challenging differentiated problem in the planning document but did not incorporate it in the lesson play.
Example 2 (see Figure 3b) was prepared for the more challenging differentiated problem in Set 3 but was not utilized in the lesson play script. This PST noted in the planning document that the answer is they are equivalent, and this problem would be more challenging to compare because different shapes are used as the whole, without being aware that the whole must be the same (with the same area). So, it is possible that if the student answers they are equivalent, this PST could have ended the assessment without further elicitation.
As the key understanding about fractions includes equal partitioning [40] and the invariance of the whole [41], these PSTs’ lack of knowledge prevented them from further eliciting the student’s thinking. In other scripts, similar problems were used as non-examples and asked students to explain and justify whether the presented solutions are correct, which yielded more elicitation.
Leading: teaching algorithms. Some scripts ended with very short interactions when the teacher focused on teaching how to get the answer using the common denominator algorithm. In the following example, a PST asked to compare 5 11 and 3 10 for the Set 3 core question (Note: According to the standard document, 11 is not the recommended size of denominators in grade 4, but this PST chose this number).
Example
Teacher: “Which fraction is greater, 5 11 or 3 10 ? You can use your pencil and show me your math, too.”
Student: (drew out 5 11 and 3 10 by making blocks and shading in the numbers to compare them.) “ 3 10 is bigger because 5 11 is split up into more parts and the 3 10 overlaps the 5 11 .”
Teacher: “Okay, let’s look at this another way. So, we have 5 11 , and 3 10 . Instead of drawing out squares and shading them, what’s another way to figure this out?”
Student: “I don’t know.”
Teacher: “How about we try to find the common denominator? We want to try to make it the same to see which one is bigger. Do you know how to find the common denominator?”
Student: “No.”
Teacher: “Find all the multiples of 10 and 11. Then find one that matches. Then multiply the numerators. Now we have 50 110 and 33 110 . So, we can say that 5 11 is actually greater than 3 10 .”
Student: “Okay.”
Several scripts specifically guided the process of using the cross-multiply algorithm, which is not considered to be the main problem-solving strategy at the target grade level.

5.5. Looking into Some Longer Scripts

Overall, more types of teacher elicitation prompts appeared, and incorporating anticipated student confusion and additional (differentiated) problems occurred more frequently in the longer scripts. However, the scripts in the higher frequency group did not always yield effective elicitation. Here are some examples:
More cycles of Simple IRE from anticipated confusion and additional (differentiated) problems. The setting and the characteristics of imaginary students in the lesson plays were intentionally open to see how the PSTs’ anticipation of students’ confusion and their improvisation of the breadth and depth of elicitation could be tailored to various imaginary students. However, it was required for them to prepare a list of anticipated student confusion and differentiated problems in their planning document.
Overall, 21 out of 95 scripts partially or extensively incorporated the anticipated confusion in the planning document (4 scripts from the low-frequency group, 5 scripts from the middle-frequency group, and 12 scripts from the high-frequency group). A total of 37 out of 95 scripts partially or extensively incorporated the additional (differentiated) prepared problems (5 scripts from the low-frequency group, 14 scripts from the middle-frequency group, and 18 scripts from the high-frequency group).
One notable aspect is that incorporating the possible student confusion or additional (differentiated) problems increased the number of teacher talks. However, this alone does not guarantee meaningful eliciting and responding to the student’s thinking. For instance, the following example is from Problem Set 2, and the PST anticipated some confusion in identifying improper fractions and prepared three problems with different challenging levels (see Figure 4).
------ Some additional launch talks prior to this------
Teacher: “Could you read that question for me?”
Student: “Find the value of location a.”
Teacher: “Okay, so let’s find location a, and what would the value of that be in fraction form?”
Student: (points to a) “Two and a half.”
Teacher: “Two and a half. Okay, could you explain to me how you got that?”
Student: “Because it’s between two and three, I mean it’s the middle one.”
Teacher: “Okay great, could you also tell me the value of location c?”
Student: “Um, wait, would this one be a quarter?” (Student points to the line at 4 and 1/4)
Teacher: “Yes, so what would the next mark be?”
Student: “Half and a quarter?”
Teacher: “So, how did you get that?”
Student: “Because it’s not a half, it’s more.”
Teacher: “Okay, so let’s start counting from the number four.”
Student: “4, 4 1/4, 4 1/2—I’m not too sure.”
Teacher: “Let’s go to this one, b.”
Student: “Um, that’s a half.”
------ End of the lesson play------
In this script, the PST mainly used the move of eliciting the student’s methods or reasoning for actions (“Could you explain to me how you got that?” “How did you get that?”) by asking to do the next question. Thus, although this script belongs to the high-frequency group, the interactions’ sequence was straightforward (the teacher asks a question, a student responds, the teacher poses additional problems without further elicitation). Additionally, it was evident that this PST intentionally tried to avoid evaluating or leading the student; however, overall, this script only seemed to create more cycles of the simplest form of IRE.
Teacher leading. Some lesson play scripts that developed more teacher talk turns during active elicitation contained various teacher-led talks. This included leading the student through a specific method or an answer; explicitly explaining or modeling the concept, procedure, or strategy; or giving an alternative strategy for the student to use.
Example 1 (middle-frequency group)
Teacher: “We have 1/2 and 2/4. Is one bigger than the other or are they equal?”
Student: “2/4 is bigger.”
Teacher: “How come?”
Student: “Because 4 is bigger than 2.”
Teacher: “Alright, let’s draw a picture for this one.”
------ The teacher drew the area model for each fraction and explained procedures -----
In Example 1, the PST incorporated the anticipated confusion. However, once the student responded with an incorrect answer, the PST took the path of explaining and teaching using drawn representations instead of further eliciting.
Example 2 (from the high-frequency group)
------ Some additional launch talks prior to this------
Teacher: “This question is asking you to compare these two fractions. Which is bigger, 2/3 or 9/10?”
Student: (Writes out thoughts to create common denominators) “9/10 is bigger.”
Teacher: “How did you get that?”
Student: “I made the fractions into ones with common denominators so 2/3 is 20/30 and 9/10 is 27/30. And 27 is bigger than 20 so 9/10 is larger.”
Teacher: “That’s awesome! Is there another way you can think about this problem?”
Student: “I don’t know.”
Teacher: “What if you think about what part of the fraction is missing? How much do you need for 9/10 to become whole?”
Student: “1/10?”
Teacher: “And how much do you need for 2/3 to become whole?”
Student: “1/3.”
Teacher: “Right, so, which fraction has a smaller missing part? Which fraction is smaller, 1/10 or 1/3?”
Student: “1/10 is smaller.”
Teacher: “So, what does that mean about 9/10 compared to 2/3?”
Student: “9/10 is a bigger fraction because it has a smaller missing part?”
Teacher: “Exactly! 9/10 has a smaller missing part, so the fraction is bigger!”
------ End of the lesson play------
In Example 2, the student answered using the algorithm involving finding the common denominator. More elicitation questions could have been used in response to the student’s initial method and answer. However, this PST gave an alternative strategy, which was prepared in the planning document. After this point, the teacher offered step-by-step instructions, and the student simply followed them.
Additionally, similar to some cases in the low-frequency group, several scripts specifically guided the process of using the cross-multiply algorithm. There were some additional types of talks in the scripts containing more turns of teacher talk, but the essence was the same in that the teacher attempted to lead the student’s thinking.
Teacher talks during closure. Table 6 summarizes the different ways PSTs concluded their lesson play scripts. Here, the frequencies are reported whether specific categories are present or absent in each script.
About 54% of the lesson play scripts ended with the student’s turn without the teacher’s closure. The percentage frequency in the low group was higher than in the other groups. About 20% of the scripts ended with neutral comments or thanking the student for their work. Another 20% of scripts ended with praise for specific aspects the student demonstrated. Additionally, 14% of the scripts wrapped up the interview with the teacher’s evaluative comments on the correctness of the answer or confirming comments about the strategies used.

6. Discussion and Implications for Teacher Educators

Over the last decades, the focus of research on teacher talk has shifted from justifying the need for teacher talk to supporting student learning as socialization (i.e., conversations) in the classroom to describing the quality of teacher talk, including discerning strategies and measuring effectiveness. That said, the research actually did not report much about the number of turns of teacher talk in eliciting and responding to student thinking. Considering a PST’s learning in teacher education as developmental, this study adds that the number of turns of teacher talk alone may not determine PSTs’ success in eliciting and responding to student thinking. Furthermore, practicing more turns of teacher talk would (1) produce more opportunities to reflect on the process and ultimately (2) increase the chance to facilitate effective teacher talk in mathematics classrooms. This section revisits this study’s findings and suggests two areas for further exploration in mathematics teacher education. The two areas include the total number of turns taken by PSTs and the types of teacher talks involved in the launch and closure.
Frequency of the number of turns taken by PSTs in lesson plays. As reported in the findings section, longer lesson plays did not necessarily produce rich talks, and some lesson plays primarily gained volume from excessive yet fruitless talks in terms of eliciting, such as guiding students on procedures. However, the data showed that the frequency of probing as a follow-up was even lower in the low-frequency group than the other two frequency groups. Thus, it is inconclusive whether the number of turns taken by the PSTs in the lesson plays could be indicative of their performance. However, it is also found that short lesson plays did not contain much elicitation only to present simple iterations of typical IRE/IRF cycles. It could be that the PSTs’ perceptions of eliciting and responding to students’ thinking are bound insofar as quickly asking questions and getting students’ answers. This leads us to suspect that the number of teacher talk turns indicates the complexity of student–teacher interactions and that the longer lesson plays may serve as a space where PSTs may write a mathematical conversation that starts from checking the correctness of student answers but should go beyond the IRE/IRF cycles as they learn to recognize teacher talk in teacher education as the opportunity to gain access to (and stretch out) student thinking.
Launching and closing lesson plays. Research has not reported much about the launch and closure stages of a lesson written by PSTs. This study looked into PSTs’ patterns of launching and closing their scripts as lesson play writers. While the focus of the lesson play was expected to be purposefully eliciting students’ thinking, the launch and closure in PSTs’ lesson plays occurred very abruptly in the majority of the scripts, indicating little build-up to gaining access to student thinking. It confirms Groth et al. [28], which reported that cognitive and affective preparation to establish a relationship with a student in the context of a clinical interview was somewhat neglected by the PSTs in their context of interacting with real students. Likewise, the launch in the majority of lesson play scripts (69% of scripts) in this study rushed to reading prepared problems and asking students to provide answers. Few scripts showed efforts to build rapport with students by assessing their interests, prior knowledge, or experiences. Similarly, more than half of the scripts (54%) ended with the student’s response without teacher closure. This is indicative of PSTs’ naïve perceptions of teacher practice and strategies to elicit student thinking—namely, asking a question and checking for the student’s answer. Although the launch and closure were not the most critical parts of a lesson play, it is worth noting that student thinking during launch, elicitation, and closure is not necessarily the same. It does change not only at each stage but also throughout the lesson plan as a whole. The findings further indicate that PSTs struggle with drawing a complete picture of a lesson with each piece of instruction. This is to tie together and build up to closure so that they can benefit from learning with examples in which student thinking develops as a continuum in a series of launches, elicitations, and closures facilitated by teacher talk.
Do lesson plays yield different patterns of elicitation from other approximation approaches? Writing lesson plays afforded the PSTs the opportunity to demonstrate their skills in eliciting and responding to students’ thinking in a conversational setting. In terms of a setting in teacher education that presents PSTs with approximations in practice, our study’s context is not as much authentic and real as the settings of other studies in the literature [4,11,28,29,31]. These studies had PSTs interact with a hypothetical student with a prescribed profile or real elementary students and make in-the-moment decisions in order to respond to the students, which is definitely a core pedagogical skill in a real teaching context. Understandably a classroom setting with students poses challenges, and it is a difficult task for novice teachers to demonstrate the skill to facilitate learning in that scenario. For example, Groth et al. [28] mentioned that probing student thinking—especially spontaneously (or unrehearsed) formulating teacher talks—was particularly difficult for the PSTs in their study. Our study provided a different condition and context from the previous studies, where making in-the-moment decisions is timed and critical.
Writing lesson plays in this study was not time-bound and allowed ample time for the PSTs to write their teacher talks. Thus, the context of writing lesson plays in this study was purposeful in that the study designed the task that has the effect of reducing unproductive teacher talk moves and increasing clarity and depth (albeit “imagined”) in eliciting and responding to student thinking. However, the findings indicate that some undesirable teacher talk moves still appeared in the lesson plays. For example, all five studies discussed in the literature review section reported that undesirable approaches such as leading questions, guiding and instructing students, and filling in student thinking commonly appeared when participating PSTs engaged in one-on-one interactions with students. Shaughnessy and Boerst [4] particularly addressed that these are the moves that may require unlearning. Our study shows that PSTs with similar backgrounds and coursework completed in teacher education as those in other studies presented teacher talks with undesirable approaches in the lesson plays and that the setting, whether it is one-on-one interactions or the hypothetical conversations in the mind of the participants in this study, made little difference.
Prior studies also reported that posing additional problems was hard for novice teachers, and it is a skill that needs to be newly learned [4,29]. Moyer and Milewics [11] reported about PSTs’ tendency towards check-listing—proceeding from one question to the next as planned with little regard for the child’s response. Despite the fact that, in this study, the PSTs seemed to anticipate the students’ confusion and additional (differentiated) problems to a degree, these were not fully played out in their lesson plays in their effort to gain access to student thinking and extend students’ mathematical thinking. In many cases, the PSTs wrote a lesson play with a successful student in mind (see the cases of overestimating students’ ability and performance in the findings section), who provided correct responses. The PSTs then wrapped up the lesson play without further probing. This tendency is very similar to the PSTs’ knowledge and skill in Moyer and Milewics [11].
We view that these findings are twofold. First, writing lesson plays can provide PSTs with an appropriate approximation of practice that offers learning opportunities for PSTs early in their coursework in teacher education, such as reviewing what happened and reflecting on what went well and what could improve. Second, the similar patterns of PSTs’ talk moves in prior studies and this study indicate that making in-the-moment decisions under time-pressured situations is not the only reason for the unsuccessful talk moves. It is worth noting that, in Kabar and Taşdan’s [31] study, four out of nine groups of PSTs remained intransigent in their questioning approaches during the three clinical interviews with real students over the course of a semester. When the PSTs’ patterns of eliciting and responding to student thinking remain unchanged regardless of contexts (e.g., more authentic contexts vs. less authentic contexts) or duration (e.g., one-time practice vs. multiple practices), more research is necessary to improve PST skills to elicit and respond to students’ thinking and use lesson plays as a pedagogical approach in teacher education to discern PST’s understanding of pedagogical content knowledge.
Before closing, some limitations of the study need to be noted. First, this study asked PSTs to develop their own tasks in the lesson plays. The quality of tasks varied, but most tasks had low complexity and cognitive demand. It is also unclear how the quality of tasks is related to teacher talks’ quality concerning the task. Although this was out of the scope of this study, we suspect that the quality of tasks might matter in unveiling the PSTs’ skills of eliciting and responding to student thinking and propose it as an important research question for future studies. Related, noticing [1,2,12] skills were not clearly addressed in our analysis; future research might carry out a full analysis of lesson plays in explicit terms of a noticing framework [42,43]. Second, this study did not have information on the PSTs’ field experiences. Each PST might have different experiences in facilitating teacher talks, depending on their field experiences. Notably, some mentor teachers actively use various talk moves, and others may keep to teaching methods that involve few student–teacher interactions.
To conclude, we view that it is as important to teach PSTs to develop skills to have access to student thinking and facilitate teaching, but what is needed for mathematics teacher educators is to continue to inquire into eliciting and responding to PSTs’ perceptions, experiences, knowledge, and needs. This study documents such an effort in teacher education embarking upon this particular mandate.

Author Contributions

Conceptualization, J.-E.L.; Formal analysis, J.-E.L. and W.L.; Funding acquisition, W.L.; Investigation, J.-E.L.; Methodology, J.-E.L. and W.L.; Writing—original draft, J.-E.L. and W.L.; Writing—review—editing, J.-E.L. and W.L. All authors have read and agreed to the published version of the manuscript.

Funding

This work was supported by the Yonsei University Research Grant of 2021.

Institutional Review Board Statement

The study was approved by the Institutional Review Board of Oakland University (Project # 526015-1 approved on 4 November 2013).

Informed Consent Statement

Informed consent was obtained from all subjects involved in the study.

Conflicts of Interest

The authors declare no conflict of interest.

Appendix A

Table A1. An Example of a PST’s Work.
Table A1. An Example of a PST’s Work.
Core QuestionWhat fraction of the rectangle is shaded?
Mathematics 09 02842 i001
Anticipated ConfusionThe shaded squares are not all together so it may be different from how the student has seen the area model being used; students might not have a clear concept of numerators, denominators, and fractions and it may confuse them.
Differentiated TasksLess challenging question: What fraction of the rectangle is shaded?
Mathematics 09 02842 i002
More challenging question: What fraction of the rectangle is shaded?
Mathematics 09 02842 i003
Follow-up Prompts
  • Can you explain how you got your answer?
  • (Pointing to numerator/denominator) What is this number? What does it represent?
  • (If the student simplified) Why did you choose to do that? How?
  • (If the student finished very quickly) Is there another way to solve it?
  • (If the student were confused about numerator and denominator) What would happen if 1/2 the rectangle was shaded? What would that fraction look like?
Anticipated Student Explanation
  • Core: 4/6 (or 2/3) because 4 squares out of the 6 equally partitioned squares are shaded
  • Less challenging: 2/4 because 2 out of the 4 equally partitioned rectangles is shaded
  • Harder: 4/9 because if I add one more partition in the end to make 9 equal-sized pieces, I can see that 4 out of 9 parts are shaded.
Lesson Play Script
  • Teacher (T): Hello! We’re going to be working with fractions. I prepared a little worksheet for us to work on together. You can ask me any questions along the way too. How do you feel about fractions?
  • Student (S): I kind of like fractions, but I think they’re hard sometimes.
  • T: I remember feeling like that too. But practice makes perfect! Ready to get going?
  • S: Sure!
  • T: Alright. The first question here is asking us what fraction of the rectangle is shaded. First, though, can you tell me what the top and bottom numbers of a fraction are called?
  • S: The top number is a numerator and the bottom number is the denominator.
  • T: Wow that’s so great! So how would we know what the denominator of this fraction is?
  • S: There are six total boxes in this rectangle, so I’m going to say that it is six.
  • T: Right. There are six total boxes of our whole, so six is the denominator. Go ahead and write that down.
  • S: (writes “six” on the paper with a line above it.)
  • T: Great. The top number we said was the numerator, right?
  • S: Yes, I think that would be four.
  • T: Wow you’re fast! Why do you say the numerator is four?
  • S: Because there are four blue boxes in the rectangle.
  • T: Okay great. So, what is our final answer?
  • S: Four-sixths (writes the four above the six).
  • T: I see something interesting about our fraction here. Is there a way we could simplify this fraction? Do you know what it means to simplify a fraction?
  • S: It means to make it smaller. But that’s kind of hard for me.
  • T: Well let’s take a look here. I brought some manipulatives for us to use. I have four blue squares for the numerator and six white squares for the denominator. What could we do to these squares?
  • S: We could group them together into twos?
  • T: Right, so on the top we take the four and turn them into two. What do we do on the bottom? S: We turn the six squares into two!
  • T: What does that leave us with?
  • S: Two-thirds!
  • T: Is 2/3 equal to 4/6?
  • S: It is!
  • T: Well, write that down then!

Appendix B

Table A2. Types of Teacher Talk Moves in Active Elicitation.
Table A2. Types of Teacher Talk Moves in Active Elicitation.
Inductive Categories and DescriptionsExample of Teacher TalkFrequency of Occurrence
(n = 748)
Low Group
(n = 113)
Middle Group
(n = 234)
High
Group
(n = 401)
Eliciting methods or reasoning for actions224 (30%)47 (42%)76 (32%)101 (25%)
Asking to describe the methods used (how)“Can you tell me what you used and what you did?”
“How did you get that answer?”
123284458
Asking to describe the reason (why)“Why do you think that?”
“Could you explain why you put the 1/2 and 4/6 where you did?”
6371218
Asking without specification (open)“Can you explain your answer?”58122025
Follow-up probing 198 (26%)17 (15%)49 (21%)132 (33%)
Gets the student to further explain their thinking “Okay, so you counted five squares. How did you know it was 5/8 though?”1601438108
Ask the student to show/explain the process step-by-step “Okay, what is your next step after that?”200614
Asks the student to explain/clarify underlying meaning “What do you mean by ‘half way’?”
“So, you got six. So, what does six mean?”
183510
Teacher-led process102 (14%)9 (8%)30 (13%)63 (16%)
Leads the student through a specific method or towards an answer
Explicitly explains or models the concept, procedure, or strategy
Gives an alternative strategy for the students to use after they give their answer
“How about we try to find the common denominator? We want to try to make it the same to see which one is bigger. Do you know how to find the common denominator?”
“First, you need to find a common denominator by multiplying denominators.”
“Another skill you could also try is drawing dotted lines and breaking up B into as many pieces of A as possible.”
8082339
Draws the student’s attention to certain details or differences “Are all the pieces evenly divided?”16169
Shows the student how to record the answer (e.g., writing down the answer or drawing the answer for the student before the student answers) “Yes, then you can put ¼ under here.”6015
Making connections100 (13%)19 (17%)37 (16%)44 (11%)
Asks the student to use different representations“Can you draw a picture to help me understand your thinking?”
“Could you show what that would look like if you use fraction circles?”
5292320
Asks if the situation can be extended to other situations “So now how would we do this with non-unit fractions?”4810 1424
Modifying questions57 (8%)10 (9%)23 (10%)24 (6%)
Gives a slightly harder or an easier prepared question“Show 1/2 in a pie circle drawing instead of 4/6.”
“Can you tell me one fraction that is larger than 6/10 and one fraction that is smaller than 6/10?”
2631211
Breaks down the question into parts“Let’s look at the square for the corn only. How much of the square does the corn take up?”18378
Rephrases or represents the question in a different form for the student due to confusion“Let me rephrase.”
“Let me write it down.”
13 4 4 5
Revoicing41 (5%)5 (4%)13 (6%)23 (6%)
Restate the student’s response to confirm“Because you had a whole circle and you cut it in half. Is it what you are saying?”4151323
Other miscellaneous 26 (3%)6 (5%)6 (3%)14 (3%)
Offers the student time to think or work(When the student seems unsure, the teacher does not speak and uses wait time.) “I’ll give you a minute.”10424
Checks for the student’s confusion “So, did the parts of the shaded circle confuse you?”7124
Teacher asks the student to recall prior problem or statement to help solve current problem“Do you remember what we talked about a fraction being in the first question?”6015
Reminds the student of what to do“Remember to utilize that number line.”
“So, remember there are two parts to this question.”
3111

References

  1. Schack, E.O.; Fisher, M.H.; Wilhelm, J. (Eds.) Teacher Noticing—Bridging and Broadening Perspectives, Contexts, and Frameworks; Springer: New York, NY, USA, 2017. [Google Scholar]
  2. Sherin, M.G.; Jacobs, V.R.; Philipp, R.A. (Eds.) Mathematics Teacher Noticing: Seeing through Teachers’ Eyes; Routledge: Lodon, UK, 2011. [Google Scholar]
  3. Lampert, M.; Beasley, H.; Ghousseini, H.; Kazemi, K.; Franke, M.L. Instructional explanations in the disciplines. In Using Designed Instructional Activities to Enable Novices to Manage Ambitious Mathematics Teaching; Stein, M.K., Kucan, L., Eds.; Springer: New York, NY, USA, 2010; pp. 129–141. [Google Scholar]
  4. Shaughnessy, M.; Boerst, T.A. Uncovering the Skills That Preservice Teachers Bring to Teacher Education: The Practice of Eliciting a Student’s Thinking. J. Teach. Educ. 2017, 69, 40–55. [Google Scholar] [CrossRef] [Green Version]
  5. Shaughnessy, M.; Boerst, T.A.; Farmer, S.O. Complementary assessments of prospective teachers’ skill with eliciting student thinking. J. Math. Teach. Educ. 2019, 22, 607–638. [Google Scholar] [CrossRef]
  6. Sztajn, P.; Confrey, J.; Wilson, P.; Edgington, C. Learning trajectory based instruction: Toward a theory of teaching. Educ. Res. 2012, 41, 147–156. [Google Scholar] [CrossRef]
  7. Ball, D.L.; Forzani, F.M. The Work of Teaching and the Challenge for Teacher Education. J. Teach. Educ. 2009, 60, 497–511. [Google Scholar] [CrossRef]
  8. Grossman, P.; Compton, C.; Igra, D.; Ronfeldt, M.; Shahan, E.; Williamson, P. Teaching practice: A cross-professional per-spective. Teach. Coll. Rec. 2009, 111, 2055–2100. [Google Scholar]
  9. Zazkis, R.; Liljedahl, P.; Sinclair, N. Lesson plays: Planning teaching versus teaching planning. Learn. Math. 2009, 29, 40–47. [Google Scholar]
  10. Zazkis, R.; Sinclair, N.; Liljedahl, P. Lesson Play in Mathematics Education: A Tool for Research and Professional Development; Springer: New York, NY, USA, 2013. [Google Scholar]
  11. Moyer, P.S.; Milewicz, E. Learning to question: Categories of questioning used by preservice teachers during diagnostic mathematics interviews. J. Math. Teach. Educ. 2002, 5, 293–315. [Google Scholar] [CrossRef]
  12. Jacobs, V.R.; Lamb, L.L.C.; Philipp, R.A. Professional Noticing of Children’s Mathematical Thinking. J. Res. Math. Educ. 2010, 41, 169–202. [Google Scholar] [CrossRef]
  13. Grossman, P.; McDonald, M. Back to the Future: Directions for Research in Teaching and Teacher Education. Am. Educ. Res. J. 2008, 45, 184–205. [Google Scholar] [CrossRef]
  14. TeachingWorks. (n.d.). High-Leverage Practices. Available online: http://www.teachingworks.org/work-of-teaching/high-leverage-practices (accessed on 5 December 2019).
  15. Levin, T.; Long, R. Effective Instruction; Association for Supervision and Curriculum Development: Alexandria, VA, USA, 1981. [Google Scholar]
  16. Stevens, R. The Question as a Means of Efficiency in Instruction: A Critical Study of Classroom Practice; Teachers College, Columbia University: New York, NY, USA, 1912. [Google Scholar]
  17. Boaler, J.; Brodie, K. The importance, nature and impact of teacher questions. In North American Chapter of the International Group for the Psychology of Mathematics Education, Proceedings of the Twenty-Sixth Annual Meeting of the North American Chapter of the International Group for the Psychology of Mathematics Education; Toronto, ON, Canada, 21–24 October 2004; McDougall, D.E., Ross, J.A., Eds.; University of Toronto: Toronto, ON, Canada, 2004; Volume 2, pp. 774–782. [Google Scholar]
  18. Amy, C.; Brualdi, T. Classroom questions. Pract. Assess. Res. Eval. 1998, 6, 6. [Google Scholar] [CrossRef]
  19. Wood, D. Teaching talk. In Thinking Voices: The Work of the National Oracy Project; Norman, K., Ed.; Hodder & Stoughton: London, UK, 1992; pp. 203–214. [Google Scholar]
  20. Mercer, N.; Dawes, L. The study of talk between teachers and students, from the 1970s until the 2010s. Oxf. Rev. Educ. 2014, 40, 430–445. [Google Scholar] [CrossRef]
  21. Wells, G. Dialogic Inquiry: Towards a Sociocultural Practice and Theory of Education; Cambridge University Press: Cambridge, UK, 1999. [Google Scholar]
  22. Chapin, S.H.; O’Connor, C.; Anderson, N.C. Classroom Discussions in Math: A teacher’s Guide for Using Talk Moves to Support the Common Core and More; Math Solutions: Sausalito, CA, USA, 2013. [Google Scholar]
  23. Lim, W.; Lee, J.; Tyson, K.; Kim, H.; Kim, J. An integral part of facilitating mathematical discussions: Follow-up questioning. Int. J. Sci. Math. Educ. 2020, 18, 377–398. [Google Scholar] [CrossRef]
  24. Lampert, M. When the problem is not the question and the solution is not the answer: Mathematical knowing and teaching. Am. Educ. Res. J. 1990, 27, 29–63. [Google Scholar] [CrossRef]
  25. Borko, H. Professional Development and Teacher Learning: Mapping the Terrain. Educ. Res. 2004, 33, 3–15. [Google Scholar] [CrossRef]
  26. Grossman, P. Teaching Core Practices in Teacher Education; Harvard Education Press: Cambridge, MA, USA, 2018. [Google Scholar]
  27. Zeichner, K.M. The Turn Once Again Toward Practice-Based Teacher Education. J. Teach. Educ. 2012, 63, 376–382. [Google Scholar] [CrossRef]
  28. Groth, R.E.; Bergner, J.A.; Burgess, C.R. An exploration of prospective teachers’ learning of clinical interview techniques. Math. Teach. Educ. Dev. 2016, 18, 48–71. [Google Scholar]
  29. Weiland, I.; Hudson, R.; Amador, J. Preservice formative assessment interviews: The development of competent questioning. Int. J. Sci. Math. Educ. 2014, 12, 329–352. [Google Scholar] [CrossRef]
  30. Herbst, P.; Chieu, V.; Rougee, A. Approximating the practice of mathematics teaching: What learning can web-based, mul-timedia storyboarding software enable? Contemp. Issues Technol. Teach. Educ. 2014, 14, 356–383. [Google Scholar]
  31. Kabar, M.G.D.; Tasdan, B.T. Examining the change of pre-service middle school mathematics teachers’ questioning approaches through clinical interviews. Math. Teach. Educ. Dev. 2020, 22, 115–138. [Google Scholar]
  32. Zazakis, R.; Herbst, P. (Eds.) Scripting Approaches in Mathematics Education; Springer: Cham, Switzerland, 2017. [Google Scholar]
  33. National Governors Association Center for Best Practices & Council of Chief State School Officers. Common Core State Standards for Mathematics; NGA & CCSSO: Washington, DC, USA, 2010. [Google Scholar]
  34. National Council of Teachers of Mathematics. Principles to Actions: Ensuring Mathematics Success for All; NCTM: Reston, VA, USA, 2014. [Google Scholar]
  35. Creswell, J.W.; Creswell, J.D. Research Design: Qualitative, Quantitative, and Mixed Method Approaches, 5th ed.; Sage Publication: Thousand Oaks, CA, USA, 2017. [Google Scholar]
  36. Decuir-Gunby, J.T.; Marshall, P.L.; McCulloch, A.W. Developing and Using a Codebook for the Analysis of Interview Data: An Example from a Professional Development Research Project. Field Methods 2010, 23, 136–155. [Google Scholar] [CrossRef]
  37. Grbich, C. Qualitative Data Analysis: An Introduction, 2nd ed.; Sage: Thousand Oaks, CA, USA, 2013. [Google Scholar]
  38. Riffe, D.; Lacy, S.; Fico, F. Analyzing Media Messages: Using Quantitative Content Analysis in Research; Erlbaum: Mahwah, NJ, USA, 1998. [Google Scholar]
  39. Rourke, L.; Anderson, T.; Garrison, D.R.; Archer, W. Assessing social presence in asynchronous text-based computer con-ferencing. J. Distance Educ. 2001, 14, 50–71. [Google Scholar]
  40. Lamon, S. The development of unitizing: Its role in children’s partitioning strategies. J. Res. Math. Educ. 1996, 27, 170–193. [Google Scholar] [CrossRef]
  41. Yoshida, H.; Sawano, K. Overcoming cognitive obstacles in learning fractions: Equal-partitioning and equal-whole. Jpn. Psychol. Res. 2002, 44, 183–195. [Google Scholar] [CrossRef]
  42. Banner, D. Frameworks for noticing in mathematics education research. In Proceedings of the 42nd Meeting of the North American Chapter of the International Group for the Psychology of Mathematics Education, Mazatlán, Mexico, 14–18 October 2020; pp. 1572–1576. [Google Scholar]
  43. Mason, J. Learning about noticing, by, and through, noticing. ZDM 2021, 53, 231–243. [Google Scholar] [CrossRef]
Figure 1. Representing 1/3.
Figure 1. Representing 1/3.
Mathematics 09 02842 g001
Figure 2. (a) Fractions on a number line. (b) Using models to represent fractions.
Figure 2. (a) Fractions on a number line. (b) Using models to represent fractions.
Mathematics 09 02842 g002
Figure 3. (a) Example 1. (b) Example 2.
Figure 3. (a) Example 1. (b) Example 2.
Mathematics 09 02842 g003
Figure 4. Locating fractions on a number line.
Figure 4. Locating fractions on a number line.
Mathematics 09 02842 g004
Table 1. Studies on PST performance in eliciting and responding to student thinking.
Table 1. Studies on PST performance in eliciting and responding to student thinking.
Study/Participants/ContextCategories of PST Interactions Identified and Examined
Moyer and Milewics [11]
48 elementary PSTs (the senior year before final internship placement for teacher certification)
One-on-one diagnostic mathematics interviews with children ranging in age from 5 to 12
Making a checklist (the PST proceeds from one question to the next with little regard for the child’s response):
  • Posing few follow-up questions
  • Indicating verbal checkmarks only
  • Instructing rather than assessing:
  • Leading questions that direct the child’s response
  • Abandoning questioning and teaching the concept directly
  • Probing and follow-up (invite or further investigate the child’s answer):
  • Questioning only incorrect responses
  • Non-specific questioning
  • Competent questioning
Groth, Bergner,
and Burgess [28]
Four PSTs (two elementary and two secondary)
One-on-one pre- and post-assessment interviews with elementary students (Grades 3 and 5)
  • Cognitive and affective preparation to establish a relationship with a student in the context of a clinical interview: Somewhat neglected by the PSTs.
  • Teacher vs. interviewer stance: Difficult for PSTs to avoid guiding children rather than assessing their thinking.
  • Objective researcher vs. active participant: Need to learn how to deviate from the prepared script.
  • Quantity and quality of probes: Spontaneously formulating probes was particularly difficult.
Weiland, Hudson, and Amador [29]
One pair of elementary PSTs
Formative assessment interview over ten weeks with two elementary students to examine changes over time
Problem-Posing Questions
  • Protocol: PSTs directly ask questions taken from the protocol or slightly modified from the protocol.
  • Framing: PSTs use tasks/questions that are not on the protocol to introduce/frame the prepared tasks/questions on the protocol.
  • New: PSTs ask tasks/questions that are not on the protocol.
  • Repeat: PSTs repeat questions to refocus the students or react to students’ questions.
Instructing-Rather-Than-Assessing Questions
  • Teaching and telling: Instruct students
  • Leading questions: Talks that offer hints or cues
Follow-Up Questions
  • Competent: Questions in response to student response to build on, justify, and explore reasoning.
  • Incorrect: Questions in response to student response to tell the response was incorrect.
Areas for Further Development
  • Asking leading questions
  • Creating opportunities to probe student thinking
Shaughnessy and Boerst [4]
47 elementary PSTs in the initial stage of teacher education
One-on-one interview with a hypothetical student
Moves that Require New Learning
  • Eliciting/probing students’ mathematical thinking (PSTs were not effective in asking the students to document their process to solve the problem, posing an additional problem to learn more about the students’ thinking, and probing their understanding of the process and key mathematical ideas.)
Moves That Can Be Built Upon
  • Facing the student when asking questions and positioning the student’s work so that the student could see their work and participate in the interaction.
  • Asking questions that elicited the student’s process for solving the problem.
  • Attending to the student’s ideas when they posed questions.
Moves that May Require Unlearning
  • Filling in the student’s process or understanding during the interaction.
  • Influencing the student to solve the problem using a particular process.
Kabar and Taşdan [31]
22 PSTs in nine working groups
Three clinical interviews of middle-school students over a semester
Question Types Identified:
  • Factual Questions
  • Procedural-Next Step Questions
  • Leading Questions
  • Yes-No Questions
  • Probing Questions
  • General Questions
Four groups of PSTs did not change their questioning approaches from the first interview to the third interview.
Table 2. Requirements for each set of tasks.
Table 2. Requirements for each set of tasks.
Problem SetFocus of the Question and Required RepresentationsRelevant Standards
1Basic concepts of fractions using the area modelUnderstand a fraction 1/b as the quantity formed by one part when a whole is partitioned into b equal parts; understand a fraction a/b as the quantity formed by a parts of size 1/b.
2Basic concepts of fractions using the number line representationsUnderstand a fraction as a number on the number line; represent fractions on a number line diagram.
3Comparing two fractions (no required representation)Explain equivalence of fractions in special cases, and compare fractions by reasoning about their size.
Table 3. Leveled frequency of teacher talk.
Table 3. Leveled frequency of teacher talk.
Frequency Level Based on the Number of Turns of Teacher TalkNumber of Lesson Play Scripts (n = 95 Scripts)
Low (1–5 turns) 30 scripts (Set 1: 13 scripts, Set 2: 10 scripts, Set 3: 7 scripts)
Middle (6–9 turns)33 scripts (Set 1: 11 scripts, Set 2: 11 scripts, Set 3: 11 scripts)
High (10+ turns)32 scripts (Set 1: 8 scripts, Set 2: 11 scripts, Set 3: 13 scripts)
Table 4. Inductive categories and descriptions of teacher talking during launch.
Table 4. Inductive categories and descriptions of teacher talking during launch.
Inductive Categories and DescriptionsExampleTotal
(n = 95 Scripts)
Low
(n = 30 Scripts)
Middle
(n = 33 Scripts)
High
(n = 32 Scripts)
Presenting prepared written problems only66 (69%)22 (73%)23 (70%)21(66%)
Reads or asks to read prepared problems as written Teacher read: “What fraction of the square is shaded?”
“Could you read the question?”
45 (47%)16 (53%)14 (43%)15 (47%)
Slightly elaborates/rephrases problemsThe problem: “What fraction is on the number line?”
Teacher said: “We are going to find the number that is shown on the number line. We have four choices here. Can you find the number that’s marked here?”
21 (22%)6 (20%)9 (27%)6 (19%)
Additional talks in addition to presenting written problems29 (31%)8 (27%)10 (30%)11 (34%)
Checks on background knowledge/experience“Have you ever used these [fraction circles] before?”
“Do you ever see fractions outside of school?”
13 (14%)6 (20%)5 (15%)5 (16%)
Describes protocol for interview“Here are some markers and papers. Feel free to use them whenever you need.”9 (9%)2 (7%)4 (12%)4 (13%)
Informs math topics to be asked“We’ll be comparing, testing for equivalency, and decomposing fractions.” 7 (7%)3 (10%)2 (6%)2 (6%)
Checks on key terms“Do you know what unit fractions mean?”7 (7%)2 (7%)2 (6%)5 (16%)
Note. Some PSTs’ first teacher talks consisted of multiple statements with different purposes and forms, and those were included in multiple categories. As a result, the sum of frequencies of subcategories in each column is not equal to the total frequency.
Table 5. Analysis of teacher talk during active elicitation.
Table 5. Analysis of teacher talk during active elicitation.
Inductive Categories and DescriptionFrequency of Occurrence
(n = 748)
Low Group
(n = 113)
Middle Group
(n = 234)
High Group
(n = 401)
Elicitation of methods or reasoning for actions *224 (30%)47 (42%)76 (32%)101 (25%)
Follow-up probing 198 (26%)17 (15%)49 (21%)132 (33%)
Teacher-led process102 (14%)9 (8%)30 (13%)63 (16%)
Making connections100 (13%)19 (17%)37 (16%)44 (11%)
Modifying questions57 (8%)10 (9%)23 (10%)24 (6%)
Revoicing41 (5%)5 (4%)13 (6%)23 (6%)
Other miscellaneous 26 (3%)6 (5%)6 (3%)14 (3%)
* Note: In case the PSTs utilized additional (differentiated) problems, more talks were coded in this category.
Table 6. Analysis of teacher talks during closure.
Table 6. Analysis of teacher talks during closure.
Inductive Categories and DescriptionsExampleTotal (n = 95 Scripts)Low (n = 30 Scripts)Middle
(n = 33 Scripts)
High
(n = 32 Scripts)
Ends with student answer without teacher’s closureStudent: I don’t know (unable to tell me).51 (54%)25 (66%) 16 (53%)10 (37%)
Wraps up with a neutral comment“I see.” “Okay.” “I think I understand what you mean.” “Thank you.”19 (20%)9 (24%)7 (23%)3 (11%)
Praises the student“That is well done. I like how you drew arrows to show the simplified area model.” “Wow, that’s fast!” “You are very good with mental math.”19 (20%)3 (8%)7 (23%)9 (33%)
Wraps up with an evaluative/confirmative comment on the final answer or strategies used“So, when we’re comparing fractions it helps to look at the denominators.”
“That’s correct. 2/4 or 1/2 because those are equivalent.”
14 (14%)4 (11%)6 (15%)4 (14%)
Empathizes that the problem was difficult“This problem was difficult because the model isn’t something we normally see.”2 (2%)0 (0%)0 (0%)2 (7%)
Asks if the offered tools were helpful “Did it help using these fraction circles?”2 (2%)0 (0%)0 (0%)2 (7%)
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Share and Cite

MDPI and ACS Style

Lee, J.-E.; Lim, W. Preservice Teachers’ Eliciting and Responding to Student Thinking in Lesson Plays. Mathematics 2021, 9, 2842. https://doi.org/10.3390/math9222842

AMA Style

Lee J-E, Lim W. Preservice Teachers’ Eliciting and Responding to Student Thinking in Lesson Plays. Mathematics. 2021; 9(22):2842. https://doi.org/10.3390/math9222842

Chicago/Turabian Style

Lee, Ji-Eun, and Woong Lim. 2021. "Preservice Teachers’ Eliciting and Responding to Student Thinking in Lesson Plays" Mathematics 9, no. 22: 2842. https://doi.org/10.3390/math9222842

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop