Next Article in Journal
How Does Education Quality Affect Economic Growth?
Previous Article in Journal
Sustainable Education Using Augmented Reality in Vocational Certification Courses
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

How Reducing Discomfort Impacts Peer Assessments of Preservice Teachers

1
Graduate School of Education, Yonsei University, Seoul 03722, Korea
2
Graduate School of Education, University at Buffalo, Buffalo, NY 14260, USA
*
Author to whom correspondence should be addressed.
Sustainability 2021, 13(11), 6435; https://doi.org/10.3390/su13116435
Submission received: 25 March 2021 / Revised: 2 June 2021 / Accepted: 3 June 2021 / Published: 5 June 2021

Abstract

:
This study examined the effects of a feedback model called Peer Review of Teaching (PRT) on preservice teachers’ learning. In this model, preservice teachers (n = 81) participated in critical feedback on teaching demonstrations in the absence of presenters. Presented are four themes of the experience of teaching and sharing feedback including how the absence of a peer presenter impacted feedback process. Our findings suggest that teacher educators create intellectually safe and sensitive learning opportunities with critical feedback for preservice teachers to engage in a professional practice of peer assessments.

1. Introduction

Notions related to teachers as reflective practitioners have received considerable attention by researchers and teacher educators [1,2,3,4]. A common view of the reflective practitioner is that teachers share effective practice with peers and reflect on the appropriate use of research and feedback [5]. As reflection is considered key to becoming educators, teacher educators ensure that preservice teachers (PSTs) have authentic opportunities to engage in reflection and experience teaching as a scholarly endeavor that involves a cohesive body of theories, research, critical thinking, and evaluation [6,7,8].
Reflection at the level of preservice teachers is an individual and social experience with the potential to result in professional growth. Such opportunities include sharing professional views on teaching and learning, examining teaching practice, and devising action plans for improvement. In particular, constructing lesson plans and enacting curriculums are important aspects of teacher preparation programs. Peer feedback on teaching tasks often play a significant role, as it enables PSTs to engage in evaluating peers’ performances and reflect on their own teaching [9,10,11].
The potential benefits of peer feedback on PSTs’ teaching are clear: it helps PSTs identify their strengths and areas of growth [12,13], it increases collegiality and helps to shape professional practice [13,14], and more importantly it “helps to clarify some of the ways in which teacher education can model, and influence, the nature of schools as learning communities” [15]. However, more research is necessary to confirm the extent and favorable conditions of critical and constructive peer feedback experiences on PSTs’ teaching [16]—i.e., the way peer review of teaching is influenced by personal reflection and interpersonal factors especially when PSTs demonstrated low participation [17,18,19] in peer feedback.
The purpose of this study is to examine the impact of a feedback model in which teacher candidates have the opportunities to offer critical feedback to their peers and reflect on the whole experience of teaching and analyzing the delivery. The research questions that guided the study are:
  • How do PSTs perceive peer feedback?
  • How does their peer feedback change with their experience of a new feedback model?
  • How does participation in the feedback model affect PSTs’ learning to teach?
This paper contributes to the fields of feedback and teacher education by investigating the valued but often poorly implemented pedagogical practice of peer review. The study illustrates the specific procedure of a feedback model that can be replicated and discusses the ways a carefully and purposefully designed peer feedback in teacher education has the potential to develop professional reflective practice for PSTs.

2. Literature Review

2.1. Studies on Peer Feedback

According to Topping [10], peer feedback is an “arrangement in which individuals consider the amount, level, value, worth, quality, or success of the products or outcomes of learning of peers of similar status” (p. 250). A large body of literature on peer assessment highlight the benefits of peer feedback, reporting that peer feedback is not only a tool to evaluate students but also a teaching tool that engages students [9,11].
However, despite the potential benefits of peer feedback on students’ learning, some concerns remain regarding the practice of peer feedback. Literature suggests that students are easily biased or not honest in giving feedback due to gender, race, interpersonal relationships, or personal preferences [20,21,22,23]. Topping [10] documented students’ difficulty in offering critical feedback to classmates because it could damage relationships. This study defines critical feedback as verbal interactions “aimed at redirecting and improving” the receiver’s teaching [24]. Another area of concern with peer feedback is the sense of embarrassment felt by those receiving feedback when weaknesses are identified [25,26]. Research [11,12] indicates that students conducting face-to-face feedback frequently expressed anxiety in sharing their comments for fear of being rejected, refuted, or regarded as having low expertise by their peers. Machin and Jeffries [27] added that peer feedback could impact students’ self-esteem and belonging.
To alleviate these concerns, several researchers have suggested the feature of good feedback (e.g., [28]) and instructional approaches to increase the quality of peer feedback content (e.g., [12,27]). In particular, the following instructional approaches have been suggested to improve peer feedback: (1) using computer-mediated communication to avoid the possible embarrassment or discomfort faced by students in face-to-face interactions [27], (2) using multiple evaluators to balance the uneven quality of peer feedback [22,29,30], and (3) using anonymous peer feedback to minimize opportunities for students to reward friends or cheat during the peer feedback process [12,27].

2.2. Anonymous Feedback

Peer feedback can only be as effective as the authenticity of the process. Anonymous peer feedback refers to a peer review condition in which the reviewers and reviewees are kept unknown to one another. This condition can contribute to an environment where reviewers and reviewees may avoid face-to-face discomfort while sharing detailed discussions of performance and candid evaluations. Research has documented the major advantage of anonymous peer feedback [12,21,23,27,30,31,32]. Research [19,27] reported that anonymity could increase critical feedback because in anonymous situations, students are relieved from social pressure so they can be more honest and less anxious about expressing their opinions regardless of interpersonal factors. In particular, Connolly, Jessup, and Valacich [31] compared the characteristics of feedback between anonymous groups and non-anonymous groups, and showed that anonymous groups demonstrate more critical thinking than non-anonymous groups. Despite the many benefits of anonymity discussed in the literature, however, research on the impact of anonymous peer feedback on preservice teachers’ learning is sparse. Most previous studies focused on the effect of anonymous peer feedback on student learning. As Scheeler, Ruhl, and McAfee [33] pointed out, few studies have focused on the effective feedback of teachers. This study intended to address this gap by developing a feedback model emphasizing anonymity and implementing it with PSTs.

2.3. What Effective Feedback Entails

  • Drawing on sociocultural theory (which highlights human intentions and possibilities and how they can be developed) and social constructivism (which focuses on how learners are actively engaged in constructing their knowledge), we can assume that teachers guide the learning process and peers are involved through collaboration. When multiple peers and teachers provide feedback, it allows those who provide or receive feedback to have the opportunity to reflect and refine their learning. With respect to specific ways for a system of peer feedback to engage learners in the learning process, the characteristics of effective feedback [28] can be categorized into five clusters (Table 1): (1) overview of feedback characteristics, (2) task-related characteristics, (3) timing, (4) affective and emotional characteristics, and (5) effects on learners.
Given that the task-related cluster of elements in Table 1 mirrors the authenticity of feedback, in exploring the efficacy of a feedback model on PSTs’ learning we examined the levels of PSTs’ engagement in participation.

2.4. Formative Feedback

A key aspect of formative feedback is that the learner receives information to improve learning [34,42]. Black and Wiliam [43] articulated the essential elements of formative feedback: clarifying learning goals and criteria for evaluation; implementing learning tasks to produce evidence of student understanding; providing feedback that enables learners to make changes; using peers as instructional resources; and fostering a sense of ownership in learning for students. Further, Hattie and Timperley [44] conceptualized feedback as information on performance or understanding, involving possible agents such as teacher, peer, self, parent, book, and experience. Over time, the field has begun to examine the relationships between effective feedback and outcomes, looking at how characteristics of agents, different types of knowledge, and feedback types can together create different interactions and influence feedback models [44,45].
Peer feedback models in teacher education settings are connected to various feedback models (e.g., [44]) proposed in the field of formative feedback. In particular, peer review of teaching (PRT) involves agents such as peers with the common professional aspiration, teacher knowledge through performance, and a critical discourse through which peers share constructive feedback on teaching performance [46,47]. The present study implements a PRT practice in a university-based teacher education program and examined the relationship between affective components of feedback and level of engagement in the collaborative model of PRT [12,13]. We aimed to illustrate how preservice teachers process information about content, understand the performance of teaching tasks, and foster a sense of trust through participating in the professional practice of feedback sharing.

3. Materials and Methods

3.1. Research Design

The study used both qualitative data (e.g., survey comments and interviews) and quantitative data (e.g., Likert scale items and engagement counts). Multiple data (i.e., surveys, follow-up interviews, observation notes, and reflection assignments) were used to triangulate the interplay of PSTs’ participation in feedback sessions in relation to the PSTs’ behaviors and attitudes demonstrated in mathematics methods and during student teaching.
In order to answer the first research question concerning participants’ prior experiences and perceptions of peer feedback, we conducted a survey. To answer the second question of how the participants’ feedback content changed as they experienced the PRT model we conducted a survey, made observations of feedback sessions, and reviewed PSTs’ written reflections. To answer how participants’ peer feedback changed as they experienced the PRT model and how peer feedback impacted PSTs’ learning, we focused on the number of students that demonstrated active participation to represent the engagement level of participation over time. Lastly, we conducted follow-up interviews with students, instructors, and field supervisors in order to complement written and observed data.

3.2. The Feedback Model and the Procedure

The PRT model was developed by adapting the Small Group Instructional Diagnosis (SGID, [48,49]), which was a method of feedback on faculty instruction developed at the University of Washington. The SGID feedback system involves using facilitators to conduct group interviews to get student feedback for instructors on the courses’ strengths, areas for improvement, and suggestions for change. In principle, SGID provides a set of procedures designed to review teaching as formative evaluation and build student consensus via classroom interviews toward an individual’s effectiveness in teaching in a guilt-free environment where the instructor is not present. However, the model is not necessarily meant to increase students’ learning of teaching; thus, students may evaluate instructor’s teaching to “pass judgment but rather to diagnose problems, to enrich the teaching and learning environment, and to promote collegiality” [49]. In order to implement a PRT model for preservice teachers, this study revised the SGID method such that our model provides meaningful opportunities for PSTs (not as students but as peers) to inform and improve their own teaching by drawing on the dialogs of teaching demonstrated by their peers. In particular, our PRT model uses multiple evaluators to build consensus and rely on anonymous peer feedback to increase critical feedback. The detailed procedures of the PRT model in the methods course are as follows:
  • Establish a shared understanding about what constitutes good teaching.
  • Use key assignments such as writing lesson plans and micro teaching highly relevant teaching tasks.
  • The PST (the presenter) conducts a mock teaching assignment (i.e., micro teaching) for 20–30 min.
  • Peers play a role as schoolchildren at an appropriate grade level for the presentation.
  • After the teaching task is complete, the presenter leaves the room.
  • The peer group participates in a feedback session facilitated by the instructor (or a volunteer peer), and a volunteer student takes session minutes. A common rubric is used to structure feedback.
    • Feedback focuses on strengths, weaknesses, and suggestions.
    • The participants are encouraged to use course discussions and readings to present and support opinions, reflect on their own teaching practices, and suggest specific strategies to improve weaknesses.
  • The presenter returns and the instructor highlights positives briefly.
  • The instructor schedules a follow-up meeting with the presenter to examine positives, negatives, and recommendations on areas for improvement.
  • The peer group participants, including the presenter, provide written reflections about their learning experiences in the PRT model.
These procedures were reiterated and finalized during a pilot study over two semesters. During the pilot study several variations of the PRT model were used before the final version of the PRT model used in this study. For example, in the original model, instructors asked PSTs to facilitate feedback sessions because facilitating a group discussion could serve as a learning opportunity to develop leadership skills. We thought PSTs might respond differently when their peers facilitated feedback sessions instead of the instructor.
However, we found that the level of participation was about the same regardless of the facilitator. Thus, for the current study, participants were encouraged to volunteer as facilitator and instructors facilitated when there was no volunteer. Table 2 shows how salient features of the PRT model are supported by the research introduced in the literature review section of this article.

3.3. Participants and Procedure

Eighty-one PSTs participated in the current study over five semesters. The investigator of the study received permission from the institutional review board of the university to conduct this research of preservice teachers with their free and informed consent. Participants were enrolled in middle grades mathematics methods courses at a public university in a southeastern state of the United States from fall 2018 through fall 2020. All of the participants had completed the required course equivalent of a pre-calculus course during their freshman or sophomore year and were in their senior year prior to a field experience course requiring a minimum of 135 h of clinical experience in a local public school. Of the 81 participants for the study, 29 were male. There were 14 PSTs participating in fall 2018; 17 in spring 2019, 20 in fall 2019, 19 in spring 2020, and 11 in fall 2020. Of the 81 participants for the study, 25 PSTs attended follow-up interviews; 5 in fall 2018, 5 in spring 2019, 6 in fall 2019, 5 in spring 2020, and 4 in fall 2020. The last two interviews (spring 2020 and fall 2020) were conducted online via Zoom due to COVID-19.
The middle grades mathematics methods courses were taught by three instructors who used a common syllabus. The course was designed to engage PSTs in the theory and practice of teaching while supporting their understanding of pedagogy and issues relevant to the teaching and learning of school mathematics. Assignments included writing lesson plans, unit plans, reading research articles, presenting mathematics lessons, and weekly assignments (e.g., solving math problems, analyzing videotaped lessons, analyzing students’ work, etc.). At the beginning of each method course, a survey measuring the PSTs’ past experiences and perceptions of peer feedback was conducted. A key assignment of the course was creating a lesson plan for micro teaching. During the micro teaching assignment, PSTs taught a portion of a lesson (20–30 min) to classmates in which they demonstrated classroom teaching skills; after the class observed each peer presentation, they were encouraged to share feedback.
All 81 PSTs participated in the PRT model. Each class meeting had one or two micro teaching events followed by a PRT session. For example, the course in fall 2018 had 17 PRT sessions and 19 micro teaching events. We note the study has been influenced by the COVID-19 pandemic during the spring and fall semesters in 2020. Adaptations to minimize the disruptions include online PRT and feedback sessions via Zoom. Although COVID-19 has caused a drop in course enrollment numbers in fall 2020, there were no COVID-19 related withdrawals from the study. Despite some students’ comments about their online learning experiences via Zoom, the study did not include the case in the analysis because the impact of COVID-19 on (online) teacher education falls outside the scope of this study.
In addition to micro teaching and participation in the PRT model, participants were asked to complete reflective writings six times during the course of the semester. In their writings they were asked to analyze and reflect on their teaching performance, perceived strengths, challenges faced, and changes implemented to improve teaching. Furthermore, at the end of the semester participants were asked to complete a survey measuring their overall satisfaction with the PRT feedback experience.
Follow-up meetings were arranged during the period when the PSTs were placed in field experience. In total, follow-up interviews were composed of 25 PSTs (31%), 3 instructors, and 5 field supervisors. The participants who indicated an interest in attending the interview in the permission form were invited, with no incentive provided for participating. The same interview protocol was applied in each semester of the study.

3.4. Data Collection

Data sources in this study included PSTs’ responses to pre/post surveys, observation notes, session minutes, reflection assignments, and follow-up interviews.

3.4.1. Surveys

Two surveys were used (see Appendix A and Appendix B), one at the beginning and another at the end of the semester. In the beginning of the semester, PSTs were asked to report their past experience and perception of peer feedback. Regarding their prior experience, PSTs listed all of the peer feedback methods and formats in which they participated. They also rated their overall past experience of peer feedback in terms of meaningfulness and valuableness of the learning opportunity by using a Likert scale of 1–5. The participants were also asked to respond in a commentary box regarding the impact of a presenter remaining in the room and using writing as primary feedback tool. A second survey was administrated at the end of the course to measure overall satisfaction with the PRT feedback experience. Participants were asked to respond in a commentary box regarding the degree to which the absence of the presenter affected their feedback behavior and the factors that allowed them to participate in peer feedback more meaningfully, as well as how the experience influenced their own learning in the methods course. Both surveys were anonymous. The response rate for the first survey was 78% with 63 surveys completed out of 81 surveys distributed, and the rate for the second survey was 75% with 61 surveys completed.

3.4.2. Observation Notes

The notes were taken by three research assistants who completed at least one graduate level qualitative research methods course in the doctoral program. The research assistants observed feedback sessions four times at random per course, taking free notes on settings/interactions to capture concrete, detailed, and textured descriptions of how PSTs participated in the PRT feedback model. The observer stayed only during the PRT session and the notes were used for agreement on the operational definition of meaningful participation. Both the research assistants and the class were informed about the study and research questions. The study also used session minutes to calculate the level of participation. The minutes indicated the names of those who provided feedback and brief description of feedback.

3.4.3. Reflective Writings

The writing assignments of each participant were collected six times during the semester. The first five assignments asked PSTs to analyze and reflect on their teaching performance, including their perceived strengths, challenges faced, and changes implemented to improve teaching. PSTs were also asked to make connections to the research readings provided by the instructor. There was one writing assignment given to PSTs who agreed to participate in the study asking them to evaluate the PRT feedback model and reflect on their experience engaging with it.

3.4.4. Follow-Up Interviews

Follow-up interviews were conducted twice per course after the course was complete and students were in their fieldwork. First, a follow-up interview session was designed for PSTs to respond to researchers’ clarifying questions analyzing the written responses of surveys. A second interview session was designed for instructors and field supervisors to provide their comments about the feedback model and its impact on PSTs’ learning. The following are two representative questions from the second interview: “(1) Interviewer: Some participants said, “I felt I was treated professionally when everyone’s involved and serious about a colleague’s teaching performance.” Can you relate to this statement or help us better understand this statement through your own experience? (2) Interviewer: Can you (as a field supervisor) describe how the PRT model impacted your facilitation of the participants’ learning of student teaching”?

3.5. Data Analysis

The first survey responses were analyzed in order to examine the first research question concerning participants’ prior experiences and perceptions of peer feedback. In order to answer the second research question concerned with how participants’ feedback content change as they experience the PRT model, PSTs’ responses to the second survey questions and observation notes including thick descriptions and written reflections were analyzed to identify changes and suggest emerging patterns of peer feedback as demonstrated by the participants.
The survey response analyses and written reflections involved four processes: (1) an initial reading of each participant’s response, (2) identifying themes and exploring the subcategories, (3) coding the themes and subcategories, and (4) quantitatively and qualitatively interpreting data [52]. A theme was decided only when the same theme appeared in both the written reflections and survey responses at least 20 times (about 30% of the total number of survey participants) by different participants and were verified by the description notes.
To answer how participants’ peer feedback changed as they experienced the PRT model and the impact of peer feedback on PSTs’ learning, we also focused on the number of students who demonstrated authentic participation in order to represent engagement levels of participation over a period of time. The researchers who observed played a role in devising the working definition of authentic participation. We considered it authentic when PSTs made feedback comments supported by evidence and analysis. For example, we did not consider feedback authentic if a PST said, “I like it. I think he did a nice job,” primarily because the comment did not show evidence for judgment. We did count it authentic when a PST said, “His warm-up was really neat because [the class] keeps going back to the warm-up question and learned to apply to the new law of exponents, dealing with when there is power of a power,” because the comment included evidence supporting the evaluative statement. We also considered it authentic when PSTs provided comments supporting other opinions or to sharing their personal life or academic experiences. Active listening was also considered authentic participation when PSTs paid attention to peers’ commentaries and took notes. The session minutes were used in addition to the observation notes.
Another piece to the analysis was the number of participants who provided feedback. Percent scores were calculated by dividing the number of those who provided feedback at least once by the number of those who attended the session. For example, the first feedback session of the course during fall 2018 was recorded as 23% (see Figure 1) because three people provided feedback out of thirteen attendees.
The study conducted follow-up interviews with students, instructors, and field supervisors in order to answer the second research question. Interview data were analyzed in a similar way by identifying themes in the analysis of written responses such as the initial reading of each participant’s response, identifying themes and exploring subcategories, and coding themes and subcategories. The preliminary findings of this study were then presented to all participants during follow-up interviews, and the participants voted yes or no to indicate their agreement with each statement. We accepted the findings when the following two conditions were met: (1) More than the half of the participants of the same group agreed, and (2) at least two groups agreed with the majority of the votes.

3.6. Limitations of the Study

The first concern is that the majority of participants was female, which could play a role in the way the participants perceive their experience of peer feedback. This could influence the outcome of the study concerning gender differences in the form of resolving interpersonal conflicts, processing critical feedback, and verbal communication styles. Second, a random sampling of preservice students in the teacher education program was not conducted since all students enrolled in the mathematics methods courses agreed to participate in the study. Before generalizing the findings, it would be appropriate to conduct additional studies including a larger sample from more university-based teacher education programs in the country.

4. Results

4.1. Perception Change on the Peer Feedback

Table 3 shows how PSTs rated their overall past experience with peer feedback in terms of meaningfulness and value of learning opportunity in response to the following instruction: “Reflecting on your overall past experience in previous courses, please indicate how meaningful (or valuable) your experience in peer feedback was.” Around 66% of the PSTs expressed lack of meaningfulness and 50% reported lack of valuableness to peer feedback prior to involving the PRT model.
Regarding peer feedback formats, 83% respondents (n = 52) indicated that they had previously experienced paper–pencil based feedback with rubrics or prompts. Other formats included co-editing, group grading, and voting for the best. Most respondents (97%, n = 61) reported that they had no opportunity to provide peer feedback with the facilitator while the peer on the receiving end was absent, which implies our PSTs might face difficulty in offering critical feedback to classmates as noted from previous studies (e.g., [21,50]). Analyses of their written responses concerning the situation in which the presenter remains in the room confirm this interpretation.
Table 4 lists four constraints repeatedly mentioned by the respondents based on their past experience with feedback: Shallow comments, Lack of critical feedback, Sense of disconnect, and Low participation. While 46% of the respondents (n = 29) expressed difficulty in providing detailed feedback or authentic feedback due to their lack of knowledge, the second most common difficulty was the lack of critical comments. Thirty-five percent of the respondents (n = 22) expressed difficulty in providing critical feedback when the presenter was present. The rest of the PSTs also expressed a low level of willingness to provide feedback or participate in the process.
Table 5 presents the ratings of meaningfulness and valuableness of peer feedback that respondents experienced through the PRT model over the semester: “Please rate your overall experience of peer feedback in this course. How meaningful (or valuable) was your experience?” The majority of respondents reported a high level of meaningfulness (about 73%) and valuableness (about 75%) of feedback through the PRT model by choosing “very much,” or “a great deal”.
Our PRT model asked the presenter to leave the room during the peer feedback process. Regarding the situation in which the presenter leaves the room, about 78% of the respondents (n = 48) reported a positive impact of anonymous peer feedback by choosing “very much” or “a great deal” as suggested in the previous studies (e.g., [21,30,32]).

4.2. Level of Engagement

Regarding the change of participant feedback while experiencing the PRT model, we found an overall pattern of increasing participation in all three methods courses. Figure 1 illustrates the different levels of participation by providing the percent of those who contributed comments in the discussion for each session.

4.3. The Efficacy of the PRT Model

Regarding the efficacy of the PRT model, four themes emerged from PSTs’ responses: (1) Specific details of teaching (n = 53), (2) Authentic peer feedback (n = 42), (3) Reflective attitude (n = 39), and (4) Professional practice of feedback sharing (n = 55). The frequency was also reported—the reflective writing assignments were used to determine the frequency. The first two themes may be supported by Thurlings et al. [28] who identified the characteristics of effective feedback. The other two themes, reflective attitude and professional practice of feedback sharing, were drawn from the participants’ comments and may be added as components of effective feedback. Shown below are representative comments in which the proposed theme is evidenced by underlined words and phrases.
  • Theme 1: Specific details of teaching revisited in the PRT model
PSTs focused on the specific details of their teaching. These details afforded opportunities to analyze teaching decisions in the broad context of middle grades mathematics curricula in order to address the needs of students in the fieldwork.
“Within my micro teaching lesson, my peers seemed to have concerns about the focus… I agree that there were several different topics addressed, but I felt that there were several areas (order of operations and inverse operations) that would be helpful if reviewed before presenting the concept of solving equations. I do agree that I should not have focused on expressions versus equations or numerical versus algebraic equations.”
“My peers thought the [warm-up] strategy may not have been completely effective or some of the information may have been lost in translation when going from angles into triangles. A triangle has three interior angles. I hoped the class could figure out people categorize triangles by angles and by sides as they reviewed the types of angles. Not so fast. Maybe I should have started with asking the class to describe the triangle and extended to key properties. Their description will help them notice sides, angles, interiors, exteriors, etc. After all of these, perhaps the students were ready for sorting triangles… I began to realize that a lot of what we do as teachers are also about understanding [how] middle graders think and learn.”
  • Theme 2: Authentic peer feedback abundant in the PRT model
PSTs recognized the positive effects of the feedback model on enabling them to share candid and critical views with little discomfort.
“As a presenter, I would feel a little uncomfortable sitting in the room as my peers discussed my performance. There are some items that I would have wanted to defend but would not have wanted to sound defensive. It would also be hard to hear praise and wonder if it were truly heartfelt or if they were simply saying it because I was sitting there.”
“[As] a peer observer, I felt that I could be totally honest about the performance since the presenter was not in the room. I didn’t feel that I needed to modify my opinion in fear of hurting someone’s feelings. I also knew that I wanted to give careful thought to my comments so that there would not be unnecessary criticism given to a presenter. I wanted my critique to be helpful but not harsh.”
  • Theme 3: Reflective attitude emerging in the PRT model
PSTs demonstrated reflective attitudes as peer groups provided consensus regarding their teaching performance and suggested ways to improve. Reflective attitudes were increasingly evident in multiple feedback sessions (provided below) in which reflective and analytical dialogs enabled PSTs to focus on the growth of their teaching skills rather than defending their practices.
“It was also nice to know that there were some of the positive feedback about my classroom management and making the students feel at ease in my classroom… As I am feeling no confident in this area, I am now more focused on perfecting student questioning and allowing time for students to gather thoughts and respond to questions. This is one area that I had not really considered prior to this class and am definitely still weak. While I understood that it was important, I didn’t have any tools or resources to help build these skills… I am excited about growing in this area of my teaching abilities and skills.”
“[Feedback sessions] did help me start thinking about growing as a professional teacher. It helped me observe and see things that quite frankly I never thought about before until now. … Who am I to criticize my peers? We watched many teaching demos and the things that I saw my peers struggling were timing and teachable moments. Eventually, I started thinking about my own teaching, my classroom, and the kids I will be teaching… My peers seemed to all have the energy and confidence while teaching. I know [nobody’s] perfect. We watched someone teach and discussed the teaching as if we were teaching it. So there is no hard feeling because we are helping one another to review what kinds of mistakes were made and to improve with analysis.”
  • Theme 4: Professional practice of feedback sharing conceptualized in the PRT model
PSTs began to notice feedback sharing as a professional practice of teachers. This seems partly empowered by the shared goal of improving teaching among fellow PSTs as opposed to an evaluative, therefore vulnerable, setting where PSTs have limited opportunities to critically examine their teaching.
“When I think about this… people look at how you teach and make comments. They can be brutal since I am not there. But it was professionally done. People were serious and I was serious. I know people will talk about my teaching like experts do because that’s what happened when others were gone. …I thought these guys read my mind. When [professor told me] they thought I was asking questions to keep the class quiet, I laughed out loud because that’s exactly what I was thinking. It is encouraging we, educators talk in a very serious meeting about real stuff about teaching. It makes me feel I am part of this great group who knows each other so well. I’d be more comfortable [with] someone with as much or more knowledge of teaching and who understand[s] what I am going through each day in class analyze my teaching and share their honest opinions. [Professors] always talk about professional practice, professional practice, etc. all the time. I sort of begin to think about it—professional community and teaching as professional career…”
“I felt like I could speak openly about the performances, which should be a lot more beneficial than just saying everything was okay. It helped me to grow as a professional teacher because it made me realize where my weaknesses are and different things to be aware of when planning and implementing my own lessons. We did [nit-pick] each other, but those conversations were so beneficial for the people in the room; we learned so many do’s and don’ts from being that candid. … [It] was helpful to get the feedback on how we were seen by our peers teaching along with how [professor] analyzed the teaching like professional consultants. …. There is something only experts could see, but there are other things interns could see as important. For most of us this was our first real time planning and implementing a lesson which I know was quite [nerve-racking] for most but it is great to know we could still talk like we are adults, [professor] supports and challenges, and we do the same to him.”
Aligned with the four themes of efficacy in the PRT model, Table 6 presents a list of thematic comments on the impact of the PRT on PSTs’ learning in mathematics methods courses or field experiences. These comments are supported by PSTs, instructors, or field supervisors to validate the PRT model.

5. Discussion

With respect to the degree to which the PRT model was successful with engagement, we observed a pattern of increasing participation as the course progressed (see Figure 1). One may argue that such a pattern is not surprising because participants participate more as they became comfortable with classmates, classroom environments, or course materials; however, the instructors reported a consistent low level of participation before the PRT model there. For one, PSTs perceived paper and pencil-based peer feedback with rubrics neither meaningful nor valuable. There are several factors of passive participation that correspond to the reasons why PSTs did not value peer feedback in the past [28,35]: first, PSTs exhibited discomfort in providing critical feedback while the peer was present in the room; second, PSTs felt they did not have authentic opportunities to improve their work or performance by using feedback.
Then, what are the possible factors of the PRT model that motivated preservice teachers to more willingly contribute during peer feedback? Our analysis indicates that the PRT model was able to provide authentic and effective environments of peer feedback, which is related to the five characteristics (i.e., feedback, task-related, timing, affective/emotional, and effect on learners) of effective feedback [28]. For example, the PRT model encouraged peer feedback to be specific, well-balanced, relevant, and data-based. As for the appropriate timing of feedback, the feedback was performed immediately after micro teaching. The PRT model also allowed feedback to be task-related, as the methods course’s key outcome was the micro teaching task. Presenters were expected to demonstrate qualities of effective teaching that were emphasized in the course, which served as engaging and relevant contexts for those preparing to be certified for teaching in the near future. Feedback also focused on strengths, weaknesses, and suggestions, so that various perspectives supported by course discussions and readings and suggestions of specific strategies to improve weaknesses were encouraged.
The findings of this study not only reflect the work of Thurlings et al. [28] on effective feedback, but also add to the literature by extending the affective/emotional domain of Thurlings’ [28] five characteristics. We substantiated the benefit of implementing PRT in the “absence of the presenter”. To lessen discomfort for peer-evaluating PTSs, a primary feature of the PRT model was to ask the presenter to leave the room so that PSTs could comfortably critique and analyze their peer’s performance without the risk of upsetting or offending the presenter. Asking the presenter to leave the room created an effect of de-individuation [31] that helped to establish an intellectually safe place for PSTs to analysis and take peer comments. Over time, PSTs’ cynicism towards peer feedback ameliorated and PSTs began to participate in feedback sessions more actively. Here, we argue that PSTs’ increasing engagement in peer feedback is not so much a product of their growing comfort in the process as a result of the relationship and trust that had formed when PSTs found the feedback offered genuine and helpful.
Whether PSTs are comfortable or uncomfortable with feedback, in the greater scheme of feedback as an essential tool to inform and improve their teaching, may not be important as long as teachers recognize the benefits of the practice and become the active participant in peer feedback. Nonetheless, the discomfort that PSTs have experienced while providing face-to-face feedback is bound to be felt by future teachers when they critique students’ work or co-teach with colleagues. In their past experiences of evaluating peers presentations critical feedback had been avoided because the presenter remained in the room, and this sustained a level of discussion that was superficially complimentary. This study concludes that such conditions have led some PSTs to remain aloof and unable to recognize peer feedback as a significant part of their learning.
The fact that nearly 80% of the participants believed that the absence of the presenter played a positive role in their feedback development lends credence to the view that some psychological barriers may exist when PSTs are in a position to be critical about a peer’s performance. This also demonstrates how much PSTs think and act in a sensitive manner to their learning environments. In other words, passive participation may not be caused by PSTs’ inability to embark on a reflective process but is more likely to be due to psychological barriers prohibiting PSTs from fully engaging in tasks such as peer feedback. For instance, PSTs’ vulnerability might stem from a lack of experience with performance analysis or a lack of confidence in their knowledge and skills related to teaching [53]. It is also possible that PSTs have yet to recognize their learning as a process and misunderstand the opportunities of teaching demonstrations as high-stake evaluative tasks; when in fact, they are collective reflective processes for developing teaching skills [54].
The PRT model demonstrated that regularly participating in non-evaluative performance analysis enables PSTs to remain detached from grades and use the analysis and advice they receive to inform their teaching. Over time, it was noticed that PSTs began to apply their experience of peer feedback to their teaching in their clinical experience. In the study, PSTs grew increasingly comfortable with critical feedback and frequently referred to discussions during peer feedback in meetings with instructors and field supervisors while student teaching. In this way, the PRT model empowered the instructors and supervisors in providing expert guidance, sharing professional perspective in reflective critique [55], and using PSTs’ performance as a basis for reflection. Our findings highlight the potential of using practice of peer feedback as an authentic experience of sharing the concerns and challenges of teaching and working together towards a solution.

6. Conclusions

The tangible impact of reducing the discomfort of having the presenter in the same room lends credence to the view that affective components in feedback do have the potential to increase effort, motivation, or engagement in feedback. Furthermore, feedback models should be sensitive to fostering a learning environment that is in consideration of agents, types of information and knowledge, methods to mediate effective, and authentic communication [44,45]. Overall, the study contributes to the body of knowledge for peer feedback in teacher education by supporting the view that PSTs do have the potential for critical reflection in feedback sessions when feedback is less evaluative and more informative. Effective feedback sessions with PSTs should consider the characteristics of effective feedback and involve analysis through consensus seeking and intellectual dialog with support from the instructor.
For future researchers, these findings may necessitate more research on feedback models involving teachers at different stages of their teaching careers and the role of gender in critical feedback. This study did not examine how PSTs improved their work after peer feedback with a new opportunity. In the short run, a future study can examine the impact of using peer feedback on learning outcomes (e.g., what are the specific ways in which micro teaching experiences can impact PSTs’ performances in student teaching experiences and how do the comments on previous micro teaching presentations influence subsequent presentations as the sessions progress?) In the long run, multiple international studies can examine how teachers in different countries respond to various feedback models; this line of work will prove to be significant in detailing the essential aspects of peer feedback in various cultural settings. Ultimately, teachers improve in teaching with experience, and feedback adds much to the experience. Further research on how teachers value critical feedback and PSTs feel empowered in the process of feedback exchange may create more ways to provide effective learning experiences for teachers.
If teacher education is committed to building critical professional communities as a change agent for sustainability [56], then this study certainly demonstrates an evolving model for preservice teachers in their pre-professional courses. We assert that the PRT model will effectively facilitate critical feedback with PSTs, enabling teacher educators to create the authentic learning opportunities to share critical feedback as part of reflective and sustainable practice [57] and fostering meaningful relationships and trust in the teaching profession.

Author Contributions

Conceptualization, W.L.; methodology, W.L. and S.-H.K.; formal analysis, W.L. and S.-H.K.; investigation, W.L. and J.-W.S.; data curation, W.L.; writing—original draft preparation, W.L.; writing—review and editing, J.-W.S. and S.-H.K.; supervision, S.-H.K.; funding acquisition, S.-H.K. All authors have read and agreed to the published version of the manuscript.

Funding

This research was funded by Yonsei University Research Grant 2020-22-0458.

Institutional Review Board Statement

Not applicable.

Informed Consent Statement

Informed consent was obtained from all subjects involved in the study.

Data Availability Statement

Not applicable.

Conflicts of Interest

The authors declare no conflict of interest.

Appendix A

Survey I (Pre-survey)
  • Please describe your perception of peer feedback. How would you explain peer feedback to others who have not experienced it?
  • Please describe your past experience engaging in peer feedback from previous coursework.
  • From your previous experience, please recall and list any peer feedback methods and formats you can remember.
  • Reflecting on your overall past experience in previous courses, how meaningful was your experience in peer feedback?
    Barely   A little   Somewhat   Very much   A great deal
  • Reflecting on your overall past experience in previous courses, how valuable was your experience in peer feedback?
    Barely   A little   Somewhat   Very much   A great deal
  • When you participated in peer feedback, was the presenter in the same room? If so, describe the impact of the presenter remaining in the room. How did it impact or not impact your honest, critical, or productive feedback?
  • Please describe your experience of using “writing” as the primary mode of providing feedback.

Appendix B

Survey II (Post-survey)
  • Please rate your overall experience of peer feedback in this course. How meaningful was your experience?
    Barely   A little   Somewhat   Very much   A great deal
  • Please rate your overall experience of peer feedback in this course. How valuable was your experience?
    Barely   A little   Somewhat   Very much   A great deal
  • When the presenter left the room, did it positively affect your peer feedback sessions?
    Barely   A little   Somewhat   Very much   A great deal
  • Regarding your response to the previous question, how so?
  • Please describe, in detail, the factors that allowed you to participate more meaning fully in peer feedback than in previous courses.
  • Please describe the ways in which your experience with peer feedback has influenced your learning in this course.

References

  1. Boyd, P.C.; Boll, M.; Brawner, L.; Villaume, S.K. Becoming reflective professionals: An exploration of preservice teacher’s struggles as they translate language and literacy theory into practice. Action Teach. Educ. 1998, 19, 61–75. [Google Scholar] [CrossRef]
  2. Minor, L.C.; Onwuegbuzie, A.J.; Witcher, A.E.; James, T.L. Preservice teachers’ educational beliefs and their perceptions of characteristics of effective teachers. J. Educ. Res. 2002, 96, 116–127. [Google Scholar] [CrossRef]
  3. Pihlaja, P.M.; Hoist, T.K. How reflective are teachers? A study of kindergarten teachers’ and special teachers’ level of reflection in daycare. Scand. J. Educ. 2011, 1–17. [Google Scholar] [CrossRef]
  4. Rosen, D. Impact of case-based instruction on student teachers’ reflection on facilitating children’s learning. Action Teach. Educ. 2008, 30, 28–36. [Google Scholar] [CrossRef]
  5. Atkinson, M. The scholarship of teaching and learning: Reconceptualizing scholarship and transforming the academy. Soc. Forces 2001, 79, 1217–1229. [Google Scholar] [CrossRef]
  6. Boyer, E. Scholarship Reconsidered: Priorities of the Professoriate; The Carnegie Foundation for the Advancement of Teaching: Princeton, NJ, USA, 1990. [Google Scholar]
  7. Kreber, C. Teaching excellence, teaching expertise, and the scholarship of teaching. Innov. High. Educ. 2002, 27, 5–23. [Google Scholar] [CrossRef]
  8. McKinney, K. The scholarship of teaching and learning: Past lessons, current challenges, and future visions. Improv. Acad. 2004, 22, 3–19. [Google Scholar] [CrossRef]
  9. Brutus, S.; Donia, M. Improving the effectiveness of students in groups with a centralized peer evaluation system. Acad. Manag. Learn. Educ. 2010, 9, 652–662. [Google Scholar]
  10. Topping, K. Peer assessment between students in colleges and universities. Rev. Educ. Res. 1998, 68, 294–297. [Google Scholar] [CrossRef]
  11. Double, K.S.; McGrane, J.A.; Hopfenbeck, T.N. The impact of peer assessment on academic performance: A meta-analysis of control group studies. Educ. Psychol. Rev. 2020, 32, 481–509. [Google Scholar] [CrossRef] [Green Version]
  12. Zeng, L.M. Peer review of teaching in higher education: A systematic review of its impact on the professional development of university teachers from the teaching expertise perspective. Educ. Res. Rev. 2020, 31, 1–16. [Google Scholar] [CrossRef]
  13. Hendry, G.D.; Georgiou, H.; Lloyd, H.; Tzioumis, V.; Herkes, S.; Sharma, M.D. ‘It’s hard to grow when you’re stuck on your own’: Enhancing teaching through a peer observation and review of teaching program. Int. J. Acad. Dev. 2021, 26, 54–68. [Google Scholar] [CrossRef]
  14. Georgiou, H.; Sharma, M.; Ling, A. Peer review of teaching: What features matter? A case study within STEM faculties. Innov. Educ. Teach. Int. 2018, 55, 190–200. [Google Scholar] [CrossRef]
  15. Buchanan, M.T.; Stern, J. Pre-service teachers’ perceptions of the benefits of peer review. J. Educ. Teach. Int. Res. Pedagog. 2012, 38, 37–49. [Google Scholar] [CrossRef]
  16. Thurlings, M.; den Brok, P. Student teachers’ and in-service teachers’ peer learning: A realist synthesis. Educ. Res. Eval. 2018, 24, 13–50. [Google Scholar] [CrossRef]
  17. Rodman, G.J. Facilitating the teaching-learning process through the reflective engagement of pre-service teachers. Aust. J. Teach. Educ. 2010, 35, 20–34. [Google Scholar] [CrossRef] [Green Version]
  18. Sagor, R. Guiding School Improvement with Action Research; Association of Supervision and Curriculum Development: Alexandria, VA, USA, 2000. [Google Scholar]
  19. Vieira, F.; Marques, I. Supervising reflective teacher development practices. ELTED 2002, 6, 1–18. [Google Scholar]
  20. Ghorpade, J.; Lackritz, J.R. Peer evaluation in the classroom: A check for sex and race/ethnicity effects. J. Educ. Bus. 2001, 76, 274–282. [Google Scholar] [CrossRef]
  21. MacLeod, L. Computer-aided peer review of writing. Bus. Commun. Q. 1999, 62, 87–95. [Google Scholar] [CrossRef]
  22. Nilson, L.B. Improving student peer feedback. Coll. Teach. 2003, 51, 34–39. [Google Scholar] [CrossRef]
  23. Panadero, E.; Alqassabhttp, M. An empirical review of anonymity effects in peer assessment, peer feedback, peer review, peer evaluation and peer grading. Assess. Eval. High. Educ. 2019, 44, 1253–1278. [Google Scholar] [CrossRef]
  24. Griffth, A.N.; Johnson, H.E.; Larson, R.W.; Buttitta, E.K. A qualitative examination of critical feedback processes in project-based youth programs. Contemp. Educ. Psychol. 2020, 62, 1–9. [Google Scholar] [CrossRef]
  25. London, M. Giving feedback: Source-centered antecedents and consequences of constructive and destructive feedback. Hum. Resour. Manag. Rev. 1995, 5, 159–188. [Google Scholar] [CrossRef]
  26. Lu, R.; Bol, L. A comparison of anonymous versus identifiable e-peer review on college student writing performance and the extent of critical feedback. J. Interact. Online Learn. 2007, 6, 100–115. [Google Scholar]
  27. Machin, T.M.; Jeffries, C.H. Threat and opportunity: The impact of social inclusion and likeability on anonymous feedback, self-esteem, and belonging. Personal. Individ. Differ. 2017, 115, 1–6. [Google Scholar] [CrossRef]
  28. Thurlings, M.; Vermeulen, M.; Bastiaens, T.; Stijnen, S. Understanding feedback: A learning theory perspective. Educ. Res. Rev. 2013, 9, 1–15. [Google Scholar] [CrossRef]
  29. Quible, Z.K. The efficacy of several writing feedback system. Bus. Commun. Q. 1997, 60, 109–124. [Google Scholar] [CrossRef]
  30. Robinson, J. Computer-assisted peer review. In Computer-Assisted Assessment in Higher Education; Brown, S., Bull, J., Race, P., Eds.; Kogan Page: London, UK, 1999; pp. 95–102. [Google Scholar]
  31. Connolly, T.; Jessup, L.M.; Valacich, J.S. Effects of anonymity and evaluative tone on idea generation in computer-mediated groups. Manag. Sci. 1990, 36, 689–703. [Google Scholar] [CrossRef]
  32. Liu, E.Z.; Lin, S.S.; Chiu, C.H.; Yuan, S.M. Web-based peer review: The learner as both adapter and reviewer. IEEE Trans. Educ. 2001, 44, 246–251. [Google Scholar] [CrossRef]
  33. Scheeler, M.C.; Ruhl, K.L.; McAfee, M.K. Providing performance feedback to teachers: A review. Teach. Educ. Spec. Educ. 2004, 27, 396–407. [Google Scholar] [CrossRef]
  34. Shute, V. Focus on formative feedback. Rev. Educ. Res. 2008, 78, 153–189. [Google Scholar] [CrossRef]
  35. Gielen, S.; Peeters, E.; Dochy, F.; Onghena, P.; Struyven, K. Improving the effectiveness of peer feedback for learning. Learn. Instr. 2010, 20, 304–315. [Google Scholar] [CrossRef]
  36. Martens, R.; de Brabander, C.; Rozendaal, J.; Boekaerts, M.; Van der Leeden, R. Inducing mind sets in self-regulated learning with motivational information. Educ. Stud. 2010, 36, 311–327. [Google Scholar] [CrossRef]
  37. Li, L.; Liu, X.; Steckelberg, A.L. Assessor or assessee: How student learning improves by giving and receiving peer feedback. Br. J. Educ. Technol. 2010, 41, 525–536. [Google Scholar] [CrossRef]
  38. Colasante, M. Using video annotation to reflect on and evaluate physical education pre-service teaching practice. Australas. J. Educ. Technol. 2011, 27, 66–68. [Google Scholar] [CrossRef] [Green Version]
  39. Fund, Z. Effects of communities of reflecting peers on student–teacher development—Including in-depth case studies. Teach. Teach. Theory Pract. 2010, 16, 679–701. [Google Scholar] [CrossRef]
  40. Hyland, F. Providing effective support: Investigating feedback to distance language learners. Open Learn. 2001, 16, 233–247. [Google Scholar] [CrossRef]
  41. Orsmond, P.; Merry, S. Feedback alignment: Effective and ineffective links between tutors’ and students’ understanding of coursework feedback. Assess. Eval. High. Educ. 2011, 36, 125–136. [Google Scholar] [CrossRef]
  42. Hattie, J. Visible Learning: A Synthesis of over 800 Meta-Analyses Relating to Achievement; Routledge: London, UK, 2009. [Google Scholar]
  43. Black, P.; Wiliam, D. Assessment and classroom learning. Assess. Educ. Princ. Policy Pract. 1998, 5, 7–74. [Google Scholar] [CrossRef]
  44. Hattie, J.; Timperley, H. The power of feedback. Rev. Educ. Res. 2007, 77, 81–112. [Google Scholar] [CrossRef]
  45. Hattie, J. Know thy impact. Educ. Leadersh. 2012, 70, 18–23. [Google Scholar]
  46. Blackmore, J.A. A critical evaluation of peer review via teaching observation within higher education. Int. J. Educ. Manag. 2005, 19, 218–232. [Google Scholar] [CrossRef]
  47. Fileborn, B.; Wood, M.; Loughnan, C. Peer reviews of teaching as appreciative inquiry: Learning from “the best” of our colleagues. High. Educ. 2020, 1–15. [Google Scholar] [CrossRef]
  48. Clark, D.J.; Redmond, M.V. Small Group Instructional Diagnosis: Final Report. 1982. Available online: http://files.eric.ed.gov/fulltext/ED217954.pdf (accessed on 26 January 2020).
  49. Bowden, D. Small group instructional diagnosis: A method for enhancing writing instruction. WPA J. Counc. Writ. Program Adm. 2004, 28, 115–135. [Google Scholar]
  50. Alqassab, M.; Strijbos, J.W.; Ufer, S. Preservice mathematics teachers’ beliefs about peer feedback, perceptions of their peer feedback message, and emotions as predictors of peer feedback accuracy and comprehension of the learning task. Assess. Eval. High. Educ. 2019, 44, 139–154. [Google Scholar] [CrossRef]
  51. Ballantyne, R.; Hughes, K.; Mylonas, A. Developing procedures for implementing peer assessment in large classes using an action research process. Assess. Eval. High. Educ. 2002, 27, 427–441. [Google Scholar] [CrossRef]
  52. Creswell, J.W.; Miller, D.L. Determining validity in qualitative inquiry. Theory Pract. 2000, 39, 124–130. [Google Scholar] [CrossRef]
  53. Murphy, C.; Neil, P.; Beggs, J. Primary science teacher confidence revisited: Ten years on. Educ. Res. 2007, 49, 415–430. [Google Scholar] [CrossRef]
  54. Pedro, J.Y. Reflection in teacher education: Exploring pre-service teachers’ meanings of reflective practice. Reflective Pract. 2005, 6, 49–66. [Google Scholar] [CrossRef]
  55. Liu, S.H. Effects of an online sharing and feedback programme on preservice teachers’ practical knowledge, learning preferences and satisfaction. Technol. Pedagog. Educ. 2020, 29, 463–475. [Google Scholar] [CrossRef]
  56. Lane, S.; Lacefield-Parachini, N.; Isken, J. Developing novice teachers as change agents: Student teacher placements “against the grain”. Teach. Educ. Q. 2003, 30, 55–68. [Google Scholar]
  57. Burns, H. Meaningful sustainability learning: A research study of sustainability pedagogy in two university courses. Int. J. Teach. Learn. High. Educ. 2013, 25, 166–175. [Google Scholar]
Figure 1. Level of participation, fall 2018–fall 2020.
Figure 1. Level of participation, fall 2018–fall 2020.
Sustainability 13 06435 g001
Table 1. Characteristics of effective feedback in five clusters [28,34,35,36,37,38,39,40,41].
Table 1. Characteristics of effective feedback in five clusters [28,34,35,36,37,38,39,40,41].
Feedback ClusterCharacteristics
1. FeedbackSpecific, consistent, positive, unbiased, balanced between positive and negative, evidence-based, formative, relevant, leaving control to learner, constructive, challenging
2. Task-relatedFocused on and related to task, aligned with goals, contains information about progress
3. TimingImmediate and frequent when learners remember their actions
4. Affective/emotionalSupporting, honest, promoting positive motivational beliefs
5. Effect on learnersSupporting learners to engage in comparing performance with standard, supporting learners to engage in action to close gap, creating cognitive dissonance
Table 2. Research studies supporting the adapted PRT model.
Table 2. Research studies supporting the adapted PRT model.
Key Process of PRT ModelResearch Studies
  • Feedback addresses teaching performance, which is the central task of methods courses
  • Characteristics of effective feedback in five clusters [28]
  • The presenter leaves the room during feedback session
  • Feedback should provide strengths, weaknesses, and specific recommendations.
  • A balance between positive and negative remarks [34,35,36,37,40,41]
  • Instructor summarizes feedback as content/pedagogy expert and goes over it with the presenter at a later time.
  • Objectivity, quality and fairness of the process [29,51]
  • Key process of PRT Model
  • Research studies
Table 3. Overall rate of past experience of peer feedback.
Table 3. Overall rate of past experience of peer feedback.
ScaleBarelyA LittleSomewhatVery MuchA Great Deal
Item 1. Ratings on meaningfulness21 (34%)20 (32%)16 (26%)3 (5%)2 (3%)
(n = 62 *)
Item 2. Ratings on valuableness 14 (23%)16 (27%)18 (30%)8 (13%)4 (7%)
(n = 60 **)
* One response is missing; ** three responses are missing.
Table 4. Recounting past experiences regarding the difficulty of providing authentic feedback.
Table 4. Recounting past experiences regarding the difficulty of providing authentic feedback.
ThemeRepresentative CommentsCounts (n = 63)
Lack of critical feedback
  • “I provided positive comments because the presenter was in the room.”
  • “People say nice things to be nice…they are so vain… they could be more [constructive] about what they found in my presentation.”
45
(71%)
Shallow comments
  • “I don’t like to give details about the teaching because it takes too much time to write about.”
  • “I used to be unsure how to analyze someone’s teaching. Really. I am not the instructor, why would I get so judging and [analytical]?”
23
(37%)
Low participation
  • “I don’t like to criticize people so I don’t say a lot during feedback.”
  • “It doesn’t count for a grade… it serves no purpose at all.”
22
(35%)
Sense of disconnect
  • “Nobody cares about peer feedback, so why bother.”
  • “Peer feedback’s been around forever… Professors like it because they don’t have to grade themselves…”
28
(44%)
Table 5. Overall rate of experience of (PRT) model feedback and PSTs’ perception on the policy that the presenter leaves the room.
Table 5. Overall rate of experience of (PRT) model feedback and PSTs’ perception on the policy that the presenter leaves the room.
ScaleBarelyA LittleSomewhatVery MuchA Great Deal
Item 1. Ratings on meaningfulness1 (2%)3 (5%)12 (20%)33 (54%)12 (19%)
(n = 61)
Item 2. Ratings on valuableness1 (2%)4 (7%)10 (16%)31 (51%)15 (24%)
(n = 61)
“When the presenter left the room, did it positively affect your peer feedback sessions?” (n = 61)0 (0%)4 (7%)9 (15%)26 (42%)22 (36%)
Table 6. Agreed impact of the PRT model.
Table 6. Agreed impact of the PRT model.
Impact of the PRT ProcessSupported by
  • Providing opportunities for PSTs to perceive the math methods instructors as instructional coach or professional mentor as opposed to evaluator or distant academia
PSTs and instructors
2.
Feedback comments and recommendations for improvement were taken seriously and referred to frequently by PSTs in their field experience and methods courses.
Instructors and field supervisors
3.
PSTs developed a habit of exchanging critical and reflective views
PSTs and instructors
4.
Providing opportunities for PSTs to experience a sense of professional community by providing formative feedback comments in a structured intellectual environment
PSTs, field supervisors, and instructors
5.
Effective teaching practices and professional teacher behaviors were recognized in the process of consensus seeking, and effective practices and positive behaviors were emulated afterwards.
Instructors and field supervisors
6.
Providing opportunities in which meaningful connections were made between content and pedagogy
PSTs and instructors
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Share and Cite

MDPI and ACS Style

Lim, W.; Son, J.-W.; Kang, S.-H. How Reducing Discomfort Impacts Peer Assessments of Preservice Teachers. Sustainability 2021, 13, 6435. https://doi.org/10.3390/su13116435

AMA Style

Lim W, Son J-W, Kang S-H. How Reducing Discomfort Impacts Peer Assessments of Preservice Teachers. Sustainability. 2021; 13(11):6435. https://doi.org/10.3390/su13116435

Chicago/Turabian Style

Lim, Woong, Ji-Won Son, and Seung-Hae Kang. 2021. "How Reducing Discomfort Impacts Peer Assessments of Preservice Teachers" Sustainability 13, no. 11: 6435. https://doi.org/10.3390/su13116435

APA Style

Lim, W., Son, J.-W., & Kang, S.-H. (2021). How Reducing Discomfort Impacts Peer Assessments of Preservice Teachers. Sustainability, 13(11), 6435. https://doi.org/10.3390/su13116435

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop