Next Article in Journal
Beyond Traditional Classrooms: Comparing Virtual Reality Applications and Their Influence on Students’ Motivation
Previous Article in Journal
Designing and Situating Text to Promote Textual Dexterity in the Context of Project-Based Science Instruction
Previous Article in Special Issue
Educational Assessment Knowledge and Skills for Teachers Revisited
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Unveiling Classroom Assessment Literacy: Does Teachers’ Self-Directed Development Play Out?

1
School of Languages and Communication, Beijing Technology and Business University, Beijing 100048, China
2
Department of Education Studies, Hong Kong Baptist University, Kowloon, Hong Kong
*
Author to whom correspondence should be addressed.
Educ. Sci. 2024, 14(9), 961; https://doi.org/10.3390/educsci14090961
Submission received: 23 July 2024 / Revised: 21 August 2024 / Accepted: 29 August 2024 / Published: 1 September 2024

Abstract

:
Ideally, teachers’ classroom assessment literacy can be developed through in-service teacher education or assessment training from institutions. Yet in reality, teachers may not gain sufficient assessment training on the job or from institutionalised training programmes. This contextual disadvantage cannot explain teacher inertia in advancing their professional knowledge and their skills in classroom-based assessment. Instead, teachers are encouraged to proactively rely on themselves to enhance their CAL amid their tried-and-tested assessment practices. The current qualitative case study explores how a university English teacher directed herself to develop CAL in her assessment practices over time. Data were collected through narrative frames, interviews with the teacher and her students, classroom observations, and documents. This study shows that self-directed CAL development may be buttressed by the teacher’s prior assessment experiences. The teacher’s self-agency and reflections further empowered her to acquire the assessment knowledge, skills, and experience in improving assessment effectiveness. The implications for enhancing self-directed professional development in assessment are also discussed.

1. Introduction

Classroom assessment literacy (CAL) generally refers to teachers’ ability to conduct valid, reliable, and instructional goal-oriented classroom-based assessment to promote student learning [1,2]. It has become a focal point in educational research, emphasising the necessity for teachers to be proficient in designing, implementing, analysing, and interpreting assessment data. Classroom assessment literate teachers can adeptly identify students’ learning difficulties informed by assessment results and adjust teaching to help students achieve desirable learning goals [3]. Teachers who are literate in classroom assessment also need to take effective assessment approaches to make informed decisions for the attention of various stakeholders [4]. For university English teachers, CAL is particularly crucial owing to the diverse linguistic and cultural backgrounds of university students and the multifaceted nature of language acquisition. Given this, CAL has been strongly advocated to be integrated with teachers’ professional development regardless of educational levels, contexts, or systems [1,5].
Ideally, teachers’ CAL can be developed or enhanced through external assistance, including teacher education programmes [6], assessment training, courses [7,8], or other supportive forms provided by their institutions (e.g., in-house training workshops). However, in real educational practice, especially at the tertiary level, teachers may not gain sufficient support in assessment literacy development from their institutions [9]. As a result, they frequently need to rely on themselves on the job to acquire the assessment knowledge or skills, accumulate their assessment experience, and use assessment methods to meet students’ learning needs and institutional requirements [10].
Against this backdrop, it is vital for teachers to direct themselves to develop their CAL through their classroom assessment practices. Despite its importance, research on self-directed CAL development, particularly in higher education, remains limited [2,11]. The present study, henceforth, aims to fill this much-needed void by examining how a university English teacher self-reliantly develops her CAL through her classroom assessment practices. By doing so, this study sheds light on how language teachers empower themselves with professional development in language assessment through their own agency.

2. CAL and Professional Development

CAL encompasses a comprehensive understanding of assessment principles, methods, and their implications for student learning [12]. Teachers at different educational levels should be classroom-assessment literate to develop and utilise appropriate assessment methods, especially alternative methods (e.g., peer or portfolio assessment) to improve and judge students’ learning, even though teachers rarely utilise these methods in the classroom [13,14]. Given that teachers are the key agents in implementing assessment, their competency to drive valid, reliable, and effective assessment has been part of their accountability [15]. This accountability also puts pressure on teachers who need to identify and address their concerns in classroom-based assessment practices, which drives teachers at different educational levels to enhance their CAL.
Effective CAL development is notably influenced by teacher education and professional development. Yet, traditional teacher education programmes often emphasise content knowledge and pedagogy over assessment literacy, leading to gaps in teachers’ abilities to design and implement effective assessments [16]. Professional development programmes focusing on assessment literacy can help bridge these gaps. For example, Koh [17] showed that ongoing and sustainable professional development significantly contributed to primary teachers’ CAL enhancement. Similarly, DeLuca and Johnson [18] discovered that ongoing, collaborative, and context-specific professional development programmes are more effective in enhancing assessment literacy. At the tertiary level, Xu and Brown [19] found that professional development initiatives that include practical training in assessment design and interpretation led to improved CAL among educators.
Recent trends in teacher education emphasise the integration of assessment literacy into both pre-service and in-service training programmes. Lee [7] developed a pedagogical project for pre-service ESOL teachers, promoting their language assessment literacy by having them critique their assessment practices. Language assessment literacy (LAL) refers to the assessment literacy for language teaching and learning. Researchers have advocated to explore further in classroom-based language assessment literacy. Likewise, Fulcher [20] suggested that introducing apprenticeships into language assessment courses could significantly promote language teachers’ LAL. Recently, Ukrayinska [21] has proposed a synergetic approach to provide a LAL training programme for pre-service teachers in Ukrainian tertiary institutions. This approach might work effectively to promote teachers’ LAL. As for in-service teachers’ CAL development at the tertiary level, research has revealed that practice-based teacher education courses focusing on CAL could lead to EAP instructors’ development of assessment theories and practices [22].
Existing studies have implied that targeted professional development programmes offer a promising solution to teachers’ CAL enhancement. Integrating practical, collaborative, and context-specific training into teacher education can enhance teachers’ assessment literacy, ultimately benefitting student learning outcomes.

3. CAL and Self-Directed Professional Development

Although developing teachers’ CAL through professional programmes, teacher education courses, or training, is undeniably effective, many educational contexts lack these external supports for teachers’ CAL development. Consequently, teachers often rely on themselves to enhance their CAL on the job by conducting assessment activities and gaining experience through trial and error with various assessment methods. For example, the college English teacher in Zhang’s study [9] galvanised his self-reflection and self-empowerment to enact effective assessment for learning in a constrained context (e.g., lack of administrative support and training), although this study focused more on assessment practices than on literacy development [9]. It is also noted that self-directed development is indispensable even when teacher education or professional programmes are provided. For instance, the pre-service teachers in the study of Cowie et al. [23] developed their CAL in terms of assessment purposes by proactively negotiating and resolving the conflicting priorities between their teacher education and school experiences. These emerging landscapes highlight the importance of self-directed professional development for teachers at different educational levels whether external support is available or not.
Self-directed professional development involves professional learning initiated and driven by teachers through self-motivation and strong willpower [24]. It emphasises the use of teachers’ agency to meet students’ learning needs in localised contexts [25]. More effective than guided development, self-directed professional development is characterised by self-empowerment and self-reliance, enabling teachers to utilise a wide range of resources to support their teaching. These resources include not only pedagogical materials but also external support such as colleagues, researchers, institutions or schools, and online resources [9]. Furthermore, self-directed professional development fosters self-reflection and self-initiative, allowing teachers to assess and refine their instructional practices based on identified challenges and feedback [4].
The process of self-directed professional development is not linear. Teachers must dynamically interact with social and cultural factors to explore and utilise external resources for their classroom instruction [9,23]. This involves overcoming various challenges, both intended and unintended, such as managing teachers’ emotions towards teaching, student responses, and the necessity to adapt instruction for improvement [25]. Reflection serves as a critical framework through which these challenges can be effectively addressed [26].
Despite the recognised importance of self-directed professional development, research on its synergy with assessment literacy is limited. Babaii and Asadnia [27] found that EFL teachers’ reflective practices on recent assessment research helped them to evaluate and improve their classroom assessment practices, contributing to their LAL development. Similarly, Tian et al. [28] revealed that reflections on assessment across cultures and contexts enhanced teachers’ professional development in LAL. The development of teachers’ self-assessment literacy, in turn, noticeably prompted their autonomous learning [29]. The latest study by Gan and Lam [10] suggested that teachers’ self-agency was crucial for their LAL development as they utilised available resources for classroom assessment.
However, it should be pointed out that sometimes teachers’ self-agency for CAL development in a real classroom environment is constrained by various contextual factors (e.g., institutional policies or requirements). For example, Mansouri et al. [30] found that the assessment policies of schools sanctioned EFL teachers’ autonomy to implement their preferred assessment practices. In that case, teachers needed to adjust their agentive decisions to lift the sanctions and practice their CAL.
In summary, the literature has indicated that self-directed development significantly impacts teachers’ CAL. However, there is still a paucity of research in understanding how teachers, especially those in tertiary institutions, empower themselves to enhance their CAL over time. This study aims to fill this gap by addressing one research question: how does a university English teacher independently develop her CAL through daily classroom assessment practices?

4. Methodology

An ethnographic case study, interpretatively qualitative in nature [31], was adopted to investigate a university English teacher’s CAL development over time. As this approach features thick descriptions of cases based on long-term data collection [32], it is well aligned with the purpose of this study, revealing the context-specific processes of a phenomenon.

4.1. Research Context

College English is a compulsory course for non-English major students in China. The course aims to inspire student learning and enhance the curriculum through both formative and summative assessments. According to the Guidelines on College English Teaching (GCET) [33], university English teachers are encouraged to assess students’ language performance, provide constructive feedback, and involve students in the assessment process through various methods, including self-assessment, peer assessment, and portfolio assessment.
To effectively implement these activities, GCET emphasises the importance of developing university English teachers’ LAL particularly in classroom-based assessment. It proposes that universities should provide in-house training programmes or opportunities for teachers to take regional or national training offered by external institutes to empower them with assessment knowledge, skills, and practices. However, research indicates that university English teachers often do not receive adequate training in classroom-based assessment [34]. As a result, many teachers primarily rely on themselves as resources to enhance their CAL through practical assessment activities.
This paper focuses on Emma (a pseudonym), one of the four participants in a longitudinal study on LAL development, to illustrate her CAL development process over time.

4.2. Participant

Emma was chosen as the focus of the current study for the following reasons. First, she is an experienced English teacher and capable of being self-directed in her classroom teaching and assessment. Upon joining the study, she had taught College English for 12 years at AU, a university with a focus on the Agricultural fields. Second, despite her research area in second language acquisition, Emma showed great interest in language assessment, as she wanted to get promoted up the career ladder by being involved in academic publications in this field. She was keen to conduct assessment-relevant research. Third, like most mid-career teachers whose approaches to assessment were influenced by classroom experience [35], Emma was equipped with CAL to some degree at the beginning of the study. She was familiar with language testing and assessment (LTA) content owing to the assessment course she took in her pre-service teacher training programme and her continuous self-learning in LTA.

4.3. Data Collection and Analysis

Guided by the principles of an ethnographic case study, data were collected and triangulated over three semesters using a variety of methods: narrative frames (NF), interviews with Emma (IT), focus group interviews with her students (FGI), classroom observations, and documents. Four narrative frames were utilised, allowing Emma to recount her experiences related to LTA with the frames acting as scaffolding [36]. Under the guidance of the first author, the initial NF was completed at the start of the first semester, with the subsequent three NFs finished at the end of each semester.
Semi-structured interviews with Emma were conducted to complement the narrative frames, addressing any gaps caused by space constraints [36]. Two interviews per semester, at the beginning and end, totalled six interviews. These sessions explored Emma’s assessment plans, tasks, feelings, perceptions, and reflections on assessment practices. Each interview lasted approximately one and a half hours and was conducted in Chinese to facilitate free expression. The interviews were audio-recorded, transcribed verbatim, and only relevant excerpts were translated into English for use as evidence.
To triangulate Emma’s interviews, a videotaped focus group interview with her students was conducted after each round of classroom observations at the end of each semester, resulting in three focus group interviews. These sessions encouraged students to express and exchange their views on Emma’s assessment practices. A series of guiding questions ensured the quality and flow of these discussions. Six or seven students, randomly selected from a name list, participated in each interview.
Classroom observations were also conducted to complement NF and IT data, focusing on how Emma enhanced her CAL through assessment practices. Observations included a total of 11 College English sessions and two in-class speaking tests for the three semesters, with accompanying field notes. Each 90 min observation was videotaped, and segments related to assessment were transcribed verbatim. These segments included how Emma organised teaching, learning, and assessment activities, utilised assessment methods to motivate and promote student learning, provided feedback, and encouraged error correction. As the segments were delivered in English, no translation was necessary.
Documents provided another crucial data source. These included course syllabi, curriculum requirements, assessment rubrics, criteria, grading sheets, assessment plans, student assignments, and presentation slides. Documents were collected at various times during classroom observations, providing insights into assessment policies and teacher development related to assessment.
An inductive approach was adopted for data analysis [37]. The authors repeatedly read and re-read the narrative and interview data to produce preliminary codes through comparison, re-examination, and re-evaluation. Informed by the research question, codes were narrowed to focus on keywords and phrases pertinent to the study. The literature on CAL and self-directed professional development supplemented and refined the codes to avoid overlooking important aspects (e.g., teachers’ emotional responses to assessment, awareness of being an assessor, and CAL development) [19,38]. These codes converged into common categories, such as self-directed assessment effectiveness, assessment frequency, students’ emotional responses, and reflections on assessment research to facilitate self-directed CAL. All codes and categories were iteratively examined and refined until themes relevant to the research question and study purposes were finalised.
Throughout the data analysis, the authors collaboratively examined the datasets and reconciled differences through discussions until agreed themes emerged. Based on the themes from NFs and ITs, videotaped classroom observations were repeatedly reviewed and analysed to identify assessment scenarios demonstrating the participant’s CAL development through assessment practices. These scenarios were also used to triangulate the participant’s narratives. To ensure its trustworthiness, all coded data were sent to Emma for member checking, and her feedback was incorporated into the interpretation of findings.

5. Findings

5.1. Initial Self-Directed Conceptualisation of Classroom Assessment

Emma’s initial understanding of classroom assessment was shaped by her personal experiences as a student. Emma encountered “tons of paper-and-pencil tests in primary and secondary schools” (IT1), as these tests have been widely regarded as the only valid assessment of academic achievement. However, these tests, as she claimed, “cannot assess students’ learning performance comprehensively. They are … paying little attention to assessing learning process”. She argued that pen-and-pencil tests “don’t contribute to students’ learning autonomy. They are useless for me (her)” (IT1). As a student, Emma displayed her aversion towards paper-and-pencil tests. These tests are usually characterised by knowledge reproduction, being summative and score/grade-oriented [39]. As such, Emma insisted that such tests failed to document her learning progress and demonstrate her continuous efforts and positive attitude.
Upon entering university, Emma experienced a shift in assessment methods, particularly in language skill evaluation. These methods could assess students from different perspectives and were more “comprehensive and authentic” (IT1). A notable example was the portfolio assessment used by her writing teacher from an English-speaking country, which Emma found profoundly influential. She valued the portfolio for its ability to document her learning process, academic growth, and achievements. Moreover, it helped her develop the self-regulation skills crucial for her future professional development—an advantage that paper-and-pencil tests could not offer.
Upon becoming a postgraduate student, Emma enrolled in an elective course on language testing and assessment. This course was primarily oriented to fundamental concepts including validity, reliability, and purposes of testing, but excluded hands-on assessment practices. Consequently, Emma struggled to retain the theoretical content which she learned when she began her professional career, leading to considerable disappointment with the course’s unbalanced emphasis.
Apart from formal coursework, Emma expanded her understanding of language assessment through various training programmes. While these programmes enriched her assessment knowledge, they were primarily geared towards grading speaking, writing, and translation for standardised tests, rather than classroom-based assessment. Nonetheless, Emma’s self-directed learning, particularly through reading and teaching practice, played a crucial role in her professional development. Teaching an IELTS speaking course, for instance, compelled her to delve into IELTS assessment criteria and explore effective assessment methods for students’ speaking abilities (NF1).
Emma’s prior assessment experiences ultimately empowered her with a self-directed conceptualisation of classroom-based assessment. She was well-versed in language testing and assessment content and grading but maintained a critical attitude towards traditional tests. Emma preferred alternative assessment methods that track students’ learning progress and support their academic growth.

5.2. Self-Directed Development of CAL: Using and Improving Peer Assessment

In her class, Emma highlighted students’ involvement and interaction in assessment focusing particularly on productive language skills. To meet her department’s requirement for a greater emphasis on formative assessment over summative assessment, she designed activities such as oral group reports, role plays, and individual presentations. Peer assessment was integrated into these activities to specifically evaluate the students’ speaking skills. As Emma explained, “If they [students] can express in English, there is no problem with their writing. So is their listening” (IT1). She believed that English learning and assessment should be oriented towards productive skills and that peer assessment could “motivate and improve students’ speaking by involving students in criteria and samples learning” (IT2).
Emma’s integration of peer assessment with speaking was intended to maximise its potential to foster students’ learning autonomy. She developed assessment criteria such as clarity or grammar correctness based on IELTS speaking criteria, requiring students to assess their fellow students’ performance in various speaking activities (e.g., group discussion, role play). This could be illustrated in the following classroom observation episode.
Having completed a group discussion and a story description regarding the topic “loving your neighbours”, students proceeded to a role play part.
Emma: Now, we’re going to move into another activity, that is a role play. You are still required to work in pairs and groups, okay? In the very first step, you are supposed to tell a story. Firstly, you get prepared to tell a story and choose one of the situations below (shown in the screen) and talk about when you have some funny or amusing experiences, for example, you get stuck in a lift or miss a flight or train. Yes, or some embarrassing moment… And in the second step, you’re going to take turns to play the roles of a storyteller and a listener. Those listeners should write down several keywords (Emma wrote down “clarity, details, errors, and funny/not funny” on the blackboard) on a piece of paper and evaluate the partner’s story by giving a tick.
Students began to tell their stories in pairs within a group. There were seven groups with six students in each. Meanwhile, Emma walked around each group and sat among students to listen and talk. About 10 min later, one student from each pair was asked to share his/her story.
Emma: (After one student from group one finishes her story) Thank you. I’m impressed by the consistency in keeping the past tense. You don’t need to tell a complicated story. Even though the story is very simple, if you can keep it in the past tense, that’s a great job. Ok, now it comes to group two. Who is the group leader? Who would you want to share the story?
This episode shows that despite Emma’s efforts, the integration was not effectively implemented. Emma neither collected students’ assessment sheets nor required students to provide oral feedback to their fellow students, as one student noted, “We did not know how our speaking was graded by our peers… we didn’t see or hear any feedback on our speaking from our peers” (FGI3).
By the end of the first semester, Emma realised that the peer assessment was not as effective as she had anticipated. She observed that most students “cannot identify problems and give feedback to their fellow students’ speaking performance” (NF2). This realisation made her hesitant to continue using peer assessment in her class. Then she considered offering a brief training session to familiarise students with the assessment criteria. Yet she was uncertain of the effectiveness of the training, as she said, “They [students] may not give comments as expected due to their English proficiency” (IT2).
Concerned about the effectiveness of peer assessment, Emma turned to her colleagues for advice. They recommended her to attend a webinar given by a professor who was said to be a peer assessment expert. Unfortunately, the webinar turned out to be a disappointment, as it focused on the statistical validation of peer assessment instead of its practical use in the classroom. As the webinar provided little guidance in the classroom assessment, Emma decided to “look for materials or resources closely related to my [her] assessment concerns” (NF2).
In the second semester, Emma made three attempts to address her concerns. First, she employed a new app introduced by the department to categorise assignments and record peer assessment engagement. This app “… increased the efficiency of peer grading and raised students’ awareness of peer assessment” (IT3). Second, she added a value and moral component to the peer assessment criteria inspired by a student’s interest in value learning. Value learning through the curriculum has been advocated by the China Ministry of Education to raise university students’ moral consciousness and their ideological quality. Cultivating students’ ability to tell good “Chinese stories” in English is one of the value and moral components. This addition aligned with the current emphasis on value learning in College English, as she claimed, “value elements can make College English teaching look trendy” (IT3). Emma’s conscious adaptation demonstrated her agency to promote her assessment practice and accumulate assessment experience by drawing on her sharp perception of the curriculum. This growing experience, as Emma responded, relied on her own “reflections on the assessment practices and processes in the class” (IT3). Third, Emma offered students a 45 min session of training on assessment criteria, using detailed examples. This training significantly improved the quality of her peer assessment activities, as most students could assess their peers with increased accuracy and at least “70% of students give almost the same score as I do” (IT4). Figure 1 illustrates the near-identical scores between Emma and her students in the second and third columns.
This endeavour which Emma invested in to refine peer assessment spurred her on to conduct research on its impact on student learning autonomy. She conducted studies in two different classes, generating “satisfying” data that encouraged her to write a research manuscript for publication. Emma took the writing process as a form of reflection on her assessment practices, from which her understanding of using peer assessment in English teaching could be enhanced. She noted, “It [writing the manuscript] prompts me to think about the study thoroughly… I think it is a part of my reflection and an indicator of how far I can reflect. It is also a part of my teaching practice” (IT5).

5.3. Self-Directed Development of CAL: Insisting on Portfolio

Apart from peer assessment, Emma employed portfolios in her class as well. This method was introduced and used by her writing teacher, a foreign teacher in the university. “My foreign teacher used portfolio assessment… with a clear objective to record students’ progress in different writing genres. As College English is a comprehensive course, I record students’ learning process of not only one language skill but all skills. I ask every student to have a notebook where they jot down what they have learned, including vocabulary, mind mapping, and their own reflections on learning. Their notebooks help me assess their learning attitudes and progresses” (IT1). This excerpt indicated that Emma used portfolios to record her students’ learning progress, through which she could make informed assessment decisions on her students’ learning process instead of learning products only.
However, this portfolio-based approach was not welcomed by students. First of all, Emma did not explicitly specify to the students the purposes of using portfolios (FGI2&3). Without a clearly stated objective, students would not “reflect on the learning progress” (FGI2). Second, no useful feedback was added to the portfolios. Emma admitted that she just looked through the notebooks collected at the end of the semester, giving a sign or a tick without specific written feedback, as she could not afford the time on these bulky notebooks. This was echoed by students, “it is pointless to hand in the notebooks which are returned without any specific feedback” (FGI2).
Despite students’ unsatisfied comments, Emma insisted on the use of portfolios. From the third semester on, she changed the notebook portfolios into e-portfolios focusing on writing. Students were required to submit different self-edited drafts of their writing to an online platform to create a writing portfolio. The self-editing practice required students to capture multiple versions of the edition and revision in the same written essay when it was revised from different foci, such as linguistic features or academic structures. In addition, the drafted versions should be submitted while the course was in progress rather than at the end of the semester. In this process, Emma continually identified, showcased, and shared good portfolio examples with her students. Emma believed that this practice could “prompt students to learn better by telling them what the good examples should be” (IT5). However, later she cast doubt on the “good portfolio examples”, suggesting that these examples are not good indicators of students’ language proficiency. It seemed that Emma continuously reflected on her practice of using portfolios as an assessment tool. Yet this reflection failed to drive her to connect the spirit of the portfolio with students’ mastery of metacognitive skills through the use of good examples. This unsuccessful attempt might be ascribed to Emma’s insufficient CAL in this regard.

5.4. Self-Directed Development of CAL: Being Concerned with Students’ Emotional Responses

Initially, Emma aimed to enhance students’ learning autonomy through peer assessment. However, an informal survey made by Emma revealed that the students were unsure how to leverage peer assessment for their learning improvement. In the third semester, Emma significantly reduced the use of peer assessment in her General Academic Writing course, an extended course of College English. Instead, she used multiple-choice questions (MCQs) to assess students’ understanding of academic writing concepts, moves, and functions (see Table 1).
The increasing use of MCQs was required by the course coordinator, a veteran instructor who was very confident in and proud of this instruction-and-assessment mode for her General Academic Writing course. Despite her discomfort with the coordinator’s initiative, Emma had to adopt the mode out of respect to the veteran coordinator.
Student reactions to MCQs were mixed. Some interviewed students applauded the use of MCQs, as these “mini tests” could help them to stay focused. However, some students frowned on the MCQ tests, as there were “fewer interactions between the instructor and us [students]” and these tests made them “bored and uncomfortable” and “restless” (FGI3). Emma noticed these negative emotional responses and recognised that the frequent assessment “put students under great pressure” and made her feel like “a student controller” (NF4). This pressure led to negative feedback from the students at the end of the semester. Emma found this discouraging, as the feedback was “really bad for the teacher-and-student relationship” (IT6). In response, Emma deliberately diminished her profile as an assessor by reducing the number of assessment activities.
To alleviate students’ negative emotions towards assessment, Emma designed a peer assessment task for students’ oral presentations at the end of the third semester. During these presentations, peers scored each other’s performance using Tencent Docs v2.17.0 (a software programme like Google Docs), based on those pre-set assessment criteria. These scores were sent directly to Emma via an app, and the final presentation grade was an average of the peer and Emma’s scores (Classroom observation 12th). Although peer feedback and scores were not shared with students afterwards, most students were content with the activity, as they reported that, “this activity involves us and there are more interactions between peers and the instructor. At least, it makes the class not so boring” (FGI3). Despite this positive feedback, Emma continued to withhold peer feedback and scores from the students.

6. Discussion and Conclusions

The findings revealed that Emma’s CAL was developed over the three semesters. She acquired her initial conceptualisation of CAL primarily from her experiences of being assessed as a student. These experiences highlighted the importance of alternative assessment methods in promoting student learning and informed her approach to implementing these methods in her teaching practices, aligning with College English curriculum requirements. This finding corroborates that of Zhang’s study [9] in which teachers’ prior experiences as language learners shaped their pedagogical knowledge and influenced their self-directed assessment practices [9].
Emma’s CAL development was largely self-directed, rooted in her teaching and assessment practices. She adeptly integrated peer assessment into her speaking lessons to foster student autonomy, thereby gaining a nuanced understanding of the interplay between assessment and language learning. In response to student feedback on the peer assessment, Emma sought assistance from colleagues and webinars, which prompted her to independently seek out additional resources. Her attempts to enhance the effectiveness of peer assessment demonstrated her agency in refining both her assessment and instructional methods. This supports the view that teacher agency is a vital resource for this profession [40].
This study found that Emma’s self-directed CAL development was mediated by her reflections on her research and assessment practices. Her research on peer assessment led her to scrutinise her classroom assessment methods and to re-evaluate their effectiveness. This finding corroborates Babaii and Asadnia’s study [27], which showed that reflection on research and practice can significantly enhance classroom-based assessment practices and CAL. In this study, Emma enriched her assessment experiences through reflective practices, underscoring reflection as a key component of self-directed professional development [28].
Moreover, the study revealed the importance of considering students’ perceptions and emotions towards assessment in CAL development, consistent with Zhang’s findings [9]. Emma was aware of students’ feedback on her alternative assessment practices, recognising the limited impact of peer assessment on students’ learning. Consequently, she reduced the use of peer assessment and increased the use of traditional test formats, such as MCQs. When she noticed the pressure that frequent MCQs placed on students, she adjusted her assessment approach to alleviate this stress. This responsiveness to students’ perceptions and emotions facilitated her self-directed development in CAL.
Notably, Emma’s self-agency to enact assessment practice was influenced by her colleague. For instance, she increased the use of MCQs, not because she preferred this traditional assessment method, but because she simply showed respect to her colleague who advocated this method. This finding is in line with that of the study by Mansouri et al. [30] that teachers’ autonomy to implement their preferred assessment practices was sanctioned by contextual factors. Fortunately, Emma was not confined by this sanction but adjusted her agency to implement assessments tailored to students’ learning.
Despite Emma’s progress, her self-directed CAL development was not without flaws, highlighting the need for professional guidance. While she emphasised alternative assessment methods, she overlooked the importance of providing feedback informed by these assessments. Without feedback, methods like peer assessment may not effectively foster student autonomy [41]. Emma’s continued use of these methods without sufficient feedback, despite efforts to improve their effectiveness, demonstrates the potential limitations of relying solely on personal understanding. Teachers also need to seek conceptual and practical knowledge of classroom-based assessment through extensive reading and resources, especially when external guidance is lacking [42].
These findings underscore the potential of self-directed professional development in CAL through agency and reflection. Encouraging reflective practice as part of self-directed professional development can significantly enhance teachers’ assessment literacy. The effectiveness of teachers’ development of CAL is likely to require a healthy balance of self-directed learning initiatives and structured training opportunities for self-reflection and peer discussion by respective institutions. Furthermore, teachers’ self-directed development does play out in nurturing CAL so long as they possess an active agency and reflective capacity. Despite these insights, this case study still has its limits. Its findings may not be generalisable because of its focus on a single instructor. Future research should include more instructors with diverse educational backgrounds and work experience to better investigate educators’ self-directed CAL development from multiple perspectives.

Author Contributions

Conceptualization, L.G. and R.L.; methodology, L.G. and R.L.; software, L.G.; validation, L.G. and R.L.; formal analysis, L.G.; investigation, L.G.; resources, R.L.; data curation, L.G.; writing—original draft preparation, L.G.; writing—review and editing, R.L.; supervision, R.L.; project administration, L.G. and R.L. All authors have read and agreed to the published version of the manuscript.

Funding

This research received no external funding.

Institutional Review Board Statement

The study was conducted in accordance with the Research Ethics Guidelines by Hong Kong Baptist University, and the protocol was approved by the Research Ethics Committee on 22 October 2020.

Informed Consent Statement

Informed consent was obtained from all subjects involved in the study.

Data Availability Statement

The data presented in this study are available on request from the corresponding author due to ethical reasons.

Conflicts of Interest

The authors declare no conflict of interest.

References

  1. Stiggins, R. The Perfect Assessment System; ASCD: Washington, DC, USA, 2017. [Google Scholar]
  2. Xu, H. Exploring Novice EFL Teachers’ Classroom Assessment Literacy Development: A Three-Year Longitudinal Study. Asia-Pac. Educ. Res. 2017, 26, 219–226. [Google Scholar] [CrossRef]
  3. Gu, P.Y. The Unbearable Lightness of the Curriculum: What Drives the Assessment Practices of a Teacher of English as a Foreign Language in a Chinese Secondary School? Assess. Educ. 2014, 21, 286–305. [Google Scholar] [CrossRef]
  4. Asamoah, D.; Shahrill, M.; Abdul Latif, S.N. Towards Developing Classroom Assessment Literacy: Exploring Teachers’ Approaches to Assessment across Cultures. Cogent Educ. 2023, 10, 2280301. [Google Scholar] [CrossRef]
  5. DeLuca, C.; Braund, H. Preparing Assessment Literate Teachers. In Oxford Research Encyclopedia of Education; Oxford University Press: Oxford, UK, 2019. [Google Scholar]
  6. Baker, B.A.; Riches, C. The Development of EFL Examinations in Haiti: Collaboration and Language Assessment Literacy Development. Lang. Test. 2018, 35, 557–581. [Google Scholar] [CrossRef]
  7. Lee, J. A Training Project to Develop Teachers’ Assessment Literacy. In Handbook of Research on Assessment Literacy and Teacher-Made Testing in the Language Classroom; IGI Global: Hershey, PA, USA, 2019; pp. 58–80. [Google Scholar]
  8. Levi, T.; Inbar-Lourie, O. Assessment Literacy or Language Assessment Literacy: Learning from the Teachers. Lang. Assess. Q. 2020, 17, 168–182. [Google Scholar] [CrossRef]
  9. Zhang, X. Assessment for Learning in Constrained Contexts: How Does the Teacher’s Self-Directed Development Play Out? Stud. Educ. Eval. 2020, 66, 100909. [Google Scholar] [CrossRef]
  10. Gan, L.; Lam, R. Language Assessment Literacy Development of a Novice University English Teacher in the Chinese Context. RELC J. 2024. [Google Scholar] [CrossRef]
  11. Xu, Y.; Liu, Y. Teacher Assessment Knowledge and Practice: A Narrative Inquiry of a Chinese College EFL Teacher’s Experience. TESOL Q. 2009, 43, 492–513. [Google Scholar] [CrossRef]
  12. Mertler, C.A.; Campbell, C. Secondary Teachers’ Assessment Literacy: Does Classroom Experience Make a Difference? Am. Second. Educ. 2004, 33, 49–64. [Google Scholar]
  13. Sultana, N. Language Assessment Literacy: An Uncharted Area for the English Language Teachers in Bangladesh. Lang. Test. Asia 2019, 9, 1. [Google Scholar] [CrossRef]
  14. Vogt, K.; Tsagari, D. Assessment Literacy of Foreign Language Teachers: Findings of a European Study. Lang. Assess. Q. 2014, 11, 374–402. [Google Scholar] [CrossRef]
  15. Darling-Hammond, L. Accountability in Teacher Education. Action Teach. Educ. 2020, 42, 60–71. [Google Scholar] [CrossRef]
  16. Popham, W.J. Assessment Literacy for Educators in a Hurry; ASCD: Washington, DC, USA, 2018. [Google Scholar]
  17. Koh, K.H. Improving Teachers’ Assessment Literacy through Professional Development. Teach. Educ. 2011, 22, 255–276. [Google Scholar] [CrossRef]
  18. DeLuca, C.; Johnson, S. Developing Assessment Capable Teachers in This Age of Accountability. Assess. Educ. 2017, 24, 121–126. [Google Scholar] [CrossRef]
  19. Xu, Y.; Brown, G.T.L. Teacher Assessment Literacy in Practice: A Reconceptualization. Teach. Teach. Educ. 2016, 58, 149–162. [Google Scholar] [CrossRef]
  20. Fulcher, G. Operationalising language assessment literacy. In Language Assessment Literacy: From Theory to Practice; Cambridge Scholars Publishing: Newcastle upon Tyne, UK, 2020; pp. 8–28. [Google Scholar]
  21. Ukrayinska, O. Synergies in Developing Pre-Service Teachers’ Language Assessment Literacy in Ukrainian Universities. Educ. Sci. 2024, 14, 223. [Google Scholar] [CrossRef]
  22. Estaji, M. Perceived Need for a Teacher Education Course on Assessment Literacy Development: Insights from EAP Instructors. Asian-Pac. J. Second. Foreign Lang. Educ. 2024, 9, 50. [Google Scholar] [CrossRef]
  23. Cowie, B.; Cooper, B.; Ussher, B. Developing an Identity as a Teacher-Assessor: Three Student Teacher Case Studies. Assess. Matters 2014, 7, 64–89. [Google Scholar] [CrossRef]
  24. Lopes, J.B.; Cunha, A.E. Self-Directed Professional Development to Improve Effective Teaching: Key Points for a Model. Teach. Teach. Educ. 2017, 68, 262–274. [Google Scholar] [CrossRef]
  25. Jackie Gerstein, E.D. Teacher Agency: Self-Directed Professional Development. User Generated Education. Available online: https://usergeneratededucation.wordpress.com/2013/11/11/teacher-agency-self-directed-professional-development/ (accessed on 15 June 2024).
  26. Minott, M.A. Reflective Teaching as Self-directed Professional Development: Building Practical or Work-related Knowledge. Prof. Dev. Educ. 2010, 36, 325–338. [Google Scholar] [CrossRef]
  27. Babaii, E.; Asadnia, F. A Long Walk to Language Assessment Literacy: EFL Teachers’ Reflection on Language Assessment Research and Practice. Reflective Pract. 2019, 20, 745–760. [Google Scholar] [CrossRef]
  28. Tian, W.; Louw, S.; Khan, M.K. COVID-19 as a Critical Incident: Reflection on Language Assessment Literacy and the Need for Radical Changes. System 2021, 103, 102682. [Google Scholar] [CrossRef]
  29. Arefian, M.H. Perceptions of Self-Assessment Literacy and Self-Directed Reflection during Online Learning for Iranian EFL Student Teachers. Reflective Pract. 2022, 23, 623–634. [Google Scholar] [CrossRef]
  30. Mansouri, B.; Molana, K.; Nazari, M. The Interconnection between Second Language Teachers’ Language Assessment Literacy and Professional Agency: The Mediating Role of Institutional Policies. System 2021, 103, 102674. [Google Scholar] [CrossRef]
  31. Patton, M.Q. Qualitative Research & Evaluation Methods: Integrating Theory and Practice, 4th ed.; SAGE Publications: Thousand Oaks, CA, USA, 2023. [Google Scholar]
  32. Fusch, P.; Fusch, G.; Ness, L. How to Conduct a Mini-Ethnographic Case Study: A Guide for Novice Researchers. Qual. Rep. 2017, 22, 923–941. [Google Scholar] [CrossRef]
  33. China Ministry of Education. Guidelines on College English Teaching; Higher Education Press: Beijing, China, 2020. [Google Scholar]
  34. Sun, H.; Zhang, J. Assessment Literacy of College EFL Teachers in China: Status Quo and Mediating Factors. Stud. Educ. Eval. 2022, 74, 101157. [Google Scholar] [CrossRef]
  35. Coombs, A.; DeLuca, C.; LaPointe-McEwan, D.; Chalas, A. Changing Approaches to Classroom Assessment: An Empirical Study across Teacher Career Stages. Teach. Teach. Educ. 2018, 71, 134–144. [Google Scholar] [CrossRef]
  36. Barkhuizen, G.; Wette, R. Narrative Frames for Investigating the Experiences of Language Teachers. System 2008, 36, 372–387. [Google Scholar] [CrossRef]
  37. Miles, M.B.; Huberman, A.M.; Saldana, J. Qualitative Data Analysis: A Methods Sourcebook, 4th ed.; SAGE Publications: Thousand Oaks, CA, USA, 2019. [Google Scholar]
  38. Looney, A.; Cumming, J.; van Der Kleij, F.; Harris, K. Reconceptualising the Role of Teachers as Assessors: Teacher Assessment Identity. Assess. Educ. 2018, 25, 442–467. [Google Scholar] [CrossRef]
  39. Frey, B.B. Modern Classroom Assessment; SAGE Publications: Thousand Oaks, CA, USA, 2014. [Google Scholar]
  40. Zhang, X. Teaching Reading beyond Language Form: A Case Study of Chinese College English Teachers’ Self-directed Development. Asian EFL J. Q. 2018, 20, 180–214. [Google Scholar]
  41. Yang, M.; Badger, R.; Yu, Z. A Comparative Study of Peer and Teacher Feedback in a Chinese EFL Writing Class. J. Second. Lang. Writ. 2006, 15, 179–200. [Google Scholar] [CrossRef]
  42. Lam, R. Teacher Assessment Literacy: Surveying Knowledge, Conceptions and Practices of Classroom-Based Writing Assessment in Hong Kong. System 2019, 81, 78–89. [Google Scholar] [CrossRef]
Figure 1. A Record of Peer Assessment with a New App. Note. The number in the first column is the students’ score of speaking averaged by the teacher’s mark on the second column and the peers’ mark on the third column. The last column is the number of students involved in the peer assessment. The number before the slash is the recorded number of peer assessments the students have completed and the number after the slash is the total number of peer assessment tasks they should have completed.
Figure 1. A Record of Peer Assessment with a New App. Note. The number in the first column is the students’ score of speaking averaged by the teacher’s mark on the second column and the peers’ mark on the third column. The last column is the number of students involved in the peer assessment. The number before the slash is the recorded number of peer assessments the students have completed and the number after the slash is the total number of peer assessment tasks they should have completed.
Education 14 00961 g001
Table 1. The Frequency of Using MCQs in the Classroom Observations.
Table 1. The Frequency of Using MCQs in the Classroom Observations.
Classroom Observation (the Third Semester)Number of MCQ
9th10
10th2
11th10
12th1
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Gan, L.; Lam, R. Unveiling Classroom Assessment Literacy: Does Teachers’ Self-Directed Development Play Out? Educ. Sci. 2024, 14, 961. https://doi.org/10.3390/educsci14090961

AMA Style

Gan L, Lam R. Unveiling Classroom Assessment Literacy: Does Teachers’ Self-Directed Development Play Out? Education Sciences. 2024; 14(9):961. https://doi.org/10.3390/educsci14090961

Chicago/Turabian Style

Gan, Ling, and Ricky Lam. 2024. "Unveiling Classroom Assessment Literacy: Does Teachers’ Self-Directed Development Play Out?" Education Sciences 14, no. 9: 961. https://doi.org/10.3390/educsci14090961

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop