Multiple Stakeholder Interaction to Enhance Preservice Teachers’ Language Assessment Literacy
Abstract
:1. Introduction
2. Literature Review
- (1)
- How do multiple stakeholders interact during a language assessment development project in teacher education?
- (2)
- How do such interactions assist preservice teachers’ LAL as well as their mutual learning among all the participating stakeholders?
3. Study
3.1. Participants
3.2. Data
3.3. Data Analysis
4. Results
4.1. Research Question 1
4.1.1. Planning Stage
4.1.2. Development Stage
4.1.3. Implementation Stage
4.1.4. Reflection Stage
4.2. Research Question 2
4.2.1. Preservice Teachers’ LAL Enhancement 1: Assessment Knowledge
Realizing the Importance of Theories in Assessment Development
Even for Helen, who had decades of experience teaching English to adults, developing theoretically sound language assessment was a new experience. In her reflection paper, she said:As a result of taking part in this process, I feel that I am better equipped at creating assessments that are more deliberate in the constructs to be tested, as well as more confident in what to include in those assessments to let my students demonstrate what they know.
The greatest learning for me throughout the process was in determining the constructs to be used in the assessments. I became aware that many of the tests that are created lack one or more aspects of the theories. I realize that it is important for me to consider all attributes in my own future test development. This course has given the tools needed [to do this].
Due to the discrepancy among evaluation, the group members brought much discussion among themselves. Diana believed that this experience further enhanced her understanding of how inter-rater reliability works. The preservice teachers’ reflections as well as their interaction with the course instructor showed that the hands-on practices of developing language assessment helped them consider assessment theories less esoteric and technical.I experienced first-hand how scorer reliability can impact test results. As stated previously, we had varying interpretations of the test results. This disagreement caused our students to have different scores depending on the individual who scored their assessment. Even with a rubric, the scores were different.
Contextualizing assessment theories: Reflecting test-takers’ needs
Figure 9 is an example of Yuri’s group’s listening assessment. Reflecting the ESOL students’ interests in ancient Egyptians, their listening assessment’s text was about the interview of an Egyptologist.During the Skype session today, the teacher talked about a lot of things of interest for the students (eighth graders), but the following are the ones that caught my attention: The use of texts of the students’ interest and make them respond to the comprehension questions, and discussion can be a very good activity for them.The students have been working on the passive voice in present tenses; so, any assessment using the passive voice should be worded in those tenses.They have been reading about civilization and especially the ancient Egyptians and the Mayans, and they were amazed at discovering that so distant and different civilizations have used the same techniques of communication among themselves and others in writings.Assessments for the students should be for improving their learning, not for just testing their language ability.
The first major learning experience was about the importance of context specifics of our tests (the students’ profile—age and L1 background, their language proficiency levels and their familiarity with different testing modes and formats). I learnt that all these factors are to be considered to make more valid, reliable and practical tests with a good likelihood of positive backwash on the students’ learning.
Practicing Assessment Theories: Rethinking Constructs
In terms of what other things could papyrus be used for? We were hoping to stretch the more advanced students to inference other ideas and make the connection of paper. One student did so with the answer of paper airplanes. For our next assessment, we are titling it with the passages. As suggested, this is more in line with what the students are accustomed to.
For one of our open-ended questions, one of our test takers provided both a creative answer and a creative false orthography for a poorly constructed test-item (“What else can papyrus be used for?”). For this question we received a range of responses that demonstrated that the students may not be at a strong level for the critical thinking skill inferencing, which was the construct we were aiming to test.
4.2.2. Preservice Teachers’ LAL Enhancement 2: Assessment Skills
Making Use of Pictorial Aids
According to the inservice teacher, her students mentioned the lack of pictorial aids in the listening assessment. They could have noticed the differences between the WIDA listening assessment and the preservice teachers’ assessment, as the WIDA listening assessment mostly includes pictorial aids.I feel there was an opportunity to include a picture of the papyrus in the background. It would have given the students the scaffold of comprehensible input.The questions need pictorial support.Again, adding a picture of an anime character would help the students in providing a comprehensible input scaffold.Having some options to choose from pictures would have been beneficial for emerging bilinguals at the beginning stages of language development.
However, the preservice teachers’ response to the course instructor was slightly different:As per your suggestions, pictures would have given the students a scaffold of comprehensible input. This may have better prepared them. This was true in both passages. As a result, we have incorporated illustrations in our writing assessment.
In their response to the course instructor, the preservice teachers tried to justify their choice of not including the pictures in the listening assessment; they wanted to avoid a construct-irrelevant factor or distraction from the students. The course instructor agreed with the preservice teachers’ justification of not including pictures in the listening assessment. However, she also advised her students (i.e., preservice teachers) to reflect the ESOL students’ familiarity with the WIDA exam. In the end, responding to the inservice teacher’s comment about the need for pictorial aids, the preservice teachers decided to follow her suggestion in the writing assessment (Figure 11). As the preservice teachers mentioned, the seventh-grade writing assessment included more pictorial aids. The preservice teachers did not express that adding pictorial aids would have impacted the ESOL students’ writing performance. Nonetheless, there were no further discussions of the impact of pictorial aids in the writing assessment on the ESOL students’ perception of the assessment. It could have been a fruitful experience if the stakeholders had had a chance to discuss the role of pictures in assessment more explicitly.After administering the test, the teacher had suggested we use more visuals along with the prompt in order to give more context to draw from when answering the question.We felt that because this is a listening assessment, we wanted to focus on auditory input rather than provide too much support that would mask their true listening ability.
Motivating Learners While Avoiding Construct-Irrelevant Factors
Receiving direct feedback from the inservice teacher and her students gave the preservice teachers a chance to reflect their assessment format and gave them some confidence in their decision.We thought the silly voices in the listening prompt would have been a distraction, but it appears the students enjoyed the voices; this surprised us, but we’re glad it kept the students engaged.
A suggestion from the teacher that we would like to take into consideration for replicating this type of assessment would be to omit having the test name in bold at the top of the page while also asking the students to create their own name for the story. The teacher had informed us that the students were confused by our question “What would be an appropriate title for this passage?” because they were accustomed to having story titles at the top of the page, and for our assessment we had the name of our group project in bold letters in that space.
4.2.3. Preservice Teachers’ LAL Enhancement 3: Assessment Principles/Consequences
Assessment for Learning
The biggest takeaway that I took from the test development process was understanding how to create ways that enable students to learn through assessment... Seeing real life responses from real students made it easier to envision how my choices in determining what to include in an assessment affect student learning.
Bowen also received feedback about his listening and writing performances from three other preservice teachers (Helen, Tim, and Mia) independently. The following quote is Tim’s feedback to Bowen’s writing performance:In your listening assessment you showed that you are skilled at finding the main idea and specific points after listening to a passage. An area of improvement for this assessment is inferencing, which means that you would benefit from practicing figuring out what would happen in a situation or place that you have not seen before based on experiences you have had in the past. To practice this skill, I would recommend playing a game with your classmates called 20 questions, where your friend picks an object, and you ask questions to figure out what that object may be based on the answers you receive.
Tim’s feedback also started with what Bowen did well. In his feedback, instead of identifying the punctuation error, he directed Bowen to find the error by himself. Tim’s feedback ended with encouraging and motivating words. At the end of the project, Bowen shared his responses to the experience, “this makes me feel so special!!!!! I am so happy to read this!!! It is not just “Good Job!” Although he must have received feedback from his teacher before, he enjoyed the extensive and individualized feedback from the preservice teachers. In Tim’s reflection, he emphasized the importance of avoiding negative backwash in assessment by using a well-prepared rubric and feedback to students.Your answers for the Writing Assessment were great. The way you answered the first two questions was how I imagined them being answered. First you gave a simple sentence. Then, you added onto this sentence to create a compound sentence. You did a good job describing Pikachu.Your writing on Dark Ben was creative and gave a lot of information. I could see what he looked like, until he goes invisible of course! I could not have asked for a better writing. You wrote introductory and concluding sentences and provided extra information on his powers and future hopes. Furthermore, I only found a single punctuation issue. You wrote, “the character’s nickname would be Dark Ben”. Can you tell where the error is? Your writing skills are strong enough that I think you know where the mistake is. Your writing will only get better with more practice Bowen. Keep up the good work.
4.2.4. Inservice Teachers’ and ESOL Students’ LAL Enhancement
Engaging in the language assessment development project, the inservice teacher also had a chance to learn about assessment knowledge and observed how the knowledge was applied in actual language assessment development. In the correspondence with the course instructor, the inservice teacher mentioned that she learned about language assessment theories and application during the language assessment development project.I realized that my own teacher education program (20 years ago) did not offer this information [language assessment theories]. I wanted to learn more about the application of the framework to creation of the assessment experience for my students, based on the content I was teaching.
It was beneficial to see the assessment ideas and materials that the pre-service teachers created based on their study of the assessment theory. It was interesting to see how the target content and language skills were reflected in their assessment exercises.
As her reflection indicated, individualized and focused feedback provided by the preservice teachers to each ESOL student motivated the learners.The pre-service teachers were providing written feedback to the students, and this practice had a great impact on the student’s motivation. In the day-to-day learning and assignments completion with the ubiquitous use of rubrics and computerized responses the feedback tends to be too general and yields limited further learning. When my students got specific feedback praising their effort and success in language structures, as well as opportunities for growth, they were motivated to learn further and appreciated the attention.
It made school initiatives such as American Education Week and Career Day more relevant because the students gained the experience of contributing to a professional field and saw representation of the ESOL field in practice.
4.2.5. The Course Instructor’s LAL Enhancement
4.2.6. Challenges
Becca’s experience was not an isolated example. Diana, who planned to teach adult learners upon completion of her master’s degree, also reported, “one of the most important learning experiences I had comes from working with two other individuals”. Straightforward and timely communication was certainly a challenge in her group. One of the group members shortened part of an audio file in their listening assessment, but the other member did not remove a few listening questions that were associated with that part of the listening assessment. It caused confusion among the ESOL students.Working with peers that had a different way of approaching planning pushed me to remember to pay attention to little details while also being confident enough in my own methods to speak up when I feel that we need to work in a different direction.Having the ability to respectfully disagree and be strategic in working towards a common goal on a team will be a critical skill in the workforce in the field of teaching, so this experience in test development was definitely beneficial in working towards my future career goals.
5. Discussion
5.1. All Stakeholders as Key-Informants to Enhance LAL
5.2. Inclusive Approaches to LAL Enhancement
5.3. Importance of Assessment Knowledge and Its Application
5.4. Limitations
6. Conclusions
Author Contributions
Funding
Institutional Review Board Statement
Informed Consent Statement
Data Availability Statement
Conflicts of Interest
References
- Assessment Reform Group. 2002. Assessment for Learning: 10 Principles. Cambridge: Faculty of Education, University of Cambridge. [Google Scholar]
- Bachman, Lyle F., and Adrian S. Palmer. 2002. Language Testing in Practice: Designing and Developing Useful Language Tests. Oxford: Oxford University Press. [Google Scholar]
- Baker, Beverly, and Caroline Riches. 2017. The development of EFL examinations in Haiti: Collaboration and language assessment literacy development. Language Testing 35: 557–81. [Google Scholar] [CrossRef]
- Bøhn, Henrik, and Dina Tsagari. 2021. Teacher educators’ conceptions of language assessment literacy in Norway. Journal of Language Teaching and Research 12: 222–33. [Google Scholar] [CrossRef]
- Brindley, Geoff. 2001. Language assessment and professional development. In Experimenting with Uncertainty: Essays in Honour of Alan Davies. Edited by Cathie. A. Elder, Alison Brown, Kathryn Hill, Noriko Iwashita, Tom Lumley, Tim McNamara and Kieran O’Loughlin. Cambridge: Cambridge University Press, pp. 126–36. [Google Scholar]
- Brown, James Dean, and Kathleen M. Bailey. 2008. Language testing courses: What are they in 2007? Language Testing 25: 349–83. [Google Scholar] [CrossRef]
- Butler, Yuko Goto, Xiaolin Peng, and Jiyoon Lee. 2021. Young learners’ voices: Towards a learner-centered approach to understanding language assessment literacy. Language Testing, 0265532221992274. [Google Scholar] [CrossRef]
- Csépes, Ildikó. 2014. Language assessment literacy in English teacher training programmers in Hungary. In Studies in Honour of Nikolov Marianne. Edited by József Horváth and Peter Medgyes. Pécs: Lingua Franca Csoport, pp. 399–411. [Google Scholar]
- Davies, Andrea. 2008. Textbook trends in teaching language testing. Language Testing 25: 327–47. [Google Scholar] [CrossRef]
- DeLuca, Christopher, and Don A. Klinger. 2010. Assessment literacy development: Identifying gaps in teacher candidates’ learning. Assessment in Education: Principles, Policy & Practice 17: 419–38. [Google Scholar]
- Fulcher, Glenn. 2012. Assessment literacy for the language classroom. Language Assessment Quarterly 9: 113–32. [Google Scholar] [CrossRef]
- Giraldo, Frank. 2019. Language assessment practices and beliefs: Implications for language assessment literacy. HOW 26: 35–61. [Google Scholar] [CrossRef]
- Inbar-Lourie, Ofra. 2008. Constructing an assessment knowledge base: A focus on language assessment courses. Language Testing 25: 385–402. [Google Scholar] [CrossRef]
- Inbar-Lourie, Ofra. 2017. Language assessment literacy. In Language Testing and Assessment, Encyclopedia of Language and Education, 3rd ed. Edited by Elana. S. Shohamy, Stephen May and Lair G. Or. Berlin: Springer, pp. 257–68. [Google Scholar]
- Jeong, Heegeong. 2013. Defining assessment literacy: Is it different for language testers and non-language testers? Language Testing 30: 345–62. [Google Scholar] [CrossRef]
- Koh, Kim, Lydia. E. Carol, Ann Burke, Allan Luke, Wengao Gong, and Charlene Tan. 2018. Developing the assessment literacy of teachers in Chinese language classrooms: A focus on assessment task design. Language Teaching Resaerch 22: 264–88. [Google Scholar] [CrossRef]
- Kremmel, Benjamin, and Luke Harding. 2020. Towards a comprehensive, empirical model of language assessment literacy across stakeholder groups: Developing the language assessment literacy survey. Language Assessment Quarterly 17: 100–20. [Google Scholar] [CrossRef]
- Lam, Ricky. 2015. Language assessment training in Hong Kong: Implications for language assessment literacy. Language Testing 32: 169–97. [Google Scholar] [CrossRef]
- Lee, Jiyoon. 2019. A training project to develop teachers’ assessment literacy. In Handbook of Research on Assessment Literacy and Teacher-Made Testing in the Language Classroom. Edited by E. White and T. Delaney. Hershey: IGI Global, pp. 58–80. [Google Scholar]
- Lee, Jiyoon, and Yuko Goto Butler. 2020. Where are language learners? Reconceptualizing language assessment literacy. TESOL Quarterly 54: 1098–111. [Google Scholar] [CrossRef]
- Malone, Margaret E. 2017. Unpacking language assessment literacy: Differentiating needs of stakeholder groups. Paper presented at East Coast Organization of Language Testers, Washington, DC, USA, October 20. [Google Scholar]
- Nowell, Lorelli S., Jill M. Norris, Deborah E. White, and Nancy J. Moules. 2017. Thematic analysis: Striving to meet the trustworthiness criteria. International Journal of Qualitative Methods 16: 1–13. [Google Scholar] [CrossRef]
- Pill, John, and Luke Harding. 2013. Defining the language assessment literacy gap: Evidence from a parliamentary inquiry. Language Testing 30: 381–402. [Google Scholar] [CrossRef] [Green Version]
- Scarino, Angela. 2013. Language assessment literacy as self-awareness: Understanding the role of interpretation in assessment and in teacher learning. Language Testing 30: 309–27. [Google Scholar] [CrossRef]
- Stiggins, Rick. 2007. Conquering the formative assessment frontier. In Formative Assessment Classroom: Theory into Practice. Edited by H. McMillan. New York: Teachers College Press, pp. 8–28. [Google Scholar]
- Taylor, Lynda. 2013. Communicating the theory, practice and principles of language testing to test stakeholders: Some reflections. Language Testing 30: 403–12. [Google Scholar] [CrossRef]
- Vogt, Karin, and Dina Tsagari. 2014. Assessment literacy of foreign language teachers: Findings of a European study. Language Assessment Quarterly 11: 374–402. [Google Scholar] [CrossRef]
- Vogt, Karin, Dina Tsagari, Ildiko Csépes, Anthony Green, and Nicos Sifakis. 2020. Linking learners’ perspectives on language assessment practices to teachers’ assessment literacy enhancement (TALE): Insights from four European countries. Language Assessment Quarterly 17: 410–33. [Google Scholar] [CrossRef]
- Volante, Louis, and Xavier Fazio. 2007. Exploring teacher candidates’ assessment literacy: Implications for teacher-education reform and professional development. Canadian Journal of Education 30: 749–70. [Google Scholar] [CrossRef]
- WIDA. 2020. WIDA English Language Development Standards Framework, 2020 Edition Kindergarten—Grade 12. Board of Regents of the University of Wisconsin System. Madison: WIDA. [Google Scholar]
- Yan, Xun, and Jason Fan. 2021. “Am I qualified to be a language tester?”: Understanding the development of language assessment literacy across three stakeholder groups. Language Testing 38: 219–46. [Google Scholar] [CrossRef]
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations. |
© 2021 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
Lee, J.; Butler, Y.G.; Peng, X. Multiple Stakeholder Interaction to Enhance Preservice Teachers’ Language Assessment Literacy. Languages 2021, 6, 213. https://doi.org/10.3390/languages6040213
Lee J, Butler YG, Peng X. Multiple Stakeholder Interaction to Enhance Preservice Teachers’ Language Assessment Literacy. Languages. 2021; 6(4):213. https://doi.org/10.3390/languages6040213
Chicago/Turabian StyleLee, Jiyoon, Yuko Goto Butler, and Xiaolin Peng. 2021. "Multiple Stakeholder Interaction to Enhance Preservice Teachers’ Language Assessment Literacy" Languages 6, no. 4: 213. https://doi.org/10.3390/languages6040213
APA StyleLee, J., Butler, Y. G., & Peng, X. (2021). Multiple Stakeholder Interaction to Enhance Preservice Teachers’ Language Assessment Literacy. Languages, 6(4), 213. https://doi.org/10.3390/languages6040213