Next Article in Journal
The Use of the Kahoot! Learning Platform as a Type of Formative Assessment in the Context of Pre-University Education during the COVID-19 Pandemic Period
Previous Article in Journal
Patterns of Scientific Reasoning Skills among Pre-Service Science Teachers: A Latent Class Analysis
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Low Inter-Rater Reliability of a High Stakes Performance Assessment of Teacher Candidates

Rossier School of Education, University of Southern California, Los Angeles, CA 90089, USA
*
Author to whom correspondence should be addressed.
Educ. Sci. 2021, 11(10), 648; https://doi.org/10.3390/educsci11100648
Submission received: 25 August 2021 / Revised: 3 October 2021 / Accepted: 6 October 2021 / Published: 18 October 2021
(This article belongs to the Section Teacher Education)

Abstract

The Performance Assessment for California Teachers (PACT) is a high stakes summative assessment that was designed to measure pre-service teacher readiness. We examined the inter-rater reliability (IRR) of trained PACT evaluators who rated 19 candidates. As measured by Cohen’s weighted kappa, the overall IRR estimate was 0.17 (poor strength of agreement). IRR estimates ranged from −0.29 (worse than expected by chance) to 0.54 (moderate strength of agreement); all were below the standard of 0.70 for consensus agreement. Follow-up interviews of 10 evaluators revealed possible reasons we observed low IRR, such as departures from established PACT scoring protocol, and lack of, or inconsistent, use of a scoring aid document. Evaluators reported difficulties scoring the materials that candidates submitted, particularly the use of Academic Language. Cognitive Task Analysis (CTA) is suggested as a method to improve IRR in the PACT and other teacher performance assessments such as the edTPA.
Keywords: inter-rater reliability; preservice teacher performance assessment; PACT; edTPA; weighted kappa; cognitive task analysis; qualitative; quantitative inter-rater reliability; preservice teacher performance assessment; PACT; edTPA; weighted kappa; cognitive task analysis; qualitative; quantitative

Share and Cite

MDPI and ACS Style

Lyness, S.A.; Peterson, K.; Yates, K. Low Inter-Rater Reliability of a High Stakes Performance Assessment of Teacher Candidates. Educ. Sci. 2021, 11, 648. https://doi.org/10.3390/educsci11100648

AMA Style

Lyness SA, Peterson K, Yates K. Low Inter-Rater Reliability of a High Stakes Performance Assessment of Teacher Candidates. Education Sciences. 2021; 11(10):648. https://doi.org/10.3390/educsci11100648

Chicago/Turabian Style

Lyness, Scott A., Kent Peterson, and Kenneth Yates. 2021. "Low Inter-Rater Reliability of a High Stakes Performance Assessment of Teacher Candidates" Education Sciences 11, no. 10: 648. https://doi.org/10.3390/educsci11100648

APA Style

Lyness, S. A., Peterson, K., & Yates, K. (2021). Low Inter-Rater Reliability of a High Stakes Performance Assessment of Teacher Candidates. Education Sciences, 11(10), 648. https://doi.org/10.3390/educsci11100648

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop