A Framework of Construct-Irrelevant Variance for Contextualized Constructed Response Assessment
Abstract
Estimating and monitoring the construct-irrelevant variance (CIV) is of significant importance to validity, especially for constructed response assessments with rich contextualized information. To examine CIV in contextualized constructed response assessments, we developed a framework including a model accounting for CIV and a measurement that could differentiate the CIV. Specifically, the model includes CIV due to three factors: the variability of assessment item scenarios, judging severity, and rater scoring sensitivity to the scenarios in tasks. We proposed using the many-facet Rasch measurement (MFRM) to examine the CIV because this measurement model can compare different CIV factors on a shared scale. To demonstrate how to apply this framework, we applied the framework to a video-based science teacher pedagogical content knowledge (PCK) assessment, including two tasks, each with three scenarios. Results for task I, which assessed teachers’ analysis of student thinking, indicate that the CIV due to the variability of the scenarios was substantial, while the CIV due to judging severity and rater scoring sensitivity of the scenarios in teacher responses was not. For task II, which assessed teachers’ analysis of responsive teaching, results showed that the CIV due to the three proposed factors was all substantial. We discuss the conceptual and methodological contributions, and how the results inform item development.
Author
Xiaoming Zhai,
Kevin Haudek,
Christopher Wilson,
Molly Stuhlsatz
Year of Publication
2021
Journal
Frontiers in Education
Volume
6
Number of Pages
409
ISSN Number
2504-284X
URL
https://www.frontiersin.org/article/10.3389/feduc.2021.751283
DOI
10.3389/feduc.2021.751283