- Main
Exploring the item features of a science assessment with complex tasks
Published Web Location
https://doi.org/10.1016/j.measurement.2017.08.039Abstract
Item explanatory models have the potential to provide insight into why certain items are easier or more difficult than others. Through the selection of pertinent item features, one can gather validity evidence for the assessment if construct-related item characteristics are chosen. This is especially important when designing assessment tasks that address new standards. Using data from the Learning Progressions in Middle School Science Instruction and Assessment (LPS) project, this paper adopts an “item explanatory” approach and investigates whether certain item features can explain differences in item difficulties by applying an extension of the linear logistic test model. Specifically, this paper explores the effects of five features on item difficulty: type (argumentation, content, embedded content), scenario-based context, format (multiple-choice or open-ended), graphics, and academic vocabulary. Interactions between some of these features were also investigated. With the exception of context, all features had a statistically significant effect on difficulty.
Many UC-authored scholarly publications are freely available on this site because of the UC's open access policies. Let us know how this access is important for you.
Main Content
Enter the password to open this PDF file:
-
-
-
-
-
-
-
-
-
-
-
-
-
-