Performance variations across reading comprehension assessments: Examining the unique contributions of text, activity, and reader

AbstractThese studies examined the contribution of text, activity, and reader to variance in reading comprehension test scores. Study 1 focused on multiple-choice and open-ended item responses, whereas Study 2 examined retell. Both studies included 79 fourth-grade students (ageM = 9.72;SD = .34). Each student read six passages from theQualitative Reading Inventory-Fifth Edition (QRI-5) and completed comprehension assessments of varying response format (open-ended questions, multiple choice, and retell). Measures of cognitive capacity, language knowledge, learning motivation, and word reading fluency were also administered. In Study 1, item-response crossed random effects models revealed statistically significant differences between open-ended question and multiple-choice response formats, and three covariates significantly predicted reading comprehension test scores: (a) attentive behavior, (b) language knowledge, and (c) working memory. Further exploratory analyses identified two-way interactions: (a) Response Format  × Attentive Behavior, and (b) Response Format × Language Knowledge. In Study 2, crossed random effects models revealed two statistically significant predictors of retell scores: (a) text genre, and (b) language knowledge. Findings suggest different response format activities may contribut e to variance in reading comprehension tests scores, and this test property may further interact with text as well as reader abilities.
Source: Reading and Writing - Category: Child Development Source Type: research