The effect of response format and presentation conditions on comprehension assessments for students with and without a reading disability

UNCG Author/Contributor (non-UNCG co-authors, if there are any, appear on document)
Ronda Walker (Creator)
The University of North Carolina at Greensboro (UNCG )
Web Site:
Alan Kamhi

Abstract: Previous studies (Collins, 2015; Kennan & Meenan, 2014) have shown how variations in text and task factors and individual reader skills affect performance on reading comprehension assessments. The present study examined whether different presentation conditions (silent reading, watching a video) and response formats (open-ended vs. multiple-choice questions) influenced comprehension performance for students with and without reading disabilities. In addition, measures of word-level reading, vocabulary, working memory, listening comprehension, and prior knowledge were also assessed to determine the best predictors of performance on comprehension assessments. Participants were 32 fifth grade students, 17 with reading disabilities (RD) and 15 typically developing (TD) students. All students were initially administered measures of word-level reading, vocabulary, listening comprehension, working memory, and decoding. Students were then administered four passages. Two of the passages were read silently and two were presented with videos. For each condition (text and video), comprehension was assessed with open-ended and multiple-choice questions. All assessments were administered individually to each student across two 60-minute testing sessions. All students were found to perform significantly better on the multiple-choice questions than the open-ended questions. As expected, the TD group had significantly higher comprehension scores on all measures. Presentation condition did not significantly affect performance for either group. Listening comprehension, working memory, and prior knowledge contributed unique variance to performance on the different response formats. For the open-ended questions, 67% of the variance was explained by the measures of listening comprehension and prior knowledge. In contrast, only 38% of the variance was explained by working memory for the multiple-choice questions. Even though students performed better on the multiple-choice questions, the regression analyses indicated that the open-ended questions were better reflections of basic language abilities and prior knowledge. Open-ended questions appear to provide a better measure of reader and text factors than multiple-choice questions which are more influenced by task factors. Future studies should continue to examine how reader, text and task factors influence comprehension performance.

Additional Information

Language: English
Date: 2017
Language, Literacy, Reading Comprehension, Reading Disability, Response Format
Educational tests and measurements $x Evaluation
Reading comprehension $x Ability testing
Reading $x Ability testing
Reading disability

Email this document to