Elementary and middle school students’ test scores were negatively affected by a computer-based exam interface that did not allow them to return to questions they had previously completed or skipped, according to a new study from the American Association for the Advancement of Science presented at the annual conference of the American Educational Research Association.
Non-native English speakers in the study also performed better on paper-and-pencil exams than they did on either of two computer-based testing formats, the study found.
The findings add new wrinkles to the ongoing debate about whether computer- and paper-based tests are functionally the same for students.
Last year, Education Week reported that millions of K-12 students who took 2014-15 PARCC exams via computer tended to score lower than those who took the exams with paper and pencil. Overall, research on the subject of possible “mode effects” on standardized tests remains mixed. Some studies have found little difference between paper- and computer-based exams, while others have found a persistent pattern of student scores being lower on computer.
When such differences have been identified, they have often been attributed to technological features of the computer-based exams, such as the need for students to scroll through long reading passages. Practitioners also worry when students are asked to take exams using devices and interfaces with which they are not familiar.
The AAAS study looked at the results of more than 33,000 students across grades 4-12 on a test of their knowledge of science ideas related to energy concepts. Three versions of the same test were administered: a paper-and-pencil version, in which students used a booklet and scannable answer sheet; a computer-based version via an open-source testing system called TAO; and a computer-based version via the AAAS assessment website.
The two computer-based versions of the test had some subtle, but apparently significant, differences. The most notable: on the TAO system, students could revisit questions that they had skipped or previously answered, giving a response or making a revision later. On the AAAS system, students could not return to questions they answered or skipped.
Also important: To record their responses on the TAO system, students could click directly on their chosen answer. On the AAAS, system, students were asked to select a separate button that corresponded to their chosen answer.
The researchers found that high school students performed similarly on all three test formats.
Elementary and middle school students, however, performed significantly worse on the computer-based AAAS system.
And non-native English speakers performed slightly worse than their peers on the paper version, but significantly worse than their peers on both computer-based versions.
The findings, paired with earlier studies that found college students’ scores were not impacted by the ability to go back to skipped questions, led the researchers to a number of conclusions.
“This may indicate that being able to skip, review, and change previous responses could be beneficial for younger students in elementary and middle school, but have no influence on older students in high school and college,” they wrote.
In addition, “marking an answer in a different location on multiple-choice tests could be challenging for younger students, students with poor organizational skills, difficulties with concentration, or students who are physically impaired.”
Photo: Seventh graders at Marshall Simonds Middle School in Burlington, Mass., look at a PARCC practice test to give them some familiarity with the format before field-testing in 2014 of the computer-based assessments aligned with the common core.--Gretchen Ertl for Education Week-File
- PARCC Scores Lower for Students Who Took Exams on Computer
- Comparing Paper-Pencil and Computer Test Scores: 7 Key Research Studies
- Online Testing Now More Common Than Paper-and-Pencil, Study Finds
A version of this news article first appeared in the Digital Education blog.