Assessment

Impact of Paper-and-Pencil, Online Testing Is Compared

By Lynn Olson — August 30, 2005 3 min read
  • Save to favorites
  • Print

How students perform on computer-delivered tests depends, in part, on how familiar they are with the technology, concludes a set of studies conducted by the Princeton, N.J.-based Educational Testing Service.

The studies looked at how students performed when given mathematics and writing items from the National Assessment of Educational Progress by paper and pencil vs. computer. The results of the studies were released this month by the National Center for Education Statistics, which oversees the federal testing program.

In the math study, nationally representative samples of 4th and 8th graders in 2001 took a computer-based math test and a test of computer facility, among other measures. In addition, at the 8th grade level, a randomly selected control group of students took a paper-based exam containing the same math items as the computer-based test.

“Online Assessment in Mathematics and Writing: Reports From the NAEP Technology-Based Assessment Project, Research and Development Series” is available from National Center for Education Statistics.

Average scores for 8th graders taking the computerized test were about 4 points lower than for those taking the paper version, a statistically significant difference. On average, 5 percent more students responded to individual items correctly on paper than on a computer.

At both grade levels, students’ facility with the computer—based on hands-on measures of input speed and accuracy—predicted their performance on the online exam.

The writing study compared the performance of a nationally representative sample of 8th graders who took a computer-based writing test in 2002 with that of a second, nationally representative sample of 8th graders taking the same test on paper as part of the regular NAEP administration that year.

Results showed that average scores on the computer-based writing test generally were not significantly different from average scores on the paper-based exam. But, as with the math test, individual students with better hands-on computer skills tended to achieve higher online scores, after controlling for their level of paper writing skills.

Arnold A. Goldstein, the director of reporting and dissemination for the assessment division of the NCES, said that the findings suggest a possible problem in administering the national assessment online, but that further research is needed. “I think we would need to have a larger field test in a more traditional NAEP testing setting in order to determine that,” he said.

Mr. Goldstein added that, while this was a one-time study, the NCES—an arm of the U.S. Department of Education—may do further work in the future to explore the administration of the assessment online.

Scoring by Computer

The studies also examined the feasibility and costs of generating and scoring NAEP math and writing items by computer.

While the machine’s grades on simple math items were generally interchangeable with those of human scorers, that was less true for items requiring extended text responses. On those items, the computer tended to treat correct responses that were misspelled as incorrect, a technical shortcoming that could be addressed by including common misspellings in the automated scoring key or including a spell-check before an answer is submitted, according to the study’s authors.

On the writing test, however, automated scores did not agree closely enough with the scores awarded by human readers to consider the two types of scores interchangeable.

“That is something that needs to be considered in any further development work,” said Mr. Goldstein. “For example, whether a trend can be extended from a paper-and-pencil to an online administration of the assessment.”

Related Tags:

Events

This content is provided by our sponsor. It is not written by and does not necessarily reflect the views of Education Week's editorial staff.
Sponsor
Assessment Webinar
Unlocking the Full Power of Fall MAP Growth Data
Maximize NWEA MAP Growth data this fall! Join our webinar to discover strategies for driving student growth and improving instruction.
Content provided by Otus
Classroom Technology K-12 Essentials Forum How to Teach Digital & Media Literacy in the Age of AI
Join this free event to dig into crucial questions about how to help students build a foundation of digital literacy.
This content is provided by our sponsor. It is not written by and does not necessarily reflect the views of Education Week's editorial staff.
Sponsor
Special Education Webinar
Taking Action: Three Keys to an Effective Multitiered System to Supports
Join renowned intervention experts, Dr. Luis Cruz and Mike Mattos for a webinar on the 3 essential steps to MTSS success.
Content provided by Solution Tree

EdWeek Top School Jobs

Teacher Jobs
Search over ten thousand teaching jobs nationwide — elementary, middle, high school and more.
View Jobs
Principal Jobs
Find hundreds of jobs for principals, assistant principals, and other school leadership roles.
View Jobs
Administrator Jobs
Over a thousand district-level jobs: superintendents, directors, more.
View Jobs
Support Staff Jobs
Search thousands of jobs, from paraprofessionals to counselors and more.
View Jobs

Read Next

Assessment From Our Research Center It's Hard to Shift to Competency-Based Learning. These Strategies Can Help
Educators are interested in the model and supportive of some of its key components, even if largely unfamiliar with the practice.
6 min read
A collage of a faceless student sitting and writing in notebook with stacks of books, math equations, letter grades and numbers all around him.
Nadia Radic for Education Week
Assessment Explainer What Is Standards-Based Grading, and How Does It Work?
Schools can retool to make instruction more personalized and student-centered. But grading is a common sticking point.
11 min read
A collage of two faceless students sitting on an open book with a notebook and laptop. All around them are numbers, math symbols and pieces of an actual student transcript.
Nadia Radic for Education Week
Assessment Letter to the Editor Are Advanced Placement Exams Becoming Easier?
A letter to the editor reflects on changes to the College Board's Advanced Placement exams over the years.
1 min read
Education Week opinion letters submissions
Gwen Keraval for Education Week
Assessment Opinion ‘Fail Fast, Fail Often’: What a Tech-Bro Mantra Can Teach Us About Grading
I was tied to traditional grading practices—until I realized they didn’t reflect what I wanted students to learn: the power of failure.
Liz MacLauchlan
4 min read
Glowing light bulb among the crumpled papers of failed attempts
iStock/Getty + Education Week