Incentives May Boost Test Scores, Two Studies Find
ALBUQUERQUE, N.M.--Providing students with incentives such as payment for correct answers may help boost their scores on such examinations as the National Assessment of Educational Progress, two new studies suggest.
The studies were presented here last week during a national meeting on assessment held by the Council of Chief State School Officers.
Researchers at the University of California at Los Angeles's Center for Research on Evaluation, Standards and Student Testing, or CRESST, cautioned that the increases they found in students' test scores were small.
Nonetheless, they said, their findings raise questions about whether NAEP and other "low-stakes'' assessments accurately measure the best that students can do.
"I think this means we have to look at more than one indicator of students' performance,'' said Linda Winfield, a visiting professor and research associate at CRESST. "To say that kids' performance is terrible in math needs to be based on more than just the NAEP.''
The Congressionally mandated NAEP tests are given biennially to national samples of students in grades 4, 8, and 12.
The assessments are considered "low stakes,'' however, because they do not count toward students' grades and the test results are not reported for individual students, schools, or districts.
Raising the Stakes
To find out what would happen if the stakes were raised slightly and students were motivated in some way to do well on the tests, the CRESST researchers began a series of pilot studies last year using items from the 1990 NAEP mathematics tests.
In one study, the researchers asked 749 8th graders and 719 12th graders to answer 41 or more items taken from the NAEP tests given at those levels. Before the test, the students were given written directions indicating they would be paid $1 for every correct answer.
In contrast, the test instructions given to a control group of students were more typical of the kind that accompany the actual NAEP tests. They stated, for example, that the purpose of the test was "to provide information on the knowledge and attitudes of young people throughout the United States.''
"By doing the best you can you will be making an important contribution,'' the directions said. Those students were also told their scores would not be reported to the school.
Among 8th graders, the researchers found, the financial incentive helped improve test performances. Those students answered an average of 28.5 of the 41 test items correctly. Students in the control group answered only 25 items correctly.
Researchers said the difference, though slight, was statistically significant.
The effects were strongest, the researchers said, for students who said they had read and remembered the instructions and on test items considered to be "easier.''
Similarly, some of the biggest improvements were found on open-ended test items.
The effects were the same for males and females and for students of varying racial and ethnic groups, the researchers said.
The researchers said they also saw slight, but nonsignificant, increases in 8th graders' test results when the directions for students in the experimental group were changed to appeal to their competitive instincts. In that case, they were told the tests would show "how good you are in math'' and that the results would be reported to their schools and parents.
The scores for 12th graders were not affected by financial or any other incentive.
More Research Planned
In the second study, the researchers sought to try to increase students' test performance by embedding items from the NAEP math assessment in a slightly higher-stakes test--a statewide curriculum-based assessment given to more than 40,000 8th graders in Georgia. School and school district scores from that assessment are widely reported in the state. Students, however, do not receive their individual scores.
The researchers then compared students' scores on the embedded items with results from the 1990 NAEP tests for students in Georgia.
They found that on eight of the 17 embedded items that were considered "harder,'' students performed similarly on both tests. Most of these items were answered correctly by fewer than 40 percent of the students nationwide in the NAEP assessment.
On the nine items considered "easier,'' the researchers found that students performed slightly better in the higher-stakes test. The effect, however, was not as pronounced as in the study when financial incentives were provided.
However, Brenda Sugrue, one of the researchers, noted that the study showed that "even raising the stakes indirectly can raise student performance on easier test items.''
The researchers acknowledged that paying students to take tests may prove controversial and impractical.
Gary W. Phillips, the associate commissioner in charge of the education assessment division for the National Center for Education Statistics, which oversees NAEP and funded the studies, said the findings did not immediately portend any changes for the NAEP assessments.
"NAEP has no plans to provide students with any motivational incentives,'' he said.
Mr. Phillips said that NCES plans to continue to fund the California center's research effort to determine what role motivation plays in student performance.
The effort is part of a growing body of research probing how students' motivation affects both their learning and how well they do on tests. (See Education Week, Nov. 7, 1990.)
Ms. Sugrue said further studies would examine, for example, whether
the students had had an opportunity to learn the content being tested
and whether students' socioeconomic backgrounds would make a
difference--particularly if students were given financial