Published Online:

G.A.O. Blasts Method for Reporting NAEP Results

Article Tools
  • PrintPrinter-Friendly
  • EmailEmail Article
  • ReprintReprints
  • CommentsComments

WASHINGTON--The reporting of results on the National Assessment of Educational Progress in a way that tells whether students are doing better or worse than experts think they should is "fundamentally flawed,'' a hard-hitting federal report released last week concludes.

The NAEP tests, given to national samples of students in grades 4, 8, and 12 in several academic areas, are meant to provide a national barometer of student achievement.

They recently have grown in prominence as the federal government began publishing the results on a state-by-state basis and as policymakers began looking for ways to measure progress toward the six national education goals.

Recognizing the increasing importance of the tests, Congress in 1988 set up the National Assessment Governing Board to set policy for the NAEP program. It also directed the board to shift policy fundamentally--to move from simply measuring how students perform to measuring how students perform against some standard for high achievement.

The board was charged with determining, in other words, "how good is good enough.''

It defined three levels of achievement: basic, proficient, and advanced. Teachers and other experienced educators were recruited to judge what percentages of students might be expected to meet those newly defined levels. The NAEP results were then interpreted against those benchmarks.

General Accounting Office evaluators, however, said they found both conceptual and practical problems with the new achievement levels.

"Reliance on N.A.G.B.'s results could have serious consequences,'' the federal evaluators warned.

The governing board, in a strongly worded response, said the federal investigators' concerns were largely moot because of changes made in the way NAEP results are reported since the study was undertaken.

G.A.O. Findings

The study was based on a review of the 1990 NAEP math test, which was one of the first to describe its results in terms of achievement levels.

Among other problems, the evaluators said they found:

  • The interpretations of NAEP scores do not show whether students have actually mastered the material--and the achievement levels may even be set too high;
  • The method of setting achievement levels, in which educators estimate how well students ought to do on particular test items, is unsuited to the NAEP program;
  • The educators who judge the test items are not given sufficient technical information on which to base their decisions;
  • With only two testing experts on its 23-member board, N.A.G.B. is ill-equipped to make technical judgments; and,
  • The achievement levels fail to indicate how well students do on particular content areas.

"If you were interested in knowing, say, how well did people grasp fundamental skills, [N.A.G.B.] doesn't isolate their performance for that particular category of items,'' explained Gail MacCall, who directed the study.

The board responded that the NAEP tests were never intended to give "skill-by-skill results.'' Rather, it continued, they are intended to indicate a "general degree of attainment.''

The board said G.A.O. investigators' conclusions stemmed from basic misunderstandings about the purposes of NAEP and the move toward measuring student progress against standards.

"All standard-setting is judgmental,'' the board said, and its achievement levels were based on informed judgments from experienced teachers about what students ought to know.

"It is reasonable to debate whether they [the achievement levels] are too high or too low, but to say they are 'not accurate' as if some 'true' levels exist, is absurd,'' the board said.

The board also noted it has revised its processes since the study was conducted.

The educators who judged test items for the 1992 NAEP test in math, for example, were given more time, more training, and more technical support to inform their decisions.

Moreover, the board has added a process to check whether students at specific achievement levels can really master the material. That process will be incorporated for the first time into the report on the NAEP reading test, scheduled for release later this year.

Web Only

You must be logged in to leave a comment. Login | Register
Ground Rules for Posting
We encourage lively debate, but please be respectful of others. Profanity and personal attacks are prohibited. By commenting, you are agreeing to abide by our user agreement.
All comments are public.

Back to Top Back to Top

Most Popular Stories