The board that sets policy for the National Assessment of Educational Progress continues to grapple with how to report scores for states that exclude large numbers of students from naep because they have disabilities or limited fluency in English.
Results from the assessment’s 2002 reading tests, to be released next month, will not flag the scores of states with relatively high exclusion rates, said Peggy G. Carr, the associate commissioner of the National Center for Education Statistics, the arm of the U.S. Department of Education that administers NAEP. But, she said in an interview last week, the report will provide additional information to help the public interpret the data, where necessary.
In the long term, said Charles E. Smith, the executive director of the National Assessment Governing Board, “a top agenda item” will be how to interpret and communicate information about differences in exclusion rates across states to the general public.
“What’s missing,” he said during the quarterly meeting of NAGB in Kansas City, Mo., held May 15-17, “is a research- based mechanism—widely accepted—by which exclusion rates may be gauged and interpreted. Current methodologies have not yet achieved that goal.”
Puzzling Trend
Often called “the nation’s report card,” NAEP is the leading national barometer of what a representative sample of students know and can do in core subjects.
Under NAEP guidelines, schools selected for the sample may exclude certain students with disabilities or limited English proficiency, if officials deem them unable to participate meaningfully in the assessment. For example, students who have not had at least three years of academic instruction in English may be excused from the exam.
Since the mid-1990s, NAEP has offered a wide range of accommodations for those categories of students, such as more test-taking time, in an effort to reduce exclusion rates. Even so, those rates vary widely across states and have been rising in some, for reasons that are not entirely clear. (“NAEP Board Worries States Excluding Too Many From Tests,” March 19, 2003.)
Those increases have elicited concerns about the fairness or validity of comparing results across states with large differences in their exclusion rates.
In the next few weeks, Mr. Smith said, NAGB staff members will be looking for ways to examine the issue “totally separate and apart from any release of NAEP assessment results.”
In addition, the NCES has formed a working group to consider how NAEP could take a more active approach in standardizing local decisions about whether students take part in the tests and what, if any, accommodations they receive.
When it comes to the 2002 reading results, Mr. Smith said, “there is a sense of comfort that every effort has been made by NCES and its staff to properly interpret and to report” the scores.
World History Delayed
During the Kansas City meeting, the board also voted to delay a world history test for 12th graders until 2010 and to move up U.S. history tests in grades 4, 8, and 12 from 2010 to 2006.
The world history test originally was scheduled for 2006. But the board decided it needed more information about world history instruction before moving forward with the design of an assessment. Meanwhile, board members expressed strong support for testing U.S. history, which was last assessed in 2001.
David Northrup, the president-elect of the World History Association, said that while there is concern in the field that if the subject isn’t tested, it will not be valued, coming up with a sound test should be the first priority.
“The time frame is not the most important consideration,” Mr. Northrup, a professor of history at Boston College, said in an interview. “There is nothing to be gained by rushing.”