The board that sets policy for the National Assessment of Educational Progress has been meeting in Washington over the past few days, attempting to solve some of the trickiest issues in the world of testing.
An ad hoc committee of the National Assessment Governing Board gathered Thursday to consider one such topic: how to bring more uniformity to the proportions of students that states and cities exclude from taking NAEP, or provide with special help known as accommodations, on it.
It’s not easy, as I discussed in a recent story. States set their own policies on how to deal with English language learners and students with disabilities, and local school officials have a strong say, too. As a result, their exclusion and accommodation rates vary greatly on different NAEP tests, leading critics to question the validity of the results.
The ad hoc committee decided to put several possible fixes out there for public consumption. Those general options include leaving NAEP policies the way they are now (basically, following individual states’ policies); making figures known as “full-population estimates,” which seek to estimate how well jurisdictions would have done without exclusions, as the official test scores; setting some kind of mandatory exclusion or accommodation policy; or setting a recommended exclusion/accommodation rate and indicating whether jurisdictions were meeting that guideline.
Another interesting option, discussed by the ad hoc committee, would be using some sort of automated “screener” to test students’ abilities in advance of taking the test.
The ad hoc committee is likely to put forward those options for public comment, and possibly hold a public hearing at a later date to consider them.
Another tough issue being mulled over by the governing board is that of “preparedness” -- or determining whether NAEP could be used to report on how ready 12th graders are for college or the workforce.
In theory, doing so would allow the public to look at students’ NAEP scores, broken down by state or demographic group, and make a judgment about their preparation for higher ed or a job. The board is trying to determine if including that information would be feasible by 2009 for 12th grade tests in reading and math.
Michael Kirst, a professor from Stanford University who has been chairing a seven-member technical panel studying the issue, presented findings and possibilities to the board on Friday. One of the difficulties in judging student preparedness, of course, is determining what standard to use. For instance, should a college’s placement test for incoming students be used to judge readiness for higher ed? Or some other test measure?
Federal officials have long sought to increase the public’s understanding of NAEP, and its value in demonstrating student academic progress. Including information on students’ preparation for college or the job market in NAEP results could increase the public’s regard for the test, Kirst told me after his presentation.
“It would help in communicating what the purpose and implications of the NAEP are,” he said. “The public might find [the test] much easier to understand.”