The “nation’s report card’’ has a long way to go in translating its data on student achievement for the teachers, parents, and school administrators who might make use of the information, a study suggests.
The study is based on lengthy discussions about the National Assessment of Educational Progress with seven focus groups. Comprising eight to 13 people each, the groups were made up, separately, of state assessment directors, school board members, school administrators, vocational specialists, business people, parents, and teachers.
Of all the groups, only the state testing directors were familiar with NAEP. Only two of the school administrators said they had heard of it.
After they were given some NAEP reports to peruse, most of the groups said they found them to be “too long, too hefty, too complex, too dense, and to contain too much jargon,’' according to Phyllis Blaunstein, a senior adviser to the Widmeyer Group Inc., the Washington-based public-relations firm that conducted the study for the National Assessment Governing Board.
One member of a focus group said: “I don’t need 100 copies of the state report stacked up in the basement to be recycled. We don’t need them the size of doorstops.’'
“There’s a very small policy audience for NAEP, but it isn’t filtering down to the people in the classroom,’' Ms. Blaunstein said. “I think most members of the governing board would like to have the reports make a greater impact on education.’'
The results of the study were presented Nov. 19-20 in San Francisco during a meeting of the governing board, which sets policy for the Congressionally mandated assessment of student achievement in key subjects.
More Than Facts
Mark D. Musick, the chairman of the governing board, said the conclusions “rang painfully true.’'
The board is scheduled at its next meeting, in March, to look at examples of smaller, more consumer-oriented NAEP reports in an effort to make the data more useful.
But Mr. Musick noted that the effort to repackage NAEP statistics for consumers also raise some knotty questions.
For example, the focus-group participants said they wanted the NAEP reports to provide more interpretation of the statistics, more guidance, and additional practical advice.
“Most importantly, they said it didn’t resolve the question, ‘So what do I do with this now?’'' Ms. Blaunstein said.
But Mr. Musick said that kind of interpretive reporting of data may go beyond the scope of the National Center for Education Statistics, the Education Department agency that oversees NAEP.
“It’s supposed to just report the facts,’' he said.
However, while critical of the form of the NAEP reports, most of the focus-group participants also said that the NAEP program provides credible data on student achievement and that there is a need for that kind of information.
They said the data can be useful in keeping parents, policymakers, and other people informed about student achievement and in correcting misinformation.
“A legislator will get calls of extraordinary exaggeration from parents and teachers with totally different views,’' the report notes. “Armed with the data, these officials can deflect criticism or allay concern.’'
Quicker Results
Moreover, if the NAEP statistics were packaged differently and reported in a more timely fashion, the focus-group members said, they could be used to improve teaching and learning in the classroom.
An assessment director from a Western state pointed out, for example, that the 1990 mathematics assessments were done in February of that year. The results were returned in June 1991, and testing was repeated in February 1992.
“So theoretically you had students in school for six months between the time you got your results and the next assessment,’' he said. “If you’re talking about measurement of change, that’s ludicrous.’'
The public-relations firm also found that most focus-group participants did not trust most student-achievement data from standardized tests. They perceived NAEP, though, as being more reliable than most.
“And the farther the data is from them, the less they like it,’' Ms. Blaunstein said. “They don’t like international comparisons at all.’'
One participant, for example, commented: “Who really cares how the kid in East Wazoo is doing? All we want is information about our schoolchildren.’'
Participants said that standardized tests often do not account for differences in students’ experiences and that they do not measure what students need to know to succeed, or what is taught in the classroom.
In addition, the focus-group members said, test results are frequently negative and are often misinterpreted.