‘Wall Chart’ Seen Diagnostic Tool With Uses, Flaws

By Robert Rothman — May 17, 1989 7 min read

Washington--When Secretary of Education Lauro F. Cavazos released this year’s “wall chart” of state performance indicators, he challenged educators to reverse what he termed signs of “stagnation” in student achievement.

But the data he presented, skeptical state-level officials said last week, are unlikely to drum up many recruits.

They acknowledged that the annual exercise has aided school-improvement efforts by shining at least a one-day spotlight on the state of education. And its impact has spurred, they said, a variety of state and national efforts to increase the flow of information on school performance.

But the performance data on the chart--high-school graduation rates and college-admission test scores--may not support the Secretary’s grim conclusions, the officials insisted.

And, in a broader indictment, they warned that the data are too limited to provide any guidance on how to boost performance.

“Publication of something like this causes political gnashing of teeth, finger-pointing, and excuse-making,” said H.D. Hoover, Iowa’s testing director. “None of those things change education.”

What is needed, he and others suggested, is a broader set of indicators that show how well a state’s students perform in particular subject areas. The forthcoming National Assessment of Educational Progress, which will for the first time provide state-by-state achievement data, will serve this function, many predicted.

Meanwhile, educators said, the wall chart should provide examples of states and districts that have improved their performance over time.

Despite their expressions of disappointment with the chart as it is, however, few officials said they would like to see it abandoned.

“It’s better to have partial information than no information,” said Ramsey Selden, director of the state education assessment center for the Council of Chief State School Officers. “At some point in time, you have to use the best information available.”

But Mr. Selden cautioned that “you’ve got to be careful how you use’’ that best-available information.

‘A Good Goad’

First unveiled by former Secretary of Education Terrel H. Bell in 1984, the wall chart has evoked strong reactions from educators, who had until then strongly resisted state-by-state comparisons.

They objected in particular to the use of the Scholastic Aptitude Test and the American College Testing Program test as measures of a state’s student performance. Such tests, they pointed out, are taken only by college-bound students, not a sample of all students in the state, and do not reflect the effects of a state’s instructional program.

Nevertheless, the publication of the chart made such comparisons inevitable, and in 1984, the state chiefs voted to establish a new set of indicators that would better reflect state practices. That vote also helped set in motion the idea of expanding naep to produce state-level achievement data.

The publicity surrounding the chart also spurred the growth of performance indicators for individual states, according to W. Ross Brewer, director of planning and policy development for the Vermont Department of Education.

“It serves as a good goad to get us moving on collecting and disseminating information,” he said.

An Approximation

In releasing this year’s version of the wall chart May 3, Mr. Cavazos suggested that student performance “has been stagnant.”

Average sat scores, he noted, had declined by two points--to 904 out of a possible 1,600--since last year, and act scores had risen over that time by one-tenth of a point, to 18.8 out of 35.

In addition, he said, while several states had improved their graduation rates substantially since 1982, the2p4national average had improved by less than 2 percentage points.

“We are standing still,” Mr. Cavazos said, “and the problem is that it’s been this way for three years in a row. And, frankly, this situation scares me.” (See Education Week, May 10, 1989)

Some officials conceded last week that the scores may in fact provide a good approximation of student performance.

“My perception,” said Mr. Hoover of Iowa, which ranked first last year among the states that use the act, “is that if we had a true state-by-state comparison, truly sampled kids, and if all kids tried, I would be surprised if states such as Iowa didn’t finish where they finish on act comparisons. In fact, I’d be amazed.”

Similarly, Terry E. Peterson, special assistant to the South Carolina joint business-education committee, said that his state’s data on college-going rates, college-freshman performance, and other factors, correlate well with the wall chart’s findings that South Carolina’s sat scores have improved substantially since 1982.

Wilmer S. Cody, Louisiana’s superintendent of education, said naep data indicate that student achievement is improving more rapidly than the admission-test scores would suggest.

But naep scores also show that much of the improvement has been in basic skills, according to Mr. Cody, and that in higher-level skills, such as those measured on the act, there has not been a significant increase in performance.

Questioning Conclusions

But while these officials were willing to concede that data on the wall chart may reflect actual performance, they questioned the conclusions Mr. Cavazos drew from them.

“I don’t think it’s useful to seize on year-to-year trends,” said Mr. Selden. “If scores don’t move for one year, does that mean the whole system is stagnant?”

Many school reforms enacted in the past few years have yet to yield achievement gains among the high-school juniors and seniors who take college-admission tests, added Mr. Cody of Louisiana.

“I have no doubt achievement is going up,” he said. “But that has not yet been reflected in the act”

In addition, he noted, “the dropout statistics [on the chart] suggest a seel17lrious problem in the U.S. It does not look like we are making much progress.”

But the experience in his state, he said, suggests that the statistics do not tell the entire tale.

“We are aggressively trying to attend to that matter,” he said of the dropout rate. “We’ll know in two or three years whether we have succeeded.”

Moreover, the graduation-rate statistics may not indicate that more students are dropping out of school, Mr. Selden noted. In fact, he said, they could reflect the increased time it may take some students to graduate because of the toughened course requirements and exit tests implemented during the past few years.

“Are they dropouts, or kids the system legitimately applied higher standards to?” he asked. “We don’t have the information yet to interpret this.”

Charles E.M. Kolb, deputy undersecretary of education for budget, planning, and evaluation, acknowledged that “it’s too soon to write off the reforms.”

“People shouldn’t despair,” he added. “They should roll up their sleeves and go to work. Many states have done that.”

“The point here is to be challenging,” he said. “We have more work to do.”

But state officials contended last week that the wall chart has had little effect on state policies because the data do not provide enough information.

More Than a One-Day Story

In fact, as Mr. Hoover of Iowa pointed out, the most significant gains in student achievement nationally--those in the early 1980’s--predated the publication of the chart.

Mr. Hoover also noted that policymakers who draw inaccurate conclusions from the data may slow educational progress. Iowa’s high test-score ranking, he indicated, has convinced some lawmakers that the school system is good enough as it is.

“People in Iowa worry about complacency,” Mr. Hoover said. “If this has had any impact in Iowa, it has been as a way for politicians not to put as much money in education as some wish they would.”

Mr. Peterson of South Carolina proposed that, to help states bring about improvements, the Education Department should highlight model districts and states that have improved school performance.

“If there are some 5 or 10 examples of successful endeavors to raise sat scores and graduation rates,” he said, “that would give evidence that a concerted long-term effort can make a substantial difference.”

The forthcoming naep comparisons should also provide information states can use to boost achievement, said Mr. Cody, who formerly served on naep’s policymaking body. That assessment is expected to highlight states’ strengths and weaknesses in individual subject areas and on topics within those subject areas.

“If naep follows its past practice, that will be far more informative on what is going on,” said the Louisiana superintendent.

Perhaps the most effective way for the Education Department to influence performance, suggested Mr. Brewer of Vermont, is for Secretary Cavazos to report more frequently on the state of education.

The current annual report, he noted, “produces a one-day story on the front page of the newspapers, and that’s it.”

“What we need is something that would sustain attention on education more regularly,” he said. “The wall chart doesn’t do that.”

A version of this article appeared in the May 17, 1989 edition of Education Week as ‘Wall Chart’ Seen Diagnostic Tool With Uses, Flaws