States Take Stock of Math Programs in Wake of NAEP Results
From the moment he read the newspaper on June 3, Wayne G. Sanstead, North Dakota's superintendent of public instruction, had a feeling the results of the nation's first-ever state-level student assessment would bring him good news.
In Washington to receive an advance briefing on the results, Mr. Sanstead happened to read the astrology column in The Washington Post. It said the scenario for his day featured "recognition, celebration, and support from very important people."
During the course of the day, Mr. Sanstead learned that his state ranked at the top of the national rankings in overall mathematics achievement. It also ranked at the top in several factors--such as a low rate of television viewing and a high rate of two-parent families--associated with high levels of performance.
He had seldom read his horoscope before that day, Mr. Sanstead said, but "I'm going to pay more attention to it from here on out."
Although not all states received the same kind of encouraging news, the data on the assessment, which was conducted by the National Assessment of Educational Progress, offered a wealth of valuable information about student achievement, educators and state officials said last week.
Thanks to the state-by-state comparisons of achievement, the officials said, states were able to see for the first time how their students performed in a key subject area, mathematics, in relation to their peers in other states. Several states, in fact, used the occasion to announce major changes in their math programs.
But others cautioned that the data provide only limited information, and may offer little guidance to state policymakers.
Because the data include only average state performance, Secretary of Education Lamar Alexander acknowledged, the assessment is unable to indicate differences within states, which could be substantial.
In addition, observed Daniel M. Koretz, a senior social scientist at the rand Corporation, the data offer few clues on factors that cause high performance. As a result, he said, educators who draw inferences about effective programs from the naep findings are making a mistake.
The assessment "gives people a concrete, solid barometer of how severe the problem is," Mr. Koretz said. "In other tests, because of teaching to the test, it's hard to say how severe the problem is in state A or state B."
"But to go beyond that and try to figure out why is complicated," he added. "You take a big risk in assuming programs are better in states that performed well."
Nevertheless, said Ramsay W. Selden, director of the state education-assessment center for the Council of Chief State School Officers, the exercise was worthwhile.
"This provides a new point of reference--being able to look at yourself in light of one another," he said.
The results released this month were based on tests administered in 1990 to about 2,500 8th graders in each of 37 states, the District of Columbia, and two territories.
The data represent the culmination of a five-year effort to expand naep to provide information on state-level performance, according to Chester E. Finn Jr., professor of education and public policy at Vanderbilt University and a self-described "proud great-uncle" of the assessment.
As an assistant U.S. secretary of education for educational research and improvement during the Reagan Administration, Mr. Finn helped establish a blue-ribbon panel, chaired by Mr. Alexander, that recommended expanding naep to permit state-by-state comparisons. That panel's report helped influence legislation, enacted in 1988, that established the trial state assessment.
"This is the first good state-level data on education outcomes," Mr. Finn said. "That's a nontrivial accomplishment."
"If you believe, as I do," he added, "that outcomes are the only measures that matter, you've got to say this is a bright new day in our ability to diagnose and understand education."
Francie M. Alexander, associate state superintendent in the California Department of Public Instruction, said the data provide a truer picture of achievement than state achievement tests, which she said had led to the so-called "Lake Wobegon effect," in which nearly all states performed "above average."
"The results of this test take care of that myth in no uncertain terms," she said.
But H.D. Hoover, director of the Iowa Basic Skills Testing Program, said that Iowa's performance on the naep assessment--it was among the top-performing states--was consistent with its performance on the itbs
"People say the itbs is giving a false picture," he said. "For the state of Iowa, not at all. If anything, the Iowa test understates [performance]."
Suzanne E. Triplett, assistant state superintendent for research and development in North Carolina, one of the lower-performing states, said her state became aware of its low level of achievement in 1989, when it appeared at the bottom in national rankings of performance on the Scholastic Aptitude Test.
"In a sense, [the naep results] were reinforcing what we already knew," she said. "We weren't real surprised."
Perhaps the most surprising finding in the state rankings, suggested Mark D. Musick, president of the Southern Regional Education Board, is the fact that 20 of the participating states performed at about the same level, near the national average.
"If you had been given a list of states," he said, "would you have said that Kentucky, Maryland, California, and Illinois were in the same boat?"
"That shows, at least in math, we probably do have a national curriculum," Mr. Musick said. "If you look at the states, they are all very different demographically, yet the similarity in scores says there is essentially a national curriculum in math at work."
In addition to providing new information on student performance in math, the naep data suggest ways in which states can improve that performance, Ms. Alexander of California said.
"When we looked at sub-populations, we found places where students were doing well, and more where they were not doing well," she said.
Upon closer examination, she added, state officials found that many of the poorer performers were enrolled in "a wasteland called general math," where they received the kind of instruction the naep assessment found was associated with lower levels of performance--a heavy emphasis on drill and practice in computation skills.
As a result of these findings, Ms. Alexander added, California plans to revamp its general-math courses and work with 100 middle schools to test alternative instructional materials.
Elsewhere, other officials also announced reforms in their math programs as they announced the results of the assessment. Franklin L. Smith, the incoming superintendent of schools in the District of Columbia, which performed at the bottom of the national rankings, said that he would institute a package of math-curriculum reforms,el35lincluding requiring all students to take algebra, upon taking office next month.
But Ms. Triplett of North Carolina said the naep data provide few clues to help the state raise its level of performance. The data show, for example, that in calculator use--which is associated with higher levels of achievement--North Carolina students are comparable to those of other states.
"Maybe we're asking too much of naep to expect answers," she said. "We aren't finding any."
Mr. Selden of the ccsso's assessment center cautioned that it was difficult to determine from the naep data what caused a state's level of performance, and he urged state officials to "do their analyses carefully."
"I think there is certainly enough to get started," he said. "But I hope the data aren't used to make inappropriate, premature judgments about what is causing performance. Some relationships are fairly subtle."
For example, he noted, the data show that students in advantaged urban areas whose parents have relatively high levels of education performed well on the assessment. But such students may also have better-qualified teachers as well, he suggested.
Moreover, Mr. Finn pointed out, the data indicate that the factors most strongly related to student performance are those--like television viewing and parental education--over which schools have little control. He said those results reflect the strong influence of what he has called "the other 91 percent," referring to the proportion of time 18-year-olds have spent outside of school.
"Schools have trouble overcoming those [factors], and rarely do they overcome them," Mr. Finn said. "But that mustn't be allowed as an excuse for schools to abandon the field and stop trying."
Mr. Finn added that even in states, such as North Dakota, where the ''other 91 percent" is supportive of educational achievement, the level of math performance is relatively low. He noted that the achievement levels set by the National Assessment Governing Board, which are expected to be released in September, would show that relatively few students in any state are performing at the "proficient" or "advanced" level of achievement.
"Even where the 91 percent is in good repair, the education system needs to do a better job than it is doing," Mr. Finn said. "We can't encourage the folks in North Dakota to be complacent."
Mr. Sanstead, the North Dakota state chief, acknowledged that his state's level of performance must improve, particularly in higher levels of skill such as problem solving and data analysis. He said he would consider revamping the curriculum to introduce algebra, now typically taught in the 9th grade, a year earlier.
But he added that, for now at least, he would enjoy the rare privilege of reporting good news about education.
"It's a good feeling," he said. "I can tell you that."
Vol. 10, Issue 39