Education

Plan for State-by-State Comparison of Scores Meets Resistance

By Cindy Currence — August 22, 1984 7 min read
  • Save to favorites
  • Print

A pilot project that could lead to the first nationwide testing program specifically designed to assess student achievement on a state-by-state basis has encountered resistance from some state education officials, who say such comparisons are not helpful.

Of the 14 member-states on the Southern Regional Education Board, so far only superintendents of education in Florida, Tennessee, and Virginia have announced that they will participate in the pilot program that is jointly sponsored by the sreb and the National Assessment of Educational Progress.

“There is some feeling that the states have the constitutional authority for dealing with education and not the federal government--and naep is viewed, in some circles, as an arm of the federal government,” said Thomas H. Fisher, administrator of the Florida Education Department’s student-assessment section.

But political leaders and educators who support the project maintain that an accu-rate national standard is critical to the continuation of current reform efforts in education.

The voluntary testing program will, for the first time in the 15-year history of naep, allow a state-by-state comparison of student achievement among the sreb states that choose to participate in the project.

Under the pilot program, the participating states will be able to compare the reading ability of their 11th-grade students and to match their results with national and regional data from the 1983-84 National Assessment of Educational Progress, a Congressionally mandated and federally funded program that regularly surveys the educational attainments of 9-, 13-, and 17-year-old students. Testing for the pilot project will begin in March.

Currently, naep provides only a regional breakdown of test scores in comparison with the national standard set by the test.

A National Comparison

If states find the information the pilot project provides useful, the program may be expanded to include any state that wishes to participate, said Archie E. Lapointe, executive director of naep.

Approval of a nationwide state-based assessment, however, is still under consideration by the naep policy committee, according to Mr. Lapointe.

The U.S. Education Department is in the process of looking at a number of alternatives--including the national assessment--for providing achievement-test data on a state-by-state basis, said Emerson J. Elliott, director of the department’s issues analysis staff.

Last January when Secretary of Education Terrel H. Bell released the department’s report, “State Performance Outcomes,” which ranked states’ educational performance based on various criteria, he said that the department would investi-gate the possibility of using the results of the national assessment for future reports. (See Education Week, Jan. 18, 1984.)

“We hope in future years we will be able to get [naep] broken down on a state-by-state basis,” the secretary said.

The report received criticism from state officials for the comparison it made of states’ educational progress based on the limited test data provided by students’ college-entrance exams.

Fear of a National Test

But the naep committee is considering the proposal carefully, Mr. Lapointe said, because a national testing program that provides state-by-state comparisons of students’ educational progress is a big step for the national assessment.

In fact, when naep was established in 1969, it was deliberately designed to exclude the possibility of comparing states, Mr. Lapointe explained.

“There has always been a fear that there will be a national test, and as a result of the test, a national curriculum,” he said.

The usefulness of state-by-state comparisons also was debated at that time, Mr. Lapointe said.

“The argument has been that each state is responsible for setting its own objectives and that a national test can’t measure those objectives because [the states] are unique.”

Those objections still hold true to-day, although to a lesser degree, Mr. Lapointe and sreb officials say they have found.

Unfair Comparisons

In South Carolina, which will not participate in the pilot project, education officials opted out of the program because they did not think it would add useful information to the state’s existing testing program, said Charlie G. Williams, state superintendent of schools.

“We have a testing program that is measured against our own particular curriculum objectives and it was our posture that [the sreb/naep project] would not provide any additional information,” Mr. Williams said.

An aversion to state-by-state comparisons also motivated the decision of South Carolina education officials not to participate in the project, Mr. Williams said, explaining that the officials took the position that the differences in states’ “socioeconomic histories” make comparisons inappropriate.

“We have said in education for years that you shouldn’t take Johnny’s test score and compare it to Mary’s,” he said. “And if it is not appropriate to compare one child to another, it would be even less appropriate to compare 40,000 children to another 40,000.”

The state, he added, is more interested in programs to assist in the actual instruction of students.

The cost of administering the pilot project also is a problem, some state officials said. A participating state must pay $25,000 to administer the test to about 2,000 of its students.

“Frankly, I don’t see the benefit of spending $25,000 to find out that Georgia is getting higher test scores,” said one education official in Alabama.

Competition as Incentive

But in spite of the objections, there has been a significant shift in attitudes toward state comparisons, Mr. Lapointe said.

“There are still some states that are very reluctant to discuss this option,” Mr. Lapointe said, “but the political and social climate has changed and today there are just as many states that will consider it.”

In addition to providing information to the public about how well school systems are doing, “there is the hope that more accurate, comparative data will provide incentive for states to compete and will encourage those that are successful,” Mr. Lapointe said.

Demonstrating Progress

Political leaders who have sponsored costly educational reforms are particularly interested in demonstrating progress, according to Winfred L. Godwin, president of the sreb.

At a meeting of the sreb board this summer, several governors of southern states spoke in favor of the sreb/naep pilot project.

“As a governor who is extremely interested in trying to get one of the fifth- or sixth-poorest states in America to invest every extra penny it can in support of public education, it is virtually impossible for me to persuade the taxpayers to give up another penny unless I can show them results,” said Gov. Lamar Alexander of Tennessee.

“The more we can do to develop accurate and reasonable ways to measure whether students are learning and how well they are learning, the better system of education we’ll have,” he said.

Students in the southern region have scored below average on the national assessment since it was first administered in 1969.

Testing Problems

The need for “accurate and reasonable” measures of achievement is a primary reason the sreb worked with the national assessment to develop the pilot project, according to sreb officials.

Mr. Godwin maintains that there is a “missing link” in educators’ and policymakers’ ability to assess student achievement.

The lack of a truly “national” achievement test, one used by all states, has created a situation in which “no state in the country knows how its students’ achievement compares with that of other states,” he said.

Policymakers face the problem of providing reliable information about the quality of education based on the “norm-referenced” tests that have lost “much of their public credibility and importance as the limitations of [the tests] are explained,” Mr. Godwin said.

Norm-referenced tests, in which students are compared, not according to a predetermined standard of correct answers, but on the basis of their performance relative to that of other students, present several problems in making accurate assessments of student achievement, sreb officials maintain.

Because each testing company establishes a standard based on the test results of its own particular group of students, state testing officials have found that percentile rankings of their students’ performance may vary a great deal from test to test, the sreb reports in a recent study called “Measuring Educational Progress in the South: Student Achievement.”

Significant fluctuations in test scores also may occur when tests are “re-normed,” which testing firms usually do every five to seven years. In several sreb states, students taking a norm-referenced test in 1983 actually are compared to a group of students who took the test in 1977, the report states.

“Criterion-referenced” tests, such as minimum-competency tests, also do not provide adequate information about achievement levels, the report argues.

The tests are designed to measure “minimums,” according to the sreb, and do not indicate students’ actual level of achievement, particularly in performing higher-order skills.

In addition to allowing state-by-state comparisons, the national assessment, even in the limited form that will be used for the sreb/naep pilot project, will test students at all levels of achievement and will provide state leaders with current norms, as well as a single national norm, sreb officials say.

A version of this article appeared in the August 22, 1984 edition of Education Week as Plan for State-by-State Comparison of Scores Meets Resistance

Events

Jobs Virtual Career Fair for Teachers and K-12 Staff
Find teaching jobs and other jobs in K-12 education at the EdWeek Top School Jobs virtual career fair.
Ed-Tech Policy Webinar Artificial Intelligence in Practice: Building a Roadmap for AI Use in Schools
AI in education: game-changer or classroom chaos? Join our webinar & learn how to navigate this evolving tech responsibly.
Education Webinar Developing and Executing Impactful Research Campaigns to Fuel Your Ed Marketing Strategy 
Develop impactful research campaigns to fuel your marketing. Join the EdWeek Research Center for a webinar with actionable take-aways for companies who sell to K-12 districts.

EdWeek Top School Jobs

Teacher Jobs
Search over ten thousand teaching jobs nationwide — elementary, middle, high school and more.
View Jobs
Principal Jobs
Find hundreds of jobs for principals, assistant principals, and other school leadership roles.
View Jobs
Administrator Jobs
Over a thousand district-level jobs: superintendents, directors, more.
View Jobs
Support Staff Jobs
Search thousands of jobs, from paraprofessionals to counselors and more.
View Jobs

Read Next

Education Briefly Stated: March 13, 2024
Here's a look at some recent Education Week articles you may have missed.
9 min read
Education Briefly Stated: February 21, 2024
Here's a look at some recent Education Week articles you may have missed.
8 min read
Education Briefly Stated: February 7, 2024
Here's a look at some recent Education Week articles you may have missed.
8 min read
Education Briefly Stated: January 31, 2024
Here's a look at some recent Education Week articles you may have missed.
9 min read