Opinion
Education Opinion

Papering Over the Reality of Education

By Laura Hersh Salganik & M. William Salganik — January 25, 1984 10 min read
  • Save to favorites
  • Print

Despite the less-than-splashy title (“State Education Statistics--State Performance Outcomes, Resource Inputs and Population Characteristics, 1972 and 1982"), the latest federal education “report” has attracted an extraordinary amount of attention.

The report itself--a ranking of the 50 states and the District of Columbia in terms of education spending and student performance--is not a book or a pamphlet; it’s a poster that measures 3 feet by 4 feet.

Secretary of Education Terrel H. Bell, when he released it this month, described the report as “the most comprehensive data available on the performance of state educational systems.” Newspapers rushed to report how their states’ schools compared to those in other states on test scores, student-teacher ratios, per-pupil spending, and other measures. In all, the report compares the states and the District of Columbia in 14 categories.

When he released the report, Secretary Bell cautioned reporters and others to take care in interpreting the numbers, yet he had the first crack at interpretation, and he failed to heed his own warning. What he read from the chart is what the Reagan Administration has been saying all along: Public schools are doing badly, but more money--especially more federal money--won’t help. The first of these propositions has proven politically popular (and this is an election year), and the second absolves the President of responsibility to resolve the “crisis.”

Mr. Bell’s comments, asserting that the states spending the most are not necessarily providing the best education, are not surprising. Neither is the attention the report has garnered from the media. Hope springs eternal, especially when it comes to making difficult decisions based on numbers.

Many newspapers printed the results on page one, and the articles were often accompanied by charts and graphs. “How the States’ Schools Stack Up” read a USA Today headline over a story with a salmon-colored table. “State Schools Flunk” was the headline in The Atlanta Constitution over an article that called the report “exhaustive.” The Baltimore Sun emphasized Secretary Bell’s own conclusion in its headline--''Bell School Data Show Little Link Between Quality and Spending.”

But most of the data in the “new” report are not even new. Except for state-by-state results on the American College Testing (ACT) exams, everything on the poster is published annually in two books of statistical tables issued by the U.S. Education Department itself. These books are available in any good library. That the “report” got such attention is less a reflection of what it contained than of the fact that news media do not react to the publication of thick books of statistical tables the way they react to press conferences by Cabinet secretaries.

The poster is different from the usual statistical tables chiefly in that, in addition to reporting, for example, the student-teacher ratio for each state, it includes a rank for each state.

But what do these 2,856 numbers (28 for each of the 50 states and the District of Columbia for 1972 and 1982) tell us? Less than it might seem. The choice of the information on the poster, and the manner in which it is presented, can easily lead to misinterpretation.

Here are some of the poster’s pitfalls:

  • The rankings are a deceptive artifice when, in fact, on most of the measures used in the report, most states are bunched in the middle. For example, the poster ranks 22 states in which most high-school seniors took the Scholastic Aptitude Test (SAT) of the College Board in 1982. Pennsylvania ranked 13th on the poster, the same as its 1972 ranking on average SAT scores. Yet, if Pennsylvania students had scored just six points higher in 1982--a change of no educational significance--the state would have “improved” its ranking to 8th.
  • The financial figures used in the report are not adjusted for inflation over the 10-year period covered, and other factors that would alter their significance are not taken into account.
  • The New York Times, for example, in its report on the poster, noted that “Teachers’ [average] salaries in New York rose from $11,000 annually to $20,000.” In constant 1972 dollars, however, the 1980-81 New York salary of $20,000 would be worth only $9,500. (This figure was calculated using the Consumer Price Index.)

    And the cost of living does not just vary over time, it also varies from place to place. For example, the federal Bureau of Labor Statistics calculated in 1981 that an “intermediate” budget for a family of four in Anchorage was 26 percent above the U.S. urban average. In the New York metropolitan area, the budget was 16 percent above the U.S. average. In Atlanta, by contrast, the family budget was 8 percent below the U.S. average. Is it any surprise, then, that Alaska and New York rank high in teachers’ salaries (and first and second in per-pupil spending), while Georgia ranks low?

    Besides the cost of living, there are other factors--such as the amount of transportation and heat needed in an area--that influence how much a state must spend to run the schools.

  • The two tests used in the rankings--the SAT and ACT--provide very limited information about what students have learned in school, and the tests are difficult to compare with each other.
  • Neither test is taken by all seniors. The rate in each state depends both on the proportion of students who plan to attend college and on entrance requirements of colleges. Among ACT states listed on the poster, this proportion varied from over two-thirds in Colorado, Illinois, Mississippi, and Nebraska to less than one-third in Minnesota and Wisconsin.

    Declines in SAT and ACT scores during the 1970’s are well documented and have been highly publicized. Yet, during the same period, elementary students’ scores on reading tests have been increasing. Between 1975 and 1980, the reading scores of 9-year-olds on the National Assessment of Educational Progress rose 2.6 percent. These scores, however, are not available on a state-by-state basis and could not be used in the rankings.

    In addition, in the states in which the ACT is taken, academic achievement generally is lower than that in SAT states. Aaron Pallas, a sociology graduate student at The Johns Hopkins University, worked out a rough formula equating ACT scores to SAT scores, using the scores of 1,000 students who took both tests. (The students in his group make up about 5 percent of the sample used in the National Longitudinal Study of the high-school class of 1972.) Using this formula, Florida would have ranked 5th among the 28 ACT states, making it look much more successful (with no more achievement) than it does with its 9th-place ranking among 22 SAT states.

  • Americans love decades as much as they love rankings, but 10-year comparisons do not accurately reflect long-term trends or recent changes. Everything on the poster was based on a 10-year comparison (1972 and 1982) and showed that the two “output” measures--test scores and high-school completion rates--were dropping. Using the most recent four years, however, the SAT picture looks quite different. Between the school years ending in 1980 and 1983, the national SAT average rose slightly, from 890 to 893. And although the graduation rate has declined over the past 10 years, it has risen dramatically over the past 50 years.
  • The point is not that any of these figures is better or more true than the others, but only that two points in time are hardly adequate to provide a “benchmark” for “perhaps the greatest and most broad-based effort at educational reform in American history.” But “benchmark” is the term Mr. Bell used to describe the report in his accompanying press release.

  • Presenting only the state averages in spending and “outcome” ignores variation within the states.
  • Montana appears on the poster as a state in which per-pupil spending is slightly above average. Yet in 1979-80, when the state’s average per-pupil spending was $1,849, about 25 percent of its students were enrolled in districts that spent over $2,600 per pupil. About 15 percent were enrolled in districts that spent less than $1,400 per pupil.

    Similarly, there is variation within states on “outcome” measures. For the 1982-83 school year, while the overall SAT average in Maryland was 893--precisely equal to the national average--averages for individual districts ranged from 722 (in the city of Baltimore) to 968 (in wealthy suburban Montgomery County).

    Mr. Bell’s comments during the press conference are a good example of the misinterpretation the poster has inspired. He cited Idaho, New Hampshire, and South Dakota as states in which per-pupil spending is low but achievement is high. Certainly, Mr. Bell knew this statement would appear in the newspapers. But after converting their ACT scores to SAT scores, South Dakota and Idaho (both low-spending states) rank 23rd and tied-for-26th, respectively. This is hardly outstanding. New Hampshire--an average-spending state--ranks first in SAT scores. And New Hampshire illustrates a point that Mr. Bell did not stress, but which has been demonstrated again and again in educational research: The best predictors of test achievement are income and race. New Hampshire, first in test scores, ranks 50th in terms of percentage of school-age children from “poverty” families and 49th in proportion of minority students.

    In fact, the states with the five highest SAT/ACT averages--Wisconsin, Minnesota, Iowa, New Hampshire, and Oregon--are all average-spending or high-spending states with relatively low percentages of minority or poor students.

    Conversely, the states with the five lowest SAT/ACT averages--Mississippi, South Carolina, Louisiana, West Virginia, and Alabama--are all low-spending and are all in the top five in terms of either poverty or minority enrollment.

    The overwhelming relationship between background and achievement points to a fundamental problem with attempts to evaluate schools on a simple “scorecard” basis--that is, what is the definition of good performance?

    Undoubtedly, there will be suggestions about ways to use numbers differently to make the comparisons more fair. There will be discussions about how to adjust for cost-of-living or student characteristics, and about whether to use states, districts, or students as what statisticians call “the unit of analysis.”

    But sophisticated use of the statistics will not answer the question of how to rate school systems. Such efforts quickly degenerate into debates between statisticians that no one else can understand, and they divert attention from the more difficult question of what we really want out of our schools.

    Do we want all states to have the same average (the goal suggested by Mr. Bell’s report), or do we want all students to learn the basic skills, or do we want all students with the same “potential” to have the same achievement? Do we want all students to graduate from high school, or do we want higher graduation standards? Do we want all states to have the same average per-pupil spending, or do we want states in which students are more difficult to educate to spend more per-pupil? Do we want all states to have the same goals, or do we want each state to set its own?

    Decisions about the goals of schooling will hardly be easier to make now that the Education Department has produced a poster for all to see. These questions will ultimately be answered not with numbers, but with politics.

    In fact, although the poster won’t tell us how our schools are doing, it can contribute to a political phenomenon to which the Administration might rather not draw attention. It can encourage states to step up efforts to reassert decision-making authority they have traditionally delegated to local districts. It is certainly unrealistic to expect the states to sit idly by while the results of decisions made at the local level lead to highly publicized federal rankings of states.

    The Florida Times-Union, a Jacksonville paper, reported after the release of the report that Florida’s education commissioner, Ralph Turlington, said “the report legitimizes Florida’s strategy of comparing itself with other states in its attempt to improve public schools.” Once that strategy is really accepted, we can expect to see more schools run like those in Florida--one of the most centralized states in the country. Is that what Secretary Bell really wants?

    A version of this article appeared in the January 25, 1984 edition of Education Week as Papering Over the Reality of Education

    Events

    This content is provided by our sponsor. It is not written by and does not necessarily reflect the views of Education Week's editorial staff.
    Sponsor
    Student Well-Being Webinar
    Reframing Behavior: Neuroscience-Based Practices for Positive Support
    Reframing Behavior helps teachers see the “why” of behavior through a neuroscience lens and provides practices that fit into a school day.
    Content provided by Crisis Prevention Institute
    This content is provided by our sponsor. It is not written by and does not necessarily reflect the views of Education Week's editorial staff.
    Sponsor
    Mathematics Webinar
    Math for All: Strategies for Inclusive Instruction and Student Success
    Looking for ways to make math matter for all your students? Gain strategies that help them make the connection as well as the grade.
    Content provided by NMSI
    This content is provided by our sponsor. It is not written by and does not necessarily reflect the views of Education Week's editorial staff.
    Sponsor
    Mathematics Webinar
    Equity and Access in Mathematics Education: A Deeper Look
    Explore the advantages of access in math education, including engagement, improved learning outcomes, and equity.
    Content provided by MIND Education

    EdWeek Top School Jobs

    Teacher Jobs
    Search over ten thousand teaching jobs nationwide — elementary, middle, high school and more.
    View Jobs
    Principal Jobs
    Find hundreds of jobs for principals, assistant principals, and other school leadership roles.
    View Jobs
    Administrator Jobs
    Over a thousand district-level jobs: superintendents, directors, more.
    View Jobs
    Support Staff Jobs
    Search thousands of jobs, from paraprofessionals to counselors and more.
    View Jobs

    Read Next

    Education Briefly Stated: March 20, 2024
    Here's a look at some recent Education Week articles you may have missed.
    8 min read
    Education Briefly Stated: March 13, 2024
    Here's a look at some recent Education Week articles you may have missed.
    9 min read
    Education Briefly Stated: February 21, 2024
    Here's a look at some recent Education Week articles you may have missed.
    8 min read
    Education Briefly Stated: February 7, 2024
    Here's a look at some recent Education Week articles you may have missed.
    8 min read