Three years after a federal law required states to collect a host of education data, much of that information and more will now be available in one place—giving the public a newfound resource and giving educators headaches over how schools can be compared.
On a free Web site to be launched this week, a public-private partnership will post test scores, school spending, student demographics, and other relevant data. The site will feature research tools that allow users to compare achievement across districts, track districts’ and individual schools’ progress in reaching student-achievement goals under the federal No Child Left Behind Act, and find schools and districts that may be outperforming others.
The site—www.schoolmatters.com—also will give people ways to measure whether school spending translates into student learning, as well the chance to compare schools’ effectiveness.
Developed by Standard & Poor’s School Evaluation Services with help from the Council of Chief State School Officers, the project marks a new era of so-called transparency of school-related data, some analysts say.
“It’s a significant step,” said Chrys Dougherty, the research director for the National Center for Educational Accountability. The Austin, Texas-based group has worked on similar projects in the past, but is not involved in this one.
“This is taking information that states already have and making it more accessible to the public,” Mr. Dougherty said.
But some educators have questioned whether the Web site provides a fair way to compare schools, especially in the section that calculates a school’s “return on spending.”
A new Web site developed by Standard & Poor’s offers a wide array of information on public schools and tools for analyzing it.
Standard & Poor’s—a New York City-based division of the McGraw Hill Cos. known for its research on stocks and bonds—delayed the launch of the site by two months while its staff responded to complaints of the Washington-based CCSSO that centered on the spending index. ( “Albeit Late, State Data to Go Online in March,” Feb. 9, 2005.)
State officials said that they would be watching how critics and supporters of public schools use the data to bolster arguments that specific schools should or shouldn’t get more money.
“We still have concerns about it as a simple method of determining a school’s efficiency,” Lisa Y. Gross, a spokeswoman for the Kentucky Department of Education, said of the spending index. “In business, you can do that. Schools are not that simple.”
Yet while they don’t always think the debate over the bang for the buck is fair, state leaders also realize that it’s going to occur, one state official said—whether or not Standard & Poor’s puts those measures on the site.
“A lot of our members understand that this was the inevitable next step in the way data is reported,” said Scott S. Montgomery, the CCSSO’s chief of staff.
For their part, the site’s developers say it will help educators and parents find solutions to their problems by pointing to schools with similar demographics and spending patterns that are doing better at raising achievement.
“The point of this … is to figure out where they’re doing something right and what can we learn from that,” said Paul Gazzerro, a director of Standard & Poor’s School Evaluation Services, which collected and organized the data on the site.
The new site builds on an existing database of states’ student-achievement data by adding new information and features. That earlier database—www.schoolresults.org—was completed last year with funding from the U.S. Department of Education and the Broad Foundation, a Los Angeles-based philanthropy that supports efforts to improve education.
Broad joined with the Bill & Melinda Gates Foundation in underwriting the new site with $45 million. That funding is to last until early 2007. After that, the funders expect states to pay for the project to continue.
The www.schoolmatters.com site was scheduled to launch on March 29 at 7 a.m. Eastern Time. All users of www.schoolresults.org were expected to be redirected to the new site.
Schoolresults.org provided data on student demographics, published scores on state tests for every school in a state, and listed whether each school was making adequate yearly progress toward achieving student proficiency in reading and mathematics under the No Child Left Behind Act. Only Nebraska refused to provide data to the site.
That earlier site offered the ability to compare a school’s achievement with that of others with similar demographics.
By comparison, the new site collects a wealth of data and offers several new tools that help users analyze the information. In addition to all the data on the previous site, it includes:
• School spending amounts, with estimates of how much a district spends on instruction;
• State and district scores on the SAT and the ACT and participation rates on those college-admission tests;
• “School environment data,” such as class sizes, pupil-teacher ratios, and student-suspension and -retention rates;
• Community information, such as income levels, property values, and educational attainment of adults in the area; and
• Teacher compensation.
Schoolmatters.com also includes new tools that help users compare district and school performance with that of others having similar backgrounds and offers several indexes that help quantify school success.
One tool identifies schools that are outperforming others with similar demographics. A new index—called RAMP—combines reading and mathematics proficiency and measures how close a school, district, or state is to meeting the goal of 100 percent student proficiency in those two subjects by 2014—the goal set under the No Child Left Behind law.
The most controversial index quantifies a school’s and district’s “return on spending.” It uses data on student achievement and school finance to calculate a figure that suggests whether a school or district is spending its money effectively.
During the development of the Web site, CCSSO members complained that the index could unfairly label schools. A low-performing school that’s received an influx of resources, for example, may score low on the index even though the extra money may be helping achievement.
As S&P produced the site, state officials expressed concerns about the data on it and how the data were presented, Mr. Montgomery of the CCSSO said.
The research firm made more than 100 changes to the Web site from its original version, Mr. Montgomery said. Some were minor corrections to data posted on the site; others changed how the site presented the indexes to ensure it didn’t conflict with previously released data.
For example, in earlier versions of the site, the RAMP index appeared when a user requested information about state student performance. Mr. Montgomery said that would confuse users because states report the test scores for individual subjects and across individual grade levels.
Now the layout separates the RAMP index from the test scores that states report on their own.
“It reshapes the look of the site so it doesn’t conflict with what states have previously reported,” Mr. Montgomery said.
While most state education officials endorse the resulting version, concerns persist.
“You don’t have the context in there,” said Andy Tompkins, the Kansas commissioner of education and a CCSSO board member. “It might not represent it in a way that we might have represented it.”
Still, Mr. Gazzerro of S&P’s School Evaluation Services said he expects that the site’s users will investigate further when the data provide results that are either too good or too bad to be true.
“Sometimes the data tell the whole story,” he said. “Sometimes you have to find out more.”