Federal

New Uses Explored for ‘Value Added’ Data

By Debra Viadero — May 28, 2008 6 min read
  • Save to favorites
  • Print

With “value added” methods of measuring student-learning gains continuing to grow in popularity, policymakers and researchers met here last week to explore possible new ways of using the sometimes controversial approaches and to debate their pluses and pitfalls.

The May 23 conference at the Urban Institute, a think tank based here in the nation’s capital, examined the policy implications for value-added statistical designs, which typically measure students’ learning gains from one year to the next. Such methods have been spreading since the early 1990s.

While value-added designs are still imperfect technically, various speakers at the gathering said, they can provide new information to help identify ineffective teaching and the impact of certain programs and practices, for example. The data they provide can help educators reflect on their own practices, give administrators grounds for denying tenure to poorly performing teachers, or be used by states to calculate whether districts are making adequate yearly progress under the federal No Child Left Behind Act.

And value-added models can answer such important research questions as what makes a good teacher and whether problems in retaining early-career teachers actually harm or help schools, speakers said.

Yet when it comes to high-stakes decisions, supporters, critics, and scholars of value-added research models seemed to agree on one point: Value-added calculations, if they’re used at all, should be one among several measures used in judging the quality of schools or teachers.

“Assessment results are one critically important measure,” said Ross Wiener, the vice president for programs and policy at the Education Trust, a Washington-based research and advocacy group that focuses on educational inequities. “There are other things that teachers do that are important.”

Last week’s Urban Institute event piggybacked on an April conference at the University of Wisconsin-Madison, where researchers aired technical cautions about value-added research methodology and shared some other research supporting its usefulness. (“Scrutiny Heightens for ‘Value Added’ Research Methods,” May 7, 2008.)

An organizer of the Wisconsin meeting said at the Washington event that the limitations of value-added designs should be kept in perspective. Both the Washington conference and the Wisconsin gathering that preceded it were sponsored jointly by the Carnegie Corporation of New York, the Joyce Foundation, and the Spencer Foundation. (All three philanthropies underwrite coverage in Education Week.)

“I ask you not to lose sight of what I think is the main message,” said Adam Gamoran, the director of the Madison-based Wisconsin Center for Education Research, “which is that value-added models are better than the alternatives.”

Measuring Change

When it comes to accountability efforts, the alternatives for most education systems are techniques that rely on snapshots of student achievement at a single time, such as percentages of students who meet state academic targets.

The theoretical appeal of value-added accountability systems, which measure learning gains from one year to the next, is that educators would get credit only for the progress students made in their classrooms and not get penalized for the learning deficiencies that students brought with them to school.

In practice, though, various value-added models are proving controversial. A case in point is the New York City school system’s efforts to use such techniques to rate schools and evaluate teachers’ job performance, noted Leo E. Casey, the vice president for academic high schools for the 200,000-member United Federation of Teachers, the local teachers’ union.

At the conference, Mr. Casey faulted the school system’s teacher-evaluation project for relying on scores from tests taken by students in January, for failing to take into account the fact that students are not randomly assigned to classes, and for employing statistical calculations that he said are unintelligible to nonstatisticians.

“It’s really important that teachers, students, and parents believe the system on which they are being graded is a fair system,” he told conference-goers.

Opposition from his group, which is an affiliate of the American Federation of Teachers, and other teachers’ unions led New York state lawmakers in April to legislate a two-year moratorium on any efforts by districts to link student-performance data to teacher-tenure decisions. In the meantime, a state task force will be formed to study the issue.

Teacher Characteristics

Studies that try to identify which characteristics of teachers are linked to students’ learning gains are another, less controversial use of value-added methodology. Do veteran teachers do a better job, for example, than novices?

Studies examining such questions have shown that, while experience has proved to be important in some ways, possession of other credentials, such as a master’s degree, seems to have no impact on student performance, according to Douglas N. Harris, an assistant professor of educational policy studies at the Wisconsin research center.

Given the cost of a master’s degree—about $80,000, by his calculations—value-added methods might be a less expensive way to reward good teachers and signal which ones a school system ought to hire, Mr. Harris suggested.

“But we still need a path to improvement, and existing credentials might serve that function,” he said.

Value-added research models can also provide more information than experimental studies about the long-term effectiveness of particular programs or interventions in schools, said Anthony S. Bryk, a Stanford University scholar who is the incoming president of the Carnegie Foundation for the Advancement of Teaching, based in Stanford, Calif.

Mr. Bryk is currently using the statistical technique to track the progress of a professional-development program known as the Literacy Collaborative in 750 schools. He said that, while randomized studies are considered the gold standard for research on effectiveness, they can’t provide information about the different contexts in which a particular program works, the range of effect sizes that are possible, or whether the improvements change over time.

“You can only get so far by weighing and measuring,” he said. “What I’m arguing for is the use of value-added models toward building a science of improvement.”

From Data to Decisions

Whether schools will know how to make use of data collected through value-added statistical techniques is an open question, however.

Daniel F. McCaffrey, a senior statistician in the Pittsburgh office of the Santa Monica, Calif.-based RAND Corp., studied 32 Pennsylvania school districts taking part in the first wave of a state pilot program aimed at providing districts with value-added student-achievement data in mathematics.

He and his research colleagues surveyed principals, other administrators, teachers, and parents in the districts involved in the program and compared their responses with those from other districts having similar demographic characteristics.

“We found it was really having no effect relative to the comparison districts,” Mr. McCaffrey said.

Even though educators, for instance, seemed to like the data they were getting and viewed the information as useful, few were doing anything with the results, he said. Twenty percent of the principals didn’t know they were participating in the study, Mr. McCaffrey said, noting also that the program was still young at that point in the evaluation process.

Despite such challenges, other speakers at the conference argued that the use of value-added methodology should become more widespread. Said Robert Gordon, a senior fellow at the Center for American Progress, a Washington think tank: “The way we will learn about implementation problems, I think, is to implement.”

Events

This content is provided by our sponsor. It is not written by and does not necessarily reflect the views of Education Week's editorial staff.
Sponsor
School & District Management Webinar
Too Many Initiatives, Not Enough Alignment: A Change Management Playbook for Leaders
Learn how leadership teams can increase alignment and evaluate every program, practice, and purchase against a clear strategic plan.
Content provided by Otus
This content is provided by our sponsor. It is not written by and does not necessarily reflect the views of Education Week's editorial staff.
Sponsor
Artificial Intelligence Webinar
Beyond Teacher Tools: Exploring AI for Student Success
Teacher AI tools only show assigned work. See how TrekAi's student-facing approach reveals authentic learning needs and drives real success.
Content provided by TrekAi
This content is provided by our sponsor. It is not written by and does not necessarily reflect the views of Education Week's editorial staff.
Sponsor
College & Workforce Readiness Webinar
Building for the Future: Igniting Middle Schoolers’ Interest in Skilled Trades & Future-Ready Skills
Ignite middle schoolers’ interest in skilled trades with hands-on learning and real-world projects that build future-ready skills.
Content provided by Project Lead The Way

EdWeek Top School Jobs

Teacher Jobs
Search over ten thousand teaching jobs nationwide — elementary, middle, high school and more.
View Jobs
Principal Jobs
Find hundreds of jobs for principals, assistant principals, and other school leadership roles.
View Jobs
Administrator Jobs
Over a thousand district-level jobs: superintendents, directors, more.
View Jobs
Support Staff Jobs
Search thousands of jobs, from paraprofessionals to counselors and more.
View Jobs

Read Next

Federal Ed. Dept. Hangs Banner of Charlie Kirk Alongside MLK Jr., Ben Franklin
It's part of a celebration of the nation's 250th anniversary.
1 min read
New banners of Booker T. Washington, Catharine Beecher and Charlie Kirk hang from the Department of Education, Sunday, March 1, 2026, in Washington.
New banners of Booker T. Washington, Catharine Beecher, and Charlie Kirk hang from the U.S. Department of Education on March 1, 2026, in Washington.
Allison Robbert/AP
Federal Ed. Dept. Wants to Revamp Assistance Program It Calls 'Duplicative,' 'Confusing'
The department's Comprehensive Centers have already been through a year of shakeups.
3 min read
A first grade classroom at a school in Colorado Springs, on Feb. 12, 2026.
A 1st grade classroom at a school in Colorado Springs, Colo., on Feb. 12, 2026. The U.S. Department of Education released a proposal to rework a decades-old program charged with helping states and school districts problem-solve and deploy new initiatives, calling the current structure “duplicative” and “confusing.”
Kevin Mohatt for Education Week
Federal Will the Ed. Dept. Act on Recommendations to Overhaul Its Research Arm?
An adviser's report called for more coherence and sped-up research awards at the Institute of Education Sciences.
6 min read
The U.S. Department of Education building is pictured on Oct. 24, 2025, in Washington, D.C.
The U.S. Department of Education building in Washington is pictured on Oct. 24, 2025. A new report from a department adviser calls for major overhauls to the agency's research arm to facilitate timely research and easier-to-use guides for educators and state leaders.
Maansi Srivastava for Education Week
Federal Trump Talks Up AI in State of the Union, But Not Much Else About Education
The president didn't mention two of his cornerstone education policies from the past year.
4 min read
President Donald Trump enters to deliver the State of the Union address to a joint session of Congress in the House chamber at the U.S. Capitol in Washington, Tuesday, Feb. 24, 2026.
President Donald Trump enters to deliver the State of the Union address to a joint session of Congress in the House chamber at the U.S. Capitol in Washington, Tuesday, Feb. 24, 2026. The president devoted little time in the speech to discussing his education policies.
Kenny Holston/The New York Times via AP, Pool