Federal

AERA Stresses Value of Alternatives to ‘Gold Standard’

Experiment not only route to solid findings, panel says.
By Debra Viadero — April 12, 2007 4 min read
  • Save to favorites
  • Print

Includes updates and/or revisions.

At a time when federal education officials are holding up scientific experiments as a gold standard for studies in the field, a report by the nation’s largest education research group suggests there are also other methods that are nearly as good for answering questions about what works in schools.

The report, produced by a committee of scholars of the Washington-based American Educational Research Association, was released April 11 at the group’s 88th annual meeting here. It highlights ways in which researchers can use large-scale data sets, such as those maintained by the U.S. Department of Education, for analyzing cause-and-effect questions in education.

Barbara Schneider

“It’s not a question of either-or,” said Barbara Schneider, the chair of the committee and a professor of educational administration and sociology at Michigan State University in East Lansing. “It’s about the importance of building capacity in our field. We need researchers who can use a variety of methods to answer appropriate research questions.”

Under the Bush administration, the Education Department has been promoting randomized-control trials in a campaign to transform education into an evidence-based field, not unlike medicine, and improve the quality of education research.

The AERA scholars, in their report, don’t argue with the value of rigorous experimentation for making causal inferences. Yet, they note, there are also times when such studies, which can involve randomly assigning students or classrooms to either an experimental or a control group, are not feasible or ethical. To test what happens when students repeat a grade, for instance, researchers can’t ask schools to randomly hold back some students while promoting others.

“Randomized-control trials are the gold standard,” said Richard J. Shavelson, a study author and a professor of education and psychology at Stanford University. “But they have limitations, and there are a lot of excellent data sets available that can also be used and that don’t necessarily fit with the randomized-control-trial model.”

Statistical Techniques

The problem with some studies that draw on large-scale observational studies, such as the Education Department’s High School and Beyond Study or the National Education Longitudinal Study, is that researchers fail to statistically account for differences between subjects in groups under study.

One example: Students who attend private schools might come from wealthier homes, and start out with greater educational advantages, than their public school counterparts. Such differences make simple comparisons between the two groups suspect.

In recent decades, though, with advances in high-speed computing and the importing of research techniques pioneered in fields such as economics, reliable methods for reducing potential biases between study groups have become more accessible to education researchers. In their report, the AERA researchers highlighted four such methods:

  • Fixed-effects models, which involve adjusting for unmeasured characteristics that don’t change over time, such as the impact of a mother’s personality on children in the same family;
  1. Testing for instrumental variables, which are characteristics that should be linked with the treatment but not with the outcome;
  1. Propensity scoring, a method that calls for building statistical profiles that predict the probability that individuals with certain characteristics will be part of a treatment group and testing results against alternative hypotheses; and
  1. Regression-discontinuity analyses, a technique in which researchers compare subjects that fall just below or just above some cutoff point, such as a proficient level on a standardized test.

While all four methods have their own drawbacks, the researchers say, they also represent an improvement over most of the techniques, such as simple correlational studies, that researchers relied on the make sense of the data in those large-scale studies. Researchers need to know which methods are appropriate for answering which kinds of questions, according to the report.

William H. Schmidt

“There are lots of people who, with limited information, try to make causal inferences leading to major policy directions, said William H. Schmidt, a study author and an education professor at Michigan State. “This is an attempt to say there are principles for this.”

Speaking to the Field

Titled “Estimating Causal Effects: Using Experimental and Observational Designs,” the 142-page consensus report was produced by the research group’s grants board, an expert panel created 17 years ago with the aim of building the field’s capacity for conducting quantitative analyses.

Panel members said they undertook the study project in 2003 at the behest of the National Science Foundation, which along with the Education Department’s National Center for Education Statistics underwrites the AERA grants board’s work. Leading methodologists outside the board also reviewed and critiqued drafts of the study, the authors said.

Panelists said the report, which the research group hopes to make available for free on it Web site,can provide guidance to policymakers and the news media as well as to colleges of education and other researchers.

“One of the things this monograph does is it really speaks to our field,” said Anthony S. Bryk, a Stanford education professor who commented on the panel’s recommendations at the April 9-13 meeting. He noted that many policy analyses in education are now done by scholars outside the field, such as economists or think tank researchers.

“We need people who can do this kind of work at the same level of expertise and skill as people in schools of public policy,” Mr. Bryk said, “but who want to work in colleagueship with people who have deep understanding about schools and how they work.”

“Otherwise,” he added, “we’ll end up with elegant studies that reach wrong conclusions.”

Related Tags:

Coverage of education research is supported in part by a grant from the Spencer Foundation.
A version of this article appeared in the April 18, 2007 edition of Education Week

Events

This content is provided by our sponsor. It is not written by and does not necessarily reflect the views of Education Week's editorial staff.
Sponsor
Artificial Intelligence Webinar
Managing AI in Schools: Practical Strategies for Districts
How should districts govern AI in schools? Learn practical strategies for policies, safety, transparency, and responsible adoption.
Content provided by Lightspeed Systems
This content is provided by our sponsor. It is not written by and does not necessarily reflect the views of Education Week's editorial staff.
Sponsor
Reading & Literacy Webinar
Two Jobs, One Classroom: Strengthening Decoding While Teaching Grade-Level Text
Discover practical, research-informed practices that drive real reading growth without sacrificing grade-level learning.
Content provided by EPS Learning
Jobs Virtual Career Fair for Teachers and K-12 Staff
Find teaching jobs and K-12 education jubs at the EdWeek Top School Jobs virtual career fair.

EdWeek Top School Jobs

Teacher Jobs
Search over ten thousand teaching jobs nationwide — elementary, middle, high school and more.
View Jobs
Principal Jobs
Find hundreds of jobs for principals, assistant principals, and other school leadership roles.
View Jobs
Administrator Jobs
Over a thousand district-level jobs: superintendents, directors, more.
View Jobs
Support Staff Jobs
Search thousands of jobs, from paraprofessionals to counselors and more.
View Jobs

Read Next

Federal Ed. Dept. Wants to Revamp Assistance Program It Calls 'Duplicative,' 'Confusing'
The department's Comprehensive Centers have already been through a year of shakeups.
3 min read
A first grade classroom at a school in Colorado Springs, on Feb. 12, 2026.
A 1st grade classroom at a school in Colorado Springs, Colo., on Feb. 12, 2026. The U.S. Department of Education released a proposal to rework a decades-old program charged with helping states and school districts problem-solve and deploy new initiatives, calling the current structure “duplicative” and “confusing.”
Kevin Mohatt for Education Week
Federal Will the Ed. Dept. Act on Recommendations to Overhaul Its Research Arm?
An adviser's report called for more coherence and sped-up research awards at the Institute of Education Sciences.
6 min read
The U.S. Department of Education building is pictured on Oct. 24, 2025, in Washington, D.C.
The U.S. Department of Education building in Washington is pictured on Oct. 24, 2025. A new report from a department adviser calls for major overhauls to the agency's research arm to facilitate timely research and easier-to-use guides for educators and state leaders.
Maansi Srivastava for Education Week
Federal Trump Talks Up AI in State of the Union, But Not Much Else About Education
The president didn't mention two of his cornerstone education policies from the past year.
4 min read
President Donald Trump enters to deliver the State of the Union address to a joint session of Congress in the House chamber at the U.S. Capitol in Washington, Tuesday, Feb. 24, 2026.
President Donald Trump enters to deliver the State of the Union address to a joint session of Congress in the House chamber at the U.S. Capitol in Washington, Tuesday, Feb. 24, 2026. The president devoted little time in the speech to discussing his education policies.
Kenny Holston/The New York Times via AP, Pool
Federal Education Department Will Send More of Its Programs to Other Agencies
Education grants for school safety, community schools, and family engagement will shift to Health and Human Services.
4 min read
Various school representatives and parent liaisons attend a family and community engagement think tank discussion at Lowery Conference Center on March 13, 2024 in Denver. One of the goals of the meeting was to discuss how schools can better integrate new students and families into the district. Denver Public Schools has six community hubs across the district that have serviced 3,000 new students since October 2023. Each community hub has different resources for families and students catering to what the community needs.
A program that helps state education departments and schools improve family engagement policies is among those the Trump administration will transfer from the U.S. Department of Education to the U.S. Department of Health and Human Services. In this photo, school representatives and parent liaisons attend a family and community engagement discussion on March 13, 2024, in Denver to discuss how schools can better integrate new students and families into the district.
Rebecca Slezak For Education Week