Education

Campbell Collaboration Seeks To Firm Up ‘Soft Sciences’

By Debra Viadero — April 03, 2002 7 min read
  • Save to favorites
  • Print

The pursuit of scientific inquiry has long known no national boundaries. So it’s not uncommon for similar studies on, say, an effective treatment for tuberculosis or a drug to combat breast cancer to turn up in any number of countries where scientists do their work.

It wasn’t until the 1970s, though, that a British epidemiologist named Archie Cochrane hit on the idea of gathering up all those studies, analyzing the findings, and distilling nuggets of scientific truth to guide the decisions that practitioners make in the field.

More information on The Campbell Collaboration is available at http://campbell.gse.upenn.edu.

Dr. Cochrane’s idea gave birth years later to the Cochrane Collaboration, an international organization based in the United Kingdom that systematically reviews findings from randomized experiments in health care. Since its 1993 founding, the group has made media splashes worldwide with reports on everything from the effectiveness of daily doses of aspirin in preventing heart disease to whether regular mammograms improve women’s cancer-survival rates.

Now, an international group of scientists and policymakers is working to do the same for the “soft sciences,” including education research. The aim of the Campbell Collaboration, which recently held its second annual meeting here at the University of Pennsylvania, is to find out what research has to say about the myriad interventions tried in the social sciences, and then translate those results into recommendations for policymakers, public administrators, educators, police agencies, and social workers.

Named for the late Donald T. Campbell, the American statistician who called for an “experimenting society,” the group formed three years ago with backing from foundations and government agencies.

“We’re trying to stand on the shoulders of the Cochrane Collaboration,” said Robert F. Boruch, the Penn professor who launched the Campbell group. “The emphasis is on evidence right now, and we’re in the right position to do something about it.”

Demanding Proof

The new interest in “evidence-based” social policy was apparent at the group’s February meeting. More than 200 researchers and policymakers from 16 countries showed up for the two-day event hosted by Penn’s graduate school of education—more than double the number that came to the group’s first official meeting, held in Stockholm.

The collaboration is gathering steam just as policymakers on both sides of the Atlantic Ocean are stepping up their demands for empirical proof that programs or approaches work.

Perhaps nowhere is that emphasis more apparent than in the United States, where such pressure increasingly is being felt in education. Federal lawmakers laced the new “No Child Left Behind” Act of 2001 with phrases such as “scientifically based research” and required states and school districts to rely on research in choosing, for example, school improvement programs and professional-development lessons.

In Britain, Prime Minister Tony Blair, who was elected after promising to build an “evidence-based” government, has even formed a Cabinet-level agency to guide that effort. And the Swedish government last year launched an initiative to better incorporate empirical knowledge into social work practice.

“If you look across the European scene over the last two years, there is a growing interest and conscientiousness among practitioners, policymakers, and researchers to make decisions and practices that are more research-based,” said Haluk Soydan, a social work professor from Sweden who co-chairs the Campbell steering group with Mr. Boruch.

And although cultures vary from nation to nation, what’s true for Sweden or Canada in the social sciences sometimes applies in other countries, too.

“Scientists say that similarities between a great many of our societies are augmenting rather than diminishing,” Mr. Soydan noted.

Like the Cochrane Collaboration, the Campbell Collaboration uses a statistical technique known as meta-analysis to synthesize findings. Traditionally, scientists looking for a research consensus on a particular intervention merely tallied the positive or negative findings from studies. But statistical experts said that method, known as “vote counting,” often yielded the same, lukewarm conclusion: “Results are inconsistent.”

“That’s where researchers shot themselves in the foot. They didn’t know how to synthesize studies, and constantly undersold what the research had to say to policymakers,” said Harris M. Cooper, a University of Missouri psychologist who heads Campbell’s methods group.

He said meta-analyses, in comparison, give a more finely grained look that takes into account the size of each individual effect, the consistency of those effects, and the confidence that researchers place on them.

To safeguard the objectivity of the reviews, researchers who take one on have to follow strict guidelines. Reviewers must ferret out every study ever done on a topic, whether published or not, as long as it meets the group’s criteria for a credible study. If reviewers have a potential bias toward a particular outcome, they have to reveal it, and they have to ensure that their funding comes from more than one source.

The group’s first review is a look at seven studies on “scared straight” programs. Used mostly in the United States, the programs enlist convicted criminals to give delinquent teenagers a harsh look at prison life, hoping to deter budding criminals from a life of crime.

It doesn’t, according to the review by Anthony J. Petrosino, a researcher with the American Academy of Arts and Sciences in Cambridge, Mass. His analysis suggests that scared-straight participants are more likely to rack up a subsequent arrest than nonparticipants.

Heeding Advice

The reviews of educational interventions, in comparison, will be slower in coming. Some of the hesitancy comes because those in the field have traditionally been wary of experiments and more quantitatively oriented studies. Some education researchers contend, for example, that it’s unethical to provide a potentially beneficial intervention to one group of children and not another. Others fear that researchers could “miss the trees for the forest” by focusing only on strict experiments with control groups and treatment groups.

For their part, the leaders of the Campbell Collaboration say they are not against qualitative research. They just haven’t figured out how to incorporate it.

“We know in these control trials we need to know what’s going on inside the black box— whether the cops are following orders or whether the initiatives in housing projects are really carried out,” Mr. Boruch said. “That’s where good qualitative people come in. The challenge is how to integrate it all.”

The collaboration’s education group is taking some cues for now from the Centre for Evidence-Informed Policy in Education at the University of London’s Institute of Education. With support from the British government, the center coordinates research syntheses—both qualitative and quantitative—on “what works” in education.

Its forthcoming reports focus on such topics as problem-based learning and volunteer tutoring, and Campbell’s creators are hoping to borrow some of them for their own archives.

In the meantime, the education group’s co-chairman, C. Kent McGuire, wants to prod others to do Campbell reviews. Mr. McGuire, the U.S. Department of Education’s assistant secretary for educational research and improvement under President Clinton, is also enlisting policymakers to decide what research questions the education group should undertake.

“When they don’t feel like they ‘own’ the questions, policymakers will not trust the evidence,” Mr. McGuire said.

Whether the politicians will indeed “trust the evidence” is still an open question for the fledgling group. As Philip Davies, the education panel’s other co-chairman, points out, so- called faith-based initiatives in social policy remain popular in both the United States and the United Kingdom even though their research base is shaky.

“I am not naive enough to assume that we’re going to get the British government or the U.S. government to give up some of their treasured beliefs,” said Mr. Davies, a policy evaluator in the agency that Prime Minister Blair set up to lead his push for evidence-based government. “What we’re trying to say is this is what the evidence tells us about what works. The test is for government to run with it or not, and see if they get re-elected.”

A version of this article appeared in the April 03, 2002 edition of Education Week as Campbell Collaboration Seeks To Firm Up ‘Soft Sciences’

Events

School & District Management Webinar EdMarketer Quick Hit: What’s Trending among K-12 Leaders?
What issues are keeping K-12 leaders up at night? Join us for EdMarketer Quick Hit: What’s Trending among K-12 Leaders?
This content is provided by our sponsor. It is not written by and does not necessarily reflect the views of Education Week's editorial staff.
Sponsor
Artificial Intelligence Webinar
Teaching Students to Use Artificial Intelligence Ethically
Ready to embrace AI in your classroom? Join our master class to learn how to use AI as a tool for learning, not a replacement.
Content provided by Solution Tree
This content is provided by our sponsor. It is not written by and does not necessarily reflect the views of Education Week's editorial staff.
Sponsor
Teaching Webinar
Empowering Students Using Computational Thinking Skills
Empower your students with computational thinking. Learn how to integrate these skills into your teaching and boost student engagement.
Content provided by Project Lead The Way

EdWeek Top School Jobs

Teacher Jobs
Search over ten thousand teaching jobs nationwide — elementary, middle, high school and more.
View Jobs
Principal Jobs
Find hundreds of jobs for principals, assistant principals, and other school leadership roles.
View Jobs
Administrator Jobs
Over a thousand district-level jobs: superintendents, directors, more.
View Jobs
Support Staff Jobs
Search thousands of jobs, from paraprofessionals to counselors and more.
View Jobs

Read Next

Education Briefly Stated: October 23, 2024
Here's a look at some recent Education Week articles you may have missed.
9 min read
Education Briefly Stated: October 2, 2024
Here's a look at some recent Education Week articles you may have missed.
8 min read
Education Briefly Stated: September 18, 2024
Here's a look at some recent Education Week articles you may have missed.
9 min read
Education Briefly Stated: August 28, 2024
Here's a look at some recent Education Week articles you may have missed.
9 min read