Published Online: January 29, 2013
Published in Print: January 30, 2013, as 'i3' Raises Ante in Evidence, Research Push

Ed. Dept. Raises Evidence, Research Ante in Grant Awards

Ed. Dept. takes step to broaden standards for other aid contests

Using the Investing in Innovation program as a building block, the U.S. Department of Education is taking the next formal step to make research and evidence far more important factors as it awards competitive grants.

The goal is twofold: to reward projects that already have established a research-based track record of success and to encourage grant winners to produce rigorous evidence detailing the extent to which their project does—or does not—work.

To make that happen, the Education Department is proposing significant changes to an arcane, bureaucratic set of rules known as EDGAR, or the Education Department General Administrative Regulations, as part of a governmentwide push to introduce more evidence into decisionmaking.

Those rule changes, which are open for public comment, will serve as an umbrella over all the department's competitive programs, potentially governing more than $2 billion in grants. They would not apply to large, hallmark grants—such as Title I for disadvantaged students—that are given to states under a formula set by law.

But the proposed rules nonetheless signal to educators and policymakers that evidence matters.

Staying Power

"This has the potential to be a major step forward," said Jon Baron, the president of the Coalition for Evidence-Based Policy, a Washington-based group of researchers working to increase government effectiveness through evidence. "The question is the extent to which this would be used. But for the first time, in many department programs, it would begin to institutionalize the development and the use of rigorous evidence."

It's not that the Education Department could not have done this before on a piecemeal basis, but placing those rules in EDGAR gives the ideas more staying power. They can now be applied to any competitive-grant program without having to go through the cumbersome, rulemaking process each time a new competition is launched.

The department has not yet indicated what competitive-grant programs, beyond i3, might begin to use the standards.

Still, "this shows, again, that the Education Department has a preference for funding programs that have a very strong evidence base," said Michele McLaughlin, the president of the Knowledge Alliance, a national association of research groups based in Washington.

The groundwork for the rules was laid early in the Obama administration with the launch in 2009 of the relatively small, yet important, $650 million Investing in Innovation contest. Known as i3, it rewarded not just promising ideas but projects with strong evidence of past success. The contest was the first effort in years by the department to award grants based, in part, on results proved by sound research.

And last year, the Office of Management and Budget issued a memoRequires Adobe Acrobat Reader to federal agencies directing them to include with their fiscal 2014 budget requests their "most innovative uses of evidence and evaluation." A lack of emphasis on evidence has been considered a problem across agencies, not just education, particularly when it comes to social policy.

The proposed EDGAR changes are out for public comment until Feb. 12; the timeline for getting them on the books depends on how many comments are received.

Once they are finalized, the rules would:

• Create tiers of evidence. This recognizes that depending on the competition and its goals, more or less evidence may be appropriate.

• Standardize definitions of what it means to have "strong" or "moderate" evidence, or "evidence of promise" and "strong theory."

• Define what kind of evidence a project must produce at the end of a grant period. This seeks to ensure some projects produce strong evidence that meets the What Works Clearinghouse's evidence standards, in which the gold standard is an experiment that randomly assigns students to either an intervention or a control group. (The online clearinghouse is an initiative of the Education Department to vet education research against rigorous standards.)

• Extend the official project time frame for a grant winner. This would enable some winners to collect data on a project's impact long after the grant officially ends.

Link to Outcomes

Several groups that monitor the use of evidence in federal grantmaking, including Mr. Baron's coalition, are urging Education Department officials to make one change in their proposal: to make clear that strong evidence should be linked to outcomes that matter—such as high school graduation rates—and not an outcome that may be statistically significant but not related to improving student achievement.

Related Blog

The department said it will consider, and respond to all comments, once the deadline passes.

The Investing in Innovation program "was a way of seeing how this could actually work," said Jim Shelton, the department's assistant deputy secretary for innovation and improvement, referring to evidence-based grantmaking. "Now, we want to institutionalize the evidence framework for use by other programs now and in the future."

Robert E. Slavin, the director of the Center for Research and Reform in Education at Johns Hopkins University, in Baltimore, pointed out that in any federal contests, there are often bonus points—also called competitive preferences—given for focusing a grant proposal on a rural area, or for doing something in the stem (science, technology, engineering, and math) subjects, for example.

"Oftentimes the difference between a successful proposal and an unsuccessful proposal is a couple of points," said Mr. Slavin, whose school improvement organization Success for All has won two i3 grants and is involved in a third. "These competitive preferences can be hugely important. Well why not ask whether projects actually work?"

Success for All has been so successful in winning i3 grants in part because it started building an evidence base for its work when it was founded in 1987, Mr. Slavin said.

"Part of what we were trying to do is build a good program," he said, "but also build the idea that rigorously evaluated programs were a way forward for education in general."

Vol. 32, Issue 19, Pages 19,22

Related Stories
Related Opinion
You must be logged in to leave a comment. Login | Register
Ground Rules for Posting
We encourage lively debate, but please be respectful of others. Profanity and personal attacks are prohibited. By commenting, you are agreeing to abide by our user agreement.
All comments are public.

Back to Top Back to Top

Most Popular Stories

Viewed

Emailed

Recommended

Commented