If improving the “rigor” of education studies has been the watchword for much of the work carried out by the U.S. Department of Education’s key research agency over the past seven years, “relevance” and “usefulness” seem to be shaping up as twin themes for the half-dozen years ahead.
At least that’s the message John Q. Easton, the new director of the department’s Institute of Education Sciences, is communicating as he speaks to national groups around the country. Five months into his six-year term, the 60-year-old Mr. Easton has perfected what he calls his “five-bullet talk” on his plans for the $617-million-a-year agency, founded in 2002. While not yet a hard and fast agenda, his presentation outlines his own goals for the direction the government plans to take in shepherding federal education research.
One point that Mr. Easton makes clear is that while promoting rigorous research through randomized experiments will be an important part of that agenda, it won’t be the agency’s guiding star as it was under his predecessor, Grover J. “Russ” Whitehurst.
“The IES did a fabulous job of increasing the rigor of education research. I’m not retreating from that,” Mr. Easton told a national advisory board last month. “At the same time, I’m very interested in questions of usability, and one way you do that is by involving policymakers and practitioners early on.”
Research programs being launched by the institute so far call for using multiple kinds of research strategies. Mr. Easton also said that, when a new competition for federally funded regional education laboratories is held in the next year or two, he hopes to drop requirements for them to conduct large-scale randomized controlled trials, or RCTS, which randomly assign participants to either a treatment or a control group.
“When you need evidence of whether something works or not, you do RCTS,” he elaborated in a recent interview, “but you also have to have much more information about context and implementation so that you get an understanding of why or why not we got the finding that we did.”
Beyond ‘What Works’
The shift “is kind of an interesting next step for IES,” said Gerald E. Sroufe, the director of government relations for the Washington-based American Educational Research Association.
“Clearly, the emphasis was on rigorous research methods,” he added. “I think the new method is going to be to look at what would make research more relevant.”
Under Mr. Whitehurst, the institute’s first director, the agency moved early to increase funding for studies using randomized controlled trials and other rigorous methods in response to widespread dissatisfaction among policymakers and practitioners with the quality of education research.
The agency also created the What Works Clearinghouse, which vetted the research evidence on education programs and policies and made the results widely available on a user-friendly Web site.
Those and other efforts improved the agency’s reputation with federal policymakers from what it had been during the institute’s previous incarnation as the Education Department’s office of educational research and improvement.
But the studies issued by the IES yielded some disappointing results. Most of the education strategies tested were found to produce little, if any, effect on student learning.
In his talks, Mr. Easton, a veteran of the education research community in Chicago, has said that the field needs to know more than “what works.” Educators need to develop a better understanding of schools as organizations and how improvement happens in them, he believes.
“The kind of rigorous evaluations that Russ Whitehurst was talking about work much better when there’s a well-defined program without the fuzz around the edge,” said Eric A. Hanushek, a senior fellow at the Hoover Institution at Stanford University and the president of the national board that advises the IES. “Now we’re talking about looking at the fuzz.”
In his five-bullet talk, Mr. Easton says he wants to sharpen the field’s understanding of how the research-and-development process works in education, and of a cohesive government infrastructure that might support it.
On that question, Mr. Easton is working with James H. Shelton III, the Education Department’s assistant deputy secretary for innovation and improvement. The collaboration is potentially important: Mr. Shelton’s office presides over the Investing in Innovation Fund, $650 million in economic-stimulus money aimed at spurring educational innovations. (“Stimulus Rules on ‘Turnarounds’ Shift,” this issue.)
Mr. Easton said he is also soliciting suggestions from the field and studying writings by Anthony S. Bryk, who heads the Carnegie Foundation for the Advancement of Teaching, a research and policy center at Stanford. Mr. Bryk advocates a “design-engineering” approach to innovation that calls for designing an intervention, testing it, reviewing and redesigning it, and testing it again.
How all of that will play out in the federal research agency is still an open question.“
I would also like to see us move from a dissemination model to a facilitation model,” Mr. Easton said, noting one of his bullet points, “so that we’re not just dropping findings out for policymakers to use.”
He said he sees a major role for the IES in helping states build longitudinal-data systems with the $250 million in economic-stimulus money being directed to those ongoing efforts, and in helping them develop the research capacity to use the data. “A lot of school districts don’t have this capacity,” he said, “but they could.”
That’s important, said Susan Fuhrman, the president of Teachers College, Columbia University, because “so much of what’s going on in education we don’t have evidence for, and the federal government doesn’t have the capacity to do it all.”
Mr. Easton’s ideas are getting good reviews so far from Ms. Fuhrman and other leaders in education research.
“I think he’s right on the mark,” said James W. Kohlmoos, the president of the Knowledge Alliance, a Washington-based trade group whose interests in promoting educational R&D dovetail with Mr. Easton’s ideas.
Mr. Hanushek said he worries a bit about how the IES will study the organization of schools in a rigorous way. “That’s tough stuff,” he said. “Some of it involves novel research and evaluation, and there might be some missteps.”
Mr. Easton’s orientation to collaborate with local educators grows out of a career spent doing practical research. He was involved in 1990 when Mr. Bryk formed the Consortium on Chicago School Research, and later became the group’s executive director. Mr. Easton was also the research director for the Chicago school system from 1994 to 1997.
The Chicago consortium’s model of researcher-practitioner partner-ships has spread, with consortia being formed to emulate it in Texas, the New York City area, Baltimore, and other regions.
“I think it’s a really powerful model,” Mr. Easton said, and one in which the 10 regional education labs that the IES oversees may have a future role.
Mr. Easton’s work at the consortium put him in close contact with U.S. Secretary of Education Arne Duncan, who was Chicago’s schools chief from 2001 to 2008. That history raises questions, for some, over whether the “firewall” that shields the research agency from possible political influence from other federal education officials will show some cracks.
“I believe we need the firewall,” Mr. Easton said. “We also need to be responsive to the needs of the field. But I don’t think I should be sitting at the table formulating policy.”
A version of this article appeared in the December 02, 2009 edition of Education Week as Director of Research Aiming for ‘Usability’ With Federal Studies