Opinion
School & District Management Opinion

‘Scientifically Based Practice’

By Deborah Stipek — March 22, 2005 9 min read
  • Save to favorites
  • Print

The pressure is on. As a nation, we are asking teachers and administrators to bring all students to high standards of achievement, and we are holding them accountable. By raising the stakes for demonstrating better student outcomes, we have created a desperate need for information on how to achieve these challenging new goals. Everyone seems to agree that it is time for education researchers to deliver the kind of systematic knowledge that policymakers and practitioners need to do the job the nation is asking of them.

Nowhere has faith in the value of research for informing policy and practice been more forcefully expressed than in the nation’s capital. The U.S. Department of Education’s recent strategic plan claims that “we will change education to make it an evidence-based field.” Indeed, “scientifically based practice” has become the constant refrain of the Bush administration.

Improving the quality of education research does not solve the problem of how the findings will be implemented.

But the administration is also recommending significant changes in the way education researchers do business. According to the Institute of Education Sciences’ director, Grover J. “Russ” Whitehurst, the focus of research should be on identifying effective teaching practices. Borrowing from the field of medicine, the federal government has also put its faith, and its money, in a particular methodology—randomized field trials. This methodology is considered to be more rigorous than any other used in education research, and it allows causal conclusions that no other method can boast.

Also concerned with the quality and reputation of education research, the National Research Council Committee on Scientific Principles in Education Research offers a somewhat different set of recommendations. The committee suggests that the fit between the method and the questions being asked is more important than the particular method. Its recommendations focus primarily on the culture of education research—the need to foster a greater commitment to objectivity, high standards of scientific inquiry, replication, and the free flow of constructive critique.

Yet a third set of recommendations is well articulated in two documents—one issued by the National Academy of Education in 1999 (Recommendations Regarding Research Priorities: An Advisory Report to the National Education Research Policy and Priorities Board), and another by the National Research Council (Strategic Education Research Partnership, SERP). These reports promote, as the administration does, research that focuses on the problems of practice. Their recommendations differ from the administration’s strategy in several important ways, however. First, they encourage research in what Donald Stokes, in his 1997 book, calls Pasteur’s Quadrant—research on practical problems that develops, at the same time, general principles that can guide future research and practice. The reports suggest particular qualities of research that they claim will be more useful for improving education practice.

They recommend, for example, research that is embedded in practice and that involves collaborations between researchers and practitioners. Unlike the traditional linear model of “research-into-practice,” their view of productive research and development involves moving back and forth between research and practice. Innovations are developed by researchers collaborating with practitioners. They are tried out in classrooms, refined or developed by practitioners in their schools and classrooms, and then systematically studied by researchers. The link between research and practice is assumed to be complex, reciprocal, and dynamic.

Thus we have three well-developed proposals for how educational researchers can get their act together and then deliver. All three have merit, and they are not mutually exclusive, except inasmuch as time, resources, and talent are limited.

The culture of research organizations, especially universities, has not been particularly supportive of collaborative research that focuses on practical issues. But let us suppose, optimistically, that we are able to effect the needed changes in research contexts and make progress on all of the recommendations: We increase the number of randomized field trials that produce evidence for the value of particular instructional approaches; we increase the commitment and culture of rigorous scientific methods among education researchers; and we develop sustained collaborations between researchers and practitioners in which effective teaching strategies are developed, tested, refined, and disseminated.

We are still only halfway to scientifically based practice. There is more to do.

First, research findings must be made more accessible. Most research evidence is published in places and forms that only other researchers visit and can comprehend. The Bush administration’s effort to give policymakers and practitioners easy access to research findings through its What Works Clearinghouse is a laudable beginning.

We also need to create an appetite for research findings. Practitioners’ decisions are based primarily on their own intuitions and experience and occasionally on advice from colleagues, principals, or workshop leaders. The idea of basing decisions on research findings or even data collected at the local level is not part of the culture of teaching. New technology and the push for data-based decisionmaking and evidence-based practice are beginning to change the situation, but basing decisions on research and data is a new concept. Both the desire to consult research and the skills to interpret it will need to be developed within the teaching community.

We need to create an appetite for research findings.

We might expect the demand for and use of education research to rise if the quality and clarity of findings improve significantly. This occurred to some degree in medicine. But even in medicine, the path from findings to local use is indirect, often slow, and sometimes nonexistent. Education presents more serious obstacles to the implementation of research findings because the implications for practice are rarely straightforward.

We will also need to change the organization of teachers’ work to make it possible for them to learn new, effective practices. Evidence-based teaching involves more than prescribing the right pill. Research findings can never be specific enough to guide all of the myriad decisions that teachers need to make, moment by moment, in their own classrooms with their own students.

As a consequence, teachers need to have a deep understanding of the innovative methods and programs they are asked to implement. This requires far more time out of the classroom than they have available during the workday, and more training and support than most schools are organized to provide. Without these, however, the instruction that is actually implemented may bear little resemblance to the instruction that research demonstrated as effective.

Productive use of research findings at the policy level also requires many judgment calls. A policy found to be effective in one context is not necessarily effective in another, and there are often many details related to the original conditions of the research that need to be attended to when applying findings in new contexts.

Consider the example of class-size reduction in California. A large, random-assignment study in Tennessee demonstrating the benefits of reducing class sizes to about 15 students was used to support a policy of reducing class size to 20 in California. But unlike in Tennessee, where trained teachers were in good supply, in California there was a serious teacher shortage. Because crucial variables related to the context of the study were ignored, the implementation of this very costly policy in California may have done more harm than good, at least for children in the low-income communities that could not compete for the limited supply of trained and experienced teachers.

Another example is a random-assignment study of the High/ Scope preschool intervention in Ipsilanti, Mich., cited repeatedly as support for preschool education. True, the study has demonstrated impressive and long-term effects of a preschool experience, but the devil is in the details. Many of the preschool programs that were spawned by this compelling research evidence look nothing like the Ipsilanti program. It is very likely that many of the preschool programs based on this research do not give anything close to the same advantages seen in the original High/Scope program.

Policymakers will need to be willing to give more weight to research findings than they now do if evidence is to have an impact on practice.

These examples illustrate the complexity of making evidence-based policy decisions. Researchers will need to make sure that they communicate clearly what contextual variables and details of the intervention or program are necessary to achieve positive results. And policymakers will need either training or assistance to make judgments about the implications of research findings for their local context.

It is also important to consider that evidence-based education practices will not be implemented broadly without cooperation from the private sector. In the field of medicine, pharmaceutical companies use a substantial portion of their profits to develop and study more effective strategies to prevent or cure illness. The motive is profit, to be sure, but the rigor of the research is monitored, and an elaborate federal bureaucracy exists to constrain dissemination of products that have not met high standards of evidence for effectiveness and safety.

The situation is quite different in education. Although educational practices are hugely influenced by products developed in the private sector, objective evidence on the effects of these products on student learning is rare. Until recently, there have been no incentives for carefully designed studies because buyers haven’t asked for evidence, and no outside agency has monitored the quality or even the existence of evidence.

There are signs that this situation may change as a consequence of the Bush administration’s policy of limiting funding (for example, in the Reading First initiative) to instructional programs that are research-based. The potential value of such a policy is clearly evident. Companies that produce educational products are beginning to figure out how to do credible research that will demonstrate the positive effects of their products on student learning. But we have a long way to go to develop mechanisms and organizational structures that will ensure critical and fair reviews of the evidence offered.

Finally, when evidence, however rigorous, is pitted against politics politics always wins. Student retention is a good example of evidence that is consistently ignored. The lack of evidence for positive effects of retaining children in their current grade when they fail to meet minimum standards appears not to have stemmed the trend of “no social promotion” policies. More rigorous, clearer, and more consistent findings may help, but policymakers will need to be willing to give more weight to research findings than they now do if evidence is to have an impact on practice.

The bottom line is that education researchers, like educational practitioners, are being asked to approach their work differently from how they did in the past. We are being challenged to impose high standards of scientific rigor on ourselves, to focus on problems of practice, and to develop sustained collaborations with practitioners. If the resources needed to do this kind of research become available (they currently are not), we should be able to live up to the challenge.

But until many other institutional changes occur, and the organizational structures to support evidence-based practice are developed, research findings, however clear and useful, will have a feather’s weight on teaching and student learning in the nation’s schools.

We do need to improve the quality and relevance of education research, but that’s not all we need to do.

Related Tags:

Events

Jobs Virtual Career Fair for Teachers and K-12 Staff
Find teaching jobs and other jobs in K-12 education at the EdWeek Top School Jobs virtual career fair.
Ed-Tech Policy Webinar Artificial Intelligence in Practice: Building a Roadmap for AI Use in Schools
AI in education: game-changer or classroom chaos? Join our webinar & learn how to navigate this evolving tech responsibly.
Education Webinar Developing and Executing Impactful Research Campaigns to Fuel Your Ed Marketing Strategy 
Develop impactful research campaigns to fuel your marketing. Join the EdWeek Research Center for a webinar with actionable take-aways for companies who sell to K-12 districts.

EdWeek Top School Jobs

Teacher Jobs
Search over ten thousand teaching jobs nationwide — elementary, middle, high school and more.
View Jobs
Principal Jobs
Find hundreds of jobs for principals, assistant principals, and other school leadership roles.
View Jobs
Administrator Jobs
Over a thousand district-level jobs: superintendents, directors, more.
View Jobs
Support Staff Jobs
Search thousands of jobs, from paraprofessionals to counselors and more.
View Jobs

Read Next

School & District Management The Eclipse Is Great for Learning. But It's Tough on School Logistics
A total solar eclipse will cross a large swath of the country on April 8, sparking tough management choices for leaders of the districts in its path.
5 min read
A woman and stands outside with her arm on the back of a boy as they look up at the sky while wearing special paper glasses made for viewing a solar eclipse.
Jackie Johnson and her son Bradley Johnson, 9, watch a partial solar eclipse at the Frost Science Museum on Oct. 14, 2023, in downtown Miami. In 2024, some districts are planning to delay or cancel school on the day of a total eclipse, out of safety concerns.
Matias J. Ocner/Miami Herald via AP
School & District Management Opinion A Good Principal Knows When It's Time to Leave
I didn’t leave my job because of burnout; I stepped away from being a school leader because it was in everybody’s best interest.
Matthew Ebert
4 min read
Conceptual illustration of someone handing off a baton to someone else over a completed puzzle.
Vanessa Solis/Education Week via Canva
School & District Management Principals Tell Politicians on Capitol Hill: We’re Burning Out
Students' mental health top principals' growing list of concerns.
6 min read
People walk outside the U.S Capitol building in Washington, June 9, 2022.
Visitors walk outside the U.S Capitol building in Washington on June 9, 2022.
Patrick Semansky/AP
School & District Management Women Superintendents Experience Bias on the Climb to Leadership
Interpersonal slights and inequities make it hard for women to land the job and stay in it.
3 min read
Woman stands in front of a staircase in different colors. She is about to walk up the stairs. Concept of standing in front of a challenge and finding the right solution and courage to move on.
mikkelwilliam/E+